Last month, General Assembly released a groundbreaking report, “Measuring What Matters,” that reports on the student outcomes in its full-time programs. The auditing firm KPMG LLP reviewed and validated the results, which lays bare a bevy of useful data for students thinking about enrolling in General Assembly. For example, 92 percent of students who enrolled in a full-time program graduated. Of those, 76 percent participated in General Assembly’s Career Services program. And of those, 99 percent got a job within 180 days.
I caught up with the founder and CEO of General Assembly, Jake Schwartz, to learn more about the process behind creating the report, the rationale for producing it, lessons learned, and next steps.
Michael Horn: What was the process by which General Assembly developed and produced this report?
Jake Schwartz: As it turned out, the process of getting our data to a point that it could be reviewed by a Big Four accounting firm was much more complex than we originally anticipated.
We learned quickly that to do this right, we’d need to work with not just one — but two of the Big Four accounting firms. One to help us create the framework we would use to track and report these outcomes going forward, and eventually, another independent firm to actually conduct the review.
In some ways, it was as much about operations as it was about data collection. It’s not that our processes were all that far off, but the American Institute of Certified Public Accountants (AICPA) standards require a level of uniformity and precision that isn’t necessarily built-into a business like ours, and I’d think most educational programs or institutions.
Developing the framework and putting the processes in place to verify our data took about six months. Once we were able to share our data with our financial auditors, KPMG LLP, the review process took about another four months.
These are firms that work with major public companies on all sorts of complex issues, from carbon emissions to labor practices, where there are billions of dollars at stake. It gave us a really profound sense of respect for the rigor that goes into producing these kind of reports.
Horn: Why did GA choose this overall approach? And why these outcomes, in particular?
Schwartz: When we launched General Assembly in 2011, we knew that our “disruption” was going to be rooted in not just offering educational programs that weren’t simply shorter and lower cost, but rather generating a real and quantifiable return on our students’ investment.
My first real job in business was for an investment manager, and we had all sorts of standards we had to comply with around how we were allowed to market and report our performance to our clients. That meant that if you looked at our returns versus another fund’s returns, you could make an apples-to-apples comparison, even when the other fund leveraged a different strategy. New investment strategies were constantly emerging, and the clients were presented with information that enabled them to decide what worked for their investment goals.
Similarly, our goal was to create a framework that would enable prospective students, or other stakeholders, to have total transparency into outcomes for every student. We wanted to create a framework that could be applied across a range of similar educational programs and would provide a degree of objectivity without stifling innovation. With the first phase of this project, we focused exclusively on our full-time students, whose goals are fairly consistent—that is, they want to switch into a new field—and where the outcomes involve much less subjectivity—that is, the graduate either got a job or didn’t.
Horn: What can the higher education sector learn from nonfinancial reporting in other industries?
Schwartz: People like to compare what we do at General Assembly to the work of colleges and universities. But the truth is that measuring the outcomes for programs like ours is, in so many ways, much simpler than in higher education–which has all sorts of really profound externalities not just for students, but for local communities and society at large.
I mentioned investment management; I think that’s a powerful analogy and one that fits well in a context with as much diversity—and room to innovate—as education.
The field of measuring and reporting on nonfinancial metrics is also evolving in all sorts of exciting ways, and in a wide range of sectors. So much value of businesses is tied up in nonfinancial metrics these days with companies reporting on their cybersecurity, environmental impact, and labor policies. These are the sorts of topics that didn’t always have a place in a corporate report.
I’m surprised that we haven’t seen a shift toward more transparent reporting in education over the years. Imagine if University of Phoenix had been required to report on a bunch of student outcomes metrics over the last 15 years. Some trends would have been a lot more obvious. It could have potentially saved students from incurring massive debt and investors from catastrophic losses, but it could have also incentivized management to manage growth or invest differently to improve outcomes for students.
Horn: What surprised you the most throughout this process?
Schwartz: We certainly didn’t—and I don’t think most executives do—understand nor appreciate the level of rigor that is required to be AICPA compliant and get sign off from a major global accounting firm. It’s really an operational challenge. Obviously, serving 100 students is different than serving 1,000 students … which is different than serving 10,000 students and so forth. The same is true for scaling our data and reporting on our students’ outcomes.
We also learned about the different levels of verification in the world of nonfinancial reporting—examination, review, agreed upon procedures—and what they mean from an operational perspective. The process forced us to organize and store our data in accordance with the report framework we developed. This was actually a positive for our business. It shines a light on areas that we care a lot about, and ensures that no single individual can operate to a different standard or hide the truth from the rest of the company. Given our scale relative to the rest of the industry, we felt that it was up to us to figure out how to create that accountability through detailed controls and transparency.
Horn: What are the next steps for GA as far as measuring and reporting of student outcomes?
Schwartz: Well, first we want to get this report widely disseminated, and our hope is that it sets a new bar in terms of what educational providers can do to assure their effectiveness. We designed the framework to be open and we’ve received a ton of great feedback on it already. Beyond that, we hope that this work can help to inform a new paradigm of student outcomes-based regulation that might have broader applicability.
Next, we have all sorts of metrics we want to introduce in future student outcomes reports. As we gather more longitudinal data, we can report on some great results over time. Things like salary growth in the first three years, impact on part-time students’ careers, for example. We are very eager to share that data as we continue to grow and learn about our students.