Fail fast, not spectacularly


Aug 15, 2013

MOOC stalwart Udacity made more news lately, and it wasn’t of the positive, swooning variety.

Udacity announced in January that it would partner with San Jose State University (SJSU) to serve 100 students—half from SJSU and half from nearby community colleges and high schools—in three remedial courses online.

The results from this experiment—one that was celebrated widely when it was first announced—emerged recently, and they were not spectacular. Whereas in traditional remedial classes at SJSU, reportedly 74 percent of students were able to pass the course, in the online version from Udacity, no more than 51 percent were able to pass any of the three classes. What was widely celebrated, is now being widely derided.

There are a few things to take away from this moment.

First, as my colleague Michelle Rhee-Weise wrote, citing this as a failure based on the comparison from previous remedial courses is a bit strange given that half of the students were not SJSU students, but from other community colleges, high schools, and the military.

Second, it’s worth remembering that disruptive innovations often start as not as good as the existing products and services. This is why it is important that they not compete with those offerings at the outset, but instead compete against nonconsumption, where the alternative is literally nothing at all so that the disruption is infinitely better. Once a disruptive innovation gains a foothold, it is then able to improve to be able to serve more demanding users. Serving high school students would seem to be one of those areas of nonconsumption from the perspective of SJSU. So, too, would be using online learning to expand capacity to the hundreds of thousands of students on waiting lists at California colleges who have no other alternative at the moment.

Third, it’s worth applauding Udacity and SJSU for putting their partnership on pause when things were not working perfectly so that they could revisit some assumptions, make some changes, and then try to move forward again. Too often organizations don’t approach innovations as the experiments they are and either stick stubbornly to a plan that isn’t working (and they won’t admit isn’t working) or abandon something completely rather than iterate accordingly. With students in the mix, hitting pause to improve was the right call, but it’s one that all too many organizations don’t take.

Still, it’s worth asking, could Udacity and SJSU have done even better on this accord? That is, rather than have a full course that they tried and then publicized their failures, could they have taken an approach in which they had failed faster with far lower stakes so that when they hit the ground with their first version, they had a better chance of success?

The answer is yes.

The first step would be to admit that their strategy—like those of all the MOOC companies—is still an emergent one (as opposed to a deliberate one), and therefore they should have followed a discovery-driven planning process.

In a discovery-driven planning process, the key is to start with the desired outcome in mind. From there, the crucial next step is to list all of the assumptions that must prove true to realize the desired outcomes. With the assumptions in hand, organizations then implement a plan to learn—a way to test as quickly and cheaply as possible whether the critical assumptions are reasonable. If they prove true, then organizations can invest in executing the strategy. If assumptions prove false or uncertain, then organizations can change accordingly or continue to test before they have gone too far.

It’s worth thinking about some assumptions that Udacity and SJSU made as they embarked on this path together that they could have tested before they implemented their plan.

One assumption was around pacing. The courses were implemented in a traditional semester schedule. The assumption was that these students could be successful under these traditional, factory-model circumstances. But this was an assumption that the two could have known would not have worked before they began the course. As Udacity CEO Sebastian Thrun told EdSurge, “Sal Khan has strong data that says in math in particular, a more flexible pacing is important for success… He’s been preaching go at your own pace and you can turn a C-level student to an A-level student.” In the interview, Thrun also observed that Foothill College’s “Math My Way” program doubled its student-pass rates by giving students more time. The evidence around mastery, or competency-based, learning has shown this for a long time. The two institutions didn’t need to test this assumption by running the course; there was enough evidence to have proven it faulty beforehand so that they could have changed the assumptions around pacing and time accordingly to give them a better chance of success.

A second assumption the two made was that students wouldn’t mind navigating multiple platforms. In the course, while instruction was delivered via Udacity’s platform, assessments were delivered via a separate learning management system that SJSU uses. Again, this would have been an easy assumption to test ahead of time with just a few students in an environment that did not rely on students taking a full course.

A third assumption that they made implicitly was that the SJSU professors would create good courses. There were several early warning signs that this might be a problematic assumption. The most obvious one was that with only two weeks between the announcement of the partnership and the beginning of the course, the professors creating the curriculum were still writing it when the courses began because they didn’t have much time to prepare. A second warning sign is that, generally speaking, most faculty members at traditional institutions don’t know much about sound pedagogy, let alone how to create good teaching practices online. BYU-Idaho learned as much when it created its first online offerings. Faculty members put their courses online first, and the results were not great. As has been chronicled in multiple places, when BYU-Idaho then brought in instructional designers to redo the courses with the faculty, the results improved dramatically. What’s also interesting is that Udacity itself had already seemed to learn this lesson, as it has increasingly worked with instructional designers and others focused on good teaching to create its courses rather than star professors who have great name recognition. Too bad the two weren’t able to question this assumption ahead of time and change direction. The courses that resulted had a lack of clearly communicated deadlines for various assignments within, which created big problems, especially in a semester-long course that had an ultimate deadline.

A fourth assumption the two made was that students would know naturally that this wasn’t like a normal MOOC and that there tutors available to help. As EdSurge reported, “Initially many students were unaware of the online tutors (who are real people) who were available on line to help, 12 hours a day. But over the weeks, it became clear that the tutoring services were crucial.” There is a long literature showing that having teachers in online courses is critical; their role is just different from traditional teachers. Communicating clearly that there were tutors to help and how to reach them in advance would have seemed like a no brainer had the partners questioned this unwitting assumption.

Although offering only a few courses is a great way to learn and adjust, given the high stakes involved in this particular case with even Governor Jerry Brown touting the partnership, Udacity and SJSU could have done better by testing and failing even faster. Then maybe the newspapers would not have had to deviate from their regularly scheduled adoring storyline.

Michael B. Horn

Michael is a co-founder and distinguished fellow at the Clayton Christensen Institute. He currently works as a principal consultant for Entangled Solutions.