Complement and ‘commoditize’ teachers, but don’t substitute for them with tech

By:

Dec 8, 2016

With technology’s presence growing in virtually every field, the specter of technological unemployment has raised its head again and again.

In education, the question of whether technology will replace teachers was a common meme to fight the emergence of digital learning but has increasingly faded away as a serious threat.

A new white paper, “Teaching in the Machine Age: How innovation can make bad teachers good and good teachers better,” by Thomas Arnett of the Clayton Christensen Institute uses the theory of disruptive innovation to show why and clarifies the three ways in which technology is set to emerge alongside of teachers.

Arnett shows that there are three different circumstances in which technology is poised to grow in schools relative to teachers:

  • When schools lack expert teachers because of shortages stemming from geographic limitations or attrition, for example;
  • When expert teachers must serve a wide range of student needs in a single classroom by personalizing learning for each student;
  • And when expert teachers much teach more than academic content.

The basic pattern Arnett elucidates is not confined to education. In all industries, when a disruptive innovation emerges, it typically commoditizes expertise. This means that it takes all the intuitive things that top experts do—much of which has not been codified and resides in the heads of a limited few—and boils it down first into patterns and then ultimately rules housed in innovations that are relatively affordable and accessible.

This does two things. It enables people who are not experts in a given field to deliver the basic services of experts in that field. And it unleashes the experts to be able practice to the top of their craft.

For example, as Clayton Christensen, Jason Hwang, and Jerome Grossman wrote in The Innovator’s Prescription, as the understanding of quantum mechanics and molecular physics has become better understood and applied, the codification of how to create new synthetic materials in computer modeling software has allowed many more people without the expertise of the top scientists and engineers in the world to participate in this industry. The impact on society has been breathtaking in the form of many new materials for covering and building things. The progress though did not come from replicating the costly expertise of the top scientists and engineers in the field of synthetic materials, but by commoditizing it and enabling many more scientists and technicians to build on the experts’ initial work. This in turn has also allowed the top experts to work on new challenging problems at the frontier of science.

Similarly, in healthcare, as Christensen et al wrote:

Angioplasty has enabled cardiologists to treat many patients who otherwise would have been under the care of a cardiothoracic surgeon or who were ineligible for surgery all together. Effective HIV medications, genotyping, and routine viral load surveillance have enabled primary care physicians to manage as outpatients those who were once complex inpatient cases treated by infectious disease specialists. Physician assistants, rather than primary care physicians, can adjust blood pressure medications or perform a diabetic patient’s routine examinations with less waiting time in the clinic. Nurses can perform tests for strep throat and prescribe pharmaceutical treatment at low-cost, conveniently located retail kiosks.

And in each of these cases, this has freed up the experts to spend their time tackling more complicated tasks at the upper echelon of their expertise. In other words, the emergence of a disruptive technology frees all humans to practice at the top of their expertise—from, in the case of health care, the cardiothoracic surgeons, to the cardiologists and from the primary care physicians to the physician assistants and nurses.

This pattern will hold in education. As Arnett writes:

Despite speculative claims that technology will eliminate the need for face-to-face teachers, teachers’ jobs are not as threatened as some might suggest. As artificial intelligence increasingly takes on human work, the most valued and secure human jobs will be those that require complex social skills—such as teaching. Good teachers do much more than just dispense information and assess students’ knowledge of rote facts and skills: they coach and mentor students, identify and address social and emotional factors affecting students’ learning, and provide students with expert feedback on complicated human skills such as critical thinking, creative problem solving, communication, and project management.

Ensuring that every student has a good teacher is a completely separate challenge, Arnett says. Technology is not a panacea for solving this problem, but it can help.

The challenge from my perspective for educators now is twofold.

First, educators must recognize that education technology is not at a stage where it will faithfully replicate all that the top experts in the teaching craft do. That is both OK and not a reason to turn away from it. What it can do is empower non-experts, or those who are not top teachers, to enhance their performance. That means that school leaders, teachers, union leaders, philanthropists, and others must get creative and comfortable with taking advantage of technology in combination with alternative staffing arrangements that use humans in a plurality of roles and teams. This may mean eliminating “classrooms” as we have known them and creating new learning environments in which students interact with lots of adults in multi-faceted ways. It also means that many philanthropists eager to create the next “sexy” innovation in education should instead focus on scaling relatively simpler combinations of technology with teachers in Station Rotation models and the like. For many students nationwide, this would dramatically boost the quality of their educations.

Second, top teachers must recognize that technology has a role in their classrooms as well, but that adopting it will mean they will cease doing certain parts of the job that have formerly been synonymous with teaching. For example, computers are good at delivering basic instruction, streamlining assessments, tracking student progress, and providing students with basic immediate feedback. That means teachers may stop planning for whole-class instruction, delivering lectures, and grading basic tests. But this will allow teachers to spend far more time focused on serving each individual student regardless of where they are in their learning; deeper learning as teachers work with students on projects and facilitate rich discussions; and developing students’ social and emotional learning abilities. As they do this, it will require that teachers gain comfort with ceding some control to students and computers. It will also require that they accept that the computer may not be quite as good as them at certain aspects of the job—like delivering a particular lesson—but that the tradeoff is worth it for students in terms of what teachers will now be able to devote their time to.

In my experience, neither of these things is easy. The complaints about “technology not being good enough” from many quarters is, in many cases, really just manifestations of either of these challenges. By understanding how technology commoditizes expertise more clearly—and the benefits as it does so—hopefully educators can get past that to unleash a brighter future for students and teachers.

As Arnett writes, effective teaching is demanding work, and teachers’ time is scarce. We can’t expect teachers to reach every single student effectively at scale without somehow reconfiguring teachers’ existing workloads.

For more, see:

Michael is a co-founder and distinguished fellow at the Clayton Christensen Institute. He currently serves as Chairman of the Clayton Christensen Institute and works as a senior strategist at Guild Education.