Back to Digital Learning

History and Background: Digital Learning

Photo credit: Christian Horz/Adobe Stock

Back to Digital Learning

Although proponents frequently tout the innovations of digital learning and their potential for radical transformation of education, many of the theories and practices of digital learning actually pre-date these newer technologies.

Educational psychologists in the early 20th century, for example, believed machines could automate teaching and testing. Technologies, such as radio, film, and television, were introduced into classrooms long before computers — often with the same sorts of justifications we hear today: These technologies will “personalize” education; they will allow teachers to work with more students; they will allow students to move at their own pace through course materials. Digital learning might seem new, but it has a long history.

Universities were some of the primary sites for the development of computing technology in the 1950s and 1960s, so it should be no surprise that some of the earliest applications of computers were, in fact, in education.

First Wave: “Computer-Assisted Instruction”

The first wave of digital learning started when computers consisted only of big mainframes (connected to a few terminals or screens) loaded with programs that offered what became known as “computer-assisted instruction” or CAI.

Modeled in many ways on the earlier, pre-digital work of programmed instruction espoused by the behaviorist B. F. Skinnerthe first versions of CAI were typically programs loaded onto a computer that presented a student with a question or problem to which they had to key in a response. Some systems would simply record whether the answer was right or wrong; others would congratulate the student for a correct answer or ask the student to try again if they received a wrong one. These systems would keep a record of each student’s progress so that, ideally, the questions would perfectly match the student’s knowledge and skill level.

Stanford University professor Patrick Suppes, developer of early CAI programs including Dial-a-Drill, argued — as had earlier proponents of teaching machines — that computers would “individualize” education.

“The computer makes the individualization of instruction easier,” Suppes wrote, “because it can be programmed to follow each student’s history of learning successes and failures and to use his past performance as a basis for selecting new problems and new concepts to which he should be exposed next.”

The computer, he believed, would act as a personal tutor and take over classroom instruction from the teacher. He predicted in a 1966 Scientific American article that “in a few more years, millions of school children would have access to what Philip of Macedon’s son Alexander enjoyed as a royal prerogative: the personal services of a tutor as well-informed and responsive as Aristotle.”

“Constructionists v. Instructionists

MIT professor Seymour Papert vigorously opposed this type of education technology, viewing it as a misuse of the great creative and intellectual potential of computers for learning.

Papert and his colleagues developed the programming language Logo based on his theory of learning (based in turn on the work of the Swiss psychologist Jean Piaget): that knowledge was developed through construction, not instruction. Designed for children to learn how to program, Logo enabled children to draw graphics –  both on-screen and with an external robot “turtle” with pen attachment.

In his 1980 book Mindstorms, Papert wrote that “In most contemporary educational situations where children come into contact with computers the computer is used to put children through their paces, to provide exercises of an appropriate level of difficulty, to provide feedback, and to dispense information. The computer programming the child. In the Logo environment the relationship is reversed: The child, even at preschool ages, is in control: The child programs the computer. And in teaching the computer how to think, children embark on an exploration about how they themselves think. …Thinking about thinking turns the children into an epistemologist.”

As computing moved from mainframes to personal computers, this divide between “instructionist” and “constructionist” approaches to digital learning continued.

Bill Gates stepped down from his position as the CEO of Microsoft in 2008, turning his focus to philanthropy. His foundation, which has poured billions of dollars into education technology initiatives, has heavily influenced the shape and direction of digital learning since. (Constructionism is “bullshit,” Gates told Wired magazine in 2011; the Gates Foundation, no surprise, prioritized computer-assisted instruction and its claims of “personalized learning.”)

Meanwhile, Apple seemed to promote more “constructionist” opportunities by touting the possibilities of creative computing for education. (Apple even included Logo in its educational software bundle.)

The Shift to The Internet

By the late 1990s, broadband and cable connections were replacing slow dialup internet service providers, and the internet became the preferred distribution path for digitized lessons and curricula. A growing share of education technologies used the web, rather than programs loaded onto individual computers. Universities like Columbia and Yale invested heavily in online courses, offering alumni and others access to college-level (but, typically, noncredit) coursework. These initiatives never quite succeeded, and when the Dot Com boom burst in 2000, many investors (and pundits and educators) soured on the promises of digital learning.

But as access to the internet continued to spread, new non-school players, such as Lynda Weinman (starting in 2002) and Sal Khan (starting in 2004), attracted huge audiences to their online lessons. Corporate and traditional college interest in using technology in education soon rebounded. Internet companies, such as Google, soon entered the market, competing with incumbent players, such as Microsoft and Apple, to provide both the software and the hardware (and in the case of Google, email services) for schools.

By 2012, the hype was back in full swing as investors and colleges bet so heavily on massive open online courses that the New York Times declared it “the Year of the MOOC.” These courses, initially non-credit, attracted millions of students, and were hailed as an alternative to the high cost of college tuition. But many of the MOOCS were simply taped lectures. And the completion rates were abysmal. One study found that only 15% of MOOC students managed to complete their course.

MOOCs didn’t end up stealing students away from traditional colleges, in part, because the colleges themselves started heavily marketing their own for-credit online courses. The combination of booming supply, aggressive marketing and the sheer convenience of online courses for work- and family-stressed adults drove significant growth. The percentage of American college students enrolled in at least one online course rose from 26% in 2012 to 37% in 2019, for example.

But a federal crackdown on misleading advertisements made the market more difficult for many institutions, particularly for-profit colleges – long the leaders in online enrollments –  forcing many of these out of business in the mid- 2010s.

In addition, research began to show that online courses were uniquely unhelpful to disadvantaged students, and the shift to online instruction could worsen equity gaps in the education system.

Despite this, many schools and organizations continued to push for an expansion of online courses, even at the K-12 level, where standardized testing –thanks to government programs like Race to the Top – was also increasingly conducted digitally.

The Next Wave: COVID-19 and the 2020s 

In their 2008 book Disrupting Class, Clayton Christensen and Michael Horn predicted that by 2019, half of all high school classes would be taught on the internet.

But in 2019 only 21% of (public) high schools offered any online classes. While more students did take courses online (and more schools turned to “blended learning” — a combination of software-based and traditional instruction), most students continued to experience education in brick-and-mortar settings.

That is, until 2020 when the coronavirus pandemic forced everyone into digital learning.

With the outbreak of COVID-19 came the move of most schools to online learning. Many pundits once again predicted this was digital learning’s moment to shine, that the pandemic would allow it to showcase its efficiency and efficacy.

And in fact, people stuck at home set new MOOC enrollment records. The companies that emerged in 2012 to offer them – namely Coursera, edX, and Udacity – had pivoted to for-profit enterprises, now charged for many of the classes, and in many cases, had partnered with universities to offer courses for credit.

However, the pandemic also demonstrated how many of the ongoing challenges of ed-tech still persist: Many students don’t have access to devices or to high-speed internet. There are concerns about privacy and security of student data, particularly when it comes to online test-proctoring. The type of interactions that are best done online and via computer-assisted instruction are only a small portion of what happens each day in face-to-face learning scenarios at school.

And not surprisingly, many of the online classes that were created rapidly to respond to the pandemic were not well designed. This, compounded with the struggles that both teachers and students faced in teaching and learning during a pandemic, certainly tarnished digital learning’s “moment to shine.”

As students returned to face-to-face learning for the 2021-22 academic year, the results of the year-plus long experiment with online learning are not entirely clear. It’s hard to tell if parents’, teachers’, and students’ frustrations with or embrace of digital learning will last; and it’s difficult to link any so-called “learning losses” – that is, declines in academic proficiency – to the technologies used during the pandemic as opposed to the general upheaval of the time period.

While there are proclamations that digital learning is “here to stay,” it’s too soon to tell if pandemic learning has been a temporary or permanent shift toward the digital. Certainly many ed-tech companies are banking on the latter, hoping the investments that school administrators have made in hardware and software will continue.

x
Latest
Podcast
badge-arrow
Podcast
Donate