School Futures

Questionable Assumptions About Virtual Education

As school districts began to offer online education in the wake of Covid-19, many questions were raised. Would virtual education really help stem the tide of the pandemic? Would the cost of childcare put an undue burden on poor parents, many of whom do not have the luxury of staying home with their children? What would be the long-term sociological, academic, and economic impact on teachers and school districts if they endorsed a type of quasi-homeschooling model?

These are all important questions. Interestingly, however, both advocates and opponents of virtual classrooms tend to hold certain basic assumptions in common. In particular, it is widely assumed that these questions are rightly the domain of measurable cost-benefit analysis. While there is widespread disagreement on the factors we should be assessing in comparing classroom to computer learning (i.e., should we be looking at infection rates, or math grades, or enhanced technological literacy, or the economic viability of childcare for two-income families?), most agree that the types of questions being asked can be settled by weighing the benefits against the costs.

Get Salvo in print! Subscribe today

Education & Cost-Benefit

The cost-benefit approach to education has been behind the steady rise in virtual education over the last fifteen years. Just as stores have found that they can save costs and reach a larger base by moving online, so many schools, universities, and teachers are finding that virtual classrooms are more efficient.

When I worked as a consultant for an online learning company from 2012–2017, hardly a week went by when I didn't hear comments about the superiority of online learning over what was pejoratively termed "brick and mortar classrooms." There was even a pejorative for paper: "dead trees."

From the Factory to the Classroom

This move towards online education has been part of a larger trend to use machines to achieve greater efficiency. This trend has its roots in the "scientific management" movement of the nineteenth century, which was itself an outgrowth of the industrial revolution.

Frederick Winslow Taylor (1856–1915), the guru of scientific management, increased workplace productivity for factory owners through a strict application of "time-and-motion studies." For Taylor, labor became an exact science, based on minimizing inputs and maximizing outputs. Taylor incapsulated his worldview in his comment that "in the past the man has been first; in the future the system must be first."

Taylor's ethic of efficiency was mediated to middle-class Americans in the early decades of the twentieth century, after portable electricity created new rhythms and routines for their households. Today, Frederick Taylor's pragmatic world of clockwork, standardization, productivity, and efficiency forms the backdrop to the contemporary automation movement, as well as to trends in modern education. Let's take a closer look at modern education's debt to the optimization guru.

When the Common Core initiatives were rolled out in 2010, their architects were candid about the Taylor-type underpinnings of their project. Yet the initiatives merely made explicit the cost-benefit assumptions of the industrial revolution that now permeate nearly all educational models. The idea that more is always better, the tendency to prioritize quantity over quality, the uncritical acceptance of a productivity mindset, the implicit subsidizing of fast thinking over slow thinking, the notion that education is only for career-readiness, and the elimination of non-quantifiable ends from learning—all these are marks of the utilitarian ethic of the industrial revolution.

Perhaps the most telling example of "factory thinking" applied to education is the drift towards virtual classrooms. Not only does online education enable teachers to leverage cutting-edge learning tools, but it also can be delivered at a fraction of the cost of traditional education.

The Politics of Virtuality

For politicians and economists who have been trained to think in cost-benefit terms, it has long seemed obvious that institutions of learning should migrate online whenever possible. On this way of thinking, the Covid-19 crisis simply sped up the inevitable march towards greater efficiency—a march that will only be complete when every school has an online component.

If you think I'm exaggerating, consider an article that Jeb Bush published in the Washington Post on May 3, 2020, titled "It's time to embrace distance learning—and not just because of the coronavirus." In the article Bush pointed to the work schools were doing to quickly move online, and then suggested that such changes should be made permanent:

It's time to learn the lessons from these heroic efforts and plan for a future in which public education can continue without access to classrooms—not just because of a pandemic but because that's the future of learning. . . . Learning is no longer modeled on the traditional classroom but has become digital, individualized and delivered on smartphones or laptops.

Bush was echoed, two days later, by New York governor Andrew Cuomo, who went so far as to question whether traditional classrooms served any useful function at all. For Governor Cuomo, the entire infrastructure of school buildings and classrooms is a relic of a pre-technological past:

The old model of everybody goes and sits in the classroom, and the teacher is in front of that classroom and teaches that class, and you do that all across the city, all across the state, all these buildings, all these physical classrooms—why, with all the technology you have?

That same week, Tal Frankfurt, a technology consultant for Forbes magazine, declared that the Covid-19 crisis would help accelerate the usually slow process of change that accompanies technological advance. "These moments of global emergency," he wrote, "have the invariable ability to progress [in] technology and the wide-scale implementation of technology, in ways previously not thought possible." Frankfurt continued:

With a positive outlook, this crisis can be viewed as a sort of "bypass" button for the application of technological processes and thought patterns that would have taken many more years to adopt in a time of relative peace. One could say that a positive takeaway from disaster is its recurring ability to turn something once viewed as impossible into an accepted aspect of a new reality.

Questioning the New Reality

Certainly, from the perspective of achieving greater optimization by measurable criteria, online education will always win out. After all, virtual learning platforms can operate at a fraction of the cost of a traditional school, saving thousands of dollars in building maintenance and utility costs. Moreover, technology enables education to become a deliverable commodity that can be mass-produced, with decreased inputs and increased outputs. Through digital tools, educators can measure the productivity of their teaching while continually customizing content delivery for greater efficiency.

But what if our unacknowledged assumptions about efficiency are completely wrong? What if some of the most important aspects of education—the bond between student and teacher, the cultivation of curiosity, attentiveness, imagination, and sensitivity to beauty—are not things that can easily be measured by quantifiable criteria? Moreover, what if there are detrimental side effects to virtual learning, and what if these side effects don't become apparent until a generation or two has passed?

Perspectives from Neuroscience

These are sobering questions, and they are made more urgent by preliminary research suggesting that long periods of time spent in front of a screen may be harming students in ways that, while not always quantifiable, should nevertheless be of concern to parents and educators.

Nicholas Carr collected some of this research in his book The Shallows (2010). He synthesized a string of studies showing that material processed through screens goes into a different part of the brain than printed material goes to. Some of the liabilities associated with absorbing information from a screen include:

• Reduced long-term memory;

• Erosion of conceptual and contextual thinking;

• Reduced ability to grasp over-arching narratives of meaning (the big picture);

• Reduced ability to make connections between different ideas and facts;

• Reduced ability to put knowledge into schemas.

Perspectives from History

Historians also have an important role to play here, though politicians, economists, and social planners do not generally consider historical perspectives, since such perspectives elude a purely cost-benefit approach.

Scholars specializing in the history of technology tell us that every technological change—including those that are unquestionably for the better—always comes with trade-offs. Even when technological change is inevitable and beneficial, understanding the side effects can help users get the most out of the new technology as they interact with their tools in a way that is critical and self-aware.

Historians also tell us that new communication technologies do not simply serve a passive function as a delivery vehicle, but actually change the way humans interact with, and think about, the content being delivered. It may have become a cliché, but sometimes the medium really is the message. Clocks changed the way we thought about time; maps changed the way we thought about space; calendars changed the way we thought about the seasons; and undoubtedly, digitally mediated learning will change the way we think about education.

To recognize these patterns, even while working to leverage new technologies in positive ways, is not to be a neo-Luddite, but simply to be a thoughtful human being. One can recognize that the clock has been beneficial while simultaneously trying to understand how clocks have influenced our understanding of time. Similarly, one can recognize the benefits of online education while also remaining critical of the way digitally mediated learning is changing our understanding of education.

Confusing Completion with Learning

What are some ways that digitally mediated learning could change our understanding of education? That is a question that David Smith of Calvin University set out to answer in a three-year research project. His study, which used a variety of research modalities and involved thousands of hours of observation, culminated in the book Digital Life Together: The Challenge of Technology for Christian Schools (2020).

While talking to Ken Myers of Mars Hill Audio, Smith commented, "Because efficiency is a strong cultural value for us, we instinctively measure success in terms of being able to get more done in less time." He continued:

We've got a culture that has efficiency as a strong cultural value. . . . And then you look at the way schooling is often structured—and there's research on this going back years—that schooling often does tend to encourage a productivity mindset, so students get the impression over time that what really counts, what's really going to keep them out of trouble and keep teachers happy with them, is just getting stuff done by the deadline. And the good student is the one who gets all the tasks ticked off, who writes more pages, who gets everything turned in early.

We can probably all relate to Smith's observation from our own schooldays. The emphasis on efficiency that is so predominant in Western society in general, and American society in particular, has led most of us to approach learning as a type of cost-benefit game: how can I be the most productive in the least amount of time? Although teachers themselves may give a different message, the structure of the system incentivizes students to excel at quick task completion.

Where Smith's research breaks new ground is in discovering how technologies are reinforcing the deep-seated cultural values of efficiency and productivity. Through classroom observation, focus groups, case studies, surveys, and documentary evidence, Smith and his team found that digitally mediated learning is helping to cement the concept that learning is about task-completion. Again, from Smith's interview with Myers:

And then you add to that the technology layer: you now have devices that let you track more tasks. So it's interesting for me that the technological devices let us all keep more complex to-do lists and more complex calendars, and just keep track of more things, which again is a way of enabling us to pack more things into more space. So with those three layers going on—the structure of schooling, the cultural value, and now the technological values enabling that—you have a strong set of pressures to experience life as an endless succession of getting more tasks done promptly enough to please the person who assigned them to you.

And that has a couple of effects in the learning environment. For the student, one effect it has is that they can start confusing completion with learning. For example, a large portion of students told us on the surveys that they had used the technology to copy answers from online sources into worksheets without understanding what the answers meant. So you use the copy-and-paste function, you find what looks like the right sentence on Wikipedia, you paste it into the worksheet, you turn the worksheet in, the teacher's happy, you get credit, you get a good grade. And it's kind of weird because that shouldn't count as completing the task, but if you've internalized this sense that what the teacher really cares about is getting stuff done, then it feels like completing the task. And of course, you're not actually learning, which was supposed to be the purpose of being in school.

Likewise with reading. If you read and your goal is simply to get the reading done, then you can use the search function in your PDF reader to find the sentence that has the answer to the question on the worksheet in it, then you don't have to read the whole article, you don't have to follow the argument in it, you don't need to learn to think in more complex ways. You just need to find the answer, and the technology lets you do that quicker and therefore find more answers. But it's short-circuiting other forms of learning. So there's a malformation that's going on, that's about rapid task-completion, rather than learning to think, learning to contemplate, learning to appreciate, learning to follow arguments, and so on.

Question the Assumptions

There can be no doubt that online education has been a blessing, and it would be rash to question whether technology has a legitimate role to play in the days ahead. But perhaps we do need to question the emerging assumptions about technology's capabilities, including the idea that anything traditional education can do, online education can do better. Back in the 1990s, Steve Jobs, the late Apple co-founder, admitted that he had come to rethink some of the unacknowledged assumptions behind the movement to fuse education with computer technology. In a 1996 interview with Wired, he said:

I used to think that technology could help education. I've probably spearheaded giving away more computer equipment to schools than anybody else on the planet. But I've had to come to the inevitable conclusion that the problem is not one that technology can hope to solve. What's wrong with education cannot be fixed with technology. No amount of technology will make a dent. . . . Lincoln did not have a Web site at the log cabin where his parents home-schooled him, and he turned out pretty interesting. Historical precedent shows that we can turn out amazing human beings without technology. Precedent also shows that we can turn out very uninteresting human beings with technology. It's not as simple as you think when you're in your 20s—that technology's going to change the world. In some ways it will, in some ways it won't.

If Steve Jobs came to question the simplistic assumptions behind technology-mediated education, perhaps we should, too. 

has a Master’s in History from King’s College London and a Master’s in Library Science through the University of Oklahoma. He is the blog and media managing editor for the Fellowship of St. James and a regular contributor to Touchstone and Salvo. He has worked as a ghost-writer, in addition to writing for a variety of publications, including the Colson Center, World Magazine, and The Symbolic World. Phillips is the author of Gratitude in Life's Trenches (Ancient Faith, 2020) and Rediscovering the Goodness of Creation (Ancient Faith, 2023) and co-author with Joshua Pauling of We're All Cyborgs Now (Basilian Media & Publishing, forthcoming). He operates the substack "The Epimethean" and blogs at

This article originally appeared in Salvo, Issue #56, Spring 2021 Copyright © 2024 Salvo |


Bioethics icon Bioethics Philosophy icon Philosophy Media icon Media Transhumanism icon Transhumanism Scientism icon Scientism Euthanasia icon Euthanasia Porn icon Porn Marriage & Family icon Marriage & Family Race icon Race Abortion icon Abortion Education icon Education Civilization icon Civilization Feminism icon Feminism Religion icon Religion Technology icon Technology LGBTQ+ icon LGBTQ+ Sex icon Sex College Life icon College Life Culture icon Culture Intelligent Design icon Intelligent Design

Welcome, friend.
to read every article [or subscribe.]