Why technology is not always the solution for better education

by Oubai Elkerdi, October 14, 2013

Until quite recently, I was a keen advocate of transforming education through technology. Over the years I was inspired by the ideas and works of pioneers such as Jane McGonigal, Katie Salen, Salman Khan (founder of Khan Academy), Steven Johnson (author of Everything Bad is Good for You), Douglas Thomas and John Seely Brown (authors of A New Culture of Learning), to name a few.

As a technophile and a student of engineering, I believed technology could allow us to interact with one another and with our environments in ways that would not only enrich our experience, but also enhance the condition of our species. Every new tool, I thought, would enable us to better understand and effectively tackle some of the world’s most challenging problems. With every advancement, I saw potential; with every breakthrough, a promise for a better tomorrow.

But as Nicholas Carr writes in his thought-provoking book, The Shallows: What the Internet Is Doing to Our Brains, “an honest appraisal of any new technology, or of progress in general, requires a sensitivity to what’s lost as well as what’s gained. We shouldn’t allow the glories of technology to blind our inner watchdog to the possibility that we’ve numbed an essential part of our self.”

For instance, video games may strengthen our visual-spatial intelligence by immersing us in virtual spaces where we need to learn how to rotate objects in our minds and navigate through various architectures and surroundings. But Carr admonishes that this gained ability “go[es] hand in hand with a weakening of our capacities for the kind of ‘deep processing’ that underpins ‘mindful knowledge acquisition, inductive analysis, critical thinking, imagination, and reflection.’”

In reality, the subtleties and complexities of the real world cannot possibly be encompassed by a computer—no matter how advanced or sophisticated technology becomes. In his manifesto, You Are Not a Gadget, the father of virtual reality technology and digital media guru Jaron Lanier remarks that technology often “captures a certain limited measurement of reality within a standardized system that removes any of the original source’s unique qualities.” This is because the algorithms and tools we develop are a reflection of our subjective understanding of the world, and our minds can neither comprehend nor represent a thing to its completion.

Artist and internet anthropologist Jonathan Harris came to a similar conclusion after years working on a series of highly original projects, each of which sought to explore innovative ways of understanding and celebrating personhood. With every one of his projects Harris saw the limitations of his own tools and the difficulty in trying to capture depth and meaning using only digital information, and this has drastically reshaped his philosophy of technology.

A powerful insight comes from a study by Professor James Evans at the University of Chicago. Evans looked at 34 million articles, comparing academic papers written before and after the Internet was introduced to scholarly research. He showed not only that papers written after the digital age lacked richness and variety in citations, but also that old-fashioned library search served to widen the scholar’s horizons precisely because the process included going through more or less unrelated articles before reaching the desired study. As Carr observes, “a search engine often draws our attention to a particular snippet of text, a few words or sentences that have strong relevance to whatever we’re searching for at the moment, while providing little incentive for taking in the work as a whole. We don’t see the forest when we search the Web. We don’t even see the trees. We see twigs and leaves.”

Something is getting lost in the equation

Anyone familiar with the history and philosophy of science knows that paradigm shifts occur not by reinforcing consensus and normal science, but by allowing dissent and divergent thinking. Scientific revolutions happen because brave minds search for possible explanations outside of the box, in unexplored territories. But how can algorithms that lock us in a particular mindset—usually the developer’s worldview—even enable us to question our basic assumptions about nature?

As Evans demonstrated, the tedious and seemingly irrelevant tasks we try to eliminate with every new technology turn out to be the most essential to our learning experience. They elevate us precisely because they fatigue us. Reducing human error by relying on computer efficiency makes our work less thoughtful and less original, and we do not end up learning as much as when we do the hard work.

Photographer Fulvio Bonavia offers an insightful view on the relationship between technology and art: “The big challenge for photography today is that digital makes it much easier to become a photographer, but it is even harder to become a very good photographer. When I worked by hand as an illustrator and graphic designer, I would spend an entire day to make by hand something that I can now do in two minutes with the computer. But all that time I used to spend was not wasted, as I think it made me grow better, teaching me the concentration, the patience, the precision, and the attention to not make a mistake.”

Recent studies in neuroplasticity have shown how every tool we use changes the physical structure of our brains in different ways. Carr elucidates this point using a familiar example: “A page of text viewed through a computer screen may seem similar to a page of printed text. But scrolling or clicking through a Web document involves physical actions and sensory stimuli very different from those involved in holding and turning the pages of a book or a magazine. Research has shown that the cognitive act of reading draws not just on our sense of sight but also on our sense of touch. It’s tactile as well as visual.”

Dozens of studies by psychiatrists, psychologists, neurobiologists, educators, and designers point to the same conclusion: when we go online or facilitate our education through digital technology, we enter environments that promote cursory reading, hurried and distracted thinking, and superficial learning. We think we benefit because we have come to define intelligence by the medium’s own standards. Carr puts it best: “As we come to rely on computers to mediate our understanding of the world, it is our intelligence that flattens into artificial intelligence.”

Seeing as many entrepreneurs and education leaders introduce technology to classrooms, refugee camps and other places with the hopes of democratizing learning worries me more than it gives me hope. It is wonderful to see so many people with good intentions take interest in education. Good intention alone, however, is not enough; in the urgent words of Elias Aboujaoude, “those effects deserve to be understood, studied, and discussed.”

It is especially important for the technology enthusiast to learn about the studies mentioned here and question new-age rhetoric, so that we may deepen our understanding of what is involved and what is at stake. I will conclude with the great words of Jaron Lanier:

“When it comes to people, we technologists must use a completely different methodology. We don’t understand the brain well enough to comprehend phenomena like education or friendship on a scientific basis. So when we deploy a computer model of something like learning or friendship in a way that has an effect on real lives, we are relying on faith. When we ask people to live their lives through our models, we are potentially reducing life itself. How can we ever know what we might be losing?”

Do you think technology can limit individual learning and discovery? Share your thoughts in the comments section below.

--

Oubai is a graduate student in Mechanical Engineering at McGill University. He is interested in crowd-driven innovation and multidisciplinary collaborations. His main passion is human-design interaction and the role design plays in shaping society and culture. Oubai is also the cofounder of the Arab Development Initiative. You can reach him on his blog or on Twitter at @obeikurdy.

 
Commentswith facebook
Commentswith Wamda
Sign in  to leave a comment
Not a member yet!
register now
Oubai Elkerdi , Tue 12.11.2013
Thank you for your comments and feedback.

A general remark that applies to most comments on this post is that the reader has to remember that they are responsible for making sure they have understood what the author has written before jumping to conclusions. This may required the reader to carefully go through the references and the books mentioned (at least 6 in this case), read them in their totality and in relation to one another, explore other relevant works by the authors and finally make a judgement.

If this sounds like too much work, then we have to sincerely ask ourselves whether we are truly seeking knowledge or whether we are using the article as a pretext for expressing pre-made opinions.

Such a serious topic demands this kind of rigour and honesty. Unfortunately, when I write these posts I have a word-limit that I cannot exceed; otherwise I would happily address this topic in extensive detail using all the evidence I have come across. Alternatively, you—the reader—can take the time to study the books I referenced, and there you would find strong arguments based on reliable studies which respond to your points of disagreement.

Perhaps from now on I should add the following disclaimer at the end of my articles: "Most of what you disagree with has been addressed in great detail by the authors mentioned. Believe it or not, they have anticipated your disbelief and objections and have come up with eloquent and satisfactory arguments to answer them. Please take the time to study them."

I have published enough to know that my words will go unheeded, so I will nevertheless respond to some of the points mentioned in the hopes that it may deepen the level of the discussion and encourage the reader to read more—that is, more than they think they have.

First, the reason why we might believe that technology has helped education improve is because our definition of education has been distorted by modernity and a lack of exposure to previous cultural and intellectual traditions.

In other words, we are ignorant about our past. But if we actually study history—widely and deeply—, we would be surprised to find out that our standards for what constitutes true learning have regressed. At this point, people usually come up with all sorts of self-serving arguments as to why we should not compare ourselves to previous generations—we live in different times and conditions. That is a misleading pronouncement because the fundamental human problems remain the same across history. If that is not the case, then we might as well shut down history as a human endeavour.

How about learning how previous generations have dealt with the issues we struggle with today, how this affected their lifestyles, and where they succeeded and failed as a result? In short, we would benefit a great deal if we transcend our cultural provincialism and look beyond local prejudices. I will not address this point any further as I have written about this in several previous articles and on my blog. I would only recommend reflecting upon David Orr's informative article, "What is Education For?" (http://www.context.org/iclib/ic27/orr/) and Asad Q. Ahmed's paradigm-shifting article, "Islam's invented Golden Age" (cf. http://www.opendemocracy.net).

Second, I will reiterate a point mentioned by Nicholas Carr, who writes: "The Net is making us smarter, in other words, only if we define intelligence by the Net’s own standards. If we take a broader and more traditional view of intelligence—if we think about the depth of our thought rather than just its speed—we have to come to a different and considerably darker conclusion. … What the Net diminishes is (...) the ability to know, in depth, a subject for ourselves, to construct within our own minds the rich and idiosyncratic set of connections that give rise to a singular intelligence."

As for Nidal's comment: "The brain is a smart machine, if the brain knows information is available all the time, it simply develops better ways to seek this info instead of remembering it." That is true, but the important question is: Is outsourcing memory necessarily a good thing? Here is what is lost in this process of outsourcing and adaptation:

"“The process of long-term memory creation in the human brain,” [Kobi Rosenblum] says, “is one of the incredible processes which is so clearly different than ‘artificial brains’ like those in a computer. While an artificial brain absorbs information and immediately saves it in its memory, the human brain continues to process information long after it is received, and the quality of memories depends on how the information is processed.” Biological memory is alive. Computer memory is not. Those who celebrate the “outsourcing” of memory to the Web have been misled by a metaphor. They overlook the fundamentally organic nature of biological memory. What gives real memory its richness and its character, not to mention its mystery and fragility, is its contingency. It exists in time, changing as the body changes. Indeed, the very act of recalling a memory appears to restart the entire process of consolidation, including the generation of proteins to form new synaptic terminals. … The very act of remembering, explains clinical psychologist Sheila Crowell in "The Neurobiology of Learning", appears to modify the brain in a way that can make it easier to learn ideas and skills in the future. We don’t constrain our mental powers when we store new long-term memories. We strengthen them. With each expansion of our memory comes an enlargement of our intelligence. The Web provides a convenient and compelling supplement to our personal memory, but when we start using the Web as a substitute for personal memory, bypassing the inner processes of consolidation, we risk emptying our minds of their riches. (…) As the experience of math students has shown, the calculator made it easier for the brain to transfer ideas from working memory to long-term memory and encode them in the conceptual schemas that are so important to building knowledge. The Web has a very different effect. It places more pressure on our working memory, not only diverting resources from our higher reasoning faculties but obstructing the consolidation of long-term memories and the development of schemas. The calculator, a powerful but highly specialized tool, turned out to be an aid to memory. The Web is a technology of forgetfulness." (Nicholas Carr, What the Internet is Doing to our Brains)

There is no doubt that technology improves some of our capacities. But as Carr says, "an honest appraisal of any new technology, or of progress in general, requires a sensitivity to what’s lost as well as what’s gained. We shouldn’t allow the glories of technology to blind our inner watchdog to the possibility that we’ve numbed an essential part of our self."

I have a lot more to say about this point, but I trust the reader will consult the books referenced for more evidence. What is of crucial importance here is to fully understand the implications of introducing a technology to our lives and how that might be reshaping our thinking and behaviour. Elias Aboujaoude (MD Psychiatrist/Director of the Impulse Control Disorders Clinic and the Obsessive Compulsive Disorder Clinic at Stanford University) has an important book entitled "Virtually You" about how the virtual world can alter our personality. Sherry Turkle, who is Professor of the Social Studies of Science and Technology at MIT, has an equally important book, "Alone Together". Those two books, in addition to the ones I have already mentioned, have to be read in order to understand what we misunderstand about technology. When these works are read in relation to one another (and not simply as separate works), we begin to realize some of our biases and lack of understanding when it comes to technology.

Third, in response to: "Think about it, you cannot allow the student to experience everything, some things may be dangerous and some things may be impossible to experience." I believe I quoted Jaron Lanier who is not only the father of virtual reality technology, but is also someone who remains a pioneer in this field; especially in finding ways to help us experience things that are difficult to experience in real life. I would recommend reading his book (which, again, I mentioned explicitly in my article), it contains a lot of insights into how we can do a better job at bring close distant and inaccessible learning experiences.

It seems that many readers have assumed that I am saying we should completely eliminate technology from education. That is not what I am saying, and to believe so is a grave injustice and further proof that you have not really read the article. The title of the article alone should be sufficient indication (it clearly says "not always") that this is not my position.

What I am advocating for, instead, is subjugating technology to humans (not the other way around) and equipping ourselves with the understanding and sensitivity to innovate differently, and to produce tools that do not atrophy our most essential faculties. Today, most digital tools, no matter what skill they enhance within us, also destroy our abilities for deep and critical thinking, patience, reflection, empathy, cross-pollinatory induction and introspection. Without these fundamental skills that not only define us and deepen our humanity but also allow us to live sustainably on a planet we have been shattering precisely because of a hurried and harried enthusiasm for advancement for its own sake, we become mindless automatons with constant desires that need to be met (see "Consumed" by Benjamin Barber). We seem to find every turn of argument available to us to justify the need for progress for the sake of progress without ever asking why or thinking what the consequences might be. Unlike many who, as a result of personal interest and affiliation to certain ideologies and companies, feel pledged to technological development above all, some technologists and scientists have questioned many of our modern tendencies and conceptual mistakes.

Lastly, in response to: "There is no focus at all and most of the references used are not directly related to the same topic, this is due to the fact that the topic is extremely general." As an engineer and an artist it is in my nature to draw on studies and examples from multiple and seemingly unrelated fields. While you may not see the connection, there is certainly one; and I am by no means the first to see and benefit from such meaningful bridges. Part of education is having the ability to transcend our scope of research and interest and pick up wisdom and analogies from neighbouring fields. What is more, and once again, it is your duty as a reader to elucidate your points of confusion and to try to address them through the suggested readings. In the words of Mortimer J. Adler, "We should assume that the author is intelligible until shown otherwise, not that he is guilty of nonsense and must prove his innocence. And the only way you can determine an author’s guilt is to make the very best effort you can to understand him. Not until you have made such an effort with every available turn of skill have you a right to sin in final judgement on him."

One more thing Adler said is relevant here: "The most direct sign that you have done the work of reading is fatigue. Reading that is reading entails the most intense mental activity. It you are not tired out, you probably have not been doing the work. Far from being passive and relaxing, I have always found what little reading I have done the most arduous and active occupation. I often cannot read more than a few hours at a time, and I seldom read much in that time. I usually find it hard work and slow work. There may be people who can read quickly and well, but I am not one of them. The point about speed is irrelevant. What is relevant is activity. To read books passively does not feed a mind. It makes blotting paper out of it."

I quote this because I do expect my readers to sweat and work hard to find the books I reference, to read them, to struggle with them, to study them carefully and then to agree or disagree. To me a reader who agrees without understanding is as much of a disappointment as a reader who has disagreed without doing the work I expect from them. I keep saying all this because every single one of the points of disagreements brought up in the comments is addressed in multiple ways and from multiple fields of research in the body of work that I use as a foundation for the article. I put those hyperlinks and titles there for a reason.

I hope that these points respond to some of the concerns brought up. If they do not, then I would heartily suggest consulting the works mentioned and researching this topic in depth. The reader should postpone judgement until they have done so.

In the end, what matters is that we educated *ourselves* first and foremost, before we talk about educating our children and whatnot. This is an arduous task that most of us conveniently avoid. We feel content with a degree or two and with reading a handful of studies and books about a topic, and those are usually the recent ones that are recommended to us by popular trends and ratings. But to be able to talk about education in an informed manner, we need a heavy dose of history, psychology, neuroscience, history and philosophy of science, history and philosophy of art, history and philosophy of education, and to practice and master a few arts and sciences. Only then do we develop the intuitive and rational intelligences necessary to effectively question something such as the role of technology in education. Only then do we become qualified to make informed pronouncements about what should be and what should not be, and how to move forward.

This is not a trivial task. But unless we elevate our standards to these heights, we will forever remain sophomores. From the comments that have been made so far, it is clear that we have a long way to go. We seem to be unable to properly read an article, understand it, and tame our urge to disagree long enough to consult the full arguments of the works mentioned. Take a moment to appreciate the great disservice you do to me, as an author, and to yourself, as a reader, when you jump to conclusions without critically thinking and entertaining the possibility that you might be wrong on some points.