Learning to Surf

A college provost encounters the digital revolution.

March 2012: I attend the Council for Christian Colleges and Universities Chief Academic Officers Conference in Nashville, Tennessee. Conversations among my colleagues circle back to one topic: the impending revolution in higher education brought on by the internet. As a recent Washington Post article put it: “Undergraduate education is on the verge of a radical reordering. Colleges, like newspapers, will be torn apart by new ways of sharing information enabled by the internet. The business model that sustained private U.S. colleges cannot survive.”

In a breakout session, a provost from California tells me about free online courses from premier universities such as Michigan, Stanford, and Yale. I learn a new acronym, the MOOC (Massive Open Online Course), which can enroll thousands of students around the world. The next month, Stanford president John Hennessy declares in The New Yorker, “there’s a tsunami coming.” This is all news to me, since apparently I’ve been too busy monitoring faculty teaching loads to notice an approaching tidal wave.

The Nashville conference’s plenary speaker is David Kinnaman, head of the Barna Group and author of UnChristian, who describes the salient characteristics of today’s “digital native” college students. His advice: Get on Twitter.

I download the Twitter app for my iPhone. Immediately upon registering my password, I forget it. Unable to find a “forgot my password” button, I have no way to login to my account. The Twitter icon remains on my iPhone, mocking me with its promise of an exotic world of tweets and hashtags that are a forgotten password away.

Reading Up

If my institution—Cornerstone University in Grand Rapids, Michigan—is about to be engulfed by a digital tsunami, then as provost I should probably give my faculty and staff a heads up. Clearly it’s time for some remedial reading. I begin with three recent books that are similar in their analyses and prescriptions: Clayton Christensen and Henry Eyring’s The Innovative University: Changing the DNA of Higher Education from the Inside Out; Richard DeMillo’s Abelard to Apple: The Fate of American Colleges and Universities; and Anya Kamenetz’s DIY U: Edupunks, Edupreneurs, and the Coming Transformation of Higher Education.

These books tell a compelling story of technological and organizational change centered on Christensen’s concept of “disruptive innovations.” It works like this: Organizations tend to innovate in the direction of higher quality. Each year the Chevy Nova gets a little bigger. Manual windows morph into power windows. Seat covers go from vinyl to leather.

Meanwhile, organizations emerge to occupy space at the bottom of the food chain—for example, a little Japanese car company called Honda. At first these newcomers’ products, such as the cramped, underpowered Honda Civic, attract little notice from the big players. Over time, however, these “disruptive innovations” get better while remaining more affordable—and begin taking market share from the major producers, who keep innovating upward and pricing themselves out of the market. End of story: the Civic takes over the market; the Nova disappears.

If my university can employ technology to educate our students better while making life easier for professors, then that’s the best of all possible worlds.

Today, higher education seems to be in the midst of a disruptive innovation due to digital technology. A decade ago, online courses and for-profit universities seemed to be of such inferior quality that they barely attracted the notice of traditional universities, which were busy trying to become lesser versions of Harvard. But the for-profit universities kept improving, and as internet technology improved, online courses became more sophisticated, interactive, and robust. Hence the crisis point that we’ve arrived at today, in which college costs are spiraling upward and students are seeking lower-cost alternatives through for-profit universities or open online courses. Universities, Christensen advises, need to resist the natural tendency of bigger and better and instead create a DNA of simpler and more affordable.

In Anya Kamenetz’s DIY U (it takes me a while to figure out that “DIY” is an acronym for “Do It Yourself”), the changes hitting higher education are likened to the revolution in the music industry. Most people would agree that a live performance is the best way to hear music; and before the onset of audio technology, thousands of musicians were employed to play live music for silent films, plays, and other shows. But live music is expensive, so recorded music emerged, and many professional musicians had to find a new line of work. Next came digital music, iTunes, and the disruption of the record industry through the “unbundling” of particular songs from records. Anyone with an iPod could purchase a single song for 99 cents instead of having to fork over 15 bucks for an entire album. The music industry has been scrambling to adapt to Apple’s disruption ever since.

Now, says Kamenetz, the same process is transforming higher education. Not only are students discovering that an online, “recorded” educational experience can be a legitimate (and more convenient) substitute for a live classroom lecture, but open online courses are enabling students to unbundle their education by picking and choosing topics and courses from a variety of educational sources. Thus, prognosticators such as Kamenetz and DeMillo (in Abelard to Apple) envision a future in which students earn competency “badges” in highly specialized areas of study and the internet replaces traditional accrediting bodies as the arbiter of what constitutes a legitimate credential. Picture something like an eBay Seller Rating for every student and educational program. In such a world, the traditional residential college would seem to be obsolete.

Of all the writings I came across on the digital revolution in education, one stands out for its eager embrace of the coming changes: Cathy Davidson’s Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn. For Davidson, a Duke University professor of English and Interdisciplinary Studies, the coming tsunami means one thing: excellent surfing.

Drawing on the latest developments in neuroscience, Davidson argues that the internet is transforming how our brains function. While others decry this change (see Nicholas Carr, The Shallows), Davidson applauds it. Rigid classrooms, one-way lectures, and multiple choice exams need to make way for new forms of open-ended collaborative learning, which are better adapted to today’s easily-distracted, internet-nurtured brains that constantly flit from one topic to the next.

Professors can take comfort that those students surfing Facebook during class are simply demonstrating the skill of what Davidson calls “continuous partial attention,” which is essential for thriving in today’s world. As a parent, I spent years trying to limit my kids’ computer time and warning them about decaying brain cells. Now it turns out that I was doing them a disservice. Davidson observes: “Digital natives happen to be the happiest, healthiest, most civic-minded, best adjusted, and least violent” generation on record. Given their interest in collaborating online about everything from global finance to global warming, she concludes, “today’s young digital thinkers could well end up saving our world.”

Davidson applied these notions to her class at Duke. She asked, “What if instead of telling them what they should know, we asked them?” She gave her students iPods so they could record lectures and listen to them while jogging or riding the bus. Students came up with topics to study, new research projects, and even ways to grade each other’s work. “The semester flew by,” she says, “and we went wherever it took us.”

Davidson’s vision of a digital, distracted educational future both intrigues and troubles me. She is clearly a gifted and visionary teacher, and I have no doubt that her course evaluations would eat mine for lunch. I’m always looking for ways to make my teaching more interactive and engaging, and if iPods and group grading can help, I’m all for them.

But I’m not sure where my course on, say, the Reformation would end up if I just let the semester take us wherever students wanted to go. I’m afraid that we would start at the Council of Constance and end up at the 1919 World Series. Moreover, I’m a little uneasy with Davidson’s seeming zeal for virtual reality and her penchant for discussing “innovation” with other denizens of our so-called Creativity Age. At one point Davidson describes a four-day conference on Digital Youth that she attends in South Korea. At the end of the event, she needs a new suitcase, so a hotel concierge directs her to Seoul’s night market—a colorful, chaotic jumble of luggage shops, fortune tellers, and aged Koreans selling kimchi. Maybe it’s just my memorable experiences exploring the back alleys of Germany, South Africa, and Thailand, but I’m hesitant to take advice about the new educational world from someone who seems more attracted to conferences in modern hotels than to the visceral realities of everyday living.

After a month of reading, I’m ambivalent. I see the potential to improve education with technology, but I worry about losing something in the process. Is my resistance simply the instinctive skepticism of a religious historian nurtured in traditional academia, or is the true value of higher education in danger of being lost in our rush to digitize the experience? How does one distinguish between legitimate concern and mere nostalgia? I need more insight.

The Visit

In May I get my big break: My university, which provides laptops for students and faculty, is switching from PCs to MacBooks. Our leadership team is invited to spend a day at Digital Mecca—Apple’s “Infinite Loop” headquarters in Cupertino, California—to learn digital innovation from the very source. I’m Charlie finding the Golden Ticket to visit Willie Wonka’s Chocolate Factory.

To prepare for my visit, I read the biography of Steve Jobs by Walter Isaacson. I’m reassured to discover that in some ways, Jobs differed significantly from Davidson. While Davidson praises distraction and multitasking, Jobs’ strength was his ability to focus and put aside distractions. And while Davidson envisions a future when “most of us will prefer meeting virtually rather than in person,” Jobs believed that creativity comes from face-to-face meetings and the spontaneous discussions that occur when people work in physical proximity to each other. As CEO of Pixar, he designed the company’s headquarters to ensure that people from various units would run into each other randomly when eating, checking mail, or even going to the bathroom.

We fly to California in early June. In some ways, the visit to Apple headquarters is disappointing. Obviously, Apple’s own Willie Wonka, having passed away months earlier, isn’t there to greet us, though huge photographs of Steve Jobs line the entranceway to Apple’s Executive Training Center. Also, while I envisioned a guided tour, perhaps with Oompa Loompa escorts, of Apple’s secret laboratories where exotic new devices are being tested, I learn that our day will be spent almost entirely in a spacious, attractive conference room. I do have an iPad awaiting me at my assigned seat, but it’s just to sample for the day, and unlike an Everlasting Gobstopper, it’s too small to hide in my pocket. At the end of the day we are escorted to the Apple Company Store, where I purchase a plain black t-shirt with an Apple logo for 16 dollars. So much for disruptive innovations in clothing.

The sessions themselves, however, are expertly prepared, engaging, and stimulating. Apple’s commitment to revolutionizing education isn’t simply an attempt to sell more gadgets. From Apple’s founding in 1976, Steve Jobs sought to humanize technology in order to empower the individual consumer. He did that first with computers, developing a personal computer that could be put in the hands of the everyday person. Later he disrupted the music industry by empowering the individual listener to pick and choose what songs to buy on iTunes. Now Apple plans to transform education by empowering individuals to take greater control of their learning through technology. As our first presenter of the day explains, learning occurs best when there’s collaboration, engagement, and exploration. And those are the qualities, she says, that Apple technology provides.

Apple’s leaders like to say that they stand at the intersection of technology and the liberal arts. Combining the two was Steve Jobs’ passion ever since he took classes at the small, artsy, hippie-friendly Reed College before dropping out to design computers with Steve Wozniak. That’s music to my ears—and it goes a long way toward allaying my fears about a techie takeover of liberal arts education. Indeed, as the day progresses I start to wonder whether Apple researched our team, identified me as the toughest nut to crack, and designed the program accordingly. Consider: The presentation of iTunesU is conducted by a former philosophy major at a Christian college in Chicago. The course that he demonstrates for us? Church history. Later in the day, a medievalist at yet another Christian college gives a fascinating presentation on how he has transformed his literary theory course through the use of iPads and social media. Perhaps I’m just paranoid, but the lineup seems more than coincidental.

We close our meetings by talking with an Apple executive, who poses a question that lingers with me as I ponder the future of my university: “If all your college does is give students content that is readily available on the internet from a thousand different sources without providing a challenging, compelling context for that material, why should I pay thousands of dollars to send my kid there?” By the end of the day, I’m warming to the prospect that Apple technology can help us create a learning environment that goes beyond information transfer to promote creative, engaged learning.

As I drive to Los Angeles for a conference the next day, I conclude that Apple isn’t just about a device; it’s a way of seeing the world that seeks to blend technology and artistry. Innovations such as the iPod, iPhone, and iPad are emblematic of the worldview expressed in Jobs’ mantra, “simplicity is the ultimate sophistication.” Like the consumer who buys an iPad and enters a seamless, self-contained world of software and hardware, it requires jumping all in, not just trying out a particular gadget. Am I ready to take the plunge?

Making Sense of It All

I’ve been saving Andrew Delbanco’s College: What It Was, Is, and Should Be for last. That’s because after reading Kamenetz and Davidson and touring Apple’s glitzy facilities, I felt I would need the familiar, comforting rhetoric of a fellow historian extolling the value of traditional liberal arts education. And Delbanco, professor of Humanities at Columbia University, doesn’t disappoint. “The habits of thought and feeling” that liberal education seeks to attain, he remarks, are not “commodities to be purchased by and delivered to student consumers. Ultimately they make themselves known not in grades or examinations but in the way we live our lives.”

Reading Delbanco is like returning home after months of travel abroad. Delbanco worries, as I do, about students who are continually connected to the internet. He recognizes the value of new technology for online learning in areas in which there is a single right answer, but he questions their value in advancing “genuinely humanistic education.” We have not yet, he observes, found a low-cost alternate to expert human teachers.

One of Delbanco’s virtues is that he dispels some of the myths that I have been growing weary of in my sojourn among the futurists. He reminds us that the Puritan founders of colonial colleges believed that learning was more about awakening and transformation than about the transfer of information. And it was to be a collaborative process among students, guided by a professor. That’s why they gathered students together to live in communities. There was plenty of recitation, of course, but peer-to-peer instruction and collaborative learning of the sort trumpeted in Davidson’s experimental classroom were an important part of traditional American colleges. They just had to manage it without the iPods.

Moreover, while I heard some dismissive references to dry “sage on the stage” classroom lectures at Apple, Delbanco’s description of the ideal lecture as more “provocation” than recitation is much closer to my own experience as an undergraduate at the University of Michigan, where a gray-haired Renaissance scholar in a rumpled tweed jacket could scribble three phrases on a chalkboard, then hold us transfixed throughout a 75-minute lecture. At their best, Delbanco states, lectures are times of “exploratory reflectiveness,” not simply conveying of information. They still serve a valuable purpose in our wired world, but it’s difficult to capture the power of such lectures with an online podcast. In all, Delbanco reaffirms college as a humanizing influence that stills holds a powerful attraction for young people today.

Technology, however, doesn’t have to be dehumanizing. The books that line my office shelves and hold such powerful symbolism for me of my “pure” scholarly days are themselves technological devices. The ink and the paper are not the ideas themselves; they’re simply tools designed to transfer those ideas from one human being to another. Our modern digital devices, whatever their drawbacks may be, can do the same, and sometimes in more engaging and aesthetically pleasing ways.

So I’m supportive of the digital revolution. If my university can employ technology to educate our students better while making life easier for professors, then that’s the best of all possible worlds. Why would I not want to make my own iBook for a course, send it to students on their iPads, and then have them work on the subjects in class? If I’m teaching church history, why should I have to do all the work of preparing an overview of the French Revolution when I can direct students to a Khan Academy lecture on the topic, then discuss it with them in class? I’m ready to get on board the technology train—but not without some lingering reservations.

First, I can’t shake Marshall McLuhan’s famous proclamation that “the medium is the message.” Technology is not a neutral ether through which ideas flow from teacher to student. Rather, the technology itself shapes those ideas in profound ways. As an educator, therefore, I sometimes wonder what messages our wonderful devices convey to our students in addition to words, music, and videos.

Davidson gives us a clue. In discussing her use of collaborative grading by students, she remarks, “Welcome to the Internet, where everyone’s a critic and anyone can express a view about the new iPhone, restaurant, or quarterback. That democratizing of who can pass judgment is digital thinking.” How does such digital thinking fit my university if an essential purpose of a Christian college, as Baylor professor Perry Glanzer observed in a recent Christianity Today article, is the nurturing of wisdom? How can a professor be a mentor and a “sage,” in the best sense of the word, in an environment in which everyone has an equal ability to render judgments? And how is such wisdom advanced by the vastly increased access to information via Google and Wikipedia, when such innovations are, as Delbanco notes, “incapable of making distinctions of value”?

Moreover, I suspect that if education is to remain a truly human endeavor in which students gain not just knowledge but wisdom, it will always be a bit messy and unpredictable. And here’s my lingering reservation about Apple: Steve Jobs deplored messiness. The beauty of Apple products—the “secret sauce,” as one of our presenters called it—is that unlike the open system of Microsoft, an Apple device is a closed system that seamlessly integrates software and hardware. That’s what makes it feel so pleasingly simple to a technophobe, but also what makes it resistant to tinkering, playing, and being messed with. Ironically, the device that empowers the user to explore the world is itself impervious to exploration.

As Walter Isaacson observes, “the downside of Jobs’ approach is that his desire to delight the user led him to resist empowering the user.” Which is why the novelist Cory Doctorow, explaining why he won’t purchase an iPad, writes, “buying an iPad for your kids isn’t a means of jump-starting the realization that the world is yours to take apart and reassemble; it’s a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.”

A day after my Apple visit, as it happens, I ran across an essay in The Chronicle of Higher Education cleverly entitled, “A Technological Cloud Hangs Over Higher Education,” by Keith A. Williams. The author, a physicist, observes that the physical objects of many science classrooms—globes, periodic tables, beakers and tubes—have been rendered obsolete by computer technology and the laboratory-simulation software that exists in “the cloud.” Chemistry experiments performed on a computer can’t go wrong and blow up the classroom. But I wonder what is lost in the process. Is a certain lack of control—and perhaps even danger—integral to the learning process? After all, Christian education is about developing human beings, and human nature, like ammonium nitrate, can be dangerously volatile under certain conditions.

Delbanco’s book includes a wonderful description by the philosopher George Santayana of his experience sitting under the teaching of the great psychologist William James: “In the midst of this routine of the classroom, the spirit would sometimes come upon him, and, leaning his head on his hand, he would let fall golden words, picturesque, fresh from the heart, full of the knowledge of good and evil.” I can recall similar episodes as a graduate student at Notre Dame under the teaching of George Marsden, when, in the midst of sometimes tedious and aimless discussions among students, George would sit back, rest his hand on the top of his head, and express insights that were original, thought-provoking, and timeless. Amid all of our revolutionizing of higher education, I hope that such experiences are more common, not less, when the tsunami recedes.

Rick Ostrander is provost of Cornerstone University in Grand Rapids, Michigan. He is the author most recently of Why College Matters to God: A Student’s Guide to the Christian College Experience (Abilene Christian Univ. Press).

Books discussed in this essay:

Clayton Christensen and Henry Eyring, The Innovative University: Changing the DNA of Higher Education from the Inside Out (Jossey-Bass, 2011).

Cathy Davidson, Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn (Penguin, 2012 [2011]).

Andrew Delbanco, College: What It Was, Is, and Should Be (Princeton Univ. Press, 2013 [2012]).

Richard A. DeMillo, Abelard to Apple: The Fate of American Colleges and Universities (MIT Press, 2013 [2011]).

Walter Isaacson, Steve Jobs (Simon & Schuster, 2011).

Anya Kamenetz, DIY U: Edupunks, Edupreneurs, and the Coming Transformation of Higher Education (Chelsea Green, 2010).

Copyright © 2013 by the author or Christianity Today/Books & Culture magazine.Click here for reprint information on Books & Culture.

Our Latest

Wicked or Misunderstood?

A conversation with Beth Moore about UnitedHealthcare shooting suspect Luigi Mangione and the nature of sin.

Why Armenian Christians Recall Noah’s Ark in December

The biblical account of the Flood resonates with a persecuted church born near Mount Ararat.

Review

The Virgin Birth Is More Than an Incredible Occurrence

We’re eager to ask whether it could have happened. We shouldn’t forget to ask what it means.

The Nine Days of Filipino Christmas

Some Protestants observe the Catholic tradition of Simbang Gabi, predawn services in the days leading up to Christmas.

The Bulletin

Neighborhood Threat

The Bulletin talks about Christians in Syria, Bible education, and the “bad guys” of NYC.

Join CT for a Live Book Awards Event

A conversation with Russell Moore, Book of the Year winner Gavin Ortlund, and Award of Merit winner Brad East.

Excerpt

There’s No Such Thing as a ‘Proper’ Christmas Carol

As we learn from the surprising journeys of several holiday classics, the term defies easy definition.

Advent Calls Us Out of Our Despair

Sitting in the dark helps us truly appreciate the light.

Apple PodcastsDown ArrowDown ArrowDown Arrowarrow_left_altLeft ArrowLeft ArrowRight ArrowRight ArrowRight Arrowarrow_up_altUp ArrowUp ArrowAvailable at Amazoncaret-downCloseCloseEmailEmailExpandExpandExternalExternalFacebookfacebook-squareGiftGiftGooglegoogleGoogle KeephamburgerInstagraminstagram-squareLinkLinklinkedin-squareListenListenListenChristianity TodayCT Creative Studio Logologo_orgMegaphoneMenuMenupausePinterestPlayPlayPocketPodcastRSSRSSSaveSaveSaveSearchSearchsearchSpotifyStitcherTelegramTable of ContentsTable of Contentstwitter-squareWhatsAppXYouTubeYouTube