CHANCELLOR’S COLLOQUIUM: Rethinking education in the digital world

AT A GLANCE

WHO: Cathy N. Davidson, professor of English, Duke University

WHAT: "Now You See It: Attention and the Future of Learning," first program in the 2011-12 Chancellor's Colloquium Distinguished Speakers Series

 

WHEN: 4 p.m. Tuesday (Oct. 25)

WHERE: Activities and Recreation Center Ballroom

The entire series

We need to rethink how we teach, learn and study in the interconnected, digital world, says Cathy N. Davidson, the Ruth F. DeVarney professor of English and John Hope Franklin Humanities Institute professor of interdisciplinary studies at Duke University.

Davidson, the author of a recent book, Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn, finds that academics need to understand the role of new technologies in their students’ current and future lives — and think about better ways of teaching that take advantage of these new modes of working and interacting on line. This will help students to think about the social, economic, aesthetic and intellectual issues of the new media they enjoy.

In this interview, Davidson — confirmed recently by the Senate as a presidential appointee to the National Council on the Humanities  — gives a sneak preview of how she hopes to compel those in higher education to think about how we can learn new ways to thrive in the interactive, digital, global world.

Does teaching in the digital age really differ that much from teaching as it has always been done? What should a 21st-century education look like?

First, it is important to remember that the basic forms of teaching and assessing learning that we use today are quite recent in human history. Basically, the research university, with its specialized disciplines and advanced degrees, its majors and minors, has only been around a little more than a hundred years.

Standardized multiple-choice tests, now required not only at the end of each school grade as part of our national educational policy but the basis for college and even graduate and professional school entrance, have only been around since 1914.

Item-response testing was created to address a crisis and a teacher shortage, a way of turning out students the way Ford turned out Model Ts. It is ironic that it became not “standardized” but “the standard” for educational achievement.

Are there better ways, then, to prepare students for the “real world”?

Interestingly, when I talk to businesses, from CEOs to those responsible for hiring and training new employees, they are in despair that straight-A college graduates know a lot but don’t know how to work independently or with other adults of different ages and backgrounds on project management, real-world applications, ethical thinking about practical problems, and other complex ideas that blend skills that, in the academy, are rarely taught together. We have put issues in boxes for formal education, but, in the real world, they are all equally essential.

What are the main obstacles for academics to overcome to adapt to the digital world?

Everything about how we currently measure achievement is based on metrics that were created by and for a different era. If you measure success and reward success by a system that is irrelevant to the world we live in, you replicate irrelevant education.

Policymakers and legislators want to see great test scores. Yet, privately, almost all will admit the limitations of the tests and the scores as well as the mismatch between the ideal of individualized, rote achievement within a specific discipline and the kinds of skills that a global, interactive, constantly changing workplace demand.

Academics, too, are often rewarded for highly specialized success along highly narrow and prescribed pathways as reviewed and esteemed by highly specialized peers. Those who try to do adventurous, interdisciplinary research or bold new forms of collaborative learning and teaching often find themselves without peers who can evaluate the quality of their work.

Are some of the obstacles in higher education financial?

Education has had to bear the brunt of both financial retrenchment and lack of financial support. As long as education is expensive, it is preferential and leaves out some of our brightest minds who simply cannot afford tuition. Taxpayers radically shortchange themselves and their children’s future by not supporting a highly subsidized, excellent public system of higher education.

The issue isn’t simply that their own children will inherit a great educational system, but, to be affluent, a contemporary society needs a strong middle class.

Again, I’ve been surprised that when I talk to CEOs of Fortune 500 companies, I find they often worry about the future of the U.S. even for business. Since World War II at least, the U.S. has been a world leader in a strong educational system with an emphasis on innovation. Our public, taxpayer-based support for higher education and our percentage of college graduates have steadily declined in the last two decades.

From a business point of view, we are losing our competitive advantage in a global system. I think most educators would agree we are rapidly diminishing our intellectual future as well. It’s short-term savings with a consequence of long-term disaster.

Can you give some good examples of how technology is being used in today’s universities?

There are hundreds of different technologies being adapted brilliantly for learning, in school and out. My particular pet peeve, though, is when a technology is simply dumped into a learning environment as if that is the solution, not the beginning.

If we as academics are going to adapt and thrive in a digital age we need to think about the histories of our pedagogy — how we inherited certain forms of teaching and learning. Which ones still serve the world we and our students live in? And which ones are simply forms that once had their day but are no longer very useful, inspiring or successful?

That introspective moment is what, as a historian of technology, I often see about 15 years in. A new generation is on the scene and vocal about how to make current technologies useful and better. They do not remember a “before.” They aren’t stymied by some nostalgia (false or otherwise) for the way “things used to be.”

The insistence that technology doesn’t change us, but that it is a tool that we need to change is a turning point. We are there now, I believe. We need to help professors and students understand this moment, to breathe, to realize we’ve gone through remarkably extensive changes, and to now see how we can adapt, use, explore and understand new technologies and new media in terms of all of the requirements of our students’ and our personal, intellectual, social and occupational aspirations.

Karen Nikos covers education, law, business, social sciences, humanities and the arts for the News Service unit of University Communications.

Media Resources

Karen Nikos-Rose, Research news (emphasis: arts, humanities and social sciences), 530-219-5472, kmnikos@ucdavis.edu

Primary Category

Secondary Categories

Education University

Tags