I spent Thursday and Friday at a retreat for Information Resources and technologically-inclined faculty. It's my second-such retreat, and they are interesting in getting a sense about what other folks at the College are working on and concerned about. The group is a mix of administrative computing (who handle networking, telecom, and the like), academic computing (who support office and classroom technology), library (which is part of IR at our College), computer applications faculty (who are also part of IR), and faculty who are in various academic departments.
There were a number of interesting issues, but we spent a good deal of time talking about "information literacy" and how we might help our students attain it. From my perspective the answer is both simple and overwhelming. First, one needs to realize that the modern university functions on print literacy: it demands a basic literacy from its entering students (with FYC often as a gatekeeper) and then provides a more sophisticated literacy upon graduation. Nearly every class includes course materials that must be read. Most professors expect students to take notes on lectures and study those notes. Many courses include research assignments that ask students to read additional material in the library. And a good number of courses ask students to write formal essays or lab reports or something.
In other words, a significant portion of curricular activities contribute to the development of students' print literacy.
That said, few courses explicitly address literacy practices. Of course professors cover the content of the readings they assign, but few spend much time discussing reading practices. Similarly, few spend much time discussing writing practices. Teaching reading and writing are instead seen as basic, general education-type activities, much of which should have been acquired prior to entry to college. As such, composition and perhaps gen ed lit classes are given the task of teaching print literacy. Often such courses are also the place where students are expected to learn basic research skills. That's certainly the case at my College.
In short, students develop their print literacy skills (such as they do) through their regular use in coursework but with little curricular opportunity to think about their reading and writing practices.
When we talk about "information literacy," we have to begin by recognizing that print literacy is one part of information literacy. We also have to realize, as I have written about before, that print literacy is not a universal form of literacy. That is, you don't become literate about the digital world by reading books. Sure there might be some transfer, just as learning about tennis might help you play golf. Maybe. There are obviously some baseline skills, the kind my 6-year old daughter is learning, that are perhaps universal, but they are not the kind of skills that in and of themselves will make one "literate" in the way we hope our students will become.
In other words we should continue reading books and writing texts but we must also incorporate the larger informational universe into our curriculum to the degree that our current curriculum addresses print. That probably includes FYC-type courses that cover digital media the way current composition courses cover reading and writing. Obviously our students, in general, don't need to become professional web designers or videographers anymore than they need to become professional writers. They do, however, need to be able to conduct research, evaluate material, and compose their own media in response... just like they do with reading and writing print.
While the answer is fairly clear, accomplishing that is daunting. Right now, I'd say less than 5% of the faculty at my College or any college for that matter would be capable of teaching in this context. Graduate schools do not prepare their students for working in such an environment; virtually no faculty produce non-print scholarship or research or produce any non-print media for any reason (PowerPoint presentations excepted).
Even assuming you could convince 2/3 of the faculty that they needed to acquire this literacy and incorporate it into their teaching (which is probably the wildest assumption anyone could imagine), the faculty development task would be mind-blowing. Imagine helping 150 faculty reach even the level of information literacy we might target for our students. Sure. You could have them come in every day for a month in the summer. Then you could provide them with support all year long as they integrated what they learned into their courses.
Then you could have them come back the next summer to learn about all the new developments over the past year. And the year after that. And the next year. And so on.
Of course, you'd have to pay them for that month, so you'd only increase the cost of faculty by about 10%. That's not too bad, right?
Oh yeah. You'd also have to reduce your research expectations b/c everyone at my college anyway gets 90% of their research done in the summer and you'd have just taken away 1/3 of the time we have to do that.
Essentially one would have to redefine what a college is and what it means to be an academic. That may seem extreme, but we are talking about fundamentally altering the structure of knowledge and learning in an institution that is defined by knowledge and learning.
Of course there's the other option... Ignore all this and just keep teaching books. See how much sense that makes in 2015. And if 2015 seems far away, think about it in these terms. Someone entering graduate school in 2006 will almost certainly still be an untenured professor in 2015, probably the most junior faculty member in his/her department. Are we going to say that the most junior faculty members in higher education ten years from now will still not be information literate? Obviously if that is going to change, we need to start altering higher education now.
Great post Alex!
I work with faculty at a Community College in central NY and I used to work at your institution.
The point you raise about "redefining college" and "redefining academic life" is a good one. The present "terminal degree" approach/expectation of most graduate schools is not useful during times of rapid technological innovation. My PhD program was in the area of educational technology and ALL of my professors (UW-Madison) were well-schooled in reading/printed literacy, but were not well-versed in developing areas of technology. Again, these were folks who were devoting their professional lives to educational technology.
As a user of technology, and an optimist about the power it potentially brings to teaching and learning, the fact that these full-time "technology scholars" usually had graduate students using technology FOR THEM in class, created a condition where I could not respect them.
They could TALK technology, but they had no "chops." Can you respect a musician able to talk about music, but who can not perform? In large part, this was the situation. I soon lost interest in completing my terminal degree. They did not "get" it. And, given the exclusive nature of academe, it was apparent that I was not going to qualify for The Club.
Anyway, I have little real hope that today's professoriate will, in the time-frame you identify, accomplish much. Academe is not designed to reward risk taking and "different thinkers."
The incredible effort and expense you identify as necessary to bring even a small percentage of faculty into 21st century teaching, thinking, and learning environments is asking too much of of our institutions and many of our colleagues. But, as you note, 2015 will soon be upon us. That freshman class is already in the K-12 system today. I think it's going to become quite interesting.
How long will we remain relevant to the students? How long will PowerPoint hide our flawed approach to rethinking teaching and learning? Will learning/accomplishment ever replace seat-time as our most important metric?
Cheers!
KK
Posted by: K Klein | March 15, 2006 at 08:39 PM