08 October 2011

Humanites, Liberal Arts, and Technology <-> Learning, Creating, and Problem-Solving

This post does not include the name "Steve Jobs" in its title. The 'Net is already awash in Jobsian tribute. So this is not really more of that. But I do want to quickly jot down some reflections with regard to one of the late Mr Jobs's relatively recent observations that Apple's extraordinary success (all the more extraordinary as in came in the wake of Apple's near extinction) was largely founded on not just clever technology, but in fact clever technology married with liberal arts, married with the humanities. In an age of economic crisis and (to steal a quote from former professor and musician Tom Lehrer) "international brouhaha" (as, admittedly, are perhaps most ages, but anyway ...) when universities are busily trying to cut corners and programs they view as "nonessential" -- and the victims here tend to be liberal arts and humanities programs -- it is perhaps important for everyone to take a deep breath and think as hard as they possibly can before shooting down programs on the sorts "nonessential" stuff that in fact underlies the success of what is currently (October 2011) the most valuable technology company in the United States (and was at least briefly the most valuable US company of any kind earlier this year).

Properly speaking, our word "technology" derived from Ancient Greek τεχνολογία (< τέχνη [< PIE *tek̑s- "hew, fabricate"] + -λογία suffix relating to "discourse" on, or "study" of, something [< PIE *leg- "collect, speak"]) should probably be understood as study of how to best apply tools in order to solve problems -- though in my experience a majority of people skip the "study" and "apply" as well as (worst of all) the problem-solving aspects and use "technology" as a kind of synonym for "magic tools which we don't really understand, but that we've been told are pretty awesome". (In relation to which, see for example this essay by the late, great Douglas Adams [a Cambridge man, of course!].)

Even thought we see it endlessly repeated, and it should be just blindingly obvious anyway, technology is not and end in itself, nor is it a "magic bullet". Technology by itself solves nothing.  Yet even people who talk this talk often have difficulty walking this walk. In contemporary educational circles (as in many other circles), there is a lot of buzz about leveraging ICT to extend the ways and means of helping more people learn more stuff. Even in courses and seminars (of the sort with which, for example, my well-meaning but perhaps not terribly savvy university administration bombards its faculty) that knowingly remember to remind educators that ICT tools are not strategies, and that one needs to know the best ways to apply them, they equally gush with unfettered -- and largely unguided -- enthusiasm.  "Blogs and Wikis and Social Media!  Start using them in classroom today! It will make everything better!"  It's all not a little contradictory and incoherent.

To solve things, you need people with the right knowledge and skills -- and the right technology can help them acquire that knowledge and those skills more efficiently, as well as apply them them more efficiently. You need to know what problem you have, you need to analyze the task that confronts you in solving it, and then you can chose -- or invent -- the tools that will help you do that.

It is widely stated that Mr. Jobs was gifted with an ability to solve problems that people did not yet know they had until presented with the solution (in the form of, for example, an iPhone and all that can be done with it -- or whatever).  I would argue this is a close analogue to the ability that educators need.  Learners -- perhaps especially early learners, but really all learners -- do not really know exactly what they will need to be doing in the future. Neither do their teachers, their parents, or their (possibly future) bosses.  But whatever they are doing, it will require problem-solving abilities. Whether you need to figure out how to make a cup of tea or achieve world peace, that's problem-solving. In pretty much any sphere of human activity, you need to be able to identify what your problem or problems are (often more difficult than it would seem), you need to identify the conditions that will let you know the problem has been solved, and then you need to be able to identify, analyze the various tasks you might need to perform in order to move from the state of "problem exists" to "problem no longer exists". Since it's almost impossible (perhaps increasingly impossible) to predict exactly what problems today's learners will be wrestling with tomorrow (let alone in 5, 10, or 20 years!), we need to help our learners acquire that prodigiously Jobsian quality of being able to solve problems that they not only don't know they have, but that might not even exist yet.  Let point to a quote from the mighty Lemmy Kilmister (who is, now that I think of it, perhaps more like Steve Jobs than I would have otherwise casually thought -- but anyway ...): "Everything you plan is $h!t. All the good things that happen in your life are accidents."  I would humbly suggest the following mild amendment to the Lemster's incomparable wisdom: Knowing how to make the accidents work makes the good things happen. Knowing how to make the accidents work is perhaps the highest-level of problem-solving.

had a discussion with a colleague recently in which the theme was basically: "We should throw away all 'product-oriented' education -- that is, education that focuses on acquisition of particular genre-defined knowledge or skills.  All our educational processes should be about problem-solving. Genre-defined content and skills -- that is, say, whether you are nominally studying zoology or law -- can only reasonably be just a vehicle to learn problem-solving skills. If it happens that you study zoology, and go on to become a zoologist, then great. But you might study zoology and go on to become a lawyer. In that case, your genre-specific knowledge and skills might be, at best, incidental to the problems you will need to solve (though they probably make you a more better informed, generally well-rounded sort of person, which is a result not to be denigrated!); but your more general skills and strategies in terms of task analysis and problem-solving will always apply."

In the widely cited (and justly so) speech Mr Jobs gave at the 2005 Stanford graduation, he famously recounted (among other things) how the experience of attending a calligraphy class (as an audited course at the university from which he had largely dropped out) ultimately affected (at least arguably) the development of what are now familiar aspects of everyone's daily computing experience. He note that although what he learned in calligraphy class had no real particular application to his life at the time (it was just interesting and, clearly, motivating in some way) the experience of that calligraphy class led him, ten years later, to insist that capacities for typographical elegance (proportional fonts, multiple typefaces, etc.; all of which, admittedly, in the wrong hands can and do lead to "design train-wrecks", but never mind that right now ...) be built into early Macs (not to mention the early Apple LaserWriters, which some may remember as having been pioneeringly ubiquitous, until eventually swamped by the likes of HP). Such capacities were, of course, then popularized to the point of absolute ubiquity when Windows cloned those capabilities from the Mac, leading to a whole industry of desktop publishing, and eventually e-publishing, that had not previously existed but that we now take for granted.

Small things? Sure, proportional fonts are not world peace. Some might argue that all this sort of stuff was "nonessential", or an example of "style over substance". Yet solving this problem that people didn't know they had did, in its way, change the world -- which would look rather different if we took away all the weird, supposedly "nonessential" things that Mr Jobs rabidly insisted on in his company's products. And it would be a much less cool, much more dehumanized world. Rather than "style over substance", we have here "substance with style": a consummation devoutly to be wished!

So the "nonessential" turns out to be, perhaps, essential after all -- in rather important ways, for rather important things.

We don't know what what problems our learners will face in the future (and nor do they); we can't know what problems they will face -- nor where the inspiration to face those problems will come from. But we clearly cannot (or should not) say things like "Don't study Ancient Greek; it's useless" since, just as clearly, if someone had said something similar to Steve Jobs about his calligraphy class, they would have been horribly, horribly wrong. Calligraphy itself marries together concepts of technology, the liberal arts, and the humanities; it is at once artistic and practical: performing the practical functions in artistic manners. Arguably, it was exactly that understanding and experience which led Mr Jobs to eventually demand that his company's products not only integrally incorporate concepts of technology, liberal arts, and the humanities in their design, but that likewise their design should facilitate the creation of new products that likewise reflect the integral marriage of those concepts. Or, in other words, its a self-replicating humanistic philosophy -- or at least a philosophy amenable to self-replication and humanism.

Of course, many educators need no reminder about the significance of the arts and humanities -- especially if they themselves specialize in those areas, but most of the wiser people in the sciences and technical studies readily recognize this as well, even as professors of literature will recognize the value of scientific literacy, etc.  I am reminded of the way that my undergraduate courses at Harvard were organized: half of them were determined by my major, a quarter of them I was free to chose, and another quarter had to be on topics absolutely distinct from my major. (Consider carefully, O ye curriculum designers, the value of such an approach as mandated at one of the world's leading universities!)  There's an excellent article from earlier this year on this topic (arts+humanities+tech) that got bandied about certain corners of the blogosphere (well, I guess spheres don't have corners ... but never mind!);  another nice one here, and various others elsewhere. Stephen Fry's erudite and insightful tribute to Mr Jobs [Fry being, again, a Cambridge man, of course!] includes some cogent observations on the importance of the "human element" that Jobs always insisted be incorporated into Apple design.

Likewise, in considering the importance of understanding and making use of relationships between technology and the liberal arts, Mark Randall made the observation that the "next generation" is already keenly aware of this significance, and sums it up with the phrase: "Thinking will be now more important than knowing." I would, however, argue that thinking has always been more important than knowing. Obviously, some modicum of knowledge is effectively a prerequisite to thinking, but the key change in which we can rejoice is the relative ease with which knowledge can now be acquired. I could sit (and, now that I think of it, actually have sat) on top of a mountain in South America with my iPad and access a tremendous store of information from around the globe -- and I certainly hope that this situation only continues improving. Yet even if I can find information easily, I still need the ability to evaluate and analyze that information in terms of its relevance to whatever task I need to perform or problem I need to solve. That's all about thinking, and technology cannot really help us do that. (At least, not yet!)

If educators (and learners) in this so-called "age of technology" have something to take away from the example of Steve Jobs, it's that what we really need to be able to do (and thus need to learn to do) is solve problems -- whether or not we know what problems we need to solve, and whether or not the problems have even yet come into existence -- and that being good at this sort of thing requires (beyond, apparently, enormous passion, commitment, and not a little insanity) thinking. Particularly, problem-solving demands creative thinking; the infamous Apple slogan "Think different" is not just an appealingly clever bit of "geek chic" marketing, but actually perhaps foundational to achieving anything genuinely useful. (Cf. The Onion's all too believable headline.) The obvious and easy has already been done; it is thinking differently that allows us to move on to the obscure and difficult of as-yet unsolved problems.

Technology can be an iPad, or it can be a stick with which to scratch symbols in the mud. Only powerful abilities in creative thinking will help us figure out how best to use our tools to solve our problems, and working with (not just learning facts about) materials and concepts from the arts and humanities -- everyone is, after all, human -- are powerful means of helping learners develop that creativity. It would probably be for the best if the world were not entirely populated by a zillion Steve Jobses (where would we get enough black turtlenecks, after all?), but if more of us get better at solving problems -- even just the ones we already know we have -- that would probably be all right.



“Technology alone is not enough. It’s technology married with liberal arts, married with the humanities, that yields us with the results”  
-- Steve Jobs, iPad 2 event in San Francisco, CA, USA, 03 March 2011


No comments:

Post a Comment