18 April 2012

Useful Words from Joe Harris

Life has been too busy for the indulgent luxury of blogging, but the discovery that PDF versions of the various articles collected in "Speak Useful Words or Say Nothing": Old Norse Studies by Joseph Harris are made freely available from Cornell's Web site is worth a quick post.

Joe Harris will need no introduction to those even peripherally connected with Old Norse studies, and this collection conveniently brings together a number of previously articles -- and the convenience is multiplied almost infinitely by the their ready and free availability as PDF downloads.

It's also worth pointing to a more recent bit of Joe's ever erudite and readable output:

This is one of a number of relatively recent (scattered over the past decade) papers Joe has produced on the subject of the Rök rune-stone; relatively few of these are readily available online, but it's well worth trying to track them down in any case, if you can. Taken together, they would have the makings of an amazing monograph on Rök which would stand worthily alongside other classics on the topic from the last century (and, incidentally, be the first such book-sized offering on the topic available in English). We can but hope .... ("hint! hint!", Joe, if you're reading! :))

Meanwhile, plenty of food for thought in the various papers of the Cornell article collection.

17 October 2011

CFP: Enduring Barbarism: Heroic Fantasy from the Bronze Age to the Internet

Man, do I ever look forward to video-conferencing technology becoming more widely and easily implemented in academic contexts because there is no way that I would ever get funding (or be able to justify using my own money) to attend or present at what looks like an utterly fabulous (in all and positive senses of that word) conference:


Enduring Barbarism: Heroic Fantasy from the Bronze Age to the Internet
College of St. Joseph Popular Culture Conference
Contact email:
Dr. Jonas Prida
jprida@csj.edu
The inaugural popular culture conference will be held at the College of St. Joseph, located in Rutland, Vermont, April 13th-14th, 2012.
Proposal deadline: Dec 15th, 2011.
We are looking for a wide range of topics, figures, panels and cultural studies methodologies to explore the enduring figure of the barbarian in Western popular culture. Graduate students, established faculty, and independent scholars are encouraged to submit ideas. Possible paper topics:
the multi-faceted use of the barbarian in popular culture
rise and fall of heroic fantasy in the 1970s
comic book barbarism
heroic fantasy as a heavy metal trope
the gendered barbarian
explorations of lesser-known sword and sorcery texts
Italian sword and sandal movies
The barbarian’s future
We are actively interested in innovative panel ideas as well.
Please send 250 word paper proposals, 400-500 word panel ideas, or general questions to Dr. Jonas Prida at jprida@csj.edu

08 October 2011

Humanites, Liberal Arts, and Technology <-> Learning, Creating, and Problem-Solving

This post does not include the name "Steve Jobs" in its title. The 'Net is already awash in Jobsian tribute. So this is not really more of that. But I do want to quickly jot down some reflections with regard to one of the late Mr Jobs's relatively recent observations that Apple's extraordinary success (all the more extraordinary as in came in the wake of Apple's near extinction) was largely founded on not just clever technology, but in fact clever technology married with liberal arts, married with the humanities. In an age of economic crisis and (to steal a quote from former professor and musician Tom Lehrer) "international brouhaha" (as, admittedly, are perhaps most ages, but anyway ...) when universities are busily trying to cut corners and programs they view as "nonessential" -- and the victims here tend to be liberal arts and humanities programs -- it is perhaps important for everyone to take a deep breath and think as hard as they possibly can before shooting down programs on the sorts "nonessential" stuff that in fact underlies the success of what is currently (October 2011) the most valuable technology company in the United States (and was at least briefly the most valuable US company of any kind earlier this year).

Properly speaking, our word "technology" derived from Ancient Greek τεχνολογία (< τέχνη [< PIE *tek̑s- "hew, fabricate"] + -λογία suffix relating to "discourse" on, or "study" of, something [< PIE *leg- "collect, speak"]) should probably be understood as study of how to best apply tools in order to solve problems -- though in my experience a majority of people skip the "study" and "apply" as well as (worst of all) the problem-solving aspects and use "technology" as a kind of synonym for "magic tools which we don't really understand, but that we've been told are pretty awesome". (In relation to which, see for example this essay by the late, great Douglas Adams [a Cambridge man, of course!].)

Even thought we see it endlessly repeated, and it should be just blindingly obvious anyway, technology is not and end in itself, nor is it a "magic bullet". Technology by itself solves nothing.  Yet even people who talk this talk often have difficulty walking this walk. In contemporary educational circles (as in many other circles), there is a lot of buzz about leveraging ICT to extend the ways and means of helping more people learn more stuff. Even in courses and seminars (of the sort with which, for example, my well-meaning but perhaps not terribly savvy university administration bombards its faculty) that knowingly remember to remind educators that ICT tools are not strategies, and that one needs to know the best ways to apply them, they equally gush with unfettered -- and largely unguided -- enthusiasm.  "Blogs and Wikis and Social Media!  Start using them in classroom today! It will make everything better!"  It's all not a little contradictory and incoherent.

To solve things, you need people with the right knowledge and skills -- and the right technology can help them acquire that knowledge and those skills more efficiently, as well as apply them them more efficiently. You need to know what problem you have, you need to analyze the task that confronts you in solving it, and then you can chose -- or invent -- the tools that will help you do that.

It is widely stated that Mr. Jobs was gifted with an ability to solve problems that people did not yet know they had until presented with the solution (in the form of, for example, an iPhone and all that can be done with it -- or whatever).  I would argue this is a close analogue to the ability that educators need.  Learners -- perhaps especially early learners, but really all learners -- do not really know exactly what they will need to be doing in the future. Neither do their teachers, their parents, or their (possibly future) bosses.  But whatever they are doing, it will require problem-solving abilities. Whether you need to figure out how to make a cup of tea or achieve world peace, that's problem-solving. In pretty much any sphere of human activity, you need to be able to identify what your problem or problems are (often more difficult than it would seem), you need to identify the conditions that will let you know the problem has been solved, and then you need to be able to identify, analyze the various tasks you might need to perform in order to move from the state of "problem exists" to "problem no longer exists". Since it's almost impossible (perhaps increasingly impossible) to predict exactly what problems today's learners will be wrestling with tomorrow (let alone in 5, 10, or 20 years!), we need to help our learners acquire that prodigiously Jobsian quality of being able to solve problems that they not only don't know they have, but that might not even exist yet.  Let point to a quote from the mighty Lemmy Kilmister (who is, now that I think of it, perhaps more like Steve Jobs than I would have otherwise casually thought -- but anyway ...): "Everything you plan is $h!t. All the good things that happen in your life are accidents."  I would humbly suggest the following mild amendment to the Lemster's incomparable wisdom: Knowing how to make the accidents work makes the good things happen. Knowing how to make the accidents work is perhaps the highest-level of problem-solving.

had a discussion with a colleague recently in which the theme was basically: "We should throw away all 'product-oriented' education -- that is, education that focuses on acquisition of particular genre-defined knowledge or skills.  All our educational processes should be about problem-solving. Genre-defined content and skills -- that is, say, whether you are nominally studying zoology or law -- can only reasonably be just a vehicle to learn problem-solving skills. If it happens that you study zoology, and go on to become a zoologist, then great. But you might study zoology and go on to become a lawyer. In that case, your genre-specific knowledge and skills might be, at best, incidental to the problems you will need to solve (though they probably make you a more better informed, generally well-rounded sort of person, which is a result not to be denigrated!); but your more general skills and strategies in terms of task analysis and problem-solving will always apply."

In the widely cited (and justly so) speech Mr Jobs gave at the 2005 Stanford graduation, he famously recounted (among other things) how the experience of attending a calligraphy class (as an audited course at the university from which he had largely dropped out) ultimately affected (at least arguably) the development of what are now familiar aspects of everyone's daily computing experience. He note that although what he learned in calligraphy class had no real particular application to his life at the time (it was just interesting and, clearly, motivating in some way) the experience of that calligraphy class led him, ten years later, to insist that capacities for typographical elegance (proportional fonts, multiple typefaces, etc.; all of which, admittedly, in the wrong hands can and do lead to "design train-wrecks", but never mind that right now ...) be built into early Macs (not to mention the early Apple LaserWriters, which some may remember as having been pioneeringly ubiquitous, until eventually swamped by the likes of HP). Such capacities were, of course, then popularized to the point of absolute ubiquity when Windows cloned those capabilities from the Mac, leading to a whole industry of desktop publishing, and eventually e-publishing, that had not previously existed but that we now take for granted.

Small things? Sure, proportional fonts are not world peace. Some might argue that all this sort of stuff was "nonessential", or an example of "style over substance". Yet solving this problem that people didn't know they had did, in its way, change the world -- which would look rather different if we took away all the weird, supposedly "nonessential" things that Mr Jobs rabidly insisted on in his company's products. And it would be a much less cool, much more dehumanized world. Rather than "style over substance", we have here "substance with style": a consummation devoutly to be wished!

So the "nonessential" turns out to be, perhaps, essential after all -- in rather important ways, for rather important things.

We don't know what what problems our learners will face in the future (and nor do they); we can't know what problems they will face -- nor where the inspiration to face those problems will come from. But we clearly cannot (or should not) say things like "Don't study Ancient Greek; it's useless" since, just as clearly, if someone had said something similar to Steve Jobs about his calligraphy class, they would have been horribly, horribly wrong. Calligraphy itself marries together concepts of technology, the liberal arts, and the humanities; it is at once artistic and practical: performing the practical functions in artistic manners. Arguably, it was exactly that understanding and experience which led Mr Jobs to eventually demand that his company's products not only integrally incorporate concepts of technology, liberal arts, and the humanities in their design, but that likewise their design should facilitate the creation of new products that likewise reflect the integral marriage of those concepts. Or, in other words, its a self-replicating humanistic philosophy -- or at least a philosophy amenable to self-replication and humanism.

Of course, many educators need no reminder about the significance of the arts and humanities -- especially if they themselves specialize in those areas, but most of the wiser people in the sciences and technical studies readily recognize this as well, even as professors of literature will recognize the value of scientific literacy, etc.  I am reminded of the way that my undergraduate courses at Harvard were organized: half of them were determined by my major, a quarter of them I was free to chose, and another quarter had to be on topics absolutely distinct from my major. (Consider carefully, O ye curriculum designers, the value of such an approach as mandated at one of the world's leading universities!)  There's an excellent article from earlier this year on this topic (arts+humanities+tech) that got bandied about certain corners of the blogosphere (well, I guess spheres don't have corners ... but never mind!);  another nice one here, and various others elsewhere. Stephen Fry's erudite and insightful tribute to Mr Jobs [Fry being, again, a Cambridge man, of course!] includes some cogent observations on the importance of the "human element" that Jobs always insisted be incorporated into Apple design.

Likewise, in considering the importance of understanding and making use of relationships between technology and the liberal arts, Mark Randall made the observation that the "next generation" is already keenly aware of this significance, and sums it up with the phrase: "Thinking will be now more important than knowing." I would, however, argue that thinking has always been more important than knowing. Obviously, some modicum of knowledge is effectively a prerequisite to thinking, but the key change in which we can rejoice is the relative ease with which knowledge can now be acquired. I could sit (and, now that I think of it, actually have sat) on top of a mountain in South America with my iPad and access a tremendous store of information from around the globe -- and I certainly hope that this situation only continues improving. Yet even if I can find information easily, I still need the ability to evaluate and analyze that information in terms of its relevance to whatever task I need to perform or problem I need to solve. That's all about thinking, and technology cannot really help us do that. (At least, not yet!)

If educators (and learners) in this so-called "age of technology" have something to take away from the example of Steve Jobs, it's that what we really need to be able to do (and thus need to learn to do) is solve problems -- whether or not we know what problems we need to solve, and whether or not the problems have even yet come into existence -- and that being good at this sort of thing requires (beyond, apparently, enormous passion, commitment, and not a little insanity) thinking. Particularly, problem-solving demands creative thinking; the infamous Apple slogan "Think different" is not just an appealingly clever bit of "geek chic" marketing, but actually perhaps foundational to achieving anything genuinely useful. (Cf. The Onion's all too believable headline.) The obvious and easy has already been done; it is thinking differently that allows us to move on to the obscure and difficult of as-yet unsolved problems.

Technology can be an iPad, or it can be a stick with which to scratch symbols in the mud. Only powerful abilities in creative thinking will help us figure out how best to use our tools to solve our problems, and working with (not just learning facts about) materials and concepts from the arts and humanities -- everyone is, after all, human -- are powerful means of helping learners develop that creativity. It would probably be for the best if the world were not entirely populated by a zillion Steve Jobses (where would we get enough black turtlenecks, after all?), but if more of us get better at solving problems -- even just the ones we already know we have -- that would probably be all right.



“Technology alone is not enough. It’s technology married with liberal arts, married with the humanities, that yields us with the results”  
-- Steve Jobs, iPad 2 event in San Francisco, CA, USA, 03 March 2011


07 October 2011

Let my computer go!

I have developed an almost rabid antipathy for my institution's IT policies.

I suspect they are not unlike those of many other institutions or enterprises (especially in a world dominated by not-terribly-secure Windows tech and, frankly, a workforce that is curiously un-savvy about how to use their stuff); but never mind all that. I'm not interested in judging things from the standpoint of the most common denominators. Things shall either be organized the way I want, or they are not acceptable -- though I may well lack the power to do other than grit my teeth and provisionally accept them. Alas my institution's IT policies fall into the latter category.

It's probably a familiar problem to the more "digitally literate" or "informationally literate" members of the population: basically, my institutionally supplied desktop computer is locked down in a fairly extreme way, giving me almost no control over things like what I can install or update -- or even whether I can choose my own desktop picture.  Equally, a snake's nest of weird proxy servers often confound my attempts to use my preferred browsers or email clients -- or even configure extra email account access in Outlook (effectively the default since, yeah, obviously, it's a Windows shop here).

Now in a sense, this is of course all perfectly fair. It is, after all, not my computer; it's the institution's, and they do have the right to do as they will with it. On the other hand, having the right does not mean they should be exercising it. Or at least in they rather ham-fisted way that they are.

Firstly, these policies certainly interfere with my ability to do my job (educator, researcher, all all that) effectively -- not least because they interfere with many of my institutions own stated objectives in terms of incorporating a wider range of ICT-mediated tools and materials into the educational experience.  For example, I would be expect to try to incorporated appropriate videos from YouTube into course content. I have no problem with this; videos are cool, and there are plenty of useful (though, of course, far more useless) videos for my purposes to choose from for my purposes.  On the other hand, if Flash gets upgraded along with much of the content on YouTube, but I can't upgrade Flash thanks to my institutions highly contradictory IT policies, then suddenly I can't see those videos and have easy no way of fixing that problem. 

Sure, if I could just ring up the IT guys and say "Please give me Admin privileges, at least until I've fixed this", and they would do it, then no real problem.  But I can't do that. I have to ring or mail the IT guys, desperately request a fix, and then wait quite possibly for days or even weeks (during which time I must harass, harangue, and otherwise plead with them from time to time) for a tech with the appropriate magic powers to show up and Do The Stuff -- though, of course, usually I am granted the right powers only until the log-in session ends, which can be a problem if the tech appears at a silly time when my availability to then do stuff is limited (as, for example, after 6pm last night, thereby prompting this post!), or the install itself requires a restart (as it not infrequently does), or the electricity goes out suddenly (I probably should not use the word "unexpectedly" in this context: we're in South America, and the lights go out whenever some rain god or another sneezes), or ... you get the idea.  For example, despite a Magic Admin Privileges Tech Visit last night (as noted), I absolutely did not have time (thanks to the stupid hour) to run all my upgrades (even those not requiring a restart), and so this morning, of course, my session had logged itself out (I can't control that either), meaning I am now back to Square One (at best) and need to start a whole, new round of cajoling a new Magic Admin Privileges Tech Visit -- only 12 hours after the last one! -- which could easily take additional days or weeks (not to say considerable time and effort) to achieve.

This is all enormously-- e-nor-mous-ly -- inefficient, ineffective, and counter-productive.

Of course, I do understand where they are coming from. I understand that they have to buy cruddy, easily broken and insecure services and systems because everyone is strapped for cash these days, and, of course, whatever problems y'all have off in the "developed world" in terms of budget, I can only assure you most earnestly that they are far worse here. (It's not like, after all, a computer costs less in South America than in Europe or North America. In fact, it quite possibly costs rather more; and there is, of course, much less money on hand with which to buy it.)  And I recognize that many of their employees -- not least many of my faculty colleagues -- are, for all intents and purposes "digitally illiterate", such that giving them unfettered access to their computers and systems would be the functional equivalent of handing the car keys to a drunken australopithecine. I understand all that.

But the "one-size fits all" IT policy is just wrong. Frankly. it just shouldn't apply to me, or to  people like me who actually know what they are doing. (A minority? Maybe; I dunno. But even if so, surely a growing minority?)  I am, after all, not a supergenius -- but I'm not stupid either. And I have been continuously using computers and computer-mediated communications and tools in both educational environments and the high-tech industry (not to mention at home) for some 25 years now. I am pretty knowledgeable even with regards to the poor, benighted Windows machine on my workplace (home is a Mac shop, of course) in terms of configuring, maintaining, and squeezing the most out of it without horribly messing it up or filling it with weird viruses and other malware. I have an extremely good track record. And, frankly, I know far more about what software and services will help me do my job most efficiently than the IT department -- who may well be a bunch of good, knowledgeable, and clever guys (and whom I strongly suspect would, at least in private, admit or agree to most of what I say) with respect to their jobs, but do not have the specialized knowledge about What I Need To Do which is a necessary prerequisite for deciding The Best Ways And Tools To Do What I Need To Do. Yet here we are.

The problem is the "one size fits all" IT policy. One size simply does not fit all. There should be tiered access and privileges.  There should be some way for me to qualify for greater access and privileges because I can demonstrate that I can use them sensibly and effectively. There should be appropriate ways for the less "digitally/informationally literate" to acquire appropriate knowledge and skills (if they need them, or would benefit from them) to likewise qualify for such access and privileges. After all, in other contexts, we would not just hand the car keys to any hominin who happens along, but ask them to take a course or at least demonstrate compliance with some set of criteria that suggested it was OK to let them have a driving license confirming their automotive access and privileges. Give me the power to do what I do well, and let me get on with it. Work with me, rather than against me. I am not the enemy. I am, actually, the exactly the kind of user that the IT guys would like -- I am pretty sure about this, having worked as an IT guy in the past myself.

Such tiered access would probably be a sensible and effective solution to the problems faced on all sides here -- and I am sure many other terminally frustrated persons myself face similar problems, in one form or another. At the very least, it could hardly be worse than the current situation. Alas, in my experience, convincing people do to sensible and effective things is quite a struggle in itself (unless you have a very Big Stick, which sadly I do not, or things would be different). But, well, we'll see.

Hope, as yet, springs eternal -- despite its better judgement.

01 September 2011

Book Note: Sturgis & Oberhelma, eds., The Intersection of Fantasy and Native America (2009)

A while back, I wrote a post musing on the values of multiculturalism as a key factor in creativity with a special emphasis on the ways that authors of "imaginative literature" (e.g. fantasy fiction, science fiction, etc.) have, have not, or could draw on a range of cultural influences.  One thing I specifically wondered was: "Are there any examples of contemporary fantasy fiction inspired primarily by American traditions?". I put forward a few tentative answers of my own, but it seems likely that such questions are answered in far greater detail by a recent (if not exactly "new") book upon which I stumbled in my virtual travels: Amy H. Sturgis and David D. Oberhelma, eds.,  The Intersection of Fantasy and Native America: From H.P. Lovecraft to Leslie Marmon Silko (Altadena: Mythopoeic Press, 2009); the book is also listed on Amazon.com, though apparently out of stock at the time of writing.

Alas, I haven't got a copy -- they don't exactly stock this sort of thing at the local bookseller in Chía, Colombia -- but international book orders don't usually go missing on me, so perhaps I can get one eventually.  For the moment, I will have to be content with the the promise held out in the publisher's book-blurb -- which is, admittedly, pretty enticingly promising: 

A number of contemporary Native American authors incorporate elements of fantasy into their fiction, while several non-Native fantasy authors utilize elements of Native America in their storytelling. Nevertheless, few experts on fantasy consider American Indian works, and few experts on Native American studies explore the fantastic in literature. Now an international, multi-ethnic, and cross-disciplinary group of scholars investigates the meaningful ways in which fantasy and Native America intersect, examining classics by American Indian authors such as Louise Erdrich, Gerald Vizenor, and Leslie Marmon Silko, as well as non-Native fantasists such as H.P. Lovecraft, J.R.R. Tolkien, and J.K. Rowling. Thus these essayists pioneer new ways of thinking about fantasy texts by Native and non-Native authors, and challenge other academics, writers, and readers to do the same. 

I'd also like to point to an interesting interview with one of the editors, Amy H. Sturgis here. Good stuff!

24 August 2011

Language for Learning in Haiti: Kreyòl pale, kreyòl konprann

This article on the BBC News Web site considers debates over whether or not Haitian schools should be teaching in Haitain Creole or in Standard French.  Some background: the Haitian Creole itself a blend of early modern colloquial French (particularly in terms of vocabulary), West African Niger-Congo languages (particularly in terms of syntax, and perhaps particularly the Fon language), as well as some bits and pieces from Spanish, Arabic, English, and Taíno (this last the extinct Maipurean/Arawakan language of the Caribbean, closely related to modern Wayunnaiki in Colombia and Brazil).  It's interesting stuff, really; there is what looks to me like a reasonable overview on Wikipedia (yes, I know: academics are supposed to hate Wikipedia, but, well, some of it perfectly OK for basic intros to topics), and the major book on the language seems to be Spears, Arthur K., and Carole M. Berotte Joseph, eds. 2010. The Haitian Creole Language: History, Structure, Use, and Education. Lexington/Rowman & Littlefield, the intro to which is online (as is a relevant chapter on education and Creole).

Haitian Creole is the mother tongue of most Haitians; only a relatively small educated elite has also learned Standard French.  Yet, at the same time, formal schooling has traditionally been conducted in French.  The debate (on which the BBC reports) is on whether Haitian Creole should become the main language of education on the island.

The obvious answer -- that any reasonably sane person with some knowledge about learning would give -- is, "Yes, the language of learning ought to bloody well be Creole."  As a few experts quoted in the article imply, trying to learn content or skills through a poorly developed second-language is a virtual guarantee of disaster.  And, indeed, the education situation in Haiti clearly bears this out -- though, arguably, the educational situation in Haiti would be pretty grim regardless of what language was being used.  Still, it can hardly help that such children as manage to scramble into primary or secondary schooling are instantly confronted by material presented in an alien language.

Still, there seem to be a couple of obstacles to implementing education in Creole.

One is simply that there are no materials. Very little in the way of textbooks and stuff like that that are available in Creole. This is a real problem for education in many developing countries; frankly, I struggle with it even as a university professor at a private university in (or next to) the relatively affluent (by regional standards) capital city of Colombia. In my wilder dreams and imagination, universities and professors around the world band together to create freely available electronic textbooks that can be readily augmented and updated through version control and (because of their freely available nature) readily translated into any language. (So, like, "Intro to Classical History is now available at Version 4.3 in English, 4.2 in Spanish, and 3.8 in Haitian Creole", etc.)  This would be awesome. But I've no idea yet how to convince enough other people that it would be awesome so as to get any kind of ball rolling. Maybe there already is such a project? (It seems like a no brainer!) I dunno. Anyway, that's just one challenge for education in Creole.

The other -- which will seem bizarre to many, but reflects a common pattern of belief -- is that many Haitians, especially the poorer and less educated ones who would actually benefit most from education in Creole, are simply dubious about the idea.  After all, it is Standard French that has been traditionally viewed as the gateway to success, and Creole is associated in the minds of many with all that they would like to move up and away from. Equally, many amongst the Haitian elite would (I suspect) see the use of French as something that separates them from the "squalor of the masses", and would perhaps be reluctant to see that linguistic separation dismantled. One Haitian medical student (and, so, probably not representative of the fate of poorer Haitians in the educational system) quoted in the BBC article avers, "Creole is not a scientific language."  Objectively, of course, this is nuts. Any language can be a scientific language; you just have to use it to talk about science. If you don't have the vocabulary you need to talk about some subject, you borrow it or build it. This is what languages have done throughout history.  Doubtless, educated Roman citizens would have thought Proto-Germanic (not that it was called that at the time!) not to be a "scientific language", or would have expressed sentiments of that kind -- but its descendent Modern English is, of course, the current main language of science and technology (a position it has not held for so long, either).

Still, one has to be able to understand why Haitians would think like this.  It's a not uncommon sort of view in many minority language situations -- though, of course, the opposite extreme also swings into play, with people refusing to speak anything except their minority language.  Of course, Creole isn't a minority language -- not in Haiti, anyway -- since it is the one language that pretty much every Haitian speaks.  But it is historically associated with poverty, slavery, etc., and that seems to fuel resistance to its use in education from both ends of the socioeconomic spectrum.

But it is only down towards the end of the BBC article that what is perhaps the most important point comes up:

But the question of Creole or French as the language of instruction appears to be of less concern to the Ministry than the very different question - how to give students a good grounding in English or Spanish. These are the languages, according to the Ministry of Education's Pierre-Michel Laguerre, that will really open up the world for Haitian children.

Well, amen. I would be forced to cast my vote for English -- and not because I'm a native speaker, but because no other language in the contemporary world gives you access to so much information -- but Spanish is another good choice within the context of the Americas, at least.

Still, my argument would be that Haitian education should simply abandon French immediately. Not that I've anything against French in particular; its just that I think education will work better when taught in the students' principal tongue (Creole, in this case) and that if that language is not already English (which, again in this case, it is not), then if you have limited resources for teaching a second language, for practical purposes that second language should probably be English.  OK, there are situations where needs might dictate another choice -- I can see why speakers of indigenous languages here in Colombia might find Spanish the most practical second-language, since you need to get things done in the rest of the country -- but I can't help but think that if extra widely spoken languages give you extra opportunities for education and subsequent escape from underdevelopment (by which I don't mean "everyone ditching Haiti" but "being able to do more stuff within Haiti"!), then English offers more of those opportunities than French (or Spanish, for that matter).  Although the same medical student earlier mentioned opines, "Whether we want it or not, we are influenced by French because of the history of colonialism - this is not something we can get rid of quickly", I don't think that's the right way of looking at it. You move away from French the moment you start using Creole as the main language of education, and likewise as you start teaching better English.

Again: There's naught wrong with learning French, per se, but at the very least a lot wrong with trying to teach students in French when they haven't learned it. Even if we don't put a lot of stock in the semi-Romantic notion that Creole is the language of Haiti, valuable in its own right, and important to sustain (though I think we probably might as well put some stock in that idea), it's plain good sense to try to educate people in their first language before throwing stuff at them in another. (And, sure, I am an advocate of CLIL as a form of bilingual education -- but implementing a CLIL approach is quite a bit different than just dumping Standard French textbooks on Creole speakers. CLIL implies appropriately scaffolding learners even from very low initial levels in the "target" language in order to achieve grade-appropriate competencies in both content and language -- but I digress.)  There are no materials? Build them -- and they will learn! A pi ta!

18 August 2011

Language in post-Roman Britain and post-Conquest Latin America

There's an interesting post on language in post-Roman Britain over at Guy Halsall's Historian on the Edge (a blog worth following); I tried to leave my own follow-up comment of musings, but the Web UI somehow defeated me -- so I'm posting them here, instead. Hey, it's been months since the last post, so this is as good an excuse as any to get going again.

The (Apparent) Problem of Language Shift in Post-Roman Britain

The mentioned HotE post concerns linguistic change in post-Roman Britain: essentially the familiar question boils down to "Why does most of Britain end up speaking a Germanic language (i.e. English) instead of a Romance language or a Celtic language?". After all, through comparison with other regions of post-Roman Western Europe, one might have expected Britain to end up speaking Romance despite an influence of Germanic-speaking invaders (Spain, Italy, and especially Gaul all had their share, but end up mostly Romance-speaking). And we know there was still a sizable British-speaking population in post-Roman Britain; modern Welsh-speakers are their linguistic descendents, as were (are? could be?) Cornish speakers; there were clearly enough British-speakers on hand that some went over to Gaul and gave us Breton. So if Britain hadn't stuck with Romance, it might have gone back to British .... But it didn't: It became (mostly) English-speaking instead.  Even by medieval times, most of what is now technically England was English-speaking.

This seems to worry researchers who perhaps look no further than other bits of post-Roman Western Europe for the models that underlie their expectations -- but I am not entirely sure that it should. More on that later; firstly, some observations around HotE's post and the follow up comments ....

Language and Identity in post-Roman Britain

The HotE post makes the points that there is very little evidence of impact from British (loanwords, etc.) on English, while on the other hand Old English does show a number of Latin borrowings, and observes that this leads some to think that English got a footland in the "heartland" of what had been Roman Britain -- the agriculturally productive southern and eastern lowlands. These were indeed the areas earliest and most thoroughly Romanized -- and also the areas in which Latinate culture and identity seem to have most seriously collapsed at the end of the Roman period. This leads some to the conclusion (which I think is not unreasonable) that a new Germanic-speaking elite established itself in this region where British was, if not moribund, perhaps already relegated to a irrecoverably low status, and Latin had just lost its previous high status -- thus permitting a situation in which both British- and Latin-speakers had every reason to switch to Germanic rather than the other way around (as happened in, say, most of Gaul -- though the post makes the useful observation that a comparable collapse of Roman culture/identity seems to have affected northern Gaul, which likewise ended up Germanic-speaking).

The post goes on, however, to note an objection that "Roman identity did matter ... to the Anglo-Saxons when we know about them from written sources - and it also mattered in the highlands". (That's the highland areas of former Roman Britain, really -- most of Wales, the Pennines, Cumbria -- and not really what we now call the Scottish Highlands, which were never very Romanized). From this objection stems an interesting alternative model for the spread of English, seeing it as having been:

"... used as a politically dominant 'lingua franca' in areas where British and Latin were spoken ... in that highland/lowland border zone - along the edge of the villa zone, where the greatest prosperity in late Roman period is attested, and where the most powerful AS kingdoms emerge.  The scenario (if I remember correctly) would run like this.  Pre-Anglo-Saxon British highlanders would know some Latin but not much - enough to be able to make transactions with lowland villa-owners etc, especially to pay taxes and so on.  The villa owners, by contrast, would know no British.  When an Anglo-Saxon military elite came to power, however, both would need to learn Old English to communicate with these warrior aristocrats, and knowing this language would enable them to communicate with each other in the new set up."

I don't object to many of the generalities in this scenario -- and the points about the relative economic prosperity of the highland/lowland border zone region in Roman times and similar political significance of the same region (e.g. Mercia/Northumbria/Wessex) in the Anglo-Saxon period are well made. But I have some problems with the underlying particulars that make it all seem a bit shaky to me.

Firstly, although Roman identity does seem to matter to later English writers, it must be admitted that its more the kind of Roman identity we would expect to matter to participants in Romano-centric early medieval Christianity. The floruit of this English expression of Romanitas is really something we see most prominently (IMO) in works from the 9th-11th centuries (Bede et alia not withstanding) or at any rate in post-conversion writings (most English writing being, of course, perforce post-conversion) when society was probably quite different than it was in the formative 400-600 AD (pre-conversion) period.

Old English, of course, preserves borrowings taken from Latin at different periods -- from the pre-English Proto-Germanic period, through the early "Migration" period (say 400-600) in Britain, and of course from later, learned, ecclesiastical contexts (after 600, and perhaps especially in the 9th-11th centuries, the period which seems to have generated most of the Old English writing in the forms that we know it). Dating particular loans depends on philological evidence; for an overview, see Alistair Campbell, Old English Grammar, 3rd edn (Oxford: Oxford University Press, 1983), pp. 199-220, though this is a complex issue that has perhaps yet to be treated in full. (See for example, the article by Alfred Wollman, "Early Latin loan-words in Old English", Anglo-Saxon England 22 (1993), pp. 1-26, doi:10.1017/S0263675100004282, which unfortunately my institution does not have access to, so I will have to try to scam it from someone else, but your mileage might be better). Nevertheless, I am not sure there is much in the evidence that I can see which suggests incoming Germanic-speakers and their descendents (genetic, linguistic, or both) in the period 400-600 had a great deal of interest in picking up and maintaining any kind of Roman identity. Moreover, it seems difficult to argue simultaneously that Roman identity "mattered in the highlands" but also "Pre-Anglo-Saxon British highlanders would know some Latin but not much". Or, at the very least, this would seem to suggest that the importance of a Roman identity was not necessarily accompanied by use of Latin as a first language (in which case we might as well stop worrying about associations between Roman identity and the dominance of Romance speech everywhere else, too).

Moreover, if English had been gaining ground principally through use as a second-language lingua franca between Romano-British villa owners and British-speaking hillmen (with Roman aspirations?), then why wouldn't there indeed be more evidence of impact from their respective languages on English from a relatively early date? (As an added bonus observation, lets not forget that the original Mediterranean "lingua franca" was essentially an "inter-Romance" pidgin.). Instead, while I think we see modest evidence of early British Latin (or other forms of spoken Latin, depending on what period and region we are talking about) on early Old English (and very, very little evidence of impact from British Celtic), I would suspect that borrowings of Latin words that really have to do with anything about any kind of "Roman identity" are associated with a rather different, Christianized flavour of Romanitas.

Lingue franche & Lingua Anglica

However, there is another perhaps more curious issue here regarding the use of English as a lingua franca.  For one thing, the proposed model seems to assume that either British highlanders and Romanized lowlanders did not communicate until English became available as a lingua franca or that they ditched whatever form of communication they used previously in favour of English as a lingua franca once it became available. The former possibility seems unlikely, at least if there was real economic advantage in their communication (which we are assuming, and which adoption of a distinct lingua franca would imply anyway). The second possibility seem more plausible -- though it would seem to misunderstand how a given language typically comes into use as a lingua franca.

Generally speaking, speakers of languages L1 and L2 do not simply pick out language L3 as a ethnolinguistically netural form of mutual communication.  Generally, speakers of one either end up learning the other, or some pidgin (perhaps > creole) is formed out of the two. A lingua franca L3 is generally only possible when speakers of L1 and L2 have already had some other reason to be learning L3 (typically to communicate with native speakers of L3) but then discover that since they now all know L3 they can thus simply use L3 for mutual communication without worrying about L1 and L2.  This is basically what has happened to Modern English, which now seems to have more second-language speakers than native-language speakers. Not so long ago, people were learning Modern English to communicate with native speakers of Modern English, because over the last few centuries Modern English had developed a significant degree of economic, political, and technological value (thanks to the economic, political, and technological dominance of first the British empire, and then the United States of America). That's a complicated story in itself, but to make it short, lots of people with many different native languages ended up learning Modern English to communicate with native Modern English speakers, and now those lots of different people have discovered Modern English is a practical lingua franca to speak to other non-native English speakers -- and, well, there you go.

What does this have to do with our highland Britons and lowland Romano-Britons? Well, perhaps they did use (Old) English as a lingua franca; but they hardly would have done so without a phase in which it was first seen (by both groups) as an advantageous way of communicating with native Old English speakers.  Just as with Modern English, the use of Old English as a lingua franca is as much a side effect (if a big, useful, and important side effect) as anything else. Thus, I am not sure it makes sense to invoke the need to a means a communication between British-speakers and Latin-speakers -- whether along the highland/lowland border zone or elsewhere-- as an impetus for the adoption of English.

Moreover, the facts of the case are clearly that there was relatively little impact of British on Old English (certainly very few loanwords, many connected with geographical features -- and, of course, many British place-names were borrowed with attendent "Anglicization" of their pronunciations). There was rather more impact of Latin on English (maybe 600-700 loans -- from all periods), though probably more impact of Latin on British (maybe 800 loans preserved throughout Welsh/Cornish/Breton), and I would suspect (though it would be well worth knowing more about!) that, on the whole, Latin loans into British tend to be a bit earlier (perhaps many from the Roman period itself) and Latin loans into English tend to at least trend later (with at least a sizable number belonging to the post-conversion period).  In all events, the impact of Latin on Old English looks substantially lighter (to me) than the impact of Scandinavian (in the Viking Age) and especially of Norman French (post-Conquest). (And the impact of British almost negligible in comparison to any of these. Yes, there are those who argue for more substantial Celtic influence on English than is commonly recognized -- as in this book, or the study described here, this issue of English Language and Linguistics if you are lucky enough to have access, and see summaries and some useful links on this Wikipedia page --  but even if there is something to these ideas, the mere fact that one has to look so hard to come up with still fairly controversial explanations suggests at best a relatively light touch in comparison to all our other example of linguistic influence.)

In any event, whether you were a native English, Latin, or British speaker, there was clearly very little need to pepper your speech with British words (or, if there was, it was a brief fashion of which almost no trace survives in the later recorded language), and only a modest need to deploy Latinate loans. (And there are examples of Latin borrowings to OE from early periods alongside different, re-borrowings of the same word for the same purpose in later periods, suggesting that some early borrowing that survive in our records were not necessarily so widely known as to obviate the need for the later re-borrowings). Perhaps even advantage for native Latin (or British) speakers in speaking a fairly "pure" form of Germanic (though I strongly suspect that "proto-OE" developed as something of a Mischsprache of different "North Sea Germanic" dialects). In other words, if you had been born a British or Latin speaker, you were perhaps learning English in order to participate alongside a native Germanic-speaking elite, and slanging around more Latin (and certainly British!) than the native English-speakers themselves already did might make you look like a bit of a "hillbilly" (whether or not your were from the highland zone ...). Accordingly, the existing idea that "proto-OE" first expanded at the expense of other languages in the economically attractive lowland zone -- where perhaps British was already pretty much obsolete for daily business amongst anyone who mattered, and Latin was quickly being deprecated in the post-Roman environment -- is not so bad after all.  That's not to say that an argument could not be made that the highland/lowland border area was not also economically attractive, and that English could not have started its march towards dominance there. Rather, the point is that the evidence suggests that you probably first learned English to communicate with native English-speakers in an environment where there was more Latin than British in use, so any use of English as a lingua franca between native speakers of British or Latin may have been already fairly limited (perhaps by a shortage of British-speakers not also bilingual in Latin) and perhaps short-lived in any case (as the advantages of using English on a daily basis swept away practical uses for Latin or British).

In the comments to the HotE post, the estimable Alex Woolf notes "a brand new article just out by David Parsons in the Transactions of the Philological Society 109 (2011), 113-137, which deals with the relationship between Latin and British in Roman and early post-Roman Britain"; that whole issue of TotPS is in fact devoted to languages of early Britain, and so well worth checking out -- if you can get it! Alex goes on to express discomfort with "the idea that elite languages become widespread vernaculars in this way", by which I understand him to mean through their use as a prestige language/dialect, as he goes on to observe that the argument "that OE is British-free because it is an acrolect is problematic because it clearly emerged in the seventh and eighth century from regional dialects and does have distinctly insular peculiarities (some of which, morphosyntactically, may be derived from British or Romance)". e also speculates interestingly about whether "the dominance of Romance in western Europe may not be partly because it was the lingua franca for bi-lingual Celts and bilingual Germans".  This is all good stuff, though I would once again observe that Celts and Germans on the Continent must have first learned Romance to communicate with native Romance speakers (not each other), with mutual communication in Romance between native Celtic or Germanic speakers being a useful by-product of this process (which eventually proved so very useful that the descendents of our Celtic and Germanic speakers generally stopped bothering with their forefather's quaintly outdated languages and became monolingual Romance speakers). Anyway, Alex himself has a handy paper on the whole British > English thing, "Apartheid and Economics in Anglo-Saxon England" which is well worth reading, and comes from the book Britons in Anglo-Saxon England (ed. Nick Higham) which is absolutely full of rich food for thought, and which you should beg, borrow, or steal ;) as necessary!

Of course, much of the argument about language shift in post-Roman Britain focuses on how many Germanic-speaking immigrants showed up, and whether they replaced the native population (driving them out, killing them, or -- more subtly -- swamping them genetically through high-status immigrant males monopolizing available local females) or whether the native population switched to the language of a relatively small but very high status incoming elite.  Probably, of course, things went down different versions of these routes in different places at different times.  The end result, however, is a largely English-speaking southeast and south-central and Britain by the 9th-century or so -- and speaking an English with relatively little sign of influence from either Celtic or Romance (which, of course, just makes everyone worry all the more when they look to the future, when English is enormously impacted by first Norse, in the Viking Age, and then French, following the Norman Conquest).

Pre-Roman English? Probably Not

Some of the follow-up comments to the post bring up other ideas -- one being that of paediatrician and geneticist (but not historian, archaeologist or linguist) Stephen Oppenheimer, who theorized that a Germanic language ancestral to English was spoken in pre-Roman Britain.  There are tremendous problems with this idea, of course (another comments notes some critiques at http://www.grsampson.net/QOppenheimer.html), but I think the main problems are that 1) there is no evidence for any sizeable Germanic speech community in Roman or pre-Roman Britain, and 2) there really is no need to try to explain away the lack of Romance and/or British influence on English because -- as long as we are not too obsessed with other parts of post-Roman Western Europe as some sorts of "normative models" -- there is really no particular reason that English should necessarily have been influence by Romance and/or British.

This idea seems to have found favor with the usual sorts of "mainsteam academia wants to hide the awful truth, man!" kind of "popular" or "alternative" history'n'archaeology crowd -- not to say English nationalist types.  But, honestly, though Oppenheimer may well have useful insights on population genetics (though I'm hardly qualified to judge!), it's a fundamental mistake to equate population genetics and speech communities at almost any stage of history.  Sure, there can be a correlation -- but there can just as easily not.  In some respects, it seems pretty clear that there has been remarkable continuity of population in at least some parts of Britain since the post-glacial re-population, while language has very likely changed/been replaced several times during the same period.

So my advice would be to run far and fast from theories about significant Germanic-speaking communities in Britain much before 400. There's no linguistic evidence to support such ideas -- and no need for such ideas to explain the results we have.

Comparative Examples with Language Shift in Post-Conquest Latin America (Particularly the Colombian Altiplano Cundiboyacense)


With regards to all this, I would like to point to examples from Central and South America, where there are a wide range of outcomes in terms of the fate of indigenous populations and languages vis-à-vis incoming Europeans and the Spanish (or Portuguese) languages. In some areas, local populations largely survive intact, in others they were largely replaced; likewise indigenous languages survive in some regions, but not so much in (most) others. We can't really apply any universally simplistic model to Latin America; there were clearly different social, political, economic, environmental, and even physiological factors in different regions that led to different outcomes in terms of population and/or language replacement ... or not.

As a specific example, in the particular region around Bogotá, Colombia (where I live), the inputs and outcomes seem broadly similar to what might have happened in post-Roman Britain (though only two languages are really involved in this case). Here, the pre-Hispanic indigenous population has largely survived, though it has mixed to greater or lesser extents with the incoming European (and African) population (with higher percentages of European ancestry in the city, and higher percentages of indigenous ancestry in the hinterlands. The indigenous language, however, became extinct a few centuries after after contact/colonization -- with (surprisingly?) very little impact on local varieties of Spanish, despite broad continuity in population in much of the region (particularly, of course, in more remote and rural areas). The impact of borrowings from Arawakan languages in the Caribbean -- or even from Quechua in Peru -- is far, far more evident in local Spanish than borrowings from the local Muisca (Chibchan) language. (Perhaps not insignificantly, although the Taíno language that probably contributed most of the Arawakan borrowings is likewise extinct, the closely related Wayuunaiki language retains a large speech community, as does Quechua in Peru, etc.). In any case, here around Bogotá, Spanish as the language of elite invaders seems to have edged out the local indigenous language without much impact from that language (less than the impact of other indigenous languages in different regions, with different outcomes).

Clearly, the results are profoundly affected by context, but without getting caught up in the sociolinguistics of it all, it must have been the case that it was advantageous to adopt Spanish, but that there was no functional value to adopting terms from Muisca -- because that is basically the result that we have (much as English in Britain lacks much obvious input from British or spoken Latin).  Such very, very few Muisca lexical items that were adopted predictably refer to exotic vegetation and the like (for which there were no native Spanish terms), and the need for higher-status speakers to ever use these terms is extremely limited. I've read about them, but I've never heard anyone actually say them.  Moreover, even in some such cases borrowings from "foreign" indigenous American languages ousted "local" indigenous terms. For example, colonial-era Spanish borrowed a term from the Arhuacan languages of the Caribbean coast, hayo, to identify the plant that the local Muisca called fuhuza, though hayo itself was eventually ousted by the Quechua-derived coca. (And, for the record, it's also very evident that [Modern] English has had far more impact on local Spanish than the loacl "substrate" or other indigenous language ever did -- and without there ever being any significant local population of native English speakers to aid the process.)

It seems likely that there were pre-Conquest native Muisca speakers here in the altiplano who were also conversant in the languages of neighbouring groups from the lowlands (it's that highland/lowland thing again! ;)), but clearly the Muisca-speakers and their immediate neighbours all started learning Spanish to communicate with the native Spanish-speakers that were the new power on the block, and though Spanish probably then became the lingua franca of choice amongst speakers of different indigenous languages in the region, the end result was everyone using a Spanish relatively free of any local indigenous terminology.

Back to Post-Roman Britain

In colonial Colombia, perhaps Muisca-speakers found little advantage in adding native terms to their speech since they would aid understanding with neither native Spanish-speakers nor speakers of neighbouring indigenous languages. Perhaps something similar happened in post-Roman Britain -- with the exception that perhaps there were monolingual Latin-speakers using Latin to converse with bilingual British speakers of Latin -- so perhaps Latin itself remained briefly the lingua franca in early/formative Anglo-Saxon Britain, helping more Latin terms enter OE ... until its utility as a lingua franca was swamped by the greater utility of OE for that purpose (since everyone had to learn OE anyway if they wanted to get in with the native OE-speaking big boys on the block anyway).

So I would agree that a collapse of Roman identity (and thus the usage and prestige of Latin) in lowland Britain may well have been a key element in helping English get a foothold there, but I see little evidence that Roman identity as such mattered to the English until much later times when they were (Roman) Christians (and accordingly borrowing additional Latin terms through exposure to learned ecclesiastical culture). In other words, I suggest "proto-OE" entered something of a "linguistic prestige vacuum" in lowland Britain, with lowland British perhaps largely a dead issue, with British Latin soon to join it. In such a situation, proto-OE, though absorbing some useful Latin vocabulary (which had been going on since Proto-Germanic times anyway) could have rapidly become the "go to" language with attendant (apparent) "Anglicization" of the rest of the (Romano-)British population through subsequent generations. In any case, the end result was an English apparently less affected by Latin (let alone British) than it was eventually to be by either Scandinavian or French in later centuries.

Now, as for the details about the particular "whys" of all this .... Well, obviously we want to think about the various local conditions that produced our results, but most importantly it seems to me wrong-headed to assume that it's necessarily strange that Britain ended up mostly English speaking, or that British had relatively little impact on that English. That (for example) France is not Frankish-speaking (except in the north) is a different sort of outcome, but then it was clearly a different situation -- though not a more predictable one, really.  (Well ... we might decide that we could predict, once we felt suficiently comfy with the whys and wherefores -- but could have anyone "back in the day" have predicted the respective outcomes in then-Gaul and then-Britain? I'm doubtful ....)  But I see results basically comparable (IMO) to those of Britain whenever I walk out my door here in Colombia. Naturally in some other part of Latin America, where local conditions were different, I would see some other results (as in fact we also do in different parts of former-Roman Europe: like the aforementioned case of northern Gaul).

In any event, in the ongoing debate about linguistic change(s) in early Britain, there might be value in looking at comparable language replacement scenarios further afield -- for example, in the post-Columbian Americas, as suggested here, but probably also elsewhere -- and not just across the Channel to other bits of post-Roman Western Europe.  There is lots of fun yet to be had with all this. :)