18 December 2014

Budget Home-Recording on a Mac: The Agony and the Ecstasy (Though Probably More of the Latter)

Finally, a proper post on this topic -- spurred by a friend's post to Facebook asking "what do I need to start recording my own music?". Predictably there were a lot of answers -- and many of them were potentially good ones. As with any creative endeavor, "whatever works for you" is a valid answer, though when it comes to home-recording with a personal computer (of the sort one tends to have on hand anyway) or even other digital devices, a lot also depends on what you can do with what you have.

I am focused on a home-recording approach that is centered around Macs. I'm not an Apple fanboy who queues up to buy a new iThing the moment it's available -- and I'm on a budget, so I'm usually a generation or so behind on iThings, if I have them at all -- but I moved from a DOS-based PC to a MacIIsi in 1993ish because I didn't like Windows 3.1, and have stayed in Applelandia happily enough ever since.

The obvious recording tool for a Mac user is GarageBand, if for no better reason that it comes pre-installed on every Mac. Perhaps the other two biggest players are Ableton Live and Pro Tools, widely used on the Windows/etc. side where there is no GarageBand. I've never used either, though I understand Ableton Live is a bit idiosyncratic, "love it or hate it", while Pro Tools is clearly what the "big boys" use in pro studios (and so perhaps worth it to the Mac User if you really need that interoperability with your Windows buddies). Apple's Logic has probably a bigger and better solution of software instruments than either of these, and very easy to use software instrument/MIDI editor -- but mostly it is sold as a loss-leader by Apple for about USD 200 (at present). And, frankly, that's giving it away for what Logic is. I think Pro Tools retails at around USD 700 and Ableton Live for perhaps slightly less. So for the home-recorder on a Mac, Logic is a no-brainer ... excepting that you've already got GarageBand for "free", and it's really quite powerful on its own.

A Tale of Two GarageBands, and Their Logic

GarageBand first appeared around 2004, spun out of Logic, which Apple had acquired from its original German makers, Emagic. Possibly this was some idealistic Jobsian plan to bring music production power to the masses -- and one questions whether the masses have much use for that, however good it might be for them -- but anyway it was a godsend for the budding small-time Mac-using musician. Suddenly your computer had a basic, but functional, digital audio workstation built into it, and as updates and 3rd-party add-ons continued to appear over the years, you could (with a little know-how) start producing demos that actually sounded pretty good.

Something like a road bump appeared in 2013. GarageBand's big brother Logic received a significant overhaul in the form of Logic Pro X, and an effectively new version of GarageBand (aka GarageBand 10, GarageBand X, GarageBand 2013) followed. If you already had/have "original" GarageBand (maximum version number 6.0.5), the new GarageBand installs as a distinct application, with the older version shunted into a subfolder and runable separately.

GarageBand X offers cool new stuff, like the eminently practical "Drummer" feature (borrowed from Logic Pro X), but -- at least initially -- disabled or at least obscured a lot of functionality that the "advanced beginner" had come to rely on, like easy use of 3rd-party plug-ins and control of Apple's built-in software instrument settings. Some plug-ins (like IK Multimedia's Amplitube) simply stopped working on GarageBand X (even if they still worked in Logic Pro X). Luckily, a 2014 update (accompanying Yosemite) restored a lot of this control -- and if your favorite plug-ins are still a bit screwed in new GarageBand, the manufacturer may have come up with a serviceable workaround (as with the case of Amplitude, which still has no official fix, but can be convinced to do its job).

So with some know-how and decent 3rd-party add-ons, the budding home-musican can still make some pretty cool music in GarageBand X even without jumping to Logic -- though the low price of Logic makes the jump a continuous temptation.

For my particular part, I'm based in South America (Colombia) where my earning power is low and import costs are high -- so getting by on a budget is a lot more important than it might be in the US or Europe (where part-time students and particularly dedicated lawn-mowing teens can readily out-earn me in real terms!). So I haven't made that jump yet ... though it is tempting, and I might well do so before long. Maybe .... soon ... ish .....

Getting Going

Let us then imagine that one is a budding Mac-equipped musician: What do you really need to get started with recording your own music (or even covers of your favorite songs)?

My focus is on your basic guitar-based rock formula, rather than classical chamber quartets or techno or something -- though, ultimately, a lot of the basics apply everywhere.

Audio interfaces

If you play an instrument you need a way for that instrument to talk to your computer. Instruments (like electric guitars) and microphones that produce a "traditional" analog audio signal most likely need a separate audio interface unit. If you are just working mostly by yourself at home, a simple USB interface with limited inputs/outputs will do the job. (If you need to record lots of tracks from a full band, etc., something more capable might be in order.) I currently use a somewhat obsolete MOTU MicroBook; if I were getting a new interface, it would probably be a Focusrite Scarlett 2i2 (which, in my case, have the added advantage of being not impossible to find in Colombia -- or in the capital, Bogotá, anyway!).

Since I am principally an electric guitarist/bassist, I also use an MOTU ZBox between the guitar/bass and the interface. This little device supposedly simulates the distinct "Hi" and "Lo" impedance inputs found on many classic amplifiers. Since my real experience with classic amplifiers is limited, it's hard for me to say how well it does this .... But it gets good reviews, is relatively cheap, and it lets you feel a bit more geeky in the guitar department -- which is all cool. You can survive happily without it (I did for many years) but you might as well pick one up.

Amp modelling

At this point it may have become clear that I am not using traditional amplifiers for my guitar/bass; I'm plugging (more or less) straight into the computer, where software emulates the job of the amp. For the traditionalist/purist, this probably eliminates gasps of horror and/or snorts of derision. A lot of my recordings float in and around genres of "classic rock" and "stoner rock", and the latter is specially is a place of worship for giant, vintage, smoldering tube amps capable of leveling a mountain whenever the player touches (or even looks at) a string. That kind of thing is, of course, awesome. But it's also totally impractical for my purposes -- being not merely expensive (especially in Colombia) but also likely to wake the wife and child (as well as lay waste to the neighborhood).

GarageBand (and Logic) actually come with very useable amp models for guitar and bass, as well as various effect pedal models. I used the "original GarageBand" models for years and coaxed very useful tones out them. GarageBand X seems to offer similar kinds of stuff, with much improved interfaces, though I moved to IK Multimedia's Amplitude a couple of years back. Besides Amplitude, big players in this area are Line 6 and Guitar Rig. I like Amplitube's "Custom Shop", which basically sells models through an "app store" like approach; for example, if you want a model of the classic Marshall amp that Hendrix used, you can buy it online and download it for about USD 20 or so, rather than buying a giant complete software suite of amp modeling software (much of which you might not want) for hundreds of dollars). Amp modeling software has made great strides as computers have become more powerful, and I very much doubt the average listener can tell the difference between even GarageBand's built-in models and a real amp. Plus, it's extremely practical for the home-recorder who wants to get that "to 11" sound at 3am in their urban apartment. So I'm a fan of this approach.


If you want to record anything that doesn't produce an analog audio signal, like your voice (or acoustic instruments, or a real guitar amp's speaker), you will need microphones. Most mics are either "dynamic" or "condenser". Dynamic mics like the famous Shure SM57 and SM58 are often built quite ruggedly and are commonly used in live situations for vocals, guitar amp mic'ing, etc. They are good at capturing strong sounds (like loud singing and guitar amps) at close proximity, and are often a bit cheaper. Condenser mics are often a bit more delicate (and pricey), tend to capture a wider range of loud/soft sounds, and often need external power (though this is often built into modern audio interfaces). Condenser mics are often used for vocals and acoustic instrument in studio situations, and if that's your focus, you can probably get decent entry-level condenser mics like the Røde NT1-A for USD 250-350.

That's probably somewhere I'll go eventually, but so far I've just used the old Shure Beta Green vocal mic (basically a cut-price SM58, I think!) that was originally bought for the vocalist in my first band about 20 years or so ago. The cognoscenti will not be impressed, I think -- but frankly it's worked just fine so far for my not-very-refined vocal takes.


GarageBand has a lot of built-in software instruments -- that is to say, basically sampled sounds accessed via MIDI -- that are most easily used with a USB MIDI controller keyboard. You can get these in a variety of sizes with a variety of more or less sophisticated features . I have an M-Audio Oxygen61 (i.e. 61 piano-like keys); I probably would have been fine for what I do with a 49-key unit, but the shop here in Bogotá had a 61-key unit, so I bought it while I could. Very handy for adding even simple instrumental or synth textures, as well as tapping in drum/percussion parts. Speaking of which ....


People do record acoustic (or electronic) drums kits in their homes; I am not one of those people. Partially because I am not much of a drummer/percussionist, and partially because an acoustic kit would be way to loud to be practical. So you are not going to hear a lot about the arcane arts of mic'ing up an acoustic kit here.

But a rocker needs drums.

You can fill this gap with loops; a bunch come built-in to GarageBand, though many more are available from 3rd-party providers. I like the loops from "Drums on Demand", which come in a variety of styles and tempos, and are easy to use (as-is or chopped up). Making good use of loops is something of an art in itself -- especially for the rocker looking to emulate the approach of a real drummer -- but there are a few tracks I've produced on which I think I've done OK with this technique.

The new(ish) "Drummer" feature of Logic/GarageBand X may be a "loop killer". You can read more about this elsewhere, but basically it uses sets of MIDI drum patterns associated with various styles and tempos, along with different sets of sample of different drum kits, to give a fair degree of control over what an imaginary "virtual" drummer is doing. It's pretty fun, and may well get better with time.

Otherwise, I've gotten some of my most satisfying results from hand-programming drum parts. There's a big learning curve here -- one has to learn to think like a drummer to produce something that sounds like a drummer did it -- and it's time consuming. But gives a lot of control. I've used a Mac-only application called Doggiebox to program drum parts separately (outside of GarageBand), from which I output audio files (corresponding to separate kick, snare, tom, hat, and cymbal tracks) that I import into GarageBand (sort of as if they were very cleanly recorded acoustic drum tracks). Doggiebox lets you create your own "virtual drum kits" using whatever samples you like; I've long used the "nskit_7" samples, no longer available under that name, I think, but perhaps available as "NDK Natural Drum Kit". There are other approaches like Toontrack and Additive Drums, but I haven't gone that route. If Doggiebox ever stops working, I'll probably stick with Logic/GarageBand's "Drummer" feature as long as possible before investing in another 3rd-party approach!

All that said, some simple hand percussion can be fun and "authentic". I actually have two cowbells on hand, and a few different things to hit them with, because ... well, obviously we need more cowbell. :)

Guitars & Basses

Also vital for the rocker. Not much to say here because if you've got one you like -- and an audio interface to get the sounds into the computer -- you are basically good to go (even with no more than GarageBand's amp models). If you only play one of guitar or bass, you might as well get the other and learn to use it -- 'cause then along with GarageBand's virtual drummer, you are then a basic one-man band.

You can, of course, also get away with using some loops and stuff here. GarageBand offers numerous loops of various kinds of guitar and bass parts, as both software instruments (i.e. MIDI) and real audio samples -- and, of course, a vast array of all sort of instrument loops (which can readily be augmented by 3rd-party loops) that are good for all sorts of things.

Instruments are usually pricey imports in Colombia. They cost more in real terms than they do in the US, and of course local salaries are much lower. But you can do worse than get decent "budget line" instruments like Fender Squires or Epiphones and then -- when your finances allow -- replace the pickups with higher-quality 3rd-party gear. (In my experience, the electronics are one of the key areas where budget guitars cut corners.) I have a '90s-era Gibson Les Paul Standard and a mid-'70s Rickenbacker 4001, both purchased long ago when I was a swinging bachelor. I could no way afford to replace them now :) especially in Colombia. I have a more recently acquired Epiphone "Inspired by 1964 Texan" electro-acoustic which (though it would benefit from a decent condenser mic to augment its electric output) is great. While it's cool to drool over more instrumental toys, a single electric guitar, electric bass, and electro-acoustic guitar will get the jobs done.


With all your cool software and hardware music-making toys in place, you still need some way of hearing what you're doing. Your options are basically speakers (e.g. studio monitors) and/or headphones. You can find a range of useable monitoring headphones these days: I use a AKG K240 MkII set. These let me monitor quietly -- which is a big advantage -- but there's a lot of low end you can't hear. Decent home-studio monitors would be nice for those times when I can make a bit more noise, but I haven't gone that route yet. Frankly, I can do a lot of practical work with just some bog-standard Logitech computer speakers. Purists and pros would rightly express their horror -- but you work with what you have, and what you can afford! The main disadvantage is that computer speakers, like headphones, are possibly going to lose a lot of low end -- and so you may produce mixes that contain a lot of muddy low-end simply because you can't hear it to mix it out. I try to get around this by listening to mixes on the car stereo (itself a long tried-and-tested approach!), but it's imperfect.

Check it out!

Someday, perhaps I'll have better monitors -- along with better mics, and better software. But using what I've got -- and keeping an eye on the numerous informational resources now available (lots of Web sites, videos, and blogs treating different aspects of recording and mixing) -- I've been able to come up with some stuff I'm not overly embarrassed by. :)

You can hear for yourself and decide via the following links:

01 July 2014

The music industry is dead (or at least pining for the fiords)? Long live music!

I can hardly believe one still hears grumbling and complaints about the "death" of music (or even, more properly, "the music industry" as such) at the hands of the Internet and the digital era. What year are we living in? 1973? For good or for ill, we are not.

In any case, "music" is far from dead. In fact, we are perhaps living in a moment of (at least potential) music creativity and expression unparalleled in human history. Never have our capacities to (1) make music and/or (2) listen to the results been greater for a larger number of people -- and we can reasonably expect (barring global demographic catastrophe, a possibility that we should not entirely rule out!) that trend to continue.  There is an enormous amount of good (and, of course, less good) music of essentially any style one can imagine (as well as quite a lot of styles one has probably not imagined!) being made all over the world, as well as numerous opportunities for finding this music that simply didn't exist before. Of course, there are often considerable difficulties of various kinds (legal, technical, informational, etc.) in terms of finding what you already know you want and then being able to listen to it readily ... but we shall see how all that plays out in the years to come.

What is "dead" is "the music industry" as it was known in the 1950s-1990s (essentially: that industry which revolved around pop music from Elvis through Nirvana). This could be seen as a simple consequence of things (technological, social, etc.) having changed faster than that industry itself, but it increasingly seems more like a curmudgeonly grumbling of of "Oooo, when I were a lad, things were different!". Despite the niche and somewhat "hipster" (not to say largely nostalgic or pseudo-nostalgic) resurgence of vinyl in recent years, the concept of a "record collection" is largely an artifact of a recording/distribution technology that depended on physical media. And, obviously, before records (vinyl, wax cylinders, CDs, whatever) existed, no one had such "collections" ... except, one supposes, of sheet music ... or of composers themselves (in the "collections" of rich patrons). (Many well-known classical composers of the 18th century or earlier were essentially producing on-demand for aristocratic patrons. Mozart tried to break out and make it on his own, but failed and died in debt. Slightly later, Beethoven got luckily with an expanding middle-class market for his music and was able to survive independently. This all sounds a lot like more recent cases we could probably point to!)

This is not to say that people are not still making money from business ventures associated with music. It seems likely that people will continue to consume music and to want to display their associations with music-makers that they like in some way. For the moment, people will continue to spend at least some money on downloads (or even physical media) of music they like, and if the current technological (and legal) limitations to all-streaming, all-the-time music were overcome (which we might well expect, eventually), they would probably spend some money on subscriptions to such services. Yet even if everyone had completely free streaming access to all music, they might still want to buy the merchandise associated with their favorite artists, see the live shows with other fans, etc.  Touring and merchandise have already become the major revenue streams for perhaps the majority of working artists artists (some version of the "Grateful Dead model"?), and so the evolving music industry is going to be about building relationships with fans and generating senses of community that encourage people to spend their increasingly limited and fragmented entertainment budget of both time and money on that given artist. In other words, if Artist X (and their other fans) interacts with me through social media in what I perceive to be a cool way, I will spend more time paying attention to what they are doing and be more likely to spend money on music, videos, T-shirts, fluffy slippers, concert tickets, etc. This has, to an extent, always been true -- but the ability for people to interact in this way has vastly increased in speed and scale, so it has become accordingly more important.

So though the days of towering rock and pop stars, sales of whose recorded music defined generations, may well be gone, we can probably expect that people will continue to make money from music (at least in the sense of live performances) and merchandise associated with that music (be this band T-shirts, or soft drink adverts, or whatever). Moreover, just as you could get rich in the 19th-century American gold rushes by not necessarily finding gold but by selling picks and shovels to people who might or might not then find gold, there seems to be a huge and as yet not fully exploited market for selling music making tools to an increasingly large audience (as the developing world becomes increasingly interested and accessible). I am unlikely to ever make back the money I have spent on instruments, and software, and other equipment associated with music making, but I have nevertheless spent that money to make my own music -- as have plenty of others. (Thus the redefined direction of this blog.) Perhaps most importantly here, what you can achieve with the kind of musical technology that is increasingly affordable to many is quite astounding in comparison with what was available only a few decades ago.

So: Long live music! It is not only still being made and heard, but is perhaps in fact being made and heard in greater quantities, by a greater number of people, than ever.  I can, in my spare time, record music on my (rather aging and in need of replacement) desktop computer (and increasingly on my mobile devices) that sounds (with my artistic limits!) pretty good (I think!) in comparison with what required state-of-the-art facilities in the year of my birth (more or less midway between Elvis and Nirvana, I think!). I can moreover make that music available to a large percentage of humanity at little to no cost. Yeah, lots of people can do it better than I, and I can't make a living at it (though luckily I don't need to), but -- if you think about it -- that's still pretty amazing and awesome. :)

30 June 2014

A change is as good as a rest

It's been over 2 years since I posted anything to this blog. The truth is that a blog for academic purposes simply doesn't make sense for me, these days. All my real academic thoughts get channeled into my real work at university -- and the pace of writing, presenting, etc. has been increasing. There's simply no time to duplicate work in a blog.

In fact, I'm not sure that blogs have much purpose for individuals these days -- except perhaps for people using them as part of their private workflow, or who are promoting other aspects of their work (fiction authors, perhaps -- I'm not sure the paradigm works so well for researchers). Otherwise, the only blogs that seem interesting are those that have morphed into mini-magazines or that are focused on very particular areas.

Still, I think I will keep this alive for the moment -- though I will repurpose it for musings on my music hobby -- and particularly my home recording hobby, a hobby sufficiently different from what I do for a living that I still make the effort to pursue it. So this will entail some minor redesign and whatnot, but I'll sort that out in the coming days/weeks.

I think, for the moment, I will leave all the older stuff here. A 2-year gap offers a pretty clean break to separate the new direction from the old!

18 April 2012

Useful Words from Joe Harris

Life has been too busy for the indulgent luxury of blogging, but the discovery that PDF versions of the various articles collected in "Speak Useful Words or Say Nothing": Old Norse Studies by Joseph Harris are made freely available from Cornell's Web site is worth a quick post.

Joe Harris will need no introduction to those even peripherally connected with Old Norse studies, and this collection conveniently brings together a number of previously articles -- and the convenience is multiplied almost infinitely by the their ready and free availability as PDF downloads.

It's also worth pointing to a more recent bit of Joe's ever erudite and readable output:

This is one of a number of relatively recent (scattered over the past decade) papers Joe has produced on the subject of the Rök rune-stone; relatively few of these are readily available online, but it's well worth trying to track them down in any case, if you can. Taken together, they would have the makings of an amazing monograph on Rök which would stand worthily alongside other classics on the topic from the last century (and, incidentally, be the first such book-sized offering on the topic available in English). We can but hope .... ("hint! hint!", Joe, if you're reading! :))

Meanwhile, plenty of food for thought in the various papers of the Cornell article collection.

17 October 2011

CFP: Enduring Barbarism: Heroic Fantasy from the Bronze Age to the Internet

Man, do I ever look forward to video-conferencing technology becoming more widely and easily implemented in academic contexts because there is no way that I would ever get funding (or be able to justify using my own money) to attend or present at what looks like an utterly fabulous (in all and positive senses of that word) conference:

Enduring Barbarism: Heroic Fantasy from the Bronze Age to the Internet
College of St. Joseph Popular Culture Conference
Contact email:
Dr. Jonas Prida
The inaugural popular culture conference will be held at the College of St. Joseph, located in Rutland, Vermont, April 13th-14th, 2012.
Proposal deadline: Dec 15th, 2011.
We are looking for a wide range of topics, figures, panels and cultural studies methodologies to explore the enduring figure of the barbarian in Western popular culture. Graduate students, established faculty, and independent scholars are encouraged to submit ideas. Possible paper topics:
the multi-faceted use of the barbarian in popular culture
rise and fall of heroic fantasy in the 1970s
comic book barbarism
heroic fantasy as a heavy metal trope
the gendered barbarian
explorations of lesser-known sword and sorcery texts
Italian sword and sandal movies
The barbarian’s future
We are actively interested in innovative panel ideas as well.
Please send 250 word paper proposals, 400-500 word panel ideas, or general questions to Dr. Jonas Prida at jprida@csj.edu

08 October 2011

Humanites, Liberal Arts, and Technology <-> Learning, Creating, and Problem-Solving

This post does not include the name "Steve Jobs" in its title. The 'Net is already awash in Jobsian tribute. So this is not really more of that. But I do want to quickly jot down some reflections with regard to one of the late Mr Jobs's relatively recent observations that Apple's extraordinary success (all the more extraordinary as in came in the wake of Apple's near extinction) was largely founded on not just clever technology, but in fact clever technology married with liberal arts, married with the humanities. In an age of economic crisis and (to steal a quote from former professor and musician Tom Lehrer) "international brouhaha" (as, admittedly, are perhaps most ages, but anyway ...) when universities are busily trying to cut corners and programs they view as "nonessential" -- and the victims here tend to be liberal arts and humanities programs -- it is perhaps important for everyone to take a deep breath and think as hard as they possibly can before shooting down programs on the sorts "nonessential" stuff that in fact underlies the success of what is currently (October 2011) the most valuable technology company in the United States (and was at least briefly the most valuable US company of any kind earlier this year).

Properly speaking, our word "technology" derived from Ancient Greek τεχνολογία (< τέχνη [< PIE *tek̑s- "hew, fabricate"] + -λογία suffix relating to "discourse" on, or "study" of, something [< PIE *leg- "collect, speak"]) should probably be understood as study of how to best apply tools in order to solve problems -- though in my experience a majority of people skip the "study" and "apply" as well as (worst of all) the problem-solving aspects and use "technology" as a kind of synonym for "magic tools which we don't really understand, but that we've been told are pretty awesome". (In relation to which, see for example this essay by the late, great Douglas Adams [a Cambridge man, of course!].)

Even thought we see it endlessly repeated, and it should be just blindingly obvious anyway, technology is not and end in itself, nor is it a "magic bullet". Technology by itself solves nothing.  Yet even people who talk this talk often have difficulty walking this walk. In contemporary educational circles (as in many other circles), there is a lot of buzz about leveraging ICT to extend the ways and means of helping more people learn more stuff. Even in courses and seminars (of the sort with which, for example, my well-meaning but perhaps not terribly savvy university administration bombards its faculty) that knowingly remember to remind educators that ICT tools are not strategies, and that one needs to know the best ways to apply them, they equally gush with unfettered -- and largely unguided -- enthusiasm.  "Blogs and Wikis and Social Media!  Start using them in classroom today! It will make everything better!"  It's all not a little contradictory and incoherent.

To solve things, you need people with the right knowledge and skills -- and the right technology can help them acquire that knowledge and those skills more efficiently, as well as apply them them more efficiently. You need to know what problem you have, you need to analyze the task that confronts you in solving it, and then you can chose -- or invent -- the tools that will help you do that.

It is widely stated that Mr. Jobs was gifted with an ability to solve problems that people did not yet know they had until presented with the solution (in the form of, for example, an iPhone and all that can be done with it -- or whatever).  I would argue this is a close analogue to the ability that educators need.  Learners -- perhaps especially early learners, but really all learners -- do not really know exactly what they will need to be doing in the future. Neither do their teachers, their parents, or their (possibly future) bosses.  But whatever they are doing, it will require problem-solving abilities. Whether you need to figure out how to make a cup of tea or achieve world peace, that's problem-solving. In pretty much any sphere of human activity, you need to be able to identify what your problem or problems are (often more difficult than it would seem), you need to identify the conditions that will let you know the problem has been solved, and then you need to be able to identify, analyze the various tasks you might need to perform in order to move from the state of "problem exists" to "problem no longer exists". Since it's almost impossible (perhaps increasingly impossible) to predict exactly what problems today's learners will be wrestling with tomorrow (let alone in 5, 10, or 20 years!), we need to help our learners acquire that prodigiously Jobsian quality of being able to solve problems that they not only don't know they have, but that might not even exist yet.  Let point to a quote from the mighty Lemmy Kilmister (who is, now that I think of it, perhaps more like Steve Jobs than I would have otherwise casually thought -- but anyway ...): "Everything you plan is $h!t. All the good things that happen in your life are accidents."  I would humbly suggest the following mild amendment to the Lemster's incomparable wisdom: Knowing how to make the accidents work makes the good things happen. Knowing how to make the accidents work is perhaps the highest-level of problem-solving.

had a discussion with a colleague recently in which the theme was basically: "We should throw away all 'product-oriented' education -- that is, education that focuses on acquisition of particular genre-defined knowledge or skills.  All our educational processes should be about problem-solving. Genre-defined content and skills -- that is, say, whether you are nominally studying zoology or law -- can only reasonably be just a vehicle to learn problem-solving skills. If it happens that you study zoology, and go on to become a zoologist, then great. But you might study zoology and go on to become a lawyer. In that case, your genre-specific knowledge and skills might be, at best, incidental to the problems you will need to solve (though they probably make you a more better informed, generally well-rounded sort of person, which is a result not to be denigrated!); but your more general skills and strategies in terms of task analysis and problem-solving will always apply."

In the widely cited (and justly so) speech Mr Jobs gave at the 2005 Stanford graduation, he famously recounted (among other things) how the experience of attending a calligraphy class (as an audited course at the university from which he had largely dropped out) ultimately affected (at least arguably) the development of what are now familiar aspects of everyone's daily computing experience. He note that although what he learned in calligraphy class had no real particular application to his life at the time (it was just interesting and, clearly, motivating in some way) the experience of that calligraphy class led him, ten years later, to insist that capacities for typographical elegance (proportional fonts, multiple typefaces, etc.; all of which, admittedly, in the wrong hands can and do lead to "design train-wrecks", but never mind that right now ...) be built into early Macs (not to mention the early Apple LaserWriters, which some may remember as having been pioneeringly ubiquitous, until eventually swamped by the likes of HP). Such capacities were, of course, then popularized to the point of absolute ubiquity when Windows cloned those capabilities from the Mac, leading to a whole industry of desktop publishing, and eventually e-publishing, that had not previously existed but that we now take for granted.

Small things? Sure, proportional fonts are not world peace. Some might argue that all this sort of stuff was "nonessential", or an example of "style over substance". Yet solving this problem that people didn't know they had did, in its way, change the world -- which would look rather different if we took away all the weird, supposedly "nonessential" things that Mr Jobs rabidly insisted on in his company's products. And it would be a much less cool, much more dehumanized world. Rather than "style over substance", we have here "substance with style": a consummation devoutly to be wished!

So the "nonessential" turns out to be, perhaps, essential after all -- in rather important ways, for rather important things.

We don't know what what problems our learners will face in the future (and nor do they); we can't know what problems they will face -- nor where the inspiration to face those problems will come from. But we clearly cannot (or should not) say things like "Don't study Ancient Greek; it's useless" since, just as clearly, if someone had said something similar to Steve Jobs about his calligraphy class, they would have been horribly, horribly wrong. Calligraphy itself marries together concepts of technology, the liberal arts, and the humanities; it is at once artistic and practical: performing the practical functions in artistic manners. Arguably, it was exactly that understanding and experience which led Mr Jobs to eventually demand that his company's products not only integrally incorporate concepts of technology, liberal arts, and the humanities in their design, but that likewise their design should facilitate the creation of new products that likewise reflect the integral marriage of those concepts. Or, in other words, its a self-replicating humanistic philosophy -- or at least a philosophy amenable to self-replication and humanism.

Of course, many educators need no reminder about the significance of the arts and humanities -- especially if they themselves specialize in those areas, but most of the wiser people in the sciences and technical studies readily recognize this as well, even as professors of literature will recognize the value of scientific literacy, etc.  I am reminded of the way that my undergraduate courses at Harvard were organized: half of them were determined by my major, a quarter of them I was free to chose, and another quarter had to be on topics absolutely distinct from my major. (Consider carefully, O ye curriculum designers, the value of such an approach as mandated at one of the world's leading universities!)  There's an excellent article from earlier this year on this topic (arts+humanities+tech) that got bandied about certain corners of the blogosphere (well, I guess spheres don't have corners ... but never mind!);  another nice one here, and various others elsewhere. Stephen Fry's erudite and insightful tribute to Mr Jobs [Fry being, again, a Cambridge man, of course!] includes some cogent observations on the importance of the "human element" that Jobs always insisted be incorporated into Apple design.

Likewise, in considering the importance of understanding and making use of relationships between technology and the liberal arts, Mark Randall made the observation that the "next generation" is already keenly aware of this significance, and sums it up with the phrase: "Thinking will be now more important than knowing." I would, however, argue that thinking has always been more important than knowing. Obviously, some modicum of knowledge is effectively a prerequisite to thinking, but the key change in which we can rejoice is the relative ease with which knowledge can now be acquired. I could sit (and, now that I think of it, actually have sat) on top of a mountain in South America with my iPad and access a tremendous store of information from around the globe -- and I certainly hope that this situation only continues improving. Yet even if I can find information easily, I still need the ability to evaluate and analyze that information in terms of its relevance to whatever task I need to perform or problem I need to solve. That's all about thinking, and technology cannot really help us do that. (At least, not yet!)

If educators (and learners) in this so-called "age of technology" have something to take away from the example of Steve Jobs, it's that what we really need to be able to do (and thus need to learn to do) is solve problems -- whether or not we know what problems we need to solve, and whether or not the problems have even yet come into existence -- and that being good at this sort of thing requires (beyond, apparently, enormous passion, commitment, and not a little insanity) thinking. Particularly, problem-solving demands creative thinking; the infamous Apple slogan "Think different" is not just an appealingly clever bit of "geek chic" marketing, but actually perhaps foundational to achieving anything genuinely useful. (Cf. The Onion's all too believable headline.) The obvious and easy has already been done; it is thinking differently that allows us to move on to the obscure and difficult of as-yet unsolved problems.

Technology can be an iPad, or it can be a stick with which to scratch symbols in the mud. Only powerful abilities in creative thinking will help us figure out how best to use our tools to solve our problems, and working with (not just learning facts about) materials and concepts from the arts and humanities -- everyone is, after all, human -- are powerful means of helping learners develop that creativity. It would probably be for the best if the world were not entirely populated by a zillion Steve Jobses (where would we get enough black turtlenecks, after all?), but if more of us get better at solving problems -- even just the ones we already know we have -- that would probably be all right.

“Technology alone is not enough. It’s technology married with liberal arts, married with the humanities, that yields us with the results”  
-- Steve Jobs, iPad 2 event in San Francisco, CA, USA, 03 March 2011

07 October 2011

Let my computer go!

I have developed an almost rabid antipathy for my institution's IT policies.

I suspect they are not unlike those of many other institutions or enterprises (especially in a world dominated by not-terribly-secure Windows tech and, frankly, a workforce that is curiously un-savvy about how to use their stuff); but never mind all that. I'm not interested in judging things from the standpoint of the most common denominators. Things shall either be organized the way I want, or they are not acceptable -- though I may well lack the power to do other than grit my teeth and provisionally accept them. Alas my institution's IT policies fall into the latter category.

It's probably a familiar problem to the more "digitally literate" or "informationally literate" members of the population: basically, my institutionally supplied desktop computer is locked down in a fairly extreme way, giving me almost no control over things like what I can install or update -- or even whether I can choose my own desktop picture.  Equally, a snake's nest of weird proxy servers often confound my attempts to use my preferred browsers or email clients -- or even configure extra email account access in Outlook (effectively the default since, yeah, obviously, it's a Windows shop here).

Now in a sense, this is of course all perfectly fair. It is, after all, not my computer; it's the institution's, and they do have the right to do as they will with it. On the other hand, having the right does not mean they should be exercising it. Or at least in they rather ham-fisted way that they are.

Firstly, these policies certainly interfere with my ability to do my job (educator, researcher, all all that) effectively -- not least because they interfere with many of my institutions own stated objectives in terms of incorporating a wider range of ICT-mediated tools and materials into the educational experience.  For example, I would be expect to try to incorporated appropriate videos from YouTube into course content. I have no problem with this; videos are cool, and there are plenty of useful (though, of course, far more useless) videos for my purposes to choose from for my purposes.  On the other hand, if Flash gets upgraded along with much of the content on YouTube, but I can't upgrade Flash thanks to my institutions highly contradictory IT policies, then suddenly I can't see those videos and have easy no way of fixing that problem. 

Sure, if I could just ring up the IT guys and say "Please give me Admin privileges, at least until I've fixed this", and they would do it, then no real problem.  But I can't do that. I have to ring or mail the IT guys, desperately request a fix, and then wait quite possibly for days or even weeks (during which time I must harass, harangue, and otherwise plead with them from time to time) for a tech with the appropriate magic powers to show up and Do The Stuff -- though, of course, usually I am granted the right powers only until the log-in session ends, which can be a problem if the tech appears at a silly time when my availability to then do stuff is limited (as, for example, after 6pm last night, thereby prompting this post!), or the install itself requires a restart (as it not infrequently does), or the electricity goes out suddenly (I probably should not use the word "unexpectedly" in this context: we're in South America, and the lights go out whenever some rain god or another sneezes), or ... you get the idea.  For example, despite a Magic Admin Privileges Tech Visit last night (as noted), I absolutely did not have time (thanks to the stupid hour) to run all my upgrades (even those not requiring a restart), and so this morning, of course, my session had logged itself out (I can't control that either), meaning I am now back to Square One (at best) and need to start a whole, new round of cajoling a new Magic Admin Privileges Tech Visit -- only 12 hours after the last one! -- which could easily take additional days or weeks (not to say considerable time and effort) to achieve.

This is all enormously-- e-nor-mous-ly -- inefficient, ineffective, and counter-productive.

Of course, I do understand where they are coming from. I understand that they have to buy cruddy, easily broken and insecure services and systems because everyone is strapped for cash these days, and, of course, whatever problems y'all have off in the "developed world" in terms of budget, I can only assure you most earnestly that they are far worse here. (It's not like, after all, a computer costs less in South America than in Europe or North America. In fact, it quite possibly costs rather more; and there is, of course, much less money on hand with which to buy it.)  And I recognize that many of their employees -- not least many of my faculty colleagues -- are, for all intents and purposes "digitally illiterate", such that giving them unfettered access to their computers and systems would be the functional equivalent of handing the car keys to a drunken australopithecine. I understand all that.

But the "one-size fits all" IT policy is just wrong. Frankly. it just shouldn't apply to me, or to  people like me who actually know what they are doing. (A minority? Maybe; I dunno. But even if so, surely a growing minority?)  I am, after all, not a supergenius -- but I'm not stupid either. And I have been continuously using computers and computer-mediated communications and tools in both educational environments and the high-tech industry (not to mention at home) for some 25 years now. I am pretty knowledgeable even with regards to the poor, benighted Windows machine on my workplace (home is a Mac shop, of course) in terms of configuring, maintaining, and squeezing the most out of it without horribly messing it up or filling it with weird viruses and other malware. I have an extremely good track record. And, frankly, I know far more about what software and services will help me do my job most efficiently than the IT department -- who may well be a bunch of good, knowledgeable, and clever guys (and whom I strongly suspect would, at least in private, admit or agree to most of what I say) with respect to their jobs, but do not have the specialized knowledge about What I Need To Do which is a necessary prerequisite for deciding The Best Ways And Tools To Do What I Need To Do. Yet here we are.

The problem is the "one size fits all" IT policy. One size simply does not fit all. There should be tiered access and privileges.  There should be some way for me to qualify for greater access and privileges because I can demonstrate that I can use them sensibly and effectively. There should be appropriate ways for the less "digitally/informationally literate" to acquire appropriate knowledge and skills (if they need them, or would benefit from them) to likewise qualify for such access and privileges. After all, in other contexts, we would not just hand the car keys to any hominin who happens along, but ask them to take a course or at least demonstrate compliance with some set of criteria that suggested it was OK to let them have a driving license confirming their automotive access and privileges. Give me the power to do what I do well, and let me get on with it. Work with me, rather than against me. I am not the enemy. I am, actually, the exactly the kind of user that the IT guys would like -- I am pretty sure about this, having worked as an IT guy in the past myself.

Such tiered access would probably be a sensible and effective solution to the problems faced on all sides here -- and I am sure many other terminally frustrated persons myself face similar problems, in one form or another. At the very least, it could hardly be worse than the current situation. Alas, in my experience, convincing people do to sensible and effective things is quite a struggle in itself (unless you have a very Big Stick, which sadly I do not, or things would be different). But, well, we'll see.

Hope, as yet, springs eternal -- despite its better judgement.