01 July 2014

The music industry is dead (or at least pining for the fiords)? Long live music!

I can hardly believe one still hears grumbling and complaints about the "death" of music (or even, more properly, "the music industry" as such) at the hands of the Internet and the digital era. What year are we living in? 1973? For good or for ill, we are not.

In any case, "music" is far from dead. In fact, we are perhaps living in a moment of (at least potential) music creativity and expression unparalleled in human history. Never have our capacities to (1) make music and/or (2) listen to the results been greater for a larger number of people -- and we can reasonably expect (barring global demographic catastrophe, a possibility that we should not entirely rule out!) that trend to continue.  There is an enormous amount of good (and, of course, less good) music of essentially any style one can imagine (as well as quite a lot of styles one has probably not imagined!) being made all over the world, as well as numerous opportunities for finding this music that simply didn't exist before. Of course, there are often considerable difficulties of various kinds (legal, technical, informational, etc.) in terms of finding what you already know you want and then being able to listen to it readily ... but we shall see how all that plays out in the years to come.

What is "dead" is "the music industry" as it was known in the 1950s-1990s (essentially: that industry which revolved around pop music from Elvis through Nirvana). This could be seen as a simple consequence of things (technological, social, etc.) having changed faster than that industry itself, but it increasingly seems more like a curmudgeonly grumbling of of "Oooo, when I were a lad, things were different!". Despite the niche and somewhat "hipster" (not to say largely nostalgic or pseudo-nostalgic) resurgence of vinyl in recent years, the concept of a "record collection" is largely an artifact of a recording/distribution technology that depended on physical media. And, obviously, before records (vinyl, wax cylinders, CDs, whatever) existed, no one had such "collections" ... except, one supposes, of sheet music ... or of composers themselves (in the "collections" of rich patrons). (Many well-known classical composers of the 18th century or earlier were essentially producing on-demand for aristocratic patrons. Mozart tried to break out and make it on his own, but failed and died in debt. Slightly later, Beethoven got luckily with an expanding middle-class market for his music and was able to survive independently. This all sounds a lot like more recent cases we could probably point to!)

This is not to say that people are not still making money from business ventures associated with music. It seems likely that people will continue to consume music and to want to display their associations with music-makers that they like in some way. For the moment, people will continue to spend at least some money on downloads (or even physical media) of music they like, and if the current technological (and legal) limitations to all-streaming, all-the-time music were overcome (which we might well expect, eventually), they would probably spend some money on subscriptions to such services. Yet even if everyone had completely free streaming access to all music, they might still want to buy the merchandise associated with their favorite artists, see the live shows with other fans, etc.  Touring and merchandise have already become the major revenue streams for perhaps the majority of working artists artists (some version of the "Grateful Dead model"?), and so the evolving music industry is going to be about building relationships with fans and generating senses of community that encourage people to spend their increasingly limited and fragmented entertainment budget of both time and money on that given artist. In other words, if Artist X (and their other fans) interacts with me through social media in what I perceive to be a cool way, I will spend more time paying attention to what they are doing and be more likely to spend money on music, videos, T-shirts, fluffy slippers, concert tickets, etc. This has, to an extent, always been true -- but the ability for people to interact in this way has vastly increased in speed and scale, so it has become accordingly more important.

So though the days of towering rock and pop stars, sales of whose recorded music defined generations, may well be gone, we can probably expect that people will continue to make money from music (at least in the sense of live performances) and merchandise associated with that music (be this band T-shirts, or soft drink adverts, or whatever). Moreover, just as you could get rich in the 19th-century American gold rushes by not necessarily finding gold but by selling picks and shovels to people who might or might not then find gold, there seems to be a huge and as yet not fully exploited market for selling music making tools to an increasingly large audience (as the developing world becomes increasingly interested and accessible). I am unlikely to ever make back the money I have spent on instruments, and software, and other equipment associated with music making, but I have nevertheless spent that money to make my own music -- as have plenty of others. (Thus the redefined direction of this blog.) Perhaps most importantly here, what you can achieve with the kind of musical technology that is increasingly affordable to many is quite astounding in comparison with what was available only a few decades ago.

So: Long live music! It is not only still being made and heard, but is perhaps in fact being made and heard in greater quantities, by a greater number of people, than ever.  I can, in my spare time, record music on my (rather aging and in need of replacement) desktop computer (and increasingly on my mobile devices) that sounds (with my artistic limits!) pretty good (I think!) in comparison with what required state-of-the-art facilities in the year of my birth (more or less midway between Elvis and Nirvana, I think!). I can moreover make that music available to a large percentage of humanity at little to no cost. Yeah, lots of people can do it better than I, and I can't make a living at it (though luckily I don't need to), but -- if you think about it -- that's still pretty amazing and awesome. :)

30 June 2014

A change is as good as a rest

It's been over 2 years since I posted anything to this blog. The truth is that a blog for academic purposes simply doesn't make sense for me, these days. All my real academic thoughts get channeled into my real work at university -- and the pace of writing, presenting, etc. has been increasing. There's simply no time to duplicate work in a blog.

In fact, I'm not sure that blogs have much purpose for individuals these days -- except perhaps for people using them as part of their private workflow, or who are promoting other aspects of their work (fiction authors, perhaps -- I'm not sure the paradigm works so well for researchers). Otherwise, the only blogs that seem interesting are those that have morphed into mini-magazines or that are focused on very particular areas.

Still, I think I will keep this alive for the moment -- though I will repurpose it for musings on my music hobby -- and particularly my home recording hobby, a hobby sufficiently different from what I do for a living that I still make the effort to pursue it. So this will entail some minor redesign and whatnot, but I'll sort that out in the coming days/weeks.

I think, for the moment, I will leave all the older stuff here. A 2-year gap offers a pretty clean break to separate the new direction from the old!

18 April 2012

Useful Words from Joe Harris

Life has been too busy for the indulgent luxury of blogging, but the discovery that PDF versions of the various articles collected in "Speak Useful Words or Say Nothing": Old Norse Studies by Joseph Harris are made freely available from Cornell's Web site is worth a quick post.

Joe Harris will need no introduction to those even peripherally connected with Old Norse studies, and this collection conveniently brings together a number of previously articles -- and the convenience is multiplied almost infinitely by the their ready and free availability as PDF downloads.

It's also worth pointing to a more recent bit of Joe's ever erudite and readable output:

This is one of a number of relatively recent (scattered over the past decade) papers Joe has produced on the subject of the Rök rune-stone; relatively few of these are readily available online, but it's well worth trying to track them down in any case, if you can. Taken together, they would have the makings of an amazing monograph on Rök which would stand worthily alongside other classics on the topic from the last century (and, incidentally, be the first such book-sized offering on the topic available in English). We can but hope .... ("hint! hint!", Joe, if you're reading! :))

Meanwhile, plenty of food for thought in the various papers of the Cornell article collection.

17 October 2011

CFP: Enduring Barbarism: Heroic Fantasy from the Bronze Age to the Internet

Man, do I ever look forward to video-conferencing technology becoming more widely and easily implemented in academic contexts because there is no way that I would ever get funding (or be able to justify using my own money) to attend or present at what looks like an utterly fabulous (in all and positive senses of that word) conference:


Enduring Barbarism: Heroic Fantasy from the Bronze Age to the Internet
College of St. Joseph Popular Culture Conference
Contact email:
Dr. Jonas Prida
jprida@csj.edu
The inaugural popular culture conference will be held at the College of St. Joseph, located in Rutland, Vermont, April 13th-14th, 2012.
Proposal deadline: Dec 15th, 2011.
We are looking for a wide range of topics, figures, panels and cultural studies methodologies to explore the enduring figure of the barbarian in Western popular culture. Graduate students, established faculty, and independent scholars are encouraged to submit ideas. Possible paper topics:
the multi-faceted use of the barbarian in popular culture
rise and fall of heroic fantasy in the 1970s
comic book barbarism
heroic fantasy as a heavy metal trope
the gendered barbarian
explorations of lesser-known sword and sorcery texts
Italian sword and sandal movies
The barbarian’s future
We are actively interested in innovative panel ideas as well.
Please send 250 word paper proposals, 400-500 word panel ideas, or general questions to Dr. Jonas Prida at jprida@csj.edu

08 October 2011

Humanites, Liberal Arts, and Technology <-> Learning, Creating, and Problem-Solving

This post does not include the name "Steve Jobs" in its title. The 'Net is already awash in Jobsian tribute. So this is not really more of that. But I do want to quickly jot down some reflections with regard to one of the late Mr Jobs's relatively recent observations that Apple's extraordinary success (all the more extraordinary as in came in the wake of Apple's near extinction) was largely founded on not just clever technology, but in fact clever technology married with liberal arts, married with the humanities. In an age of economic crisis and (to steal a quote from former professor and musician Tom Lehrer) "international brouhaha" (as, admittedly, are perhaps most ages, but anyway ...) when universities are busily trying to cut corners and programs they view as "nonessential" -- and the victims here tend to be liberal arts and humanities programs -- it is perhaps important for everyone to take a deep breath and think as hard as they possibly can before shooting down programs on the sorts "nonessential" stuff that in fact underlies the success of what is currently (October 2011) the most valuable technology company in the United States (and was at least briefly the most valuable US company of any kind earlier this year).

Properly speaking, our word "technology" derived from Ancient Greek τεχνολογία (< τέχνη [< PIE *tek̑s- "hew, fabricate"] + -λογία suffix relating to "discourse" on, or "study" of, something [< PIE *leg- "collect, speak"]) should probably be understood as study of how to best apply tools in order to solve problems -- though in my experience a majority of people skip the "study" and "apply" as well as (worst of all) the problem-solving aspects and use "technology" as a kind of synonym for "magic tools which we don't really understand, but that we've been told are pretty awesome". (In relation to which, see for example this essay by the late, great Douglas Adams [a Cambridge man, of course!].)

Even thought we see it endlessly repeated, and it should be just blindingly obvious anyway, technology is not and end in itself, nor is it a "magic bullet". Technology by itself solves nothing.  Yet even people who talk this talk often have difficulty walking this walk. In contemporary educational circles (as in many other circles), there is a lot of buzz about leveraging ICT to extend the ways and means of helping more people learn more stuff. Even in courses and seminars (of the sort with which, for example, my well-meaning but perhaps not terribly savvy university administration bombards its faculty) that knowingly remember to remind educators that ICT tools are not strategies, and that one needs to know the best ways to apply them, they equally gush with unfettered -- and largely unguided -- enthusiasm.  "Blogs and Wikis and Social Media!  Start using them in classroom today! It will make everything better!"  It's all not a little contradictory and incoherent.

To solve things, you need people with the right knowledge and skills -- and the right technology can help them acquire that knowledge and those skills more efficiently, as well as apply them them more efficiently. You need to know what problem you have, you need to analyze the task that confronts you in solving it, and then you can chose -- or invent -- the tools that will help you do that.

It is widely stated that Mr. Jobs was gifted with an ability to solve problems that people did not yet know they had until presented with the solution (in the form of, for example, an iPhone and all that can be done with it -- or whatever).  I would argue this is a close analogue to the ability that educators need.  Learners -- perhaps especially early learners, but really all learners -- do not really know exactly what they will need to be doing in the future. Neither do their teachers, their parents, or their (possibly future) bosses.  But whatever they are doing, it will require problem-solving abilities. Whether you need to figure out how to make a cup of tea or achieve world peace, that's problem-solving. In pretty much any sphere of human activity, you need to be able to identify what your problem or problems are (often more difficult than it would seem), you need to identify the conditions that will let you know the problem has been solved, and then you need to be able to identify, analyze the various tasks you might need to perform in order to move from the state of "problem exists" to "problem no longer exists". Since it's almost impossible (perhaps increasingly impossible) to predict exactly what problems today's learners will be wrestling with tomorrow (let alone in 5, 10, or 20 years!), we need to help our learners acquire that prodigiously Jobsian quality of being able to solve problems that they not only don't know they have, but that might not even exist yet.  Let point to a quote from the mighty Lemmy Kilmister (who is, now that I think of it, perhaps more like Steve Jobs than I would have otherwise casually thought -- but anyway ...): "Everything you plan is $h!t. All the good things that happen in your life are accidents."  I would humbly suggest the following mild amendment to the Lemster's incomparable wisdom: Knowing how to make the accidents work makes the good things happen. Knowing how to make the accidents work is perhaps the highest-level of problem-solving.

had a discussion with a colleague recently in which the theme was basically: "We should throw away all 'product-oriented' education -- that is, education that focuses on acquisition of particular genre-defined knowledge or skills.  All our educational processes should be about problem-solving. Genre-defined content and skills -- that is, say, whether you are nominally studying zoology or law -- can only reasonably be just a vehicle to learn problem-solving skills. If it happens that you study zoology, and go on to become a zoologist, then great. But you might study zoology and go on to become a lawyer. In that case, your genre-specific knowledge and skills might be, at best, incidental to the problems you will need to solve (though they probably make you a more better informed, generally well-rounded sort of person, which is a result not to be denigrated!); but your more general skills and strategies in terms of task analysis and problem-solving will always apply."

In the widely cited (and justly so) speech Mr Jobs gave at the 2005 Stanford graduation, he famously recounted (among other things) how the experience of attending a calligraphy class (as an audited course at the university from which he had largely dropped out) ultimately affected (at least arguably) the development of what are now familiar aspects of everyone's daily computing experience. He note that although what he learned in calligraphy class had no real particular application to his life at the time (it was just interesting and, clearly, motivating in some way) the experience of that calligraphy class led him, ten years later, to insist that capacities for typographical elegance (proportional fonts, multiple typefaces, etc.; all of which, admittedly, in the wrong hands can and do lead to "design train-wrecks", but never mind that right now ...) be built into early Macs (not to mention the early Apple LaserWriters, which some may remember as having been pioneeringly ubiquitous, until eventually swamped by the likes of HP). Such capacities were, of course, then popularized to the point of absolute ubiquity when Windows cloned those capabilities from the Mac, leading to a whole industry of desktop publishing, and eventually e-publishing, that had not previously existed but that we now take for granted.

Small things? Sure, proportional fonts are not world peace. Some might argue that all this sort of stuff was "nonessential", or an example of "style over substance". Yet solving this problem that people didn't know they had did, in its way, change the world -- which would look rather different if we took away all the weird, supposedly "nonessential" things that Mr Jobs rabidly insisted on in his company's products. And it would be a much less cool, much more dehumanized world. Rather than "style over substance", we have here "substance with style": a consummation devoutly to be wished!

So the "nonessential" turns out to be, perhaps, essential after all -- in rather important ways, for rather important things.

We don't know what what problems our learners will face in the future (and nor do they); we can't know what problems they will face -- nor where the inspiration to face those problems will come from. But we clearly cannot (or should not) say things like "Don't study Ancient Greek; it's useless" since, just as clearly, if someone had said something similar to Steve Jobs about his calligraphy class, they would have been horribly, horribly wrong. Calligraphy itself marries together concepts of technology, the liberal arts, and the humanities; it is at once artistic and practical: performing the practical functions in artistic manners. Arguably, it was exactly that understanding and experience which led Mr Jobs to eventually demand that his company's products not only integrally incorporate concepts of technology, liberal arts, and the humanities in their design, but that likewise their design should facilitate the creation of new products that likewise reflect the integral marriage of those concepts. Or, in other words, its a self-replicating humanistic philosophy -- or at least a philosophy amenable to self-replication and humanism.

Of course, many educators need no reminder about the significance of the arts and humanities -- especially if they themselves specialize in those areas, but most of the wiser people in the sciences and technical studies readily recognize this as well, even as professors of literature will recognize the value of scientific literacy, etc.  I am reminded of the way that my undergraduate courses at Harvard were organized: half of them were determined by my major, a quarter of them I was free to chose, and another quarter had to be on topics absolutely distinct from my major. (Consider carefully, O ye curriculum designers, the value of such an approach as mandated at one of the world's leading universities!)  There's an excellent article from earlier this year on this topic (arts+humanities+tech) that got bandied about certain corners of the blogosphere (well, I guess spheres don't have corners ... but never mind!);  another nice one here, and various others elsewhere. Stephen Fry's erudite and insightful tribute to Mr Jobs [Fry being, again, a Cambridge man, of course!] includes some cogent observations on the importance of the "human element" that Jobs always insisted be incorporated into Apple design.

Likewise, in considering the importance of understanding and making use of relationships between technology and the liberal arts, Mark Randall made the observation that the "next generation" is already keenly aware of this significance, and sums it up with the phrase: "Thinking will be now more important than knowing." I would, however, argue that thinking has always been more important than knowing. Obviously, some modicum of knowledge is effectively a prerequisite to thinking, but the key change in which we can rejoice is the relative ease with which knowledge can now be acquired. I could sit (and, now that I think of it, actually have sat) on top of a mountain in South America with my iPad and access a tremendous store of information from around the globe -- and I certainly hope that this situation only continues improving. Yet even if I can find information easily, I still need the ability to evaluate and analyze that information in terms of its relevance to whatever task I need to perform or problem I need to solve. That's all about thinking, and technology cannot really help us do that. (At least, not yet!)

If educators (and learners) in this so-called "age of technology" have something to take away from the example of Steve Jobs, it's that what we really need to be able to do (and thus need to learn to do) is solve problems -- whether or not we know what problems we need to solve, and whether or not the problems have even yet come into existence -- and that being good at this sort of thing requires (beyond, apparently, enormous passion, commitment, and not a little insanity) thinking. Particularly, problem-solving demands creative thinking; the infamous Apple slogan "Think different" is not just an appealingly clever bit of "geek chic" marketing, but actually perhaps foundational to achieving anything genuinely useful. (Cf. The Onion's all too believable headline.) The obvious and easy has already been done; it is thinking differently that allows us to move on to the obscure and difficult of as-yet unsolved problems.

Technology can be an iPad, or it can be a stick with which to scratch symbols in the mud. Only powerful abilities in creative thinking will help us figure out how best to use our tools to solve our problems, and working with (not just learning facts about) materials and concepts from the arts and humanities -- everyone is, after all, human -- are powerful means of helping learners develop that creativity. It would probably be for the best if the world were not entirely populated by a zillion Steve Jobses (where would we get enough black turtlenecks, after all?), but if more of us get better at solving problems -- even just the ones we already know we have -- that would probably be all right.



“Technology alone is not enough. It’s technology married with liberal arts, married with the humanities, that yields us with the results”  
-- Steve Jobs, iPad 2 event in San Francisco, CA, USA, 03 March 2011


07 October 2011

Let my computer go!

I have developed an almost rabid antipathy for my institution's IT policies.

I suspect they are not unlike those of many other institutions or enterprises (especially in a world dominated by not-terribly-secure Windows tech and, frankly, a workforce that is curiously un-savvy about how to use their stuff); but never mind all that. I'm not interested in judging things from the standpoint of the most common denominators. Things shall either be organized the way I want, or they are not acceptable -- though I may well lack the power to do other than grit my teeth and provisionally accept them. Alas my institution's IT policies fall into the latter category.

It's probably a familiar problem to the more "digitally literate" or "informationally literate" members of the population: basically, my institutionally supplied desktop computer is locked down in a fairly extreme way, giving me almost no control over things like what I can install or update -- or even whether I can choose my own desktop picture.  Equally, a snake's nest of weird proxy servers often confound my attempts to use my preferred browsers or email clients -- or even configure extra email account access in Outlook (effectively the default since, yeah, obviously, it's a Windows shop here).

Now in a sense, this is of course all perfectly fair. It is, after all, not my computer; it's the institution's, and they do have the right to do as they will with it. On the other hand, having the right does not mean they should be exercising it. Or at least in they rather ham-fisted way that they are.

Firstly, these policies certainly interfere with my ability to do my job (educator, researcher, all all that) effectively -- not least because they interfere with many of my institutions own stated objectives in terms of incorporating a wider range of ICT-mediated tools and materials into the educational experience.  For example, I would be expect to try to incorporated appropriate videos from YouTube into course content. I have no problem with this; videos are cool, and there are plenty of useful (though, of course, far more useless) videos for my purposes to choose from for my purposes.  On the other hand, if Flash gets upgraded along with much of the content on YouTube, but I can't upgrade Flash thanks to my institutions highly contradictory IT policies, then suddenly I can't see those videos and have easy no way of fixing that problem. 

Sure, if I could just ring up the IT guys and say "Please give me Admin privileges, at least until I've fixed this", and they would do it, then no real problem.  But I can't do that. I have to ring or mail the IT guys, desperately request a fix, and then wait quite possibly for days or even weeks (during which time I must harass, harangue, and otherwise plead with them from time to time) for a tech with the appropriate magic powers to show up and Do The Stuff -- though, of course, usually I am granted the right powers only until the log-in session ends, which can be a problem if the tech appears at a silly time when my availability to then do stuff is limited (as, for example, after 6pm last night, thereby prompting this post!), or the install itself requires a restart (as it not infrequently does), or the electricity goes out suddenly (I probably should not use the word "unexpectedly" in this context: we're in South America, and the lights go out whenever some rain god or another sneezes), or ... you get the idea.  For example, despite a Magic Admin Privileges Tech Visit last night (as noted), I absolutely did not have time (thanks to the stupid hour) to run all my upgrades (even those not requiring a restart), and so this morning, of course, my session had logged itself out (I can't control that either), meaning I am now back to Square One (at best) and need to start a whole, new round of cajoling a new Magic Admin Privileges Tech Visit -- only 12 hours after the last one! -- which could easily take additional days or weeks (not to say considerable time and effort) to achieve.

This is all enormously-- e-nor-mous-ly -- inefficient, ineffective, and counter-productive.

Of course, I do understand where they are coming from. I understand that they have to buy cruddy, easily broken and insecure services and systems because everyone is strapped for cash these days, and, of course, whatever problems y'all have off in the "developed world" in terms of budget, I can only assure you most earnestly that they are far worse here. (It's not like, after all, a computer costs less in South America than in Europe or North America. In fact, it quite possibly costs rather more; and there is, of course, much less money on hand with which to buy it.)  And I recognize that many of their employees -- not least many of my faculty colleagues -- are, for all intents and purposes "digitally illiterate", such that giving them unfettered access to their computers and systems would be the functional equivalent of handing the car keys to a drunken australopithecine. I understand all that.

But the "one-size fits all" IT policy is just wrong. Frankly. it just shouldn't apply to me, or to  people like me who actually know what they are doing. (A minority? Maybe; I dunno. But even if so, surely a growing minority?)  I am, after all, not a supergenius -- but I'm not stupid either. And I have been continuously using computers and computer-mediated communications and tools in both educational environments and the high-tech industry (not to mention at home) for some 25 years now. I am pretty knowledgeable even with regards to the poor, benighted Windows machine on my workplace (home is a Mac shop, of course) in terms of configuring, maintaining, and squeezing the most out of it without horribly messing it up or filling it with weird viruses and other malware. I have an extremely good track record. And, frankly, I know far more about what software and services will help me do my job most efficiently than the IT department -- who may well be a bunch of good, knowledgeable, and clever guys (and whom I strongly suspect would, at least in private, admit or agree to most of what I say) with respect to their jobs, but do not have the specialized knowledge about What I Need To Do which is a necessary prerequisite for deciding The Best Ways And Tools To Do What I Need To Do. Yet here we are.

The problem is the "one size fits all" IT policy. One size simply does not fit all. There should be tiered access and privileges.  There should be some way for me to qualify for greater access and privileges because I can demonstrate that I can use them sensibly and effectively. There should be appropriate ways for the less "digitally/informationally literate" to acquire appropriate knowledge and skills (if they need them, or would benefit from them) to likewise qualify for such access and privileges. After all, in other contexts, we would not just hand the car keys to any hominin who happens along, but ask them to take a course or at least demonstrate compliance with some set of criteria that suggested it was OK to let them have a driving license confirming their automotive access and privileges. Give me the power to do what I do well, and let me get on with it. Work with me, rather than against me. I am not the enemy. I am, actually, the exactly the kind of user that the IT guys would like -- I am pretty sure about this, having worked as an IT guy in the past myself.

Such tiered access would probably be a sensible and effective solution to the problems faced on all sides here -- and I am sure many other terminally frustrated persons myself face similar problems, in one form or another. At the very least, it could hardly be worse than the current situation. Alas, in my experience, convincing people do to sensible and effective things is quite a struggle in itself (unless you have a very Big Stick, which sadly I do not, or things would be different). But, well, we'll see.

Hope, as yet, springs eternal -- despite its better judgement.

01 September 2011

Book Note: Sturgis & Oberhelma, eds., The Intersection of Fantasy and Native America (2009)

A while back, I wrote a post musing on the values of multiculturalism as a key factor in creativity with a special emphasis on the ways that authors of "imaginative literature" (e.g. fantasy fiction, science fiction, etc.) have, have not, or could draw on a range of cultural influences.  One thing I specifically wondered was: "Are there any examples of contemporary fantasy fiction inspired primarily by American traditions?". I put forward a few tentative answers of my own, but it seems likely that such questions are answered in far greater detail by a recent (if not exactly "new") book upon which I stumbled in my virtual travels: Amy H. Sturgis and David D. Oberhelma, eds.,  The Intersection of Fantasy and Native America: From H.P. Lovecraft to Leslie Marmon Silko (Altadena: Mythopoeic Press, 2009); the book is also listed on Amazon.com, though apparently out of stock at the time of writing.

Alas, I haven't got a copy -- they don't exactly stock this sort of thing at the local bookseller in Chía, Colombia -- but international book orders don't usually go missing on me, so perhaps I can get one eventually.  For the moment, I will have to be content with the the promise held out in the publisher's book-blurb -- which is, admittedly, pretty enticingly promising: 

A number of contemporary Native American authors incorporate elements of fantasy into their fiction, while several non-Native fantasy authors utilize elements of Native America in their storytelling. Nevertheless, few experts on fantasy consider American Indian works, and few experts on Native American studies explore the fantastic in literature. Now an international, multi-ethnic, and cross-disciplinary group of scholars investigates the meaningful ways in which fantasy and Native America intersect, examining classics by American Indian authors such as Louise Erdrich, Gerald Vizenor, and Leslie Marmon Silko, as well as non-Native fantasists such as H.P. Lovecraft, J.R.R. Tolkien, and J.K. Rowling. Thus these essayists pioneer new ways of thinking about fantasy texts by Native and non-Native authors, and challenge other academics, writers, and readers to do the same. 

I'd also like to point to an interesting interview with one of the editors, Amy H. Sturgis here. Good stuff!