Monday, March 24, 2014

Pop Between Realities, Home in Time for Tea 77: Snapshots of 2010

No Future

Near future prediction is always a rough game. The 1984 film 2010, an adaptation of Arthur C. Clarke’s 2001: A Space Odyssey sequel, managed a double error that almost perfectly defines the way in which the classic science fiction vision of the future fizzled. It’s first important to note the sort of endearing hubris of the film in the first place. In 1968, when 2001 came out, the future was thirty-three years off. But in 1984 the future was only twenty-six years off. There was still the resplendent belief in the onrushing scientific utopia that the culture had promised - a utopia wrapped up as a historical inevitable consequence of the then-present. Hence 2010’s amusing double error - the belief that humanity would be on the verge of going to Jupiter and the belief that the Cold War would be rumbling on.

Instead the American space program spent 2010 winding down, each of the three surviving Shuttles making their penultimate flights. The Cold War is dead and gone. Indeed, the entire apparatus of the future is long since gone, and has in fact quietly disappeared into the rear view mirror. Time memorably spend the tail end of 2009 declaring the aughts to have been the “worst decade ever,” and while this should be taken as Comic Book Guy style hyperbole with periods after every word, the case was compelling enough. A decade long war on terror culminated in a massive financial crash and the growing realization that electing a black dude President of the United States was not actually going to fix everything. 

Even the replacement future hastily drafted in after the original draft withered on the vine was floundering. It’s not that the Internet was underperforming as such, but on the other hand, its transformative promise was somewhat less than advertised. What had started as revolutionary technology was now just Facebook. 


The future had never really been built past the change of the millennium, and without that to anchor it there was just a vast, featureless present. Even technological advancement had become oddly fixed and predictable. The future came in regularly scheduled product launches headlined by Steve Jobs, incremental, predictable, and mostly leaked to rumor sites a few weeks in advance. It was not even a hugely pessimistic historical moment. Rather it was one captured by a grim banality. Nothing changes. Humanity sits past the ghost point, all the ideas and storming progress of the twentieth century ground to a point. For a hundred years now we had assumed we were building to something - that there was some sort of stable endpoint that all the churn and upheaval pointed towards. Whether that endpoint was utopia or armageddon was up in the air, but the broad idea that history was going somewhere wasn’t. Instead, however, the wheels simply fell off, and the tow truck never came. We sat, abandoned on history’s roadside, left thinking, “is this all there was?” And, seeing no sign of civilization, we tweeted about it.

Tea and Coalitions

It was pretty clear that Labour was toast. Their only lead in the polls came in the immediate aftermath of Gordon Brown becoming Prime Minister, and as Brown did not call a snap election he was, in effect, doomed, not least because he had the charisma of a dead fish and a nasty habit of accidentally getting caught on microphones calling people bigots. This was hardly surprising: Labour had been in power thirteen years, and that’s a hard streak to maintain. 

What was interesting, in the lead-up to the election, was that nobody seemed particularly enthused about David Cameron either. Indeed, for much of the election the hot story looked for all the world like Nick Clegg and the Liberal Democrats, who, on the back of Clegg’s performance in the UK’s first televised leaders’ debates, were polling ahead of Labour at various points, and threatening to snatch the entire election. In effect, Clegg was successfully executing the Barack Obama playbook, offering younger voters a sense of an alternative to tired political gridlock from the two major parties. In the wake of the expenses scandal, the message seemed to play well.

But a strong performance in the final debate from David Cameron and a fear campaign over the instability a hung parliament would supposedly cause mostly pulled the Lib-Dems back to Earth, and in the final election result they finished only 1% up on their 2005 performance, actually losing five seats due to the vagaries of the UK electoral system. (Buy any Lib-Dem a drink and they’ll explain with vigor.) The final result on May 6th had the Conservatives with just 36.1% of the vote, Labour with 29%, and the Lib-Dems with 23%. The result was, in fact, a hung parliament, and after five days Cameron formed a coalition government with the Liberal Democrats. Not to peek ahead excessively in the narrative, but on the whole this coalition proved disastrous for the Liberal Democrats, who got very little of their agenda enacted and saw their polling support collapse in a large part because of the understandable anger of people who voted for a left-wing party and got a Tory Prime Minister supported by the very party they voted for.

Six months later, in the United States, the midterm elections for the House of Representatives and Class Three of the Senate took place. The Democrats managed the impressive feat of losing six Senate seats, making an impressive four election streak of the Democrats losing seats in Class Three, and getting completely slaughtered in the House of Representatives as a wave of extreme conservatives united under the loose banner of the Tea Party entered office.

In both cases there is something of an official narrative of events. The US elections were supposedly a reaction to the Affordable Care Act, aka Obamacare. The UK elections, despite the failure of the Conservatives to actually attain a majority, showed a mandate for austerity policies and belt-tightening. These are, after all, the way these things work: an election must always, in the master narrative, display some shift in the public’s desires and opinions that is then realized by the government in power. 

In practice, this is nonsense in both cases. The US elections were determined not by any sizable number of people who had previously been Obama supporters defecting to the opposition, but by changes in turnout. Obama was elected by young and minority voters who historically have a lower turnout in the midterm elections. In 2010 the voters who turned out were older and whiter, and they voted as they had in 2008. Nobody’s mind had changed. Similarly, in the UK, a full 52%, an outright majority, voted for one of the two left-leaning parties, versus just over a third for the Conservative party. As has been the case in every single UK election since 1935, the left-leaning electorate is larger but split between two parties.

It is, of course, not that simple. A democracy doesn’t function on votes not cast, and the choice between Labour and the Liberal Democrats is a real one, even if both are to the left of the Conservatives. But while it may not be a narrative that forms a leftist government, it is at least no more dishonest than the narrative of a rightward turn in either country. In both cases the basic result was the same: the political right gained power considerably out of proportion to their actual support, and used it to enact policies that had the result of ending the Great Recession for the extremely wealthy while deepening its effects on the majority. The frustration, unsurprisingly, was palpable. 

The Fame Monster of Peladon

In pop music, meanwhile, the news is Lady Gaga. 

It is not that Lady Gaga is the first visibly performative and manufactured pop star. She’s not even the first to wear her plasticity on her sleeve, nor the first to anchor well-produced dance-pop anthems in a heavily performative aesthetic. Yes, she takes it to an impressively complete level, rejecting the entire idea of a personal identity separate from the pop performance, but it’s still a straight-up lift from the Madonna or David Bowie playbooks, played with the ruthless artistic consistency of Kraftwerk or Laibach. Nevertheless, it is worth looking at her in this precise historical moment, this being, in effect, Peak Gaga. The key single at this moment in time is of course “Bad Romance,” which seems in hindsight to be set to go down as her most enduring and iconic number. 

The song itself is a straight up bit of uptempo dance pop. It has more than enough hooks to snarl in your head, and manages the useful feat of having multiple memorable segments that it can thus cycle among so as to be catchy without wearing out its welcome. You’re just as likely to end up on the chorus as you are the nonsense syllables. The subject matter is one of the great classics of pop music: this relationship is awful and I love it so much. 

None of this explains its impact. This is a single that is unapologetically driven in a large part by its music video. It is not that music videos ever died as an art, but there was a vast fallow period between “the days that MTV played music videos” and the point where YouTube took over the function, and Gaga was one of the first artists to exploit their return emphatically. 

“Bad Romance” is not her first music video, but it is the first one of her videos to foreground one of the basic problems that faced Stefani Germanotta in becoming a pop star, which is that her facial falls significantly outside the narrow band of what is considered normatively and conventionally attractive. In much of her earlier videos and work in general this fact is carefully elided, either through substantial uses of makeup or by the use of masks of various sorts (including her signature oversized glasses). But in the video for “Bad Romance” she instead features herself made up to further emphasize the sense of physical strangeness. Indeed, the video goes further, eliding Lady Gaga’s wide-eyed and unsettling visage with latex clad monsters dancing spasmodically. This image of monstrosity is, of course, central to the project - the album “Bad Romance” is on is called The Fame Monster, its title changing Gaga’s first album, The Fame, into a new form.

The resulting aesthetic is not merely pop spectacle, but pop monstrosity, in the classical sense of monstrousness as something meant to be gazed at and looked upon. But in every previous version of a stunningly plastic pop star had been based on a sense of ever-shifting identity: the question of what character the artist will play on a given song or album. But Lady Gaga moves beyond that - there is nothing but the raw spectacle. Pop becomes an object intended to be approached as the Other, defined by its very strangeness. Even if the historical moment of Lady Gaga is brief - her next video for “Telephone” doesn’t have nearly the impact despite being, in most regards, a better song and having Beyoncé as a collaborator - and her albums have seen declining sales since The Fame Monster. But the point is in many ways less Lady Gaga’s enduring presence than the new relationship with pop that she augers - one in which we finally treat culture as a monstrous entity to be looked at from beneath as it looms over us, unapproachable and formless. 

The Avatar of Inception

Science fiction, at least, is doing fine. The end of 2009 marks the release of James Cameron’s long-awaited followup to Titanic, the sprawling sci-fi epic Avatar, while 2010’s summer is anchored in part by Christopher Nolan’s Inception. Both are interesting films from a certain perspective, in that they are films concerned with questions of identity. Avatar takes a fairly straightforward story of postcolonial aliens and adds to it a whole “becoming the Other” plot that ends in a charmingly posthuman species change. Inception, on the other hand, is an astonishingly well-made execution of the concept of a dumb person’s idea of what a smart movie should be like that attempts to get “it was all a dream” to actually work as an end-of-film reveal.

Neither are particularly good movies, although they’re certainly not bad movies either. They are simply middling movies. And yet they were at the time very successful, giving at least a flavor of the time. As suggested, what’s interesting here seems first and foremost the obsession with identity and the self. The very notion of “I” is in flux. This is not a new theme, of course - science fiction has been playing with this for decades. 

But here the notion of identity is bound up in big budget, special effects based spectaculars. Whatever the films’ conceptual territories might be, both were marketed largely through the existence of their impressive and innovative special effects. They invite their audiences to gaze at a big spectacle, as with much of the culture at the time. But the spectacle is oddly self-denying. To gaze at the special effects of Avatar or Inception is to enter a world in which identity is not so much questioned as rejected. These are not films that explore the frontiers of experience and self, after all. The fragile nature of identity is just taken for granted in films that fit smoothly and uncomplicatedly into other genres. Inception is a heist film. Avatar is countless sci-fi films. It’s just that they’re standard issue films from which we have been pushed out.

The implication is that our stories no longer need us. That they can get by perfectly well without us. We are extraneous even to our own cultural objects, which have taken on lives of their own. Tellingly, the other major sci-fi film in this period is Iron Man 2, the point at which Marvel’s movie division really revved up its seemingly unstoppable juggernaut of churning films out of pre-existing properties. Film threatens to become a nearly self-sustaining medium, imbued with enough depth and concepts to simply retread existing ground over and over again. Narrative without people. Having killed the author, we’ve now killed the audience. There is only the endless desert of text itself, wandered by nobody. Only ghosts, with nothing left to haunt but themselves. 

106 comments:

  1. her next video for “Telephone” doesn’t have nearly the impact

    I haven't finished reading the article yet, but I do take some issue with this (though your analysis of gaga as a whole is excellent). You're not wrong, but I think there's more to the story. I don't have any empirical data to back this up, but my memory is of Bad Romance being a somewhat unexpectedly exciting event, but Telephone, perhaps as a result, being the massive, hyped-up, excessively anticipated blockbuster. We actually postponed the beginning of a Heidegger lecture in our Culture and Poetics class to watch and deconstruct it on the day it came out. If we're talking about the lasting impact of the songs/videos/pop cultural events, I agree with you. But, within that moment, Telephone was The Empire Strikes Back.

    It was also quite a big step in the evolution of that most ubiquitous modern pop star/event, Beyonce. Her already massive cultural cache exploded in a slightly different direction thanks to the collaboration- she'd always been 'beloved', 'talented' and 'critically acclaimed' but she finally hit on 'hip' and 'edgy', which has sustained her career since, culminating in her latest album release. You can see it pretty immediately on her next video after Telephone, Why Don't You Love Me?.

    ReplyDelete
    Replies
    1. I don't know how good VEVO views are as empirical data, but using them Bad Romance pummels Telephone by a factor of roughly 2:1. You're absolutely right about the anticipation around Telephone, but it didn't translate into views in the same way. (Though I don't know how many of BR's views are from its initial impact and how many have accumulated since - it's still one of her foundational songs in a way Telephone isn't)

      There were definitely pre-Bad Romance hints that Gaga was moving in the "grotesque" direction the main post talks about. One of the videos - "Poker Face" I think - has her dancing rather astonishingly, making movements (particularly hand movements) which are deliberately at odds with the rest of the video's movement (and compared to most pop video dancing): there's an uncanny aspect to it, it was the point where a lot of smart pop critics thought, hold on, something's up here. (Not including me, alas, I was focused on the quite boring song.)

      Delete
    2. "monstrous" not "grotesque", sorry.

      Delete
    3. I've never seen the video for "Bad Romance." I had "Telephone" on my telephone for a while, so I could watch it at will.

      Delete
    4. Even more than Poker Face, I think the 'Paparazzi' video was the real precursor to the fame monster stuff... also one of her best songs on that patchy first album

      Delete
  2. Inception, on the other hand, is an astonishingly well-made execution of the concept of a dumb person’s idea of what a smart movie should be like
    This.

    Inception is a heist film.

    ... and this. It's not some probing, philosophical treatise on the nature of dreams. It's a very good heist film, but hardly even a superior one. It's a real pet peeve of mine when texts attempt to explore dreams and fail to present anything resembling the experience of dreaming. All action films operate according to an arbitrary, though mostly internally consistent, set of rules. They all end with these rules being broken in some way (a miniature narrative collapse?). Just because you've named your rules after concepts relating to dreams, doesn't mean you've made a film about dreaming.

    Inception is a very good heist film, but a deadeningly, infuriatingly, unimaginative dream film.

    ReplyDelete
    Replies
    1. Personally I never thought of Inception as a film about dreams at all (or a heist movie). It is a film about films (or more generally, stories). The main characters are directing a film about Cillian Murphy's character, about him finding peace in his problematic relationship with his father. They even talk about this during the planning of the “heist”, about bringing catharsis and that a positive emotion is stronger than a negative one. They construct a movie with Cobb as the director, Ariadne as the set designer and so on. This movie within a movie even follows a classic five act structure in five different levels of the dream: he’s kidnapped on the first level with his uncle and asked for the code of the save. On the second level Murphy’s character is told that his uncle is the one who arranged the kidnapping in the first place and they decide to turn the tables around. They go the third level, travelling in his uncle’s dream. During the predictable action scene, all hope is lost as he gets killed. Cobb decides to save him by going to the fourth level, which results in his self-sacrifice. Murphy’s character returns to the third level again, opens the save and can finally talk to his father. His story ends with a catharsis.

      Inception thus shows us all the beats of a simple drama story. A father and son reconciliation according to a classic structure. We know this to be a lie. His father did actually never care about him and yet emotionally that scene still works. Inception dissects the workings of plots, admitting all the while that it is all just a big lie. Which is reflected in Cobb’s personal story. He could never accept his wife’s suicide, always unsure whether he was dreaming (or you could say, if he was in a film) or if he was awake. This leads to that famous last scene of the toll spinning. The most important thing about that scene isn’t that it keeps spinning, it is the fact that Cobb isn’t interested anymore in whether it stops or not. He’s off in the background with his family, happy again. Whether it’s real or not, isn’t important for him anymore. He’s seen Murphy’s character be far happier when told a lie, a story. Why should he not embrace his own personal story?

      As a wise man once said, we are all stories in the end.

      Delete
    2. Ugh! This is like Forest of the Dead, where a very good, and probably farily obvious, reading of the text is beyond my reach because of my pet peeves (bad dream logic, joseph gordon levitt)

      Delete
  3. (Long time reader, first time commenter)

    Well. I don't know if this was a deliberate stylistic choice so as to bookend the blog, or this is just the most you've had to say about an era of Doctor Who history in a while, but this felt very "Dr. Sandifer does 1960's Who" Eruditorum. Which is good, because I really loved that era of the blog.

    I am very excited for the Moffat years.

    ReplyDelete
    Replies
    1. This. Except for the first-timecommentator bit.

      Delete
  4. Oh, I see what you did there. Nice preamble.

    Sidenotes:

    I take "Inception" to be a companion piece to "Memento." Not so much about 'it's all a dream' although the film is actively, repeatedly reminding the audience of its particular dream rules. Rather it is how it engages it's audience via narrative/apparatus. That is, both films feature intricate time compression/expansion through editing techniques. Both films require the audience to pay attention and remember the explicit cause-effect and time-space shifts in order to create tension.

    In "Memento," the main character "forgets," and we bare witness to a narrative folded in half and twisted about back to front in regular intervals. In "Inception," the main characters "dream," and we follow them "dream within a dream within a dream within..." In both films, seeing how events preceding and succeeding edits are not so much how one thing causes another in sequential order but how each affects the other akin to string theory.

    Discontinuity is an attack, as someone said, on traditional narrative and thinking.

    In short, I wish the whole movie was like the bridge "sequence."





    ReplyDelete
  5. Isn't it interesting that any film or TV series set in the future is often seen to have "got it wrong" when that future arrives (see "Lost in Space", most '60s and '70s Dr Who, "2001", "2010" and "2012"), but any film portraying an alternate future that could have been ("Fatherland", "Confederate States of America" "It Happened Here") is lauded.

    Yet both of them are essentially starting from a point of history and extrapolating forward, the difference being of course that one begins from the present day, while the other begins from the past.

    Both "Fatherland" and "2010" tell a story that can only take place in the political and technological climate that the films are explicitly set in. Neither will work in the "real" 1960s or the "real" 2010. But for some reason one film makes us wince whenever the "Soviet Union" is mentioned while the other elicits our praise and enjoyment whenever we hear the words "Fuhrer" or "Reich".

    Personally I find the first episode of "Lost in Space" much more enjoyable if I just step sideways and watch as if I'm viewing a knowing alternate history saga in which hyperspace travel existed in 1997.

    ReplyDelete
    Replies
    1. Back to the Future Part II has one more good year left in it.

      Delete
    2. Terry Pratchett once observed that the weirdest "This is not my future" moment in the original 2001 is the videophone booth. Not because it's a videophone, but because it's a phone booth.

      Delete
  6. Having killed the author, we’ve now killed the audience.

    Thank you for summarising in one line something that I have been trying to articulate in conversations with people for several years now.
    Whilst I do not think that there are no new ideas left (after all, physics was completely understood in 1900!), there is certainly a sense of the swirling vortex around which we are circling in smaller and smaller revolutions - at least in the mainstream media, and by that I do include things like "literary" writing and "arthouse" cinema.
    And yet I do think that there is a decent argument to be made that the interesting work in narrative is happening in other places as a result of the rise of interaction as a component in the structure, which mainstream media is particularly badly suited to dealing with. [I will stress that I realise that audience participation is hardly a new thing, but the ways it is being explored do feel a little different.]
    And whilst even that is horribly constrained by the form - most video games are still linear structures with a beginning, middle and end, after all, although to be fair, much of that is due to the malign mainstream influence - the sense that this might be some sort of escape route from total narrative collapse does tantalise me, however illusory it may be.
    After all, Charlie Brooker nominated Twitter as the most significant video game of the last thirty years and I think he may have underestimated it - it may be the most significant narrative advance for thirty years as well.

    ReplyDelete
    Replies
    1. Twitter ... may be the most significant narrative advance for thirty years as well.

      Would you mind elaborating on that?

      As far as I can see, twitter mostly seems to function as a really efficient threat/insult/trivia delivery platform. It also does a great job as a content generator for lazy journalists ("Twitter storm fury over tweet").

      Now, obviously there is more to it than that, but I am curious as to why people think it is such a significant milestone.

      Delete
    2. But that's what a narrative medium does - it is a delivery platform for threat/insult/trivia (with a side-order of emotional blackmail and/or action set-pieces depending upon the medium.)
      What makes Twitter interesting is that it blurs the line between external observation and internal participation - as a user, you get to choose which stories you are going to be a part of (or even start), and how far you want to be a part of them or whether or not you want to take them somewhere else entirely, which makes it different from other internet-based narrative forms which are generally trapped by an imposed narrative, even when that narrative tries to be subversive (like an "alternate reality game".) And because there is no imposed narrative that requires a recognised story structure, you end up with something that defies conventional analysis whilst still conforming surprisingly tightly to recognised patterns (like the various statistical analysis papers on trending hashtags.) And that's without considering the nonlinear nature of it, in which the various story strands cross over each other unpredictably.

      The "lazy journalists" thing is probably a red herring - they are just observational reports, although sometimes they may contribute as well. And attempts by people to constrain Twitter into traditional forms (like the Romeo & Juliet experiment etc.) have generally demonstrated that it needs its own forms to work.

      The fact that I have resigned myself to the realisation that I am too far gone to be able to adjust to this sort of multi-stream freeform doesn't stop me from appreciating the potential that it has for changing our understanding of narrative.

      Delete
    3. Thanks for the reply. I'm still not sure I grasp how anything approaching a meaningful narrative in a story-telling sense can emerge from an interface like twitter. Mind you, I hardly ever use it so I expect that I'm both unable to adjust *and* unable to conceptually appreciate the potential for narrative change as well.

      Delete
    4. "And yet I do think that there is a decent argument to be made that the interesting work in narrative is happening in other places as a result of the rise of interaction as a component in the structure, which mainstream media is particularly badly suited to dealing with. [I will stress that I realise that audience participation is hardly a new thing, but the ways it is being explored do feel a little different.]"

      I had a discussion with a friend lately who is really into theater about this very subject except in terms of in-person experiences. I think the rise of social media, the need to feel immersed in an experience, and the idea of collectively creating a story has led to some fascinating experiments with non-linear storytelling. For example, the Alternate Reality Games that accompanied media from this era, like Jane talked about in her Lost essay. Those are all about the fictional invading the real world and creating a sense of being a character yourself within a fictional property. Similarly, there's a rise in interactive theater beyond the "audience guesses who dunnit" such as Sleep No More (http://sleepnomorenyc.com/) where the audience physically wanders through a surreal production of Macbeth. Increasing numbers of group participatory art-like events like Improv Everywhere (http://improveverywhere.com/) also reflect this impulse. While none of this single examples have had a huge cultural impact, I think they are reflective of a certain tendency that's growing in the culture. In reaction to this "death of the audience," people want to be part of and make the story themselves more than ever rather than just receive it. (Not to say that receiving it isn't still popular - the Marvel movies have made a ton of money and I know I'm probably going to see at least one of them this summer.)

      In terms of Twitter, I think the key distinction is between narrative in a fictional/non-fictional static form and narrative as a personal story we tell about ourselves to ourselves and others. Twitter - and social media in general - blurs the line between the two, making the personal narrative far more performative than it had ever been before.

      Delete
    5. I write (and sometimes play) what is sometimes called "Freeform Live Action Role-Playing" (in the US it's called Theater-Style LARP) in which all the participants are also characters, and their interactions shape the flow of the eventual story. In essence, you are given a character with a more or less detailed background and some secrets and are told to go and "be" that character for a few hours, interacting with the other characters in both social and plot-advancing ways. (I didn't mention this in my original post because it's a niche part of a niche hobby... kind of like puzzle hunts are within the wider puzzle hobby!)
      But both players and writers are acutely aware that we are still operating within fairly tight constraints (although clearly the writers need to be able to let "their" story go crazy sometimes) because as individuals we know that there's a reason why certain narrative tropes exist and thrive.

      Twitter, on the other hand, is not bound by those same constraints - a person can be both real and fictional at the same time without there being a particular problem (except for those who are unable to make the distinction.) Stand-up comedians with an ambiguous stage persona are perhaps the easiest reference point for this, and look at how difficult some people find it to understand something that is as ostensibly simple as that.

      Delete
    6. I've read about theater-style LARP - it looks so cool! I'd love to participate in one of them. I recently started role-playing with a GM that allows for much more fluid gameplay than my previous GM and just that change makes for a much more interactive and thoughtful experience.

      Delete
    7. Can I urge you to give Fiasco a go if you get the chance. It's a sort of mid-point between tabletop RPG and LARP and whilst it is certainly a little extreme in approach, it's also a brilliant piece of design that showcases what can be done without a GM.

      (Yes, I am Mr Unknown who was posting above.)

      Delete
    8. I'll second the Fiasco recommendation. Especially if you can play it with improvisers. I nearly injured myself laughing.

      Delete
    9. "I had a discussion with a friend lately who is really into theater about this very subject except in terms of in-person experiences."

      When I was a young man I went to an art exhibition at the Euston Tower in which we were sent up in a lift at half-hour intervals, to wander through a seemingly-deserted office complex as the evidence of something much stranger occuring began to accumulate around us - computer printouts, living angel-shaped statues.

      Also, Punch Drunk Theatre Company are known in London for their immersive, site-specific works. The fine line between installation and theatre.

      Delete
    10. I've worked a lot with interactive, site specific theatre experiences as writer, performer, director etc. Our inspiration came as much from the DIY ethos of punk and the impromptu venues of rave culture as from classical drama. One of our most successful shows was This Rough Magic in which we turned a section of Brighton (England) beach into Prospero's island in a loose adaptation of Shakespeare's The Tempest. The audience were free to explore and it depended where on the beach you where as to what you witnessed. Caliban forming out of pebbles from the beach itself, the shipwrecked sailors coming ashore on life rafts etc. There are many companies doing similar work in the UK and Europe, one of the most interesting is 'dreamthinkspeak' their version of Checkov' s The Cherry Orchard presented on a number of floors of an abandoned art deco dept. store was extraordinary.

      Delete
  7. Avatar - ugh.

    I have never seen such a piece of dross. A film whose message is that it is better for human civilisation to collapse (for that, we are told, is what will happen) and leave the Noble Savages unmolested than to find any kind of compromise. Inception was a very fun film though. Looking forwards to Wednesday!

    ReplyDelete
    Replies
    1. Ditto! I watched Avatar as a part of writing an Art Theory piece condemning 3D film as an inartistic medium for narrative purposes. I got two things out of the experience (besides fodder for my essay):
      1. A splitting headache the size of a James Cameron Blockbuster,
      2. A resolve to never watch another Cameron film, or any film in 3D. I have only broken this anti-3D resolve to watch Day of the Doctor in theatres, which was enjoyable in spite of the 3D.

      As for 'Papyrus: The Movie'...er...'Avatar', it just lends further credence to the saying "No one ever went broke underestimating the intelligence of the American public". An insultingly dumb film, and ugly--both morally and aesthetically--under all its spectacle.

      As other people in this comments section have mentioned, trying to make Inception into some intellectually deep film about dreams and reality is to mistake the nature of the film. Sure, it fails on that level, but so does Doctor Who when people think it's meant to work as Sci-Fi blockbuster. I think people have also been thinking too hard about the ending, which is first and foremost, a brilliant bit of anti-humour. The film has set up the token as the one item that will tell you if the characters in the film are in a dream or not; we are shown the token spinning, told by the camera that it is *really* important if it falls or not, and then--just as the token is starting to wobble--we cut to black.

      It's a joke at the audience's expense, and the people in the theatre when I saw it all laughed aloud because we knew we'd been had. We were all leaning in close to listen to the *terribly important* thing that this summer blockbuster is about to tell us about the nature of reality...and the film basically says "you're taking this too seriously." A great move, in a solidly enjoyable film. I agree it's not a great film on the order of "Lawrence of Arabia" or something, but it is a fun film, and solidly in the "good film" camp, which puts it above 95% of summer blockbusters, especially in the last few years.

      Delete
    2. I thought pretty much every sequence in Avatar involving machinery (particularly the first, I dunno, 15-20 minutes? I've only seen it once so I don't remember when the Smurf part starts) looked absolutely amazing, one of the only worthwhile uses of 3D in any film I've seen. And more importantly, it looked to me exactly as I'd want a mid-twentieth century science fiction novel to look. I got really excited, thinking, finally, FINALLY we can make pretty much any SF classic you want to name into something that looks the way it's supposed to look.

      Then I started nodding off as the film morphed into a Disney movie. So much potential in those human sequences. Imagine if they HAD made The Word for World is Forest. We'd all come out of the thing slitting our wrists, but from a pure aesthetic viewpoint, holy shit!

      Delete
    3. A film whose message is that it is better for human civilisation to collapse (for that, we are told, is what will happen) and leave the Noble Savages unmolested than to find any kind of compromise.

      It doesn't seem to me to be saying that. What it says is that the humans, through their unwillingness to compromise, force the choice between the two bad outcomes, and given those options the movie quite justifiably takes the side of the defenders against the aggressors. It's a deeply moral movie, I think, in that it stresses the duty of soldiers to disobey unjust orders. My main problem with the movie is narrative: every twist in the plot is utterly predictable. More here.

      Delete
    4. Okay, different impressions. I might be misremembering because I never want to see it again; I seem to recall that the voiceover (from Sully?) at the end says that humanity is pretty much doomed, serves them right. I don't think that is good or justified, no matter how shiny happy wonderful the natives are.
      Disobey evil orders is a great message, but I didn't get that message from Avatar particularly strongly. I got "resist cartoonishly evil people", and "join the Noble Savages".

      Delete
    5. Well, the militarist Republicans seen to have gotten that message from it. They screamed their heads off about how un-American it was.

      Delete
    6. Politically sensitive people tend to do that, based solely on the accent of the actors involved. The RDA is a futuristic interstellar corporation from Earth, not America, and if it had been a British film (or even just one with all British actors) then I'm sure the Republicans would be fine with it, and in fact may be making knowing comments about how the film was a subtle comment on British Imperialism.

      Delete
    7. I went into the theater expecting a textually vapid vehicle for awe-inspiring visuals and was not disappointed. (I happen to enjoy 3D movies anyway, though.) I even considered going to see it again, but I just couldn't stomach sitting through the "tribal ceremony" scenes again.

      Delete
  8. "Narrative without people. Having killed the author, we’ve now killed the audience. There is only the endless desert of text itself, wandered by nobody. Only ghosts, with nothing left to haunt but themselves. "


    And I feel depressed now. :(

    Fortunately, the perfect antidote to this is imagining Matt Smith showing up holding a mop and wearing a fez... ;)

    ReplyDelete
  9. The Marvel Avengers series has largely felt as if the creators realise that 'let's make money out of our IP' is not actually sufficient reason for a film to exist (Exceptions: Hulk. Thor Dark World would also be an exception if Tom Hiddleston and Chris Hemsworth playing opposide each other weren't sufficient reason.) On the other hand, everything I hear about DC makes me think they think it is sufficient reason.

    ReplyDelete
  10. "the Liberal Democrats, who, on the back of Clegg’s performance in the UK’s first televised leaders’ debates, were polling ahead of Labour at various points, and threatening to snatch the entire election."

    Sadly not. The Lib Dems *were* polling ahead of Labour for much of the campaign, but we never had a hope in hell of winning the election, because the other two parties have concentrated areas of power that work for them in FPTP elections, whereas we don't.

    " in a large part because of the understandable anger of people who voted for a left-wing party and got a Tory Prime Minister supported by the very party they voted for."

    Actually, if you look at the angry people, most (though certainly not all) never voted for the Lib Dems at all, but for Labour. It's a version of the well-known thing where far more people say in polls that they voted for the party in power than actually did.

    As for the austerity agenda, sadly the result of the 2010 election made literally no difference to that. Labour's chancellor, Alistair Darling, said before the election that cuts "worse than Thatcher" would be needed, the Lib Dems talked about "savage cuts", and the Tories... are Tories. In fact the current government has cut slightly *less* this Parliament than Labour were planning to (if one can believe their manifesto).

    You keep making two claims -- firstly that Labour is to the left of the Tories (it isn't, and hasn't been for decades. Both have near-identical economic policies these days), and secondly that people voting for the Lib Dems (who *are* a left-wing party) were doing so because they were on the left. That second isn't true, as can be seen by the fact that a lot of Lib Dem voters have switched to UKIP.

    The Lib Dem vote consisted, essentially, of three groups. There were anti-whoever tactical voters, who voted for us because they wanted to "keep the Tories/Labour out". Those voters are always lost to us between elections (and the "keep the Tories out" ones may be lost to us for a while longer, understandably) -- we always get a significant drop in the polls between elections, and that's one reason why.

    The second group, who are lost for the foreseeable future, were the people who were voting for us as a "none of the above" protest vote. They didn't support the Lib Dems, they supported "a plague on both your houses". Those people have switched to UKIP or (to a much lesser extent) the Greens.

    The third group were actual Liberals (in the UK, not the US, sense), who don't define themselves by the "left" or "right" at all. Those have *mostly* stayed with the party, although of course there have been losses there as people have grown disillusioned with various of the horrible things done by the coalition government.

    But the idea that there was ever a huge level of left-wing support for the Lib Dems is exactly as much a myth as the idea that the 2010 election represented a wave of support for austerity...

    ReplyDelete
    Replies
    1. Going deeper into the past the problem is one of the formation of the liberal Democrats. They are more than other parties a coalition of convenience between classic liberals of the orange book order, who are probably closer to the more liberal wing of the Conservatives, and the SDP remnants who hated Blair for being too right wing. This alliance has been stretched and uncertain during the coalition, but with the Orange bookers ascendant and the SDPer mollified by having Vince cable as secretary for business.... They have held together and started the great act of sniping leading into the General election.

      Delete
    2. Darling did say cuts would be needed, but he also said that the cuts ought to be postponed until after the economy picked up. That is a difficult message to sell to an electorate via a media interested in soundbites and largely antipathetic to hostile.
      It is true to say that one wouldn't describe either Labour or the Liberal Democrats as left-wing parties. Although Phil is living in the US where things look a bit different, and from his perspective the Tories have eccentric socialist leanings.
      As an outsider looking in, the Liberal Democrats looks like a coalition between left-wingers and right-wingers united by their opposition to state power over the personal. Clegg was always on the right-wing of that.
      The major mistake the Liberal Democrats made was to give a commitment before the election as a flagship policy that they wouldn't raise student fees even in coalition. One can see that after the election that wasn't the policy they wanted to make a stand for, but then they shouldn't have said it was.

      Delete
    3. That's a myth put about by political journalists who don't have a clue what they're talking about.
      Hint... the SDP were the *RIGHT* wing of the Alliance. Keynes and Beveridge were both members of the Liberal party, while the SDP were a party created to be "centrist" and "moderate". The opposition to the merger among the Liberal party was that the "soggy dims", as they called the SDP, were thought of as a bunch of soft moderate right-wingers.
      (During the 1987 election campaign, David Steel favoured a coalition with Labour should there be a hung Parliament, while Owen favoured Thatcher).
      Further hint -- Vince Cable is a co-author of the Orange Book.
      Also, "the Orange Bookers" are hardly "ascendant". Clegg and Cable are, and Steve Webb's in a reasonably strong position, but Laws is disgraced, Huhne and Oaten both had their political careers end ignominiously, and Susan Kramer lost her seat, lost the election for President of the party, and lost the nomination for Mayor of London candidate.

      Delete
    4. David, that's largely true, although It's More Complicated Than That.
      (Really, the problem with the tuition fee thing was a ridiculous problem of PR. The new policy is significantly better than the old, and is in effect a graduate tax. If the government had just CALLED it a graduate tax, the whole problem would have gone away. Stupid, stupid, stupid, stupid political failure.)

      Delete
    5. How can a tripling of tuition fees be a significant improvement? For a start, there was a recent report that suggested that the new system is going to cost more than the one it replaced.

      And furthermore, whether it is described as a graduate tax or not, the tripling of a typical postgraduate's debt is highly likely to discourage people from lower-income backgrounds from going to University.

      Delete
    6. The tripling of fees is an improvement because students don't have to pay them upfront, only have to start paying when they hit a (high) income threshold, and stop paying after a fixed time, all of which means that the average student ends up paying less than under the previous system, the poorest pay nothing at all, while those who go on to be rich after graduating pay more. The end result is, in everything but name, a progressive graduate tax.

      The report suggests that it will cost more than the system it replaced precisely *because* of this -- if people are paying back less, the system brings in less money.

      And I agree, it certainly would seem likely to discourage people from going to university. Thankfully, though, that likelihood seems to have been averted, as rates of university attendance among those from lower-income backgrounds have actually gone up.

      Delete
    7. The tripling of fees is an improvement because students don't have to pay them upfront, only have to start paying when they hit a (high) income threshold, and stop paying after a fixed time

      Students didn't pay upfront under the old system either, they took out loans.

      But in any case, the point of tripling the fees was ostensibly to sort out the funding of higher education, wasn't it? So if the new system actually takes in less money than the old one, then it isn't achieving its purpose on top of increasing the level of debt that a postgraduate has on leaving.

      Whether they actually manage to pay it all back or not, it's still hanging over them for the rest of their working life.

      I'm no fan of the loan system either, but I remain unconvinced that the new system improves on it. And either way, it was an impossible sell for the Lib Dems on the back of their pre-election pledge; whether they'd presented it as a graduate tax or not, it would still have been perceived as a broken promise.

      Delete
    8. Darling did say cuts would be needed, but he also said that the cuts ought to be postponed until after the economy picked up.

      In fact, they largely have been, although politically it suits both George Osborne and Ed Balls to pretend otherwise.

      Delete
    9. It is still open to doubt as to whether the economy is picking up in any meaningful sense or only on paper.

      Delete
    10. While we're at it: Public spending under Thatcher actually increased. Spending as a percentage of GDP did go down, but not until her last three years in office.

      Delete
    11. It is still open to doubt as to whether the economy is picking up in any meaningful sense or only on paper.

      The employment statistics would suggest that the economy is picking up in a meaningful sense.

      Delete
    12. The employment statistics would suggest that the economy is picking up in a meaningful sense.

      But only if you live in London.

      Delete
    13. And only if you count part-time jobs, zero hour contracts, and other jobs that you can only hold down while still on benefits, scrounging off hardworking families and depriving them of their beer and bingo.

      Delete
    14. Students didn't pay upfront under the old system either, they took out loans.
      That's not entirely true - some of us are old enough to remember getting a grant to go to university. (Unfortunately, I was also one of the last people to get this, by which time it was so small it meant that you graduated with a bank debt interest rate that made the later loans look like a bargain...)
      I would note that one of the significant LibDem victories in this otherwise horrible battle for them was the maintaining of the repayment threshold at a relatively high level - the Tories realised that the system was indeed going to cost money rather than raise it and tried to force the level significantly down; the LibDems held their ground. But, alas, they won't be recognised for this - expect an election campaign in which Labour (and perhaps even some Tories) will cite the tuition fees decision at every possible opportunity with no nuance involved.

      Delete
    15. Yay, politics! Apologies, I'm just excited to be reading one of these comment threads in real time as opposed to a historic document.

      Left and right are difficult labels these days, but if you define left leaning as being more comfortable with a large and proactive state, which is not unreasonable, I think it's fair to say Labour is to the left of the lib dems. Alternatively if you take sympathy with civil liberties to be the defining left wing attribute then the positions are reversed.

      I can believe that from the point of view of many readers of this blog, all 3 mainstream parties are pretty right wing, and for many the distinctions between their policies and actions would be minimal. I think those distinctions do exist though.

      Delete
    16. Scurra
      That's not entirely true - some of us are old enough to remember getting a grant to go to university. (Unfortunately, I was also one of the last people to get this, by which time it was so small it meant that you graduated with a bank debt interest rate that made the later loans look like a bargain...)

      Yes, I was at university at a similar time; the very start of the student loan system. My first loan was the maximum allowed at the time, something like £500 if I recall correctly. Which is one reason why I'm so aghast at the levels of debt that students end up with these days.

      Of course, grants were a lot easier to fund out of general taxation when a much smaller percentage of the population went to university and those that did were pretty much guaranteed to pay more in income tax. It's a good thing that more people get the opportunity to go, but I do wonder whether saddling half an entire generation with enormous debts will ultimately worth it.

      Whittso
      I can believe that from the point of view of many readers of this blog, all 3 mainstream parties are pretty right wing, and for many the distinctions between their policies and actions would be minimal. I think those distinctions do exist though.

      They do indeed, although the distinctions have been getting narrower and narrower over time. When I did that political compass thing, I came out as a left leaning libertarian. I think there is a useful distinction to be made in separating the large state - small state axis from the authoritarian - libertarian axis. New Labour rapidly became extremely authoritarian, but there has often been a strongly authoritarian streak in leftist UK politics.

      There was an interesting website set up before the 2010 election called Vote for policies which presented manifesto policies without telling you which party put them forward. The party with the most popular policies was the Green Party, even though hardly anybody actually voted for them because of our stupid FPTP electoral system.

      Delete
    17. "if you define left leaning as being more comfortable with a large and proactive state, which is not unreasonable, I think it's fair to say Labour is to the left of the lib dems."

      On the other hand, if you define left-leaning as being in favour of the aims which a large state is supposed to promote (redistribution of wealth from the rich to the poor), then they're not, at all.

      Delete
    18. @Triturus - I don't know, I think it's possible to focus too much on policies, where actually the voting decision needs to be informed by other areas. Blairite New Labour had many awful policies, but for me, the worst thing they did was take the country kicking and screaming into an illegal and immoral war (obv YMMV). Nothing in their policies in 2001 would have indicated that was what was going to happen, but you might have noticed and been appropriately worried by the messianic gleam that was already in TB's eye.

      @Andrew - please provide some back up to the suggestion that the current incarnation of the Labour party is 'not at all in favour of the aim of redistribution of wealth'. I mean, I'd probably argue with you that this is too strong a statement even for the prime of New Labour when you could point to Mandelson being 'intensely relaxed' etc., but for the current lot, it seems just unsupportable.

      Delete
    19. And note, that I'd obviously distinguish between being in favour of an aim and being good at delivering the result...

      Delete
    20. I would do, if that's what I said. What I said was that they're not *to the left of the Lib Dems* at all.

      Delete
    21. Everybody I've ever known to do it has come out as left-leaning libertarian on the Political Compass quiz. The questions are a bit leading.

      Being in favour of a large proactive state is not at all a definition of the Left. Certainly it wouldn't have been in pre-Thatcherite Britain, where being Left would mean opposition to the Establishment. It's hard to find definitions of Left and Right which a) don't define one as Not the Other; and b) don't define one side as Awful and Stupid People. Still, I would say that one is more or less left wing as one favours policies that primarily favour the more disadvantaged groups in a society, especially by economic measures. (I say primarily favour as both left and right would mostly say that benefits to the favoured group lead to benefits to other groups as well.) And one is right wing as one favours groups already better off.

      Delete
    22. And only if you count part-time jobs, zero hour contracts, and other jobs that you can only hold down while still on benefits, scrounging off hardworking families and depriving them of their beer and bingo.

      That picture is at least a year out of date. More recent employment growth has been mainly in full time jobs, and the number of people claiming in-work benefits has fallen.

      Delete
    23. The political compass has no empirical basis.

      Chris Lightfoot did a fascinating piece of research about eight years ago, in which he gathered the results of a wide-ranging political issues questionnaire and subjected them to a principal components analysis. This is designed to reveal how the answers to the different questions cluster, and to determine how the data can best be represented by plotting in a variable number of dimensions.

      For example, imagine if you just had two questions:

      1. Should we increase taxes to help the unemployed?
      2. Should we cut taxes for the rich to encourage entrepreneurs?

      - then you would imagine that most respondents would reply yes/no or no/yes, and you could account for the data by just using one significant axis (which you might call "left/right wing", though any such label is of course a matter of interpretation rather than mathematics). People being people, there will be some who answer yes to both or no to both, but it is likely that there would be few of these and the analysis would not require another significant axis to deal with these: they would just appear as noise.

      The political compass is an example of explaining political positions using two significant axes, but as I say these axes are not derived from data, they are imposed a priori. Working from the data, there could in principle be three, four or more significant axes. It all depends on how real political attitudes cluster together. There is also no reason why all the axes should be equally significant (as they are in the political compass case): some may be far more important than others.

      Lightfoot's analysis found that, in the UK, the range of political opinions can indeed be described by only two significant axes, but that one is much more important than the other.

      The less significant axis is what Lightfoot called the "Axis of Economics". This roughly corresponds with the traditional economic left/right split, but only accounts for a minor amount of political variation. On this axis, Conservative and UKIP voters are slightly to the right, Labour and Lib Dem voters slightly to the left, but there's a huge degree of overlap.

      The more significant axis, by a long way, was dubbed by Lightfoot the "Axis of UKIP". On one end, where Conservative and UKIP voters tend to be found, are the people who believe in harsh punishments for criminals and in isolationism in foreign policy, while the other end, where Labour and Lib Dem voters tend to be found, has the people who believe in rehabilitating prisoners and internationalist politics. This division explains almost all the variation in political attitudes in the UK. Lib Dem voters tend to be a little bit more towards the rehabilitation/internationalism end of this axis than Labour voters, but again there's a lot of overlap.

      It would be interesting to see how this picture might have changed in the past few years, with the experience of coalition government and as the Iraq war and Labour's authoritarian crime and imprisonment policies fade from memory. Sadly, Lightfoot died tragically in 2007, and as far as I know no-one has picked up this part of his work.

      There's a quick and jolly presentation of the data in this powerpoint presentation, and more detail in the links therein.

      Delete
    24. The powerpoint link isn't working for me, Iain.

      (Also: How did isolationism/internationalism and punishment/rehabilitation end up as the same axis? I realize UK political culture is different from US political culture, but over here I'd expect there to be a substantial number of people who prefer to withdraw from foreign conflicts but are not given to super-punitive views on crime.)

      Delete
    25. Isolationism not in the sense of staying out of foreign conflicts, but in the sense of withdrawing from the European Union, kicking all the immigrants out, and so forth.

      Delete
    26. That makes more sense—certainly there's a connection between wanting punitive treatment of criminals and punitive treatment of immigrants.

      On trade, though, this must be a place where the differences between the US and UK break down. The closest we have here to the EU is NAFTA, and it's a rather different creature. It's not uncommon for someone with liberal-minded views on crime and immigration to be very opposed to it and other regional trade agreements (and the WTO).

      Delete
    27. Originally, the opposition to joining the EEC (as was) in the UK was led by the left wing of the Labour Party, and even in the last Euro elections in 2009 there was a "No2EU" coalition made up of the RMT trade union, the Communist Party, the Socialist Workers' Party, the Liberal Party, and others. They believe that the EU's rules mandate capitalism, and oppose that.

      But opposition to the EU in the UK is mostly made up of three complaints. The first is that it's a "loss of sovereignty", usually coupled with grumbles about how we didn't go to war with the Germans twice in the last hundred years just to have them involved in setting our laws. The second is a sense that European regulations are too petty -- people don't like (the tabloid versions of supposed) rules about what size and shape bananas are allowed to be sold or how much meat can be in a sausage. There's an image of small-minded bureaucrats meddling for the sake of it.
      And finally, since the borders opened in Europe, allowing more-or-less free movement between the different countries, there has been a *HUGE* wave of hatred based on the idea that we're being "swamped" by immigrants from Poland or Bulgaria or wherever -- not taking into account that actually, more British people emigrate to other European countries than Europeans move here.
      Imagine what would happen if NAFTA said that Mexicans were free to move the US and work there, and who the opposition would be then. That's who the opposition to the EU is in the UK. That's who UKIP are.

      Delete
    28. Andrew - apologies for misreading you above. You're quite right.

      Having said that, you comment at the top of this thread was that Labour aren't a leftwing party, and you've linked that to having the aim of redistributing wealth, so you are implicitly saying that Labour don't have this aim, which I still struggle with.

      Delete
    29. I have seen little or no evidence of them having that aim in the last twenty years or so. All current mainstream political parties support moderate amounts of redistribution, as they all support the welfare state, progressive taxation and so on. I've seen little evidence that Labour in practice (as opposed to in rhetoric -- and only their rhetoric in opposition, not in government) support any more redistribution than any other mainstream party.

      Delete
    30. Yep: The EU and NAFTA are rather different creatures. NAFTA doesn't increase Mexicans' ability to move to the US.

      That said, you do hear versions of the sovereignty argument from the left in America: There's a fear that trade agreements, especially the WTO, will override laws passed at home. And doesn't your second argument have more to do with the "Axis of Economics"? It's basically the classic free-market complaint about petty economic regulation, albeit intensified by the fact that there's less control over the regulators (i.e., by the sovereignty factor).

      Delete
    31. Working Family Tax Credit was old school redistribution, and something no Tory government would ever have introduced. Labour may not have succeed in narrowing the gap between richest and poorest, but they sure did slow down the rate of increase. That's damning with faint praise, of course, but Labour are clearly further left that the Tories (and economically, I would argue also further left than the current Lib Dem leadership, who have embraced economic policies they referred to pre-election as 'economic masochism', thus ensuring a large number of Labour voters will never support them again).

      Delete
    32. Thanks for the link, Iain.

      Delete
    33. Andrew:

      Reminds me of this section from Vince Cable's recent skewering of Ed Balls:

      "When we first had these exchanges a couple of years ago, the right hon. Gentleman had a very good football chant going on the Back Benches behind him: “Growth down, inflation up. Unemployment up.” Now of course we have growth up, unemployment down and inflation down. His current favourite is the “millionaires’ tax cut”, which I would find a little more persuasive had I not sat on the Opposition Benches for 10 years being lectured by him and his boss that any increase in the top rate of tax above 40% would be counterproductive and damaging to the economy. "

      (Vince Cable is a Liberal Democrat member of the Coalition Government, Ed Balls is the Labour Shadow Chancellor - the opposition's economics spokesman.)

      Delete
    34. Labour may not have succeed in narrowing the gap between richest and poorest, but they sure did slow down the rate of increase.

      That's a very questionable statement.

      The usual measure of income inequality is the Gini coeficient. In the UK, the Gini coefficient trended sharply upwards during the Thatcher governments, then was basically flat (with a slight downward trend) under John Major. From 1997, it trended upwards again, though more gently than under Thatcher, and so far under the coalition government it has shown a marked decline.

      (I'm just eyeballing graphs on a screen here, not doing any proper quantitative analysis.)

      Delete
    35. Thanks for that, Iain. Major and the coalition both presided over a recession, and I'd expect the gap to narrow during those times. So in the direct 'like-for-like' economic circumstances (Thatcher's 80's boom and Blair's naughties boom) inequality grew slower under Labour than the Tories. With that clarification, hopefully you no longer find the statement questionable.

      Delete
  11. The other thing to mention is that the Tories' "just" 36.1% of the vote was still higher than the 35.2% which Labour got in the 2005 election, which gave them a very, very strong majority. This is worth mentioning because a lot of the anger from Labour types about the eventual result was focused around claims of illegitimacy. These claims are fatuous, as either one wants to accept the result the current system produced (in which case one accepts that the current government is legitimate), or one goes by votes cast (in which case the current government is no less legitimate than any post-war one...)

    ReplyDelete
  12. The problem with politics is that bribing, bullying, and destroying one's enemies is the only way to really affect change. Hence today's liberals become tomorrow's jackboots or petty bureaucrats, and nothing really changes. Even if by some miracle you had a benevolent dictator (the libertarian dream), a utopia would be the worst possible scenario as they would have to worsen our lot for the sake of bettering it.

    ReplyDelete
    Replies
    1. Um, no. The libertarian dream is building alternative institutions to bypass the state.

      Delete
    2. In other words, a defacto state?

      Delete
    3. This comment has been removed by the author.

      Delete
    4. Only in the sense that DIY home recordings and a rich web of informal music-sharing networks are a de facto record label. Or, in other words, no.

      Delete
    5. Yeah, but how does that work for essential services like healthcare, roads, education? It's one thing to replace record companies with lots of individuals uploading to bandcamp from their laptops, but you can't build a hospital or a power station that way.

      Delete
    6. Whether you can or can't is a conversation too involved for me to dive into at the end of this long day. I'm just making the point that no, the proposed alternatives are not a de facto state. If you replace a centralized institution with a decentralized network, you haven't simply changed the sign on the door.

      Delete
    7. The essential foundations of civilisation are common defence and waste management. I have yet to see a convincing libertarian or anarchist account of either.

      Delete
    8. Most civilisations have got by without having much success at waste management. Hence the inability of any city before the age of Joseph Bazalgette even to keep its population static without constant immigration from places where people were spread out enough not to keep crapping in the water supply. By contrast, you can't have civilisation without a legal system, and you can't even think about talking about considering the possibility without a hefty surplus of food production.

      What could have rather more to do with the viability of libertarianism or anarchism is that I cannot think of an attested civilisation without a system of compulsory tax/rent/tribute/service and some sort of elite supported by it. Which is not the same thing as that being demonstrably a requirement, but it is at least suggestive.

      Of course, you don't have to have civilisation at all. But proposals for dispensing with it would obviously have to come up with an acceptable mechanism for bringing the global population down by significantly in excess of six billion.

      Or just wait several decades. Probably.

      Delete
  13. Dr. S, I'm having a lot of trouble with your last two paragraphs and I think I need to come to office hours.

    What you're saying sounds intuitively right, but I don't really know what it means in concrete terms. You seem to be implying that our stories needed us before -- that there was some point before which film needed its audience in some sense and after which it does not. But I don't know what that point was, and what marked its transition.

    Was it some critical point at which special effects became enough to drive a film's marketability? Was it a stage where the culture became saturated enough with genre material (superhero comics, templates we all recognize like the Spy Film, the Heist, the SF Epic) where filmmakers could if they chose simply stop paying attention to actual human behavior in the world and just regurgitate formula back at us? Is it an audience who obviously still need to pay for a movie (or at least its merchandise) to keep its like profitable, but no longer knows or cares if the movie actually says anything about their lives?

    Or am I completely misunderstanding your point? I know my grades in your class are not the best and that I talk too much when I should be taking notes on the lecture, but I'm doing my best to catch up to everyone else who apparently got it the first time through.

    ReplyDelete
    Replies
    1. I made this sound like a joke, but I really am perplexed by those two paragraphs, and I feel dumb because several other commenters seem to have found them crystal clear. I would love for one of them to help me get a grip on the concept, if the author himself doesn't want to.

      Delete
    2. Shut up and admire that Imperial stitching!

      Delete
    3. ::audience member falls on sword::

      Delete
    4. I tried writing a precis for "The Avatar Of Inception". I gave up.

      Para 1 states that both films are sci-fi films that have concerns about with questions of identity.

      Para 2 states that the concept of identity and self is culturally in flux.

      Para 3:
      In these films, identity is bound up in vast spectacle, and the spectacle is the point of the film. The spectacle is "oddly self-denying" [does that mean the spectacle denies itself, or denies the notion of ego? Is it odd that spectacle should deny ego? Or it denying ego in an odd way?] To immerse oneself in the spectacle of the film does not encourage the viewer to question the notion of identity [which by inference is the more expected position to take?] but to reject it. The films do not explore notions of experience and self, which are assumed to be fragile and subordinate to the broad sweep of the film's genre. [Is this in contrast to Para 1 which states that the films ARE in fact concerned with identity?]

      I'm lost here. Having not seen Avatar or Inception, I am at a disadvantage.

      Although paragraph 4 makes more sense in isolation:

      Para 4:

      If stories have become spectacle without notions of ego, they are self-perpetuating cultural entities. We make stories that are like other stories. This is not the same as arguing for Campbellian monomyths (i.e. not saying that all stories are the same under the skin), it is speculating that stories are becoming independent of people and new concepts.

      I think our host is leading us into the "what if our stories could think for themselves" territory....

      Delete
    5. I greatly appreciate the attempt!

      Identity is certainly a concern in Avatar, which sees a human essentially projected into an alien "avatar" and gradually come to feel more at home in that body and identity than his own, which I'm not sure is truly a "post-human" experience so much as a high-tech allegory for "going native." It seemed to me that the familiarity of that real-world trope, rather than the vastness of the spectacle, is what prevented it being an interesting exploration of identity, but I only watched it once and couldn't help zoning out during the boring parts.

      I'm not sure what Inception has to do with identity, unless it's the question of whether we can be sure our ideas are our own. I think, unless we honestly believe there are people who can kidnap us and enter our dreams, this is a non-question, and it's more about reality than identity, but that too I've only seen once, and maybe if I rewatched it with this essay in mind I'd see different things. Certainly no one in the film strikes me as having anything like an identity to start with; they're just reasonably attractive people in suits being photographed beautifully. (It doesn't help that even after all the supposedly literate films he's done, I still can't believe that Leo DiCaprio isn't the kid from Basketball Diaries just pretending to be an adult the whole time.)

      I think our host is leading us into the "what if our stories could think for themselves" territory....

      That's kind of what I figured -- I was just having a lot of trouble figuring out which path he took to get there. I needed the Master to drop an "easy as pi" hint, I think.

      And part of the problem is that I'm not sure I believe "we're all stories in the end," or more to the point that I'm not sure we're just stories in the beginning and the middle, and that that matters.

      I don't think I should be inferring that Doctor Who in the Moffat years is as focused on spectacle over substance as Avatar and (though I liked it) Inception, and that we can't expect to matter as an audience since the show is now just about dicking around with stories that have nothing to do with us or anything apart from other stories, but as I articulate that it seems frighteningly supportable as a criticism.

      Delete
    6. The resolution of "The Big Bang" is of course extremely dependent on memory and non-linearity which ties it to Memento, presumably. Whether that's in any way part of the 2010 zeitgeist, who knows? But it did give me an opportunity to use the word zeitgeist.

      OK, here's my very late night take on it. These films are in some sense impersonal, that is not interested in the self, but on spectacle and genre. In contrast, Doctor Who is now beyond fiction and to some extent self-aware. Therefore Phil's thesis will be that series fnarg is a quasi-sentient metafiction in an alchemic genre collision with its spectacle-based qlippothic dark shadow.

      As Matt Smith vaults over the herb garden in a cut scene from The Eleventh Hour, he mutters "Thyme can be re-written".

      See, I can write this stuff too. And I got to say qlippothic.

      Delete
    7. "Raggedy Man" is also an anagram of "Amy Nerd Gag"

      Delete
    8. ...that...actually kind of makes sense to me.

      I feel like I should be paying you for this tutoring session.

      Delete
    9. I think what the "death of the audience" idea is driving at is that the makers of mass-market fictions no longer have to imagine an audience in order to try to communicate something to it or even in order to try to give it something that it will be prepared to pay for, because they now have a large enough repertoire of tried and tested formulas and techniques available to them that they just have to give the pot a stir and pour out another bowl. They don't have to consider the audience because they *know* that enough people will buy enough of what they are selling to make it a commercial success. And yes, it does depend on people continuing to buy the stuff. But as long as the current pattern continues, their implied presence remains entirely dispensable from the creative process and they are, from that point of view, "dead".

      If that is what PS is driving at, it may be true up to a point, but if so the critical turning-point came quite some time ago and was itself more like a reversion to a long-established order than a revolutionary novelty. The situation described sounds quite reminiscent of the production-line methods of the mid-twentieth century Hollywood dream factory. There's an oft-repeated narrative of how American cinema went through a brief golden age in the 70s precisely because the studio bosses lost confidence in the saleability of their old standardised products and were therefore forced to trust and empower directors to come up with new things. Then Spielberg invents the summer blockbuster, Lucas invents the special-effects spectacular, merchandising and the franchise, and it's back to business as usual.

      From that point of view, the present situation is merely the product of the continuing refinement and ramification of the resulting industrial techniques over the course of the last thirty-odd years, and the consequent ongoing decline in commercial unpredictability and hence in the need to consider the audience.

      Delete
    10. This comment has been removed by the author.

      Delete
    11. Incidentally, isn't it peculiar how the ability to keep selling people the same material over and over again makes Douglas Adams a folk hero and Lucas a swindling tyrant?

      Delete
    12. Cute. :)

      Actually, Lucas's main sin is NOT being able to sell the same material, in the sense that he seems to have no idea what made the original Star Wars trilogy engaging and no idea how to recapture that magic. That's a big part of why his endless rereleases of that trilogy are so maddening (which is what I think you were referring to) -- he's not content to fail to deliver new stuff that makes old fans happy (plenty of kids seem content with it, though), he also has to make the old stuff worse and let the good original versions go out of print.

      With Adams, at least the variations are interesting. The differences between Hitchhiker's the radio series (my fave these days), Hitchhiker's the novels (my fave back in the day), Hitchhiker's the TV series (ugh), and Hitchhiker's the video game (tear your hair out) are relatively subtle but they're enough to make you want to experience them all. Hitchhiker's the movie, not so much. Then there's the recycled plots -- City of Death / Dirk Gently, Key to Time / Life, the Universe, and Everything, probably a few more -- which are even more varied. I'd say he's a folk hero despite, not because of, but now this response is already less interesting and fun than your original quip, so I don't know why I'm bothering. :)

      Delete
  14. I like Lady Gaga's grotesque stylings and presentation. I just wish her music was styled to match it.

    ReplyDelete
    Replies
    1. But we already have a Björk.

      Delete
    2. And, indeed, a Marilyn Manson. Just sayin'.

      Delete
    3. The Warhol allegory is apt. I think she captured the zeitgeist real well; i.e. her inexplicable career as both an inspiration and a cautionary tale. But it stopped being an act...if indeed it ever was. All I see is enormous amounts of creativite energy spent trying to maintain her already maxed-out fame. (Warhol's "Love Boat" years.)

      Delete
    4. Ah ha ha - good points, all.

      But surely the cache of Bjork and Marilyn Manson has fallen dramatically? Surely there is room for new bizarre musicians? (Alas, with the death of Dave Brockie, GWAR, at least, may be lost to us forever.)

      In any case, I was thinking more along the lines of one of my favorite bands of the late 90s and early 00s, Rasputina - a "Ladies Cello Society" that would perform on said instruments while wearing Victorian underwear, singing strange, often haunting songs about things like the Triangle Shirtwaist Fire, Howard Hughes, and Bolivians eating rats under a dispensation from the Pope.

      Gaga, on the other hand, never seems to get much beyond the average club fare. Often catchy (I do like to listen during a workout), appealing to many, but not so interesting to me personally.

      Delete
    5. Yes, you're right, of course -- there's no reason we couldn't have someone as visually eccentric as Gaga who also happens to be musically or lyrically eccentric as well.

      Except that of course it doesn't fit her stated goals. Björk enjoyed a brief period of reasonable popularity, mostly in the so-called "alternative" charts, and Manson had his moment back when his moment was a moment, but neither of them was ever going to be a sales juggernaut on a Gaga scale, if only for the simple reason that it's almost impossible to be both bizarre and populist, perhaps by definition. St. Vincent's never going to be Lady Gaga; Janelle Monae's never going to be Beyonce or Rihanna (thank fucking god). If they were, they wouldn't be who they are, and vice versa.

      My point is simply that we can have both. There's some pleasure in Gaga being outwardly strange and apparently free-spirited, but also making music primarily for the spinal column, though I fully understand how it's a letdown for people who gravitate toward the thoroughly strange. I would venture to say that as relentlessly pop-focused as Gaga is musically (which is not an insult), she does occasionally get a little weird lyrically. Born This Way is a really odd record -- it's no Cabin Fever! or even How We Quit the Forest, but it seems like an intentional sweep-with-the-other-arm after the "fame" records, a statement that it wasn't all glamour and champagne and cigarettes but that if you were a smalltown freak who didn't fit in and couldn't behave, Gaga loved you too. However calculated that might have been, I found it really endearing.

      One of my favorite things about Rasputina was that they'd cover songs by Heart and Pat Benatar alongside stuff like "Watch T.V." and "State Fair." What they have in common with Gaga and Scissor Sisters, at least in my imagination, is an imagination and ambition that springs from an eclectic childhood in a small-minded place that nevertheless still feels like home somewhere deep inside, though they're in the big city in the bright lights now. You might be able to fake that kind of texture, but you can't buy it.

      Delete
  15. "Inception, on the other hand, is an astonishingly well-made execution of the concept of a dumb person’s idea of what a smart movie should be like..."

    You're not saying Christopher Nolan is dumb, are you? A dumb person wouldn't be capable of taking a conceit with as much potential for confusion as "show the scenes from the primary plot in reverse order, separated by scenes from a secondary plot shown in correct chronological order" and turning it into a movie as easy-to-follow and dramatically functional as Memento. (Yes, people love to talk about how that movie is incredibly confusing, but they're wrong, and that's exactly why the film is brilliant.)

    ReplyDelete
  16. This comment has been removed by the author.

    ReplyDelete
  17. It occurs to me that Gaga presents herself as the very opposite of "a monstrous entity to be looked at from beneath as it looms over us, unapproachable." Think, for instance, of that SNL skit where she's on a game show with Justin Timberlake, and the joke is that she remembers the tiniest detail of every conversation she's ever had with a fan and he barely remembers the name of the groupie he hooked up with. There's her "Little Monsters" community, ostensibly a place where her fans -- particularly the artistically inclined (or fashion, music, etc.) -- share what they're doing with each other, and the mailing list presents them as stars in their own right, spotlighting different ones each week or so. The persona she puts out is more mother to monsters, fostering all of her precious wild things as they live their own eccentricities in public. How much truth there is to this, how much of her fame she actually shares out to others, how much compassion and kinship she actually feels with her fans, may be beside the point: it's part of how she markets herself, which I think is what we're actually talking about. We are at least led to believe that our relationship with Gaga the persona and the artist is or could be actually more personal than any we've had with the less approachable pop stars of the past.

    ReplyDelete
  18. Well this posting gap is working nicely isn't it? This post already contains the most boring comment thread ever and the most revealing. I respectfully leave you to work out which is which. (Clue: neither of them are the Lady Gaga one).

    ReplyDelete