That's the location of the Rock and Roll Hall of Fame Museum, and this week the nominees for the Hall's Class of 2010 were announced.
A group of music historians takes a look at musicians who've had "a significant impact on the evolution, development and perpetuation of rock and roll" who released their first music at least 25 years ago. The categories are performers, which is the spotlight area, as well as sidemen -- the session and concert performers who often made headliners look better than they might have otherwise.
The Hall also honors non-musicians, such as producers, DJs, promoters and journalists, who've had that significant impact, as well as "early influences," or musicians whose careers predate the rock era but who had their own significant impact. People like Jelly Roll Morton make it in as "early influences." Morton died when Elvis Presley was in the second grade and "Rock Around the Clock" performer Bill Haley was a teenager trying to make it as a yodeling country singer, but his influence on jazz and blues music influenced some of the post-WWII blues that would cross over into rock and roll on the bridge of Chuck Berry's guitar.
Of course, Oklahoma rockabilly gal Wanda Jackson also makes it in as an "early influence," even though probably 95% of her recorded output takes place during the "rock era" that followed "Rock Around the Clock"'s no. 1 charting in 1955. Early apparently means different things to different people.
After the nominees are picked, a voting body made up of about 500 rock experts -- which does not include people who merely listen to and buy the music, of course -- cast their ballots. The top five or so who get more than 50% of the vote are in.
When the Hall began 25 years ago, it was nominating and electing people like Presley, Berry, Buddy Holly, Fats Domino, James Brown and so on; musicians whose impact on rock's birth and initial growth was obvious. To this day I know people who say you should never trust a guitarist who can't play Chuck Berry music, and I am one of those people. But check out this year's list of potential inductees at that first link.
OK, Kiss makes some sense. They are indeed a rock band and their Halloween made-up faces were pretty much everywhere between about 1975 and 1979. David Bowie, Marc Bolan and Gary Glitter probably had more influence in creating the glam-rock genre that Kiss worked in, but they'll do because of how large they were on the scene during their time.
Genesis fits even better. Although in their 1980s video-friendly phase they were pretty much the backing band for Phil Collins' pop albums, they began as a full-fledged "progressive rock" band. Along with prog-rockers like Emerson, Lake and Palmer, Yes, Jethro Tull and Rush, Genesis ushered in the use of a variety of instruments in rock songs and expanded the roles of others, including electronic instruments like synthesizers. They moved away from the three-minute format and often included extended instrumental breaks instead of short solos.
Some of the others, though -- ABBA? Really? The Swedish foursome may have sung some nice songs and harmonized well, but rock music? That'd be like calling someone an early influence who was actually contemporary with some of the people they were supposedly influencing...oh. Never mind. And truthfully, the Rock and Roll Hall of Fame is no longer about rock and roll anyway. But their 1976 song "Money Money Money" does at least let us know why the Rock and Roll Hall of Fame is in neither Memphis (home of Sun Studios), Chicago (home of the Chicago blues), St. Louis (home of Chuck Berry), New Orleans (home of the Delta blues) or New York City (home of most everything else) but in Cleveland, Ohio (home of $65 million).
Although many inductees were better known in rock-related genres like R&B, blues, soul and reggae, the HoF pretty much kicked down the genre boundaries in 2006 with Miles Davis' induction. The immensely talented Davis was really in no way a rock musucian, working entirely in the jazz arena. Madonna's induction in 2008 confirmed that the phrase "rock and roll" was descriptive of the museum, not the artists enshrined therein. Madonna is a pop and dance music star -- some good and some bad, but she's not a rock and roll musician. But the name's already on the museum and calling it the "Pop Music Hall of Fame" makes it sound less serious than "Rock and Roll Hall of Fame," and the last thing you want to do to music buffs is somehow insinuate that their subject matter is not deadly serious. Plus, "Rock and Roll Hall of Fame" still has an aura of decadence about it that "Pop Music" doesn't. "I'm a rocker" sounds cool. "I'm a popper" sounds like you're a fan of one of those generic colas that discount stores sell in place of the real thing like Dr. Pepper.
Heck, even the Beatles straddle the fence -- while they were obviously a rock act in their pre-moptop days and their first releases, they morphed into a pop/psychedelic band that sometimes did rock songs. Noted rock despiser Frank Sinatra covered George Harrison's "Something" and called it "the greatest love song ever written." "Let It Be" is not a rock song. Neither are "Ob-la-di, Ob-la-da," "A Day in the Life," and so on (As an aside: Nat "King" Cole is an "early influence" but Sinatra isn't? Can we check with someone on this? And maybe take a look at why Little Walter is in only as a sideman but Willie Dixon is an early influence and John Lee Hooker is a performer?)
Add a subjective standard like "Hall of Fame"-level greatness to a subjective field of artistry like music and you get something that's pretty much The Rock and Roll Hall of Fame because it calls itself that.
If I believed in the concept of a Rock and Roll Hall of Fame, which I kind of join the Sex Pistols in not doing, I would suggest that the criteria would be trying to imagine the modern music industry without the artist under consideration. So, try to picture what modern music would be like if there had been no Elvis. Would it look at all like it does now? Probably not. Same thing with the Beatles. Or the Who, or Muddy Waters, Ray Charles and so on.
But how would modern music be all that different if Agnetha, Benny, Björn and Anna-Frid hadn't told us that we could dance, we could jive, having the time of our lives? Sure, there'd have been no Mamma Mia Broadway show and movie, but I don't know if that's a bad thing.
By that criterion, we can pitch a bunch of inductees. Say goodbye to the Dave Clark Five, the Lovin' Spoonful and Billy Joel. We can bar people we shouldn't be talking about; whether talented or not, they weren't that influential. Obviously, this being the Rock and Roll Hall of Fame, the idea of some kind of morals component like the one that keeps Pete Rose out of the baseball Hall of Fame doesn't apply. So uber-creep John Phillips with the Mamas and the Papas gets to stay. And we pretty much start to close the doors on new inductees starting right about this year. Beginning with the mid-80s, the idea of a band or performer that has a major, art-form-altering impact kind of goes away. Nirvana, Smashing Pumpkins and maybe Pearl Jam get in, but we should be ready to face years of no inductees (but since that would mean no $25,000-per-table banquet, we can't have that).
Of course the whole "things would be different if this person hadn't done this" is a high bar. My own CD collection would shrink considerably if I applied it there, but I'm not purporting to run a Hall of Fame.
Does the Arcade Fire rate? Tell ya in 2029. How about indie darling My Morning Jacket? Ask me again in 2024. Radiohead? Hmmm...we might learn that starting in 2018. Coldplay? I'll pretend you didn't ask.
Sure, you could say, Britney Spears had that define-the-form change effect on pop music. But any little pop tart who could halfway sing would have had the same impact; there was nothing special about Spears that meant if she hadn't done it, nobody would have. Same goes for the boy bands of the 1990s and early 2000s.
A Hall of Fame implies links to traditions and roots in a past. Rock and roll works almost the exact opposite way. When I was a kid, so-called oldies stations played Berry, Charles, Elvis, Jerry Lee Lewis and so on. Then younger baby boomers hit their 30s and became nostalgic for the music from their teens and early 20s, but they didn't like the idea of their music or themselves being old, so they invented the label "classic rock."
Now classic rock includes music 20 years newer than the 1960s rock and soul that it originally meant, and "oldies" has dropped its 1950s-era artists and much of its 1960s catalogue for stuff that had its heyday during the bicentennial. Bob Seger may have sung that "Rock and roll never forgets," but his 2006 single "Wait for Me" didn't crack the top 50 in the US country charts and stalled at 16 on the adult contemporary charts. It didn't make the overall Top 40 at all.
As much as I love quite a bit of rock (and pop) music, I recognize that it's always been kind of ephemeral. And if fame indeed is as fleeting as we've been told, the concept of fame layered onto the Memento-short memory of modern music turns a Rock and Roll Hall of Fame from an institution that honors important contributions to a significant modern art form into just another thing rock fans can argue about.
Which may have been a better idea anyway.
Sunday, September 27, 2009
Monday, August 10, 2009
Where Now the News?
Former CBS anchor Dan Rather, a man with more than 40 years in the news biz, would like President Obama to convene a blue-ribbon panel to study the industry's current problems and recommend solutions. He made that call in a speech last month, and then repeated it in a column in Sunday's Washington Post.
Rather notes the increasing financial problems faced by newspapers and major television networks. Newspapers face dropping circulation and ad revenues. Network newscast viewership shrinks along with overall network audience losses. He's pretty persuasive that a big share of the problems of what he calls "the news infrastructure" can be traced to the collapse of the newspaper industry. Classified ads have almost disappeared, and other local ads also have migrated to the internet in large numbers. That's a revenue stream newspapers have to have in order to survive; subscriptions have never come close to paying for running a paper.
I think he's also accurate that most of the other news sources that are replacing newspapers in people's lives still depend on newspapers for a lot of the newsgathering they do. For example, the bloggers that people read are often commenting on or passing on information they picked up from a paper (like now, f'rinstance). Newspapers are often the only outfits assigning reporters to things like city councils, county commissions or school boards every time they meet, not just when a video-friendly confrontation breaks out.
Of course, Rather's call for help is a little ironic, given that his journalism career was spent mostly in television. The ad revenue-driven profit motive he complains about invaded news gathering through the TV screen and then began affecting other areas. The tiny attention spans of many folks and their disinterest in stories that don't directly affect them right now stems from television news.
And the call for some sort of panel of experts to diagnose the problem and recommend treatment brings a chuckle. For one, that kind of plan is also dinosaur thinking. By the time such a commission met, hashed out the problems and came up with solutions, the situation they addressed would be over. For another, we already have a panel in place that's identifying problems and offering solutions and new ways of doing things: The public.
Let's take classified advertising, for example. Does the current model of "junk for sale," arranged in eye-blurring echelons of tiny type work? I don't know what a commission of experts would say, but the fact that people aren't buying those kinds of ads suggests that it doesn't. What does work? Things like Craigslist and appearances in search engines, which people use to get the word out about businesses or individual things they may have to sell. Could newspapers set up a new Craigslist that did the same thing but which paid them a profit? Probably not, but they might be able to work with the people established in the field to find arrangements that helped them out as well.
How about the way newspapers present information -- does that work? Well, since people don't buy or read them as much as they have in the past, I'd guess no. What does seem to work? Well, people will read information online when it's free, so maybe there's a way to promote reading the news for free but somehow link it to page views and create an attraction for sponsors that would cover costs of producing news.
I used to be a journalist, and one of the things I remember about my attitudes of that time is reflected in Rather's suggestion of a panel of experts to study the issue, supported by the President. Despite our well-developed and eternally cultivated skepticism, we trust institutions and systems. Maybe not the way they're working now, but we imagine them the way we think they're supposed to work. In fact, we might see our work of reporting on them as a help towards that ideal. If there's a panel of experts involved, they will develop a new system that will solve the problems, or at least tell us how to solve them.
But institutions and systems can't react with the speed necessary to work in a digital world. Right now, it looks like the arena we call the free market is the only one that can process information fast enough to monitor the changes that are happening and the responses they require. Newspapers, for example, used to base their share of the news game on the twin pillars of speed and accuracy. A newspaper got information out quickly and got it right, or else it didn't survive. The internet means papers are rarely quick enough for people any more. Their own websites will update stories that the paper itself published, and do so often enough that the story in print might be out of date by noon.
As for accuracy, demagogues on left and right have spent enough time assailing media bias that most folks take what they read with several grains of salt and we all know we're supposed to reduce our salt intake. Media people themselves have done plenty to help that image -- when the New York Times has to run seven corrections to the Walter Cronkite obit, then who knows what other kinds of boo-boos slip through?
I love reading a paper. I love sitting down with one, unfolding it, scanning headlines, digging into a story, taking some time to process it along with a sip of beverage, flipping the pages and wrestling with them to fold right again, setting it aside knowing I can pick it up later any time I want, learning stuff I didn't know about places or people I'd never heard of, comics, the rattling sound paper makes when you move it or turn pages, the way I learned when I was young to use one finger as a fulcrum to fold it in half and tuck under my arm, the image I get of Al Bundy doing exactly that with a smile on his face as he heads upstairs to reclaim his bathroom...
And I love how they provide a depth of information and context that TV can't match (and which anchor personalities Chip Cappedteeth and Brenda Botox most likely wouldn't understand anyway). Time has changed, the calendar pages have turned and some of those things I love about newspapers are probably going to become part of the past. Rather's right that this situation can be seen as a crisis. There are important dimensions of news we won't get if newspapers leave. But he's also wrong, because they're far too important to leave up to a panel of experts picked by the same kind of people who run the Post Office or Department of Motor Vehicles.
Rather notes the increasing financial problems faced by newspapers and major television networks. Newspapers face dropping circulation and ad revenues. Network newscast viewership shrinks along with overall network audience losses. He's pretty persuasive that a big share of the problems of what he calls "the news infrastructure" can be traced to the collapse of the newspaper industry. Classified ads have almost disappeared, and other local ads also have migrated to the internet in large numbers. That's a revenue stream newspapers have to have in order to survive; subscriptions have never come close to paying for running a paper.
I think he's also accurate that most of the other news sources that are replacing newspapers in people's lives still depend on newspapers for a lot of the newsgathering they do. For example, the bloggers that people read are often commenting on or passing on information they picked up from a paper (like now, f'rinstance). Newspapers are often the only outfits assigning reporters to things like city councils, county commissions or school boards every time they meet, not just when a video-friendly confrontation breaks out.
Of course, Rather's call for help is a little ironic, given that his journalism career was spent mostly in television. The ad revenue-driven profit motive he complains about invaded news gathering through the TV screen and then began affecting other areas. The tiny attention spans of many folks and their disinterest in stories that don't directly affect them right now stems from television news.
And the call for some sort of panel of experts to diagnose the problem and recommend treatment brings a chuckle. For one, that kind of plan is also dinosaur thinking. By the time such a commission met, hashed out the problems and came up with solutions, the situation they addressed would be over. For another, we already have a panel in place that's identifying problems and offering solutions and new ways of doing things: The public.
Let's take classified advertising, for example. Does the current model of "junk for sale," arranged in eye-blurring echelons of tiny type work? I don't know what a commission of experts would say, but the fact that people aren't buying those kinds of ads suggests that it doesn't. What does work? Things like Craigslist and appearances in search engines, which people use to get the word out about businesses or individual things they may have to sell. Could newspapers set up a new Craigslist that did the same thing but which paid them a profit? Probably not, but they might be able to work with the people established in the field to find arrangements that helped them out as well.
How about the way newspapers present information -- does that work? Well, since people don't buy or read them as much as they have in the past, I'd guess no. What does seem to work? Well, people will read information online when it's free, so maybe there's a way to promote reading the news for free but somehow link it to page views and create an attraction for sponsors that would cover costs of producing news.
I used to be a journalist, and one of the things I remember about my attitudes of that time is reflected in Rather's suggestion of a panel of experts to study the issue, supported by the President. Despite our well-developed and eternally cultivated skepticism, we trust institutions and systems. Maybe not the way they're working now, but we imagine them the way we think they're supposed to work. In fact, we might see our work of reporting on them as a help towards that ideal. If there's a panel of experts involved, they will develop a new system that will solve the problems, or at least tell us how to solve them.
But institutions and systems can't react with the speed necessary to work in a digital world. Right now, it looks like the arena we call the free market is the only one that can process information fast enough to monitor the changes that are happening and the responses they require. Newspapers, for example, used to base their share of the news game on the twin pillars of speed and accuracy. A newspaper got information out quickly and got it right, or else it didn't survive. The internet means papers are rarely quick enough for people any more. Their own websites will update stories that the paper itself published, and do so often enough that the story in print might be out of date by noon.
As for accuracy, demagogues on left and right have spent enough time assailing media bias that most folks take what they read with several grains of salt and we all know we're supposed to reduce our salt intake. Media people themselves have done plenty to help that image -- when the New York Times has to run seven corrections to the Walter Cronkite obit, then who knows what other kinds of boo-boos slip through?
I love reading a paper. I love sitting down with one, unfolding it, scanning headlines, digging into a story, taking some time to process it along with a sip of beverage, flipping the pages and wrestling with them to fold right again, setting it aside knowing I can pick it up later any time I want, learning stuff I didn't know about places or people I'd never heard of, comics, the rattling sound paper makes when you move it or turn pages, the way I learned when I was young to use one finger as a fulcrum to fold it in half and tuck under my arm, the image I get of Al Bundy doing exactly that with a smile on his face as he heads upstairs to reclaim his bathroom...
And I love how they provide a depth of information and context that TV can't match (and which anchor personalities Chip Cappedteeth and Brenda Botox most likely wouldn't understand anyway). Time has changed, the calendar pages have turned and some of those things I love about newspapers are probably going to become part of the past. Rather's right that this situation can be seen as a crisis. There are important dimensions of news we won't get if newspapers leave. But he's also wrong, because they're far too important to leave up to a panel of experts picked by the same kind of people who run the Post Office or Department of Motor Vehicles.
Saturday, March 28, 2009
Grumbly 100th, Nelson!
Today marks the 100th anniversary of the birth of writer Nelson Algren, author of The Man With the Golden Arm and A Walk on the Wild Side, two of the 20th century's top noir novels (Clicking on the above link will take you to the page of the Nelson Algren Committee, which features a booking photo of the author. That's not a guy who has a "happy" birthday).
Though born in Detroit, Algren grew up in Chicago and set Golden Arm in the seedy neighborhoods and taverns amongst which he grew up. His book Chicago: City on the Make also explored those areas and the people who lived there in ways that made no friends at the Chamber of Commerce. Golden Arm told the story of Frankie Machine, a morphine addict who was also an amazing poker dealer. The movie version, directed by Otto Preminger, earned Frank Sinatra an Oscar nomination. Algren hated it and sued Preminger for changing the story.
Walk, published in 1956, follows drifter Dove Linkhorn from Texas to New Orleans and back, wading through a sea of pimps, hookers and other assorted undesirables. It opens with a description of Dove's father Fitz, a man whose belief that someone somewhere was cheating him was so ingrained that he "felt that every daybreak duped him into waking and every evening conned him into sleep." An apter description of some people who feel the world owes them something that it's not giving them I've yet to read.
What often fascinated Algren was how people who had little or nothing -- and for whom a whole lot of what they had was poisoned -- tried to retain some sense of their own humanity as they scratched and fought for the means to continue their spare and even sordid existence. He seemed much less interested in why folks with everything sometimes went bad and far more interested in why folks with nothing sometimes kept trying to be good, and he used his novels to try to call attention to the attempts, shredded or otherwise, of those living in what he called "the neon wilderness" to live with some level of dignity, compassion and love. More than one Old Testament prophet might have been able to read Algren with understanding. And us folks who also use the New Testament of our Bibles might find frequent mentions of a fella who kept reminding us that the least, the last and the lost have a place in God's heart as well.
Unfortunately for Algren, Walk was a close look at things that lots of people of the time didn't want to take much of a close look at. Reviews bashed the novel and the adoration that came to him following Golden Arm turned to disgust and then oblivion. He kept writing, but without much impact. An affair with Simone de Beauvoir led to repeated frustration and loneliness, as earlier affiliations with the Communist Party prevented Algren from getting a passport and living with her in France as he wished. After moving to New Jersey in 1975 and Long Island in 1980, Algren died of a heart attack in 1981. This was well before a re-examination of his work gained him some approval and before the trend of "anniversary editions" of books could have offered him some renewed approval.
But it's likely that he wouldn't have found a lot of approval in a society that gushes oceans of ink, virtual and otherwise, over the lives and loves of actors, actresses and their bizarre homunculi, "reality show" stars. Or that folks who spend hours talking, writing and reading about people who've done nothing more than be the result of successful fertilization by properly wealthy sperm and ova and whose behavior would shame a cat in heat would care much about people who make less money, show less skin and have more sense.
Though the Nelson Algren Committee has been successful in getting his apartment named as an historical site, having the Nelson Algren Fountain built and in seeing all of his novels and short story collections come back into print, one of the honors the city of Chicago tried to give Algren didn't pan out. Evergreen Street was re-named Algren Street in his memory in 1981, but when the residents complained, the city changed the name back. Algren would probably have appreciated the fuss.
Though born in Detroit, Algren grew up in Chicago and set Golden Arm in the seedy neighborhoods and taverns amongst which he grew up. His book Chicago: City on the Make also explored those areas and the people who lived there in ways that made no friends at the Chamber of Commerce. Golden Arm told the story of Frankie Machine, a morphine addict who was also an amazing poker dealer. The movie version, directed by Otto Preminger, earned Frank Sinatra an Oscar nomination. Algren hated it and sued Preminger for changing the story.
Walk, published in 1956, follows drifter Dove Linkhorn from Texas to New Orleans and back, wading through a sea of pimps, hookers and other assorted undesirables. It opens with a description of Dove's father Fitz, a man whose belief that someone somewhere was cheating him was so ingrained that he "felt that every daybreak duped him into waking and every evening conned him into sleep." An apter description of some people who feel the world owes them something that it's not giving them I've yet to read.
What often fascinated Algren was how people who had little or nothing -- and for whom a whole lot of what they had was poisoned -- tried to retain some sense of their own humanity as they scratched and fought for the means to continue their spare and even sordid existence. He seemed much less interested in why folks with everything sometimes went bad and far more interested in why folks with nothing sometimes kept trying to be good, and he used his novels to try to call attention to the attempts, shredded or otherwise, of those living in what he called "the neon wilderness" to live with some level of dignity, compassion and love. More than one Old Testament prophet might have been able to read Algren with understanding. And us folks who also use the New Testament of our Bibles might find frequent mentions of a fella who kept reminding us that the least, the last and the lost have a place in God's heart as well.
Unfortunately for Algren, Walk was a close look at things that lots of people of the time didn't want to take much of a close look at. Reviews bashed the novel and the adoration that came to him following Golden Arm turned to disgust and then oblivion. He kept writing, but without much impact. An affair with Simone de Beauvoir led to repeated frustration and loneliness, as earlier affiliations with the Communist Party prevented Algren from getting a passport and living with her in France as he wished. After moving to New Jersey in 1975 and Long Island in 1980, Algren died of a heart attack in 1981. This was well before a re-examination of his work gained him some approval and before the trend of "anniversary editions" of books could have offered him some renewed approval.
But it's likely that he wouldn't have found a lot of approval in a society that gushes oceans of ink, virtual and otherwise, over the lives and loves of actors, actresses and their bizarre homunculi, "reality show" stars. Or that folks who spend hours talking, writing and reading about people who've done nothing more than be the result of successful fertilization by properly wealthy sperm and ova and whose behavior would shame a cat in heat would care much about people who make less money, show less skin and have more sense.
Though the Nelson Algren Committee has been successful in getting his apartment named as an historical site, having the Nelson Algren Fountain built and in seeing all of his novels and short story collections come back into print, one of the honors the city of Chicago tried to give Algren didn't pan out. Evergreen Street was re-named Algren Street in his memory in 1981, but when the residents complained, the city changed the name back. Algren would probably have appreciated the fuss.
Saturday, March 21, 2009
This Is the Way the Show Ends...
Not with a bang or a whimper -- more like a clunk. We'll be reviewing the series finale of Battlestar Galactica here, so anyone who hasn't seen it and doesn't want to be spoiled should stop reading now.
Starting with a miniseries in 2003, Ronald Moore's reimagined Battlestar Galactica grabbed and repelled fans of the original series and science fiction in general. The 1978-79 show starring Lorne Greene, Richard Hatch and Dirk Benedict sometimes made it to the level of entertaining camp, but spent a lot of time being silly without seeming to realize it. Moore, not saddled with the original's need to milk the Star Wars crowd for viewers, got rid of the flashing lasers, evil emperor-styled thrones and cute furry robot-dog and went for a good deal more realism in terms of the military angle. He also added quite a bit of philosophical layering to the show, which raised questions about what it meant to be human. And he used his different characters to explore a range of theological issues unusual for television, not to mention unusual for the Sci-Fi Channel, home of "Sci-Fi Original" movies like Flu Bird Horror and Alien Apocalypse.
Moore used the basic story from the earlier series. A scientist named Baltar betrayed humanity to the Cylons, artificial life forms who then nearly wiped out humanity in a sneak attack on its Twelve Colonies. The last warship, the battlestar Galactica, collects a few other ships and about forty thousand people and seeks a way to escape the Cylons. They search for a mythical lost planet, home to the 13th tribe of humanity, known as Earth.
But Moore added some touches to give the story depth. Humans had created the Cylons, who rebelled against their makers in an earlier war. The Cylons had also developed models that duplicated human beings down to the cellular level, who moved among the human population unsuspected by their enemies. He gave Galactica much more of a submarine claustrophobia atmosphere than the shiny Star Wars set it resembled in 1978.
Over the course of the show, human beings tried to learn the location of Kobol, the ancient origin world, to see how to find Earth. The civilian government, led by President Laura Roslin, was at times on the side of as well as opposed to the military leadership of Commander and later Admiral William Adama. Fighter pilots like Lee "Apollo" Adama and Kara "Starbuck" Thrace played roles in both the military story and character and plot development of the show. Cylons were found in the very midst of the ship's crew -- no one could assume they were safe. And in the middle of all this was Gaius Baltar, the scientist who had given defense computer codes to a woman he thought was a corporate spy but who turned out to be a Cylon. Baltar's treachery was never discovered, and at times he was a political and religious leader while almost constantly haunted by a vision of the Cylon lover to whom he had given the key to humanity's defenses.
Recently, Galactica found Earth, but realized it had been destroyed many years ago. Humans found themselves actually allied with Cylons, some of whom had split from their main group following a civil war. The search for a permanent home was complicated by distrust of the new allies, an abortive mutiny and political coup, and the mystery surrounding Starbuck, who had disappeared while exploring a mysterious planet and then reappeared with the directions to Earth.
So we wind down to the last episode. I've read some responses that suggest it was brilliant and others that it stunk. I personally think it was not bad, but it had a lot of things that go clunk! in it.
The "rag-tag fugitive fleet" finds the planet we call Earth, only we learn that the entire series has taken place about a 150,000 years ago. The humans on the Earth they find are primitive, barely able to use tools. At first they plan on settling and living like they did on their own worlds, but Lee Adama argues they should leave all of their technological trappings behind and make a new start in this new world. Everyone agrees, and we see the characters start to settle in on this new world, which they will call Earth because it represents the dream they have had since the beginning. Laura Roslin, suffering from cancer since the show began, finally dies while flying with the man who has come to love her, Admiral Adama. Starbuck, now knowing she was some kind of ghost or angelic messenger of the god the show frequently refers to, disappears, leaving Lee Adama talking to himself. Baltar and the resurrected version of the Cylon to whom he betrayed all of humanity plan to begin a life of farming together. And so on. The angelic messenger version of this Cylon and Baltar himself show up again to tsk-tsk how human beings some 150,000 years later (our day) are once again trying to create artificial life and intelligence.
We do get a great money shot of the fleet moving slowly towards its destruction in the sun, arranged the way they were in the opening credits of the original series, with Glen Larson's original theme in the background.
So five years of show winds up as an eco-fable, which is silly but doesn't take away all of the great work that Moore, his writers and his cast have done. At least the dumb idea of getting rid of all the technology comes from Lee Adama, easily one of the least likable characters on the show other than Baltar and the chief Cylon villain, Cavil.
Plan to move in with a human population that has its own indigenous diseases and such without taking your modern immunization with you to protect them and you? Clunk!
Figure on developing agriculture all over again without benefit of modern tools or anything to make the agricultural implements needed to do that? Clunk!
Tory Foster's airlocking of Cally Henderson earns her a broken neck at the hands of Henderson's husband Galen Tyrol, but Baltar's complete betrayal of humanity, selling out the humans to the Cylons on the attempted settlement at New Caprica and collaboration with the Cylon regime that included signing execution warrants earns him...happily ever after with his Cylon babe? Clunk!
Human/Cylon hybrid child Hera is the potential savior of the human race as well as the Cylons, but she ends up being nothing special to the immediate survivors who settle on Earth. The Baltar and Cylon angelic messengers suggest that a news story about finding a genetic ancestor to all human beings, who is sometimes called "Mitochondrial Eve" or MRCA (Most Recent Common Ancestor) refers to Hera. Unfortunately, since the theory of the MRCA debuted in the late 1980s, developments in molecular science and DNA research have called it into question. Clunk!
In any event, though the show ending didn't live up to the promise of the first two or three seasons, I've had a whole lot of fun watching it and thinking about it. I've appreciated the fact that a TV show didn't shy away from asking questions about God and depicting characters whose religious faith strengthened them and fueled their hope, rather than some sort of serial killer psychosis. So safe voyage, Galactica. Thanks for the ride.
Starting with a miniseries in 2003, Ronald Moore's reimagined Battlestar Galactica grabbed and repelled fans of the original series and science fiction in general. The 1978-79 show starring Lorne Greene, Richard Hatch and Dirk Benedict sometimes made it to the level of entertaining camp, but spent a lot of time being silly without seeming to realize it. Moore, not saddled with the original's need to milk the Star Wars crowd for viewers, got rid of the flashing lasers, evil emperor-styled thrones and cute furry robot-dog and went for a good deal more realism in terms of the military angle. He also added quite a bit of philosophical layering to the show, which raised questions about what it meant to be human. And he used his different characters to explore a range of theological issues unusual for television, not to mention unusual for the Sci-Fi Channel, home of "Sci-Fi Original" movies like Flu Bird Horror and Alien Apocalypse.
Moore used the basic story from the earlier series. A scientist named Baltar betrayed humanity to the Cylons, artificial life forms who then nearly wiped out humanity in a sneak attack on its Twelve Colonies. The last warship, the battlestar Galactica, collects a few other ships and about forty thousand people and seeks a way to escape the Cylons. They search for a mythical lost planet, home to the 13th tribe of humanity, known as Earth.
But Moore added some touches to give the story depth. Humans had created the Cylons, who rebelled against their makers in an earlier war. The Cylons had also developed models that duplicated human beings down to the cellular level, who moved among the human population unsuspected by their enemies. He gave Galactica much more of a submarine claustrophobia atmosphere than the shiny Star Wars set it resembled in 1978.
Over the course of the show, human beings tried to learn the location of Kobol, the ancient origin world, to see how to find Earth. The civilian government, led by President Laura Roslin, was at times on the side of as well as opposed to the military leadership of Commander and later Admiral William Adama. Fighter pilots like Lee "Apollo" Adama and Kara "Starbuck" Thrace played roles in both the military story and character and plot development of the show. Cylons were found in the very midst of the ship's crew -- no one could assume they were safe. And in the middle of all this was Gaius Baltar, the scientist who had given defense computer codes to a woman he thought was a corporate spy but who turned out to be a Cylon. Baltar's treachery was never discovered, and at times he was a political and religious leader while almost constantly haunted by a vision of the Cylon lover to whom he had given the key to humanity's defenses.
Recently, Galactica found Earth, but realized it had been destroyed many years ago. Humans found themselves actually allied with Cylons, some of whom had split from their main group following a civil war. The search for a permanent home was complicated by distrust of the new allies, an abortive mutiny and political coup, and the mystery surrounding Starbuck, who had disappeared while exploring a mysterious planet and then reappeared with the directions to Earth.
So we wind down to the last episode. I've read some responses that suggest it was brilliant and others that it stunk. I personally think it was not bad, but it had a lot of things that go clunk! in it.
The "rag-tag fugitive fleet" finds the planet we call Earth, only we learn that the entire series has taken place about a 150,000 years ago. The humans on the Earth they find are primitive, barely able to use tools. At first they plan on settling and living like they did on their own worlds, but Lee Adama argues they should leave all of their technological trappings behind and make a new start in this new world. Everyone agrees, and we see the characters start to settle in on this new world, which they will call Earth because it represents the dream they have had since the beginning. Laura Roslin, suffering from cancer since the show began, finally dies while flying with the man who has come to love her, Admiral Adama. Starbuck, now knowing she was some kind of ghost or angelic messenger of the god the show frequently refers to, disappears, leaving Lee Adama talking to himself. Baltar and the resurrected version of the Cylon to whom he betrayed all of humanity plan to begin a life of farming together. And so on. The angelic messenger version of this Cylon and Baltar himself show up again to tsk-tsk how human beings some 150,000 years later (our day) are once again trying to create artificial life and intelligence.
We do get a great money shot of the fleet moving slowly towards its destruction in the sun, arranged the way they were in the opening credits of the original series, with Glen Larson's original theme in the background.
So five years of show winds up as an eco-fable, which is silly but doesn't take away all of the great work that Moore, his writers and his cast have done. At least the dumb idea of getting rid of all the technology comes from Lee Adama, easily one of the least likable characters on the show other than Baltar and the chief Cylon villain, Cavil.
Plan to move in with a human population that has its own indigenous diseases and such without taking your modern immunization with you to protect them and you? Clunk!
Figure on developing agriculture all over again without benefit of modern tools or anything to make the agricultural implements needed to do that? Clunk!
Tory Foster's airlocking of Cally Henderson earns her a broken neck at the hands of Henderson's husband Galen Tyrol, but Baltar's complete betrayal of humanity, selling out the humans to the Cylons on the attempted settlement at New Caprica and collaboration with the Cylon regime that included signing execution warrants earns him...happily ever after with his Cylon babe? Clunk!
Human/Cylon hybrid child Hera is the potential savior of the human race as well as the Cylons, but she ends up being nothing special to the immediate survivors who settle on Earth. The Baltar and Cylon angelic messengers suggest that a news story about finding a genetic ancestor to all human beings, who is sometimes called "Mitochondrial Eve" or MRCA (Most Recent Common Ancestor) refers to Hera. Unfortunately, since the theory of the MRCA debuted in the late 1980s, developments in molecular science and DNA research have called it into question. Clunk!
In any event, though the show ending didn't live up to the promise of the first two or three seasons, I've had a whole lot of fun watching it and thinking about it. I've appreciated the fact that a TV show didn't shy away from asking questions about God and depicting characters whose religious faith strengthened them and fueled their hope, rather than some sort of serial killer psychosis. So safe voyage, Galactica. Thanks for the ride.
Subscribe to:
Posts (Atom)