Saturday, June 27, 2009

"Male and female baboons form platonic friendships, where sex is off the menu."

Take heart, beta males. It's not just our species:

BBC Earth News: Female baboons exploit chaperones

Having a caring friend around seems to greatly benefit the females and their infants, as both are harassed less by other baboons when in the company of their male pal.

But why the males choose to be platonic friends remains a mystery.


"So we really don't know what these guys got out of the friendship, other than maybe spending time with a mum and a new baby and having other females seeing this."

The suggestion here is that by chaperoning a female in a platonic relationship, a male might advertise his parental skills to other females, who then might consider him a worthy partner. But as yet, there's no evidence for this or any other reason why males become chaperones.

However, for the females, the benefits of having a chaperone are clear.

"We found direct evidence that friendships provided a social benefit to mothers and infants," says Nguyen.

"We found that mother-infant pairs who spent a lot of time with their male friends received a lot less harassment from other females in the group, and the infants cried a lot less too, than pairs who spent less time hanging out with their male friends."

Friday, June 26, 2009

Buzz Aldrin to NASA: U.S. space policy is on the wrong track

Buzz Aldrin, the second man on the Moon, shares his vision for the future of space exploration, in an article for August's Popular Mechanics but available now online. (Story via Republibot; click there for the executive summary.)

As I approach my 80th birthday, I’m in no mood to keep my mouth shut any longer when I see NASA heading down the wrong path. And that’s exactly what I see today. The ­agency’s current Vision for Space Exploration will waste decades and hundreds of billions of dollars trying to reach the moon by 2020—a glorified rehash of what we did 40 years ago. Instead of a steppingstone to Mars, NASA’s current lunar plan is a detour. It will derail our Mars effort, siphoning off money and engineering talent for the next two decades. If we aspire to a long-term human presence on Mars—and I believe that should be our overarching goal for the foreseeable future—we must drastically change our focus.

See also: Why NASA is still playing catch-up to Star Trek and China.

I'm not feeling stimulated, Redux (Adventures in Job Hunting, Episode 3)

As a "due diligence"-type followup to my earlier post, I offer the following Star Tribune article: Jobs for teens get needed jump-start in Dakota County:

His is one of about 300 teen and young adult jobs created this summer in Dakota County with $410,802 from the federal stimulus package. That's the local share of the $17.8 million sent to the state to generate jobs for disadvantaged teens and young adults ages 15 to 24.

The infusion of youth job funding more than doubles the usual 180 jobs that the Workforce Service of Dakota County funds annually through a state program. The jobs at five public and nonprofit agencies are available to teens and young adults who meet limited-income requirements or have developmental disabilities.

And it includes the surprisingly pertinent note:

The Dakota County libraries saw 33 applications in four days when they posted the library shelving assistant jobs. They plan to hire 18 teens to put books back on the shelves this summer, two for each of the nine branch libraries, likely starting next week.

I'm still not sure how I feel about this. As the first commenter on the Strib article noted:

WOW! 300 part time jobs created at tax payer expense and only 3,000 permanent jobs lost daily in Dakota County. Now that's change you can believe in. What do these 300 jobs entail, checking tire pressure on cars at stop lights?

Thursday, June 25, 2009

Adventures in Job Hunting, Episode 2

"Hudson company seeking an Entry-level, Full-time Administrative Assistant"

*eager click*

"Sorry, but the page you are looking for is no longer available. Use the Quick Search to look for similar jobs."


Or this winner:

"Administrative Assistant, Religion and Philosophy (Carleton College)"

*even more eager click*


Making it even better, the caps and double exclamation points were already there.

Friday, June 12, 2009

Why NASA is still playing catch-up to Star Trek and China

Thomas P.M. Barnett, writing for Esquire, discusses the private space industry and NASA's future under Obama.
All I can say is, thank God we never created a NASA for airplanes. Otherwise, we'd have to suspend the entire space industry's operations for months on end after every crash, lapsing into periods of official mourning each time some "national hero" was lost in airspace. Forgive me, but compared to all the inglorious ways people die here on earth, there's nothing particularly noble about dying in space — even if nobody can hear you scream.

I know, I know: Space travel is infinitely more difficult and way more expensive than air travel. But you have to admit that, if not for the Cold War and the "race to the moon" and "star wars" and so on, we'd have a far larger and more accessible private-sector space industry than the puny one we've got now. That, and we wouldn't still be dicking around with those disco-age space shuttles.

Think of where we could be now if it wasn't for Washington's bureaucracy and "failure of imagination" strangling opportunities for development of the final frontier. Clearly there's a libertarian/pro-free-enterprise argument to be made here, but I'll trust my readers' intelligence and leave it to you to connect the dots.

Suffice to say that something drastic needs to be done soon to kick-start our space program, before China gets too far ahead for the United States and Europe to catch up. Space is a resource too important to be allowed to be monopolized by any one country.

Further reading:
Andrew Liptak. "Exploration vs. Scientific Modes of Spaceflight." Worlds in a Grain of Sand, June 10, 2009.
Joe Pappalardo. "Private Space to the Government: 'Get out of the way!'" Popular Mechanics, June 4, 2009.
Virgin Galactic, Wikipedia.

Wednesday, June 10, 2009

I'm not feeling stimulated: Adventures in Job Hunting, Episode 1

"These positions are funded with stimulus funds and are available to youth who live in Dakota County, are between the ages of 15-24 and have a low family income, special needs or other risk factors."

Check, check, ch-- Oh.

As Philip Fry once said, "The underprivileged get all the breaks."

Friday, June 5, 2009

Fox's new pilot Virtuality brings the holodeck to NASA

Via Andrew Liptak's Worlds in a Grain of Sand.

The pilot movie for Fox's potential new series, Virtuality, has been moved up to June 26, at 8 Eastern, presumably on Hulu soon thereafter. It's helmed by Ronald D. Moore, the man behind the Battlestar Galactica reboot, much of Star Trek: Deep Space Nine, and a decent chunk of Star Trek: The Next Generation.

This is the first I've heard of the show, but it seems the original airdate was July 4th, which even Fox realizes would be suicidal because even sci-fi nerds won't be inside in front of our TVs on July 4th. The press release:
The crew of the Phaeton is approaching the go/no-go point of their epic 10-year journey through outer space. With the fate of Earth in their hands, the pressure is intense. The best bet for helping the crew members maintain their sanity is the cutting-edge virtual reality technology installed on the ship. It's the perfect stress-reliever until they realize a glitch in the system has unleashed a virus on to the ship. Tensions mount as the crew decides how to contain the virus and complete their mission. Meanwhile, their lives are being taped for a reality show back on Earth in the World Broadcast Premiere of VIRTUALITY airing Friday, June 26 (8:00-10:00 PM ET/PT) on FOX.
I'll be honest: the description doesn’t particularly turn me on. The whole VR thing seems so ’90s now, with the element of "reality-TV in space!" added to make it topical to the 2000s, which are themselves almost over. And the ship is a clunkier Serenity with a satellite tower, but from the costumes it looks like they're going for a relatively near-future setting, so I can see why they opted for something that looks just slightly post-NASA, not a design like the Starships Enterprise (Starship Enterprises?). Essentially it's a series of holodeck episodes set on the ship from 2001.

Nevertheless I’m definitely willing to give this a try since any new (read: wholly original) space-travel sci-fi is very welcome right now, with Battlestar Galactica off the air and no Star Trek series foreseeable anytime soon. The first sentence of the release reminds me of J. Michael Straczynski’s short-lived Babylon 5 spinoff Crusade, where the mission was to search the galaxy for a cure to a nanovirus plague that would devastate the Earth within five years. (It turns out that Bill Lumbergh makes a good starship captain after all.)

io9’s script peek from May 2008 looks respectable as well, with some elements that make it sound a lot more interesting. I.e. the ship's doctor has Parkinson's Disease, which means they risk losing their doctor during the mission, but on the other hand the next mission attempt would not be for another 20 years.

It remains to be seen how many of those script elements have made it into the final series, of course. I'm just hoping this is one of those shows whose execution is better than it sounds on the label.

UPDATE (via sleepysheepie): The FutonCritic reviews the pilot, explains the various subplots, and deems the show "without a doubt worth your time."

Earth's first starship is The Phaeton. Its mission: search for extraterrestrial life around Epsilon Eridani, one of our nearest Sun-like stars. Its 10-year journey is being financed by The Consortium, a mega-corporation that hopes to make back its investment through various sponsorships, most notably a "Big Brother"-esque reality show about the ship's 12 astronauts.

True nerds will note that Babylon 5 was stationed above the third planet of the Epsilon Eridani system, although of course this star is located only 10.5 light-years away and has therefore appeared frequently in science fiction.

Thursday, June 4, 2009

This Day In History 1940: Winston Churchill

June 4, 1940: Winston Churchill delivers his famous "We shall never surrender" speech. Click here to listen to the full twelve-minute speech or download an MP3; the famous part starts at roughly ten minutes. It still gives me chills almost 70 years later:
Even though large tracts of Europe and many old and famous States have fallen or may fall into the grip of the Gestapo and all the odious apparatus of Nazi rule, we shall not flag or fail. We shall go on to the end, we shall fight in France, we shall fight on the seas and oceans, we shall fight with growing confidence and growing strength in the air, we shall defend our Island, whatever the cost may be, we shall fight on the beaches, we shall fight on the landing grounds, we shall fight in the fields and in the streets, we shall fight in the hills; we shall never surrender, and even if, which I do not for a moment believe, this Island or a large part of it were subjugated and starving, then our Empire beyond the seas, armed and guarded by the British Fleet, would carry on the struggle, until, in God’s good time, the New World, with all its power and might, steps forth to the rescue and the liberation of the old.

According to Wikipedia (fascinating if accurate):
In the most famous passage, beginning "We shall fight on the beaches..." and ending "...we shall never surrender", the assertions consist entirely of Germanic words descending from Old English, while the only French-derived word is the thing rejected: "surrender".

Tuesday, May 12, 2009

Book review: The Suicide Collectors by David Oppegaard

"The Despair has plagued the earth for five years. Most of the world’s population has inexplicably died by its own hand, and the few survivors struggle to remain alive. A mysterious, shadowy group called the Collectors has emerged, inevitably appearing to remove the bodies of the dead. But in the crumbling state of Florida, a man named Norman takes an unprecedented stand against the Collectors, propelling him on a journey across North America. It’s rumored a scientist in Seattle is working on a cure for the Despair, but in a world ruled by death, it won’t be easy to get there."

The author is an '02 grad of my college, so I showed up at a recent signing/reading on campus and scored an autographed copy.

I know that listing comparisons is lazy and inexact, but with so many parallels suggesting themselves so easily here it's a handy shorthand to convey the "feel" of Oppegaard's story--so bear with me.

As seen in the description, the setup is very much in the vein of Children of Men (though with the depopulation a few stages further along), as well as M. Night Shyamalan's The Happening. The plot is basically a post-apocalyptic road picture, with our protagonist Norman encountering various groups of survivors and overcoming hazards natural and manmade. The mysterious Collectors, who spirit away the remains of the fallen, reminded me of the Strangers in Dark City, and the descriptions of a crumbling, recently abandoned America evoked The Postman--the movie not the book since I haven't read it, and I want to note that I reference it not sarcastically but as a compliment, since I really enjoy that movie despite the detractors.

As io9 pointed out, it "may be the first novel ever to have a blurb from Marvel Comics' Stan Lee and reviews comparing it to Cormac McCarthy's The Road." (Incidentally, the film adaptation of the latter is scheduled for October 16.)

Verdict: It was OK but not great. You can definitely tell it was a first novel. It had a solid plot and some great individual elements, but didn't quite live up to the promise of the setup, and the prose itself varies between really good and extremely uninspired. I suppose vocabulary and simple sentence structure is silly to get hung up on, but that's me. (I can't stand Hemingway for the same reason, which is probably why I never became an English major. I grant that there's probably a lot of great stuff of his that I'm missing out on; if that makes me a Philistine, so be it.)

I give The Suicide Collectors 3.5 out of 5. Then again, my expectations were high because reviews were very positive--for example, it's been named a finalist for the 2008 Bram Stoker Award for Superior Achievement in a First Novel--so your mileage may vary. It's pretty short and is definitely worth a read if you enjoyed any of the plots/films I mentioned above, but I'd check it out of the library or wait for the paperback.

I will say that I am looking forward to seeing what else Oppegaard comes up with. His followup novel Wormwood, Nevada will be released in December 2009, also from St. Martin's Press.

I'd follow my dreams if I could figure out what they are...

Today's quasi-inspirational message is brought to you by xkcd. Click for full view.

Cras te victurum, cras dicis, Postume, semper;
dic mihi, cras istud, Postume, quando venit?
Quam longe cras istud? ubi est? aut unde petendum?
Numquid apud Parthos Armeniosque latet?
Iam cras istud habet Priami vel Nestoris annos.
Cras istud quanti, dic mihi, possit emi?
Cras vives? Hodie iam vivere, Postume, serum est:
ille sapit quisquis, Postume, vixit heri.

-- Martial, Epigrams 5.58

You always say, Postumus, that you will live tomorrow.
Tell me: when will it come, that "tomorrow" of yours?
How far off is it? Where is it? Or from where should it be sought?
Does it lie hidden among the Parthians or the Armenians?
Already your "tomorrow" has as many years as Priam or Nestor.
Tell me: for what price can your "tomorrow" be bought?
You say you'll live tomorrow? It's already too late to live today, Postumus.
He is a wise man, Postumus, who lived yesterday.

Monday, May 11, 2009

Even at 140 characters, you're not safe.

Spammers have found another way to invade even Twitter, no longer with the clunky mass-following method which I've encountered a couple of times already, but now deviously exploiting hashtags and the Trending Topics feature on the sidebar. Mashable's Adam Ostrow:
How? Simply include the trending term in your tweet. Then, anyone who clicks the trending topic will see your ad, for free. Today’s example comes in the form of “Apple Shampoo,” a song from Blink 182 that is being shared aggressively today because of a tweet from band member Mark Hoppus (@markhoppus).

Here’s an example: most of the tweets below really have nothing to do with the band or the song, but are rather an ad for some sort of affiliate marketing scheme. Visiting the offending user’s account on Twitter, it’s clear that they’re simply using Twitter to push affiliate links, and now exploiting trending topics to gain more traffic.

Incidentally, I also came across this older but still very solid guide to the benefits and use and abuse of Twitter over at The Lost Art of Blogging. (Yeah, I'm months behind the curve on this one. So sue me. That's where the "curmudgeon" and "antiquarian" bits in the header come in.)

Wednesday, May 6, 2009

"Nation ready to be lied to about economy again"

Satirical paper The Onion publishes another winner.
WASHINGTON—After nearly four months of frank, honest, and open dialogue about the failing economy, a weary U.S. populace announced this week that it is once again ready to be lied to about the current state of the financial system. [...]

"I thought I wanted a new era of transparency and accountability, but honestly, I just can't handle it," Ohio resident Nathan Pletcher said. "All I ever hear about now is how my retirement has been pushed back 15 years and how I won't be able to afford my daughter's tuition when she grows up."

"From now on, just tell me the bullshit I want to hear," Pletcher added. "Tell me my savings are okay, everybody has a job, and we're No. 1 again. Please, just lie to my face." [...]

"I know when he's telling the truth, and it bothers me," recently laid-off schoolteacher Mary Hanover said of Obama. "He gets this serious expression on his face and says things like, 'This is the worst economic crisis since the Great Depression.' Who needs to hear that? For Christ's sake, smile a bit and say we just found a diamond mine under Montana that's going to pay for everything. I'll believe you."

Thursday, April 30, 2009

Pixar's Up: the latest theatrical trailer

Sadly I still haven't gotten time to watch WALL-E. And I never bothered with Cars, having long since evolved past the stage where talking cars are cool--i.e., I turned seven.

But this looks like another winner from Pixar. I like the premise--it reminds me of William Pène du Bois' 1947 children's novel The Twenty-One Balloons--and, judging from this footage, it definitely has a great sense of understated comedic timing.

Up hits theaters May 29.

Socialism hits home for students in "Texas Tech" morality play

Here's one of those email forwards that boils an extremely complex issue into a little parable, so enjoy the message but take it with the grain of salt that it merits. This one makes an economic point that will resonate with students and anyone who has ever been a student (i.e. basically everyone).

An economics professor at Texas Tech said he had never failed a single student before but had, once, failed an entire class. That class had insisted that socialism worked and that no one would be poor and no one would be rich, a great equalizer. The professor then said ok, we will have an experiment in this class on socialism.

All grades would be averaged and everyone would receive the same grade so no one would fail and no one would receive an A. After the first test the grades were averaged and everyone got a B. The students who studied hard were upset and the students who studied little were happy. But, as the second test rolled around, the students who studied little had studied even less and the ones who studied hard decided they wanted a free ride too; so they studied little.. The second test average was a D! No one was happy. When the 3rd test rolled around the average was an F.

The scores never increased as bickering, blame, name calling all resulted in hard feelings and no one would study for the benefit of anyone else. All failed, to their great surprise, and the professor told them that socialism would also ultimately fail because when the reward is great, the effort to succeed is great; but when government takes all the reward away; no one will try or want to succeed.

Could not be any simpler than that....

It's quite clearly a fabrication, largely obvious from the lack of specifics as well as the fact that no college administration would allow such an experiment to continue once they got wind of it. Still, despite the obvious hyperbole in the last line, it possesses a kernel of truth that we all recognize.

The piece surfaced in March and has been making the rounds of blogs and conservative mailing lists, hitting my inbox from a Libertarian Party Yahoo group I belong to. You may even have seen it already. The piece even got a writeup on Snopes, who add that the tale is at least as old as 1994.

ABC joins Hulu, catches up to the 21st century

Upon closing, the agreement will enhance Hulu's programming line-up through the expanded online distribution of Disney's most popular current and library primetime series and library feature films. In particular, full-length episodes of hit current and library programs like Lost, Grey's Anatomy, Desperate Housewives, Private Practice, Ugly Betty, Scrubs, Greek, Hope and Faith, Less Than Perfect, Wizards of Waverly Place, Phineas and Ferb, Who Wants To Be A Millionaire, General Hospital, The View and The Secret Life of the American Teenager will soon be streamed on Hulu on an ad-supported basis. [...]

Jonathan M. Nelson, CEO of Providence, said "Hulu is creating significant value for users, advertisers and content owners. This balance, together with aggregated professional content and an expanding base of over 200 brand advertisers, is establishing Hulu as a compelling online video monetization platform. Hulu is a bright spot in the new media landscape."

Unfortunately for me, I don't watch a single ABC show, except occasionally Scrubs, but it's always good to see more networks abandon their shitty, laggy proprietary players to join the Hulu bandwagon.

And anyway, there's better stuff in their back catalog which will, I'm sure, make its way online eventually, such as Roots, Boy Meets World, Spin City, Eli Stone, Max Headroom, the Young Indiana Jones Chronicles (although with the DVDs only just released in 2007 and 2008, I'm under no illusions the eps will be posted anytime soon), Galactica 1980 (the original Battlestar Galactica is already on Hulu), even Back to the Future: The Animated Series.

Because, come on, Bifficus Antanneny absolutely needs to see the light of day again, am I right?

Friday, April 24, 2009

Kids and media: Part II

No in-depth commentary here--just today's Zits comic that seemed a nice followup to my earlier, slightly more intellectual post on communication for the next generation. Click for larger, clearer version and have a nice weekend.

Sunday, April 19, 2009

The power of suggestion

A very cool old commercial: very simple, yet well-executed and memorable.

It's a good reminder that a commercial doesn't need to be elaborate to be effective. I was also going to say it doesn't need to be expensive, but Pepsi probably spent as much getting Michael J. Fox as they spent on, say, this futuristic ad which has a much more striking production.

The YouTube link dates it at 1985, although this post says Michael left shooting Back to the Future III for three days to do these commercials, which would obviously put it a couple of years later.

Tech convergence FTW: Adventures in The New Media

Well, I've finally caved and signed up for Twitter, mostly out of curiosity.

I already had Facebook Mobile enabled, which meant I could update my status via text message. Now I hooked up Twitter to simultaneously update my Facebook status, and I can update my Twitter via text. So via a single text the same line shows up in my Facebook, Twitter, and the Twitter feed gadget on the side of the blog here. (I've been trying to get my Twitter RSS to merge right into my blog column, but no such luck.)

The verdict on this new toy is pending, although I had fun wasting the better part of two days combining Facebook, Twitter, Blogger, Gmail, texting, and Netflix (via a Facebook app which updates my newsfeed with Netflix activity) into a single superhighway of overshare. Next up: direct corneal feed!

All this resulted, of course in a paradigmatic anecdote of accidental tech overkill: I texted a test post to my Twitter ... simultaneously updating my Facebook status ... thus generating a text notification to my friend Seth's Bluetooth phone.

He was in the next room.

I wonder if this is how Alexander Graham Bell felt.

Friday, April 17, 2009

Kids and media literacy: A young adult's more optimistic perspective on the next generation

"I've never really been very interested in computers themselves. I don't watch them; I watch how people behave around them. That's becoming more difficult to do because everything is 'around them.' " -- William Gibson, February 2006*

It can be difficult to take the topic of childrens' media use seriously—there is a venerable genre of "sky is falling" articles decrying the decline in morals of the younger generation and the inability of their parents to deal with advancing technology and an overbearing media. Yawn. But as much as we might roll our eyes, we cannot deny that the pace of technological change in media has advanced more in the last couple of decades than in most of human history. It definitely bears studying.

One topic stood out for me: young children’s Internet usage—particularly in communicating with each other, posting user content, and access to information. As a member of the first computer-native generation, and one which is now beginning to reproduce, I hope to bring a slightly more nuanced perspective than the kneejerk Ludditism from the older generations of educators and child advocates.

The last of a breed

The current generation of college students is the last generation to really remember what it was like "before the Internet." The Gen-Xers (born 1965 to 1982) were older children by the time powerful home computers became widespread, and most of them were long past their formative years before the Internet had become what it is today. We are the transitional generation—the Web came into widespread use during our childhood and teenage years—thus we are the last one to understand firsthand its transformative effects.

Today’s middle- and high-schoolers are at the forefront of this wave. They have grown up with a mouse in one hand and a text-messaging cellphone in the other, but still appreciate the significance of recent advances such as the rise of YouTube, Wikipedia, and the near-ubiquity of broadband access which enables streaming video content and the uploading and downloading of massive files.

The cycle accelerates?

Our generation is starting to reproduce (scary, I know) and it will be interesting to watch what happens when a highly computer-literate cohort raises children of its own. Many current parents are astoundingly computer-illiterate, particularly when it comes to their children's web usage. Their ability to effectively to control the content to which their children are exposed is stymied by this technological barrier. They have enough trouble with television, movies, and music—areas in which they themselves are quite technologically conversant.

People will make the argument that filtering software and V-chips will never work, since kids tend to be more technology-savvy than parents and will inevitably get around the restrictions. This is true, but misses the point. Kids will always get their hands on forbidden content, but you don’t need to hand it to them on a silver platter. Resigning oneself to the fact that your eight-year-old will sneak a look at a dirty magazine or watch Saw V at the neighbor kid’s house is different from piping Laguna Beach and Nip/Tuck directly into his or her bedroom. Restricting content communicates the message that this sort of material is not suitable for them at their current age. (Some types of content, of course, is unacceptable for users of any age, and this message is something that all too many parents seem to have failed to pass along.)

The rebuttal, of course, is the "forbidden fruit" argument: that which is banned will only become more desirable. This is a fair point, but the idea here is not to be arbitrary and draconian, but to combine content restriction with talking to your children about what kinds of programs and images are acceptable/unacceptable, and the reasons for it.

Ubiquitous connectivity

The fact that today’s children are more browser-savvy than their parents is only one component of their technical literacy. There are wider sociological effects as well. Children who have been born in the last few years will never have experienced a world without broadband internet, cell phones, and other forms of connectivity. Because of this, they are increasingly growing up online as much as offline:
“Young students don't differentiate between the face-to-face world and the internet world," said Susan Patrick, who oversees technology for the [Department of Education]. "They were born into the age of the internet. They see it as part of the continuum of the way life is today.” (Wired*)
This manifests in many ways, but particularly in posting user content, interuser communication, and the ways they think about access to information.

There is a general trend among adult and especially teenage users for user-generated content to fragment into smaller and smaller pieces, updated increasingly often. We have gone from emails, to blogs, to instant messaging, to sites like Twitter where users post "tweets" or brief status updates of 140 characters or less.

And with cellphone and wireless access, it becomes possible to update in real time. Updates become ever more frequent as the complexity and size of the content shrinks correspondingly. I think we are approaching the smallest possible conceptual units, beyond which meaning begins to break down. Indeed, we are also approaching the point where it all blends into a continuous hum of background noise and whereas many of us are still entranced by its novelty, perhaps the next generation of kids will learn to tune it out from the beginning, simply as part of their "coping mechanism" the same way we learn to tune out extraneous conversations in a room.

The older generation likes to rant about Twitter, but the thing is, it's not for you. And anyway, the quality (or lack thereof) of its content is to blame on the users, not the medium itself.

A more intellectually significant, but still related, advance is the ubiquity of easily-searchable information on the Internet. Of course, databases and searchable catalogs have been around for decades, especially privately, but every year more and more information becomes quickly and publicly accessible. Examples include Google, Wikipedia, and any number of scholarly catalogs like JSTOR.

Wikipedia is something of its own beast and has several obvious drawbacks, so it is perhaps most useful as a portal guiding you to further sources, both online and in print. (This is how I've used it in my own academic and personal research.) For many, it is has become the first destination in information queries, but one would hope that most people don’t stop there.

As more and more content becomes uploaded to the web, and the web becomes accessible nearly everywhere (with the advent of Mobile Web on cell phones, Wi-Fi hotspots in restaurants and libraries, and municipal wireless networks going up around the country) the relationship between information access and physical location has been severed. This conceptual shift is important because it changes the way people learn. There is a realization that accurate information is only a wireless device away. This is still a novelty for many older users who are accustomed to having to physically hunt down a book to look something up. The next generation, and to a lesser extent my own, will soon take this for granted.

Media background noise

Furthermore, I argue that the astounding number of households with almost continuous television activity is evidence of a related trend:
Two out of three zero- to six-year-olds live in homes where the TV is usually left on at least half the time, even if no one is watching, and one-third live in homes where the TV is on “almost all” or “most” of the time; and children in the latter group of homes appear to read less than other children and to be slower to learn to read. ("Zero to Six: Electronic Media in the Lives of Infants, Toddlers, and Preschoolers"*)
A television left running all the time, even if no one is watching, may communicate to children the message that it is an appropriate thing to be absorbing continuously. (Conversely, it may end up as simply background noise to children, as noted above.) It is, in some ways, the opposite of the user-generated content, because television viewing is a passive exercise, especially if it's running while the user is reading or writing something else. And yet, at the same time, as television converges with the Internet, more and more of the content on the "tube" is user-generated:
Postmodernism conceived of contemporary culture as a spectacle before which the individual sat powerless, and within which questions of the real were problematised. It therefore emphasised the television or the cinema screen. Its successor, which I will call pseudo-modernism, makes the individual’s action the necessary condition of the cultural product. Pseudo-modernism includes all television or radio programmes or parts of programmes, all ‘texts’, whose content and dynamics are invented or directed by the participating viewer or listener (although these latter terms, with their passivity and emphasis on reception, are obsolete: whatever a telephoning Big Brother voter or a telephoning 6-0-6 football fan are doing, they are not simply viewing or listening). (Alan Kirby, "The Death of Postmodernism and Beyond"*)
Children now enter the world without preconceived notions of divisions between “passive” television content and “interactive” Internet content, or between “online” and “offline” experiences, and it will be interesting as time passes to watch how the next generation, growing up entirely within a wired context, will come to terms with the ever-increasing convergence of all these media forms.

What does it mean?

Our generation and those that came before it have had to integrate, to one degree or another, newly invented technologies into our existing lives, but the children of tomorrow will have all these tools available from the beginning, and no one yet knows how they will make sense of them or what uses they will put them to. But I'm definitely looking forward to finding out.

* Works Cited:

Associated Press. "Pre-schoolers Play Online." Wired, 4 June 2005.

Kaiser Family Foundation. "Zero to Six: Electronic Media in the Lives of Infants, Toddlers, and Preschoolers." 2003.

Kirby, Alan. "The Death of Postmodernism and Beyond." Philosophy Now, November/December 2006.

PC Magazine. "Q&A: William Gibson." February 2006.

Wednesday, March 25, 2009

Is Facebook the web's Wal-Mart?

So says Michael Brush on MSN Money:

As a one-stop shop that lets users easily build networks of friends to share news and photos, join groups and search for school and work buddies, it has the potential to bury MySpace, and other competitors the way Wal-Mart has busted local retailers.

In fact, even giants Google (GOOG, news, msgs), Yahoo (YHOO, news, msgs) and Microsoft's (MSFT, news, msgs) MSN might be getting nervous, because tools such as instant messaging and e-mail are built right in.

Brush notes, citing Nielsen, that time spent on social networks and blogs grew 63% from December 2007 to December 2008. Also, the fastest-growing age-group is 55-plus, followed by 45-54--this is not actually surprising, since pretty much everyone else under 55 is already on networking sites.

Brush's thesis is that Facebook's biggest advancement is not in sheer numerical growth (amassing of members and production of traffic), but the fact that as it grows it takes over more and more of the functions of other sites--IM chats, MySpace's music pages, etc.--as Brush notes, a one-stop shop. This is convenient for users, but by its very nature creates a monopoly that drives more specialized sites out of market share.

The difference, of course, is that Facebook is not a straightforward profit-making enterprise like Wal-Mart. It relies on ad dollars and more nebulous notions of network-building followed by profit:

In the end, however, Facebook knows so much about its users and has gotten so big -- so much like Wal-Mart -- that it's likely to find some way to make a decent profit. "When you gather a large enough audience, the means will come in terms of generating significant revenue from that," says Darren Chervitz of the Jacob Internet Fund (JAMFX).

Facebook is growing so fast that it might even be a threat to the Internet giants someday, one analyst says.

By 2012, Facebook could surpass Google for total worldwide unique visitors, predicts RBC Capital Markets analyst Ross Sandler.

One reason is that so many people now use Facebook as their starting point on the Internet -- instead of a search engine or a portal. Whether Facebook will actually hurt Google's profit margins, or produce Google-size profits, remains to be seen.

Facebook long ago locked the network-building phase of the project, now they just need to make sure they can keep getting money out of it. The more niches they conquer and the more market share they gain, the easier this will be--just as long as they don't bombard users with enough ads to kill the goose that laid their golden egg. But if Facebook really does get big enough to rival Google (something of which I remain skeptical) then networking users will have fewer alternatives to run to--at least until the next big thing debuts and the life-cycle starts over again.

Friday, March 20, 2009

Body-building cop's day in court turns ugly


Be careful how you set your mood on MySpace, your status on Facebook and never post dumb comments on video sites because it all can and will be used against you in a court of law.

Most of the article is made up of the fairly standard warnings not to incriminate yourself on networking sites, and noting that anything you post to the web will propagate pretty much forever. Hot news if this was 2006.

The really interesting part comes on the second page:

Nick Abrahams, a partner at the Sydney office of law firm Deacons who specialises in technology and media law, said the case reminded him about a famous New Yorker magazine cartoon which shows a dog at a computer accompanied by the words "on the internet, nobody knows you're a dog".

"... this anonymity just doesn't apply anymore. Everyone is accountable for their actions online now," he said. "The internet has come of age and the anonymity has gone."

Note that last bit.

The cartoon and its equally famous quote date from July 5, 1993, when the most sensational part of the nascent Internet was its ability to hide behind screen names. At worst, you can create a wholly alternate identity for malicious purposes, at best you simply tailor your online presence to showcase only your positive attributes. (On sites like Facebook, every bit of personal information has been put there voluntarily and is usually chosen with the intent to make the user "look good" or at least neutrally inoffensive.)

But even in the last few years there has been a speedy convergence of "the Internet" and "the real world" (particularly in today's children and teenagers who have grown up with the web and therefore don't distinguish the two as completely separate realms--more on this in a future post).

This is a result of media like Twitter, Facebook, Blogger, YouTube, email, and personal websites becoming increasingly linked, and indeed as Abrahams said, the comfortable anonymity of yesteryear has crumbled away with this increased connectivity.

It's always been the case that anything you post online can be traced back to you, but these days it's easier than ever--and to a large degree people have done it willingly. Increased transparency isn't necessarily bad, but as Mr. Ettienne found out, it certainly has its downsides.

Creative writing is not Mad Libs

I was reading through some archived posts on Orson Scott Card's website (Hugo and Nebula award-winning author) and came across this one on Themes and an even older one on Plagiarism, Borrowing, Resemblance, and Influence.

From the first one:

If the writer has a preconceived conscious plan for how to present a particular philosophical point, he will start to ignore his own unconscious ideas and will force the characters to act out his little allegory.* The result is: Bad fiction, and therefore an ineffective presentation of the theme. But if the writer shunts aside those preconceived plans, or subverts them deliberately (i.e., make THOSE ideas belong to a character that the audience is supposed to despise), that very humility leads the writer free to tap into his unconscious feelings and ideas about how the world works and what is worth telling tales about.

The reader who gets the story that truthfully and powerfully connects with the real world by way of the writer's unconscious understanding of it WILL find "themes" in the story. But they won't necessarily be themes that the writer was aware of, and will almost never be themes that the writer "put" into the tale.

*(Tolkien also disliked allegory, for the same reason--its artificiality.)

It's definitely worth clicking on through to the full columns. OSC also has some words about "literary writers who try to write about themes":

In a way, this is identical to "hack" work - trying to insert elements that will please a particular kind of audience. Most of the time, when these stories work at all, they do so, not because of the "plan" of the work but in spite of it, because of unconscious concerns that bubble up into the story and give it life despite the deadly story-killing "theme" elements that the writer consciously manipulates.

He explicitly says that the writer can't force themes into his work but has to trust that his unconscious will make the connections as he composes.

This is very similar to what he says about influences, in the other column. The first two-thirds is pretty standard stuff about plagiarism, working with sources, and so on. But when he gets to "derivative" creation and "the anxiety of influence," it really gets interesting:

The problem is that real influence is (or should be) unconscious. That is, because you have read certain writers whose stories have been thoroughly absorbed into your memory, you will unconsciously borrow motifs and ideas from those pivotal works without even realizing you're doing it.


Some novice writers, having absorbed utterly wrong lessons about what makes good writing, try to be "influenced" by writers they admire. This is not influence, however - it is borrowing. And it's legitimate, though it is customary to acknowledge your conscious borrowings.

If you want to follow in someone's creative footsteps, you can't just deconstruct their work to a list of representative attributes and try to string them together, or you won't get a coherent product. You'll get either something mechanically derivative or a Frankenstein mess.

OSC also notes:

Many writers, however, far from borrowing or seeking to be "influenced," are desperately afraid of inadvertent influence to the point of paranoia. Since every good idea has already been used, getting too anxious about such chance resemblances is a waste of time. Here's my rule: Any idea you really like that absolutely works for your story is your idea, no matter who else might have used it before.

Similarly, in the acknowledgements to Star Wars: Allegiance, Timothy Zahn noted:

Often a writer's mind functions like a giant food processor, taking in thoughts and ideas from everywhere and then mixing and matching the pieces until something new (or at least unrecognizable) emerges. On the rare occasions when we're actually able to trace something directly to its source, it's only right we acknowledge it.

Creative writing is not plug-and-play, it's not Mad Libs, but it is on the subconscious level an exercise in mix-and-match. Not to downplay the obvious importance of conscious composition, but on a subconscious level the creative person's brain combines fragments you've read or seen into a new configuration, and half the time you don't recognize the original ingredients until much later.

Finally, 18th-century author Samuel Johnson said the following:

When a man writes from his own mind, he writes very rapidly. The greatest part of a writer's time is spent in reading, in order to write: a man will turn over half a library to make one book.

Looking back at my own ratio of books written (zero) to libraries overturned, I can say he was definitely on the right track.

Thursday, March 19, 2009

The future is just like old age...'s always a few years ahead of wherever you are.

io9's Alyssa Johnson takes a look at sci-fi movies, TV, and books set in 2009 and how they stack up to where we are now.

To be sure, these retrospectives come out every year and are a dime a dozen, but the story caught my eye and it seemed a good enough way to start off the blog.

And come on, any list that includes Family Matters with Freejack and The Postman has got to be good. Right??

The Future Now: Science Fiction Set In 2009

It may be March, but that still counts as the start of the year, right? Let's take a look at what movies, television, and books have predicted for us in the days to come...