Well, the "Mayan Apocalypse" passed without incident. Given that life is apparently going to go on, I'd like to take a minute to register some thoughts on another end-of-the-world (as we know it) theory popular among technophiles: the Singularity.
At it's most generic, the term "Singularty" refers to a point in the future at which change begins to occur so rapidly it's completely impossible to predict what will happen. It's the equivalent to a black hole's event-horizon, the point at which light can no longer make its way out. After that point, we have no idea.
In that simple context, it's an interesting question to ponder — at what point does our ability to predict the future become so poor as to be essentially worthless? I'd actually argue that the answer to that question is a lot sooner than most Futurologists think, but more on that later.
The problem is that the popular interest in in the Singularity is based on notions of accelerating computing power and the replication of human intelligence or a different kind of "Strong AI" which has the potential to self-evolve. Essentially, some kind of artificial mind takes the drivers seat for technological development, at which point all bets are off because it will move much faster than we can imagine. Maybe we'll be immortal. Maybe we'll become post-human. Maybe SkyNet will kill us all.
It's fun to speculate about such things, and I'm not arguing against futurism or science-fiction. I enjoy both quite a bit. However, I do see a number of somewhat obvious flaws in this increasingly popular gestalt that I feel the need to point out, if only to make way for more interesting or pertinent speculation.
Remember: it's the "End of the World as We Know It", not the End of the World