Tag Archives: science

DEATH ASTEROID FROM SPACE!!!!1!!!!!

Tomorrow, a lump of rock the size of an aircraft carrier will pass between the moon and Earth. Although there will no doubt be conspiracy theorists convinced this thing is going to hit us and do to humanity what another asteroid did to the dinosaurs, reputable agencies are adamant that there’s no danger of it slamming into us and triggering earthquakes, tsunamis and the collapse of civilisation.

Of course, they would say that. This blog couldn’t possibly comment.

 

 

*

 

 

 

 

 

 

 

 

* Please note – may not necessarily be accurate.

Ada Lovelace Day 2011: Telling Stories

20111006-192337.jpgI’m not a scientist.

This probably comes as no surprise, given the contents of this blog, but I am interested in the history of science, how discoveries impacted the society around them and vice versa. That’s why I’m interested in the subject of today’s commemoration.

Ada Lovelace was the daughter of Lord Byron, not that they had a relationship; discovering that she had a talent for maths (and earning the nickname ‘the Enchantress of Numbers’, she was the first to recognise the true potential of Charles Babbage’s primitive computing ‘Engines’ going so far as to figure out a prototype computer program. It never got used, more because of the limitations of technology, and somewhere along the line Ada’s contribution to the birth of computing became an oft-forgotten historical footnote.

And that’s why today is Ada Lovelace Day, an international day of blogging to celebrate female pioneers in science, engineering and computing whose stories aren’t as well recorded as their male counterparts.

For instance, the majority of staff at Bletchley Park were WRENs, working on breaking Nazi codes and operating some of the earliest electronic computers, such as Colossus; some of their memories are recorded here.

Those WRENs fall within something of a tradition, because before computers were computers, computers were people, with the term referring to a fairly menial role manually crunching numbers for navigational charts, scientific data and the like. One of these ‘computers’ was Henrietta Swan Leavitt who, while routinely counting data for Harvard College Observatory, figured out the basis of measuring distances between astronomical objects, which in turn provided evidence for the expansion of the universe. Not bad for $10.50 a week, although sadly you won’t be surprised to hear that she received no recognition for this until after she died in 1921.

The list of unsung female heroes of science goes on; Henrietta Lacks was an African-American woman who died of cervical cancer in 1951 – her cells turned out to be remarkably resiliant and became known as the HeLa line, used to make breakthroughs in research into AIDS, cancer and polio, amoung others; Rosalind Franklin did much of the research that lead to our understanding of the structure of DNA, but her research being published later than that of Crick and Watson’s and her early death at the age of 37 meant that her work has often been overlooked.

It would be cool if… Well, I was going to say if the next Steve Jobs was a woman, but a) it’s unclassily soon to be talking about the next Steve Jobs, and b) it’s best to concentrate on being the first you than the next anyone else. And yet there’s something in this – in the UK, men are almost six times more likely to be employed in SET occupations than women. As a UKRC research report states, “The under-representation of women in SET is increasingly seen as an issue affecting economic growth and productivity… Research suggests that diverse teams that include men and women are important to innovation and economic development.”

Novelist Neal Stephenson has written an article on ‘Innovation Starvation‘, about how we seem to have lost a sense of technological optimism and the resulting inspiration that leads us to carry out epic scientific and engineering projects. There are probably many reasons for this, but one seems fairly obvious – about half the population has become marginalised from contributing to a solution. The first programmer may have been a woman, but the general perception of computing is still that of a male-dominated industry, and that sort of perception has ramifications.

One of the potential solutions to this innovation starvation Stephenson has been involved in is a rediscovery of science fiction as a vehicle for big, inspirational ideas rather than an exploration of tech’s darker side. And maybe that will tie in to the growing visibility of women in SF fandom (another field of which there’s a false perception of it being over-whelmingly male). That puts an onus on many sci-fi writers, particularly those in more populist media like comics – write better female characters!

That’s not quite as simplistic as it sounds – we help form our society through the stories and narratives we tell, and, well, you can write education strategies till they’re coming out your ears, but I’m still willing to bet that more people have heard of Watson and Crick that Rosalind Franklin; more people have heard of Charles Babbage than Ada Lovelace. Maybe Ada Lovelace Day’s importance is simply in that we tell a wider range of stories and that they’re told well, inspirational and aspirational.

After all, Ada’s dad was a poet…

Gutenberg Burning: Information may want to be free, but that’s not the whole story

On Tuesday, Michael S. Hart, founder of Project Gutenberg, died at the age of 64. Project Gutenberg was my first exposure to ebooks, a fantastic resource making thousands of out-of-copyright texts freely available to anyone with internet access and a nice argument for the theory that information wants to be free.

However, just because all this information is out there, doesn’t mean we’re heading towards a techno-utopian society where the total knowledge of humanity is available on a smartphone and where we’re all enlightened and informed about, well, everything. Sure, we’ve got access to more information than ever before, but let’s be brutally honest – we don’t know what to do with it.

So yesterday the Republican presidential candidates held a debate where climate change sceptics get lionised as heirs to Galileo. A comic book espousing 9-11 conspiracy theories has been published by respected players in the medium. And, of course, there are the modern conspiracy classics achieving traction within the mainstream: Obama’s not really an American. We never went to the moon. Vaccines cause autism.

And I’m aware that, by dismissing these theories, I sound like a closed-minded reactionary. The evidence is stacked against those ideas, the data doesn’t support them, the information renders them urban myths and dogmatic paranoia, but none of that matters nowadays because, while information may or may not want to be free, stories, narratives and memes always have been; the wild is their natural home and we’ve just made it easier for them to spread.

Back in the day, storytellers used to control our narratives – bards, skalds, griots. Somewhere along the line, that control shifted from these outsiders towards power structures – governments, churches, institutions. They were all aware of the power of stories, the way in which information and data and history and people can be woven together to create narratives that inform and define our societies and communities.

The internet has changed all that, and that’s something to be grateful for – it’s great that bloggers, musicians and other creators can get their work out there without having to go through the byzantine structures of the publishing or music industries. At their best, Wikileaks are releasing some hugely important data that deserves to have consequences for those who’d rather keep it quiet.

But there’s a dark side to this. A public spat between Wikileaks and the Guardian has lead to the release of information that may threaten the safety of innocent people, activists and whistleblowers who should have had their names redacted but who may now be endangered because a small-p political narrative replaced a more moral storyline. Conspiracy theories take hold, demonising illegal immigrants, gay people, liberals, anyone existing outside the competing narratives warring for our attention. And we know this, we know these stories are often unsupported by evidence, but still they take hold – contradictory information exists, is freely available, but it doesn’t stick. Why would it? Anyone can edit Wikipedia. Anyone can write a blog. And who cares if something’s been peer reviewed, why should we trust the peers doing all that reviewing? Easier to assume the Enemy is wrong than go to the effort of reconstructing our narratives.

Of course, we did train and employ a whole bunch of professionals to help us find, sort and curate all this information we have at our fingertips. We called them librarians, but then we decided they were unnecessary and could be replaced by volunteers (if we were gracious enough to accept that we needed a library in the first place). And this is happening in the UK, in America, in Canada, in…

I don’t know where we go from here. I’d like to say we were heading for a more informed political discourse but it seems to be getting worse. Maybe the issue isn’t really about information wanting to be free anymore; after all, a lot of it already is. Maybe the real issue is the story we tell with that information, how we use it and communicate it. Like the people peddling lies and half-truths, it’s time to stop seeing information as the be all and end all; it’s the paintbrush, not the painting. And if we don’t grasp hold of that idea the pictures we paint may have disasterous consequences, for ourselves, our world and our communities.

On This Day in History: The Carrington Event

Nice article over at Wired telling the story of the Carrington Event, a solar storm that, in 1859, caused disruption and general craziness throughout the fledgling telegraph service. The story is interesting in itself, but you can’t help but think it should be the basis for a great episode of Doctor Who

T Minus 12 Days: The final curtain for the shuttle programme

If I could go back in time and witness any event of the 20th Century, I think the moon landing would be top of the list. Maybe it’s my inner geek, but the whole idea of looking up and seeing the moon, that big ball of rock that’s been Earth’s companion for billions of years, and knowing that human beings have actually been up there, knowing that technical ingenuity is capable of getting people over 300,000km from home and back again, with less computer processing power than most household gizmos (probably including my microwave)… It’s kinda awe-inspiring.

Earlier today, the space shuttle Atlantis launched for the last time, marking the end of NASA’s shuttle programme. It’s poignant – while the private sector seem to be picking up the reins of the getting-humans-into-space industry, the shuttle always seemed, to me, to be part of a lineage that included the Mercury and Apollo programmes. Somehow, with the end of the shuttle project, it feels even less likely that I’ll see a human being land on Mars in my lifetime. And sure, the shuttle was never going to get us there, but these things have a symbolic value…

But I’m biased. I want to see us go back into space. I know what people are saying – it costs too much, there are problems to be solved here on Earth. Well, yeah, but we haven’t gone beyond our galactic back garden for 37 years and you know what? Those problems still need fixing.

That said, the challenges of the 21st century seem more inward looking than concerned with heading further into space. The internet and social media seem to be rewiring society at the moment, not excitement over space travel. It’s the role of Twitter and Facebook in things like the Arab Spring and the collapse of the News of the World that generate column inches at the moment, and while that’s all fascinating, I don’t really think it counts as awe-inspiring.

“Space travel costs too much” is the refrain we always hear, but I want to know why it’s always staged as a choice between space exploration (and the resulting scientific advances) and, say, eradicating AIDS. Why is it never a choice between space exploration and dropping bombs on people? Why is it never a choice between space exploration and the money used to deal with the greed of bankers and dodgy MP expense claims and media giants, and any other instance of corruption you can think of? Why can’t we do something good at the expense of something bad?

I found it unsettling when I read in the book Moondust that only nine of the twelve men who walked on the moon are still alive. I think it’s because, as the book mentions, the moon landings are often seen as the last optimistic act of the twentieth century; well, we’re eleven years into the 21st and that optimism is still lacking. In a world that’s currently dealing with everything from revolutions to horrifying natural disasters, the idea of a major act of optimism is highly attractive. Part of me wants to see humans walk on Mars, simply because it would be a great historical act that doesn’t involve people killing each other.

Besides, I’m not sure my generation has had it’s moment to gather around – maybe Live Aid – and there’s another couple of generations below me that are in the same boat. Much as I think the Internet is hugely significant, in 40 years time I really hope I’m not sitting in front of a TV documentary celebrating Facebook. What’s our big moment going to be? I’d love it if it was putting a man or woman on Mars, but you know, I’d also love it if we cured cancer or wiped out Third World debt or pioneered a clean energy source that no-one’s even thought of yet. There’s got to be something more that can unite us beyond death and Simon Cowell.

We’re a clever species when we put our minds to it, but we seem to get locked into cycles of destruction. We seize on anything, be it religion, or politics, or race, or land to perpetuate the darker angels of our natures. Fundamentalisms that have forgetten the fundamentals add fuel to the fire, and while knowledge increases exponentially, I’m not so convinced about wisdom – to paraphrase Smashmouth, our brains get smart but our hearts get dumb.

Atlantis is due home in twelve days, and then the shuttles will be retired to museums. We’ll see what that means for space exploration in general, but it raises the question of where humanity goes from here – the role of the internet in society is major, but there have to be greater horizons to shoot for; the future shouldn’t be limited to 140 characters, and lifting off will always be more of an adventure than logging on.