Bacteria. Eurgh. Microscopic bugs that we can’t live without. Icky as it is to contemplate (or amazing, if you’re one of those microbiology-fetishist sorts like Mr AllInTheGenes) the human body is home to bazillions of bacteria. That’s right, bazillions. On our skin, in our guts, in our mouths (eurgh eurgh), we wouldn’t function without these tiny troopers. The yoghurt adverts call these our ‘good’ bacteria. But for every Luke Skywalker there’s a Darth Vader — getting infected with ‘bad’ bacteria can cause food poisoning and meningitis and septicaemia and pneumonia and… you get the picture. What’s more, even the good bacteria can turn against us, Anakin Skywalker style, if they find themselves in the wrong part of the body, or if something happens that gives them an advantage over their neighbours. At the moment, we combat unwanted bacteria with antibiotics: compounds that stop the bugs from functioning properly. But as you are probably aware, a crisis is looming. The drugs have stopped working.
Three cheers for summer! The BBQ is ready to go, I’ve successfully run the gauntlet of self-image crises that is the yearly bikini shop and I’ve purchased music only truly suitable for a Californian road trip. I’m heading out into the British sun to cook myself like an overdone roast chicken and I can’t wait. I know I’m not alone here – one look around any workplace after a sunny weekend reveals red noses, reverse panda eyes, and the unlucky few who wince every time they sit down.
Why do we do this? We know that tanning is bad for us, yet we can’t seem to help ourselves. Is it a self-destruct mechanism common to the entire population? Not quite. Research published last week reveals that, to some extent, we actually cannot help ourselves. Tanning, it turns out, is quite literally addictive.
When you sun yourself, skin cells called keratinocytes produce a chemical called proopiomelanocortin, or POMC. POMC is then converted into a hormone that makes adjacent cells called melanocytes produce a brown/black pigment. Kind of like that time you embarrassed the girl next door and made her blush. This is tanning.
POMC can also be converted into another chemical, β-endorphin. This is one of our happy hormones. Endorphin is an opioid (as are opiates like heroin), which explains our love of being in the sun. And this makes good evolutionary sense, because before the age of supplements and vitamin drinks (Oh, VitHit, shall I compare thee to a summer’s day?), we humans needed to get our vitamin D from the sun, or else we’d get rickets and our bones would break and we’d fall down and get eaten by cheetahs, or perhaps by something more humiliating like weasels.
It’s one thing to be happy to be in the sunshine, but quite another to have an addiction. To test the addictive properties of sunlight, scientists from the USA exposed mice to UV for 6 weeks. They found that their mice became addicted to the ‘sun’. The mice show many classic signs of addiction that might normally be associated with drug or gambling problems. For example, mice exposed to UV had higher levels of β-endorphin than counterparts kept in the shade, and showed signs of opiod dependence after only 2 weeks. This included getting ‘the shakes’ when they were prevented from getting their hit of sunshine. Once showing signs of dependence, the mice needed more UV to get the same endorphin high each day, and adopted sun-seeking behaviours — a particularly striking finding, because mice are nocturnal, and so normally prefer to avoid light environments.
Of course, not many people get the shakes when they’re not in the sun. But there is some evidence to show that some people do have UV-seeking behaviours, which is suggestive of addiction. That person you know who says they’re addicted to sunbeds might not be exaggerating after all.
So tanning is addictive. This was great when we needed to get our vitamin D from the sun, and only lived for 40-odd years. But the average life expectancy in the UK is now over 80 years, and skin cancer is a real danger for those of us spending too much time in the sun. What to do? In this study, the scientists cured the mice of their sun obsession by deleting a gene called p53. The protein made by this gene is responsible for converting POMC to its tanning hormone and β-endorphin. The mutant mice couldn’t tan and did not become addicted to the sun.
Could we dampen p53’s activity in our own bodies to reduce the harmful effects of the sun’s rays? Err, no. p53 is perhaps the best known tumour suppressor gene in humans. This means that deletion or mutation of p53 might stop you tanning, but it will also make you more susceptible to breast cancer, bladder cancer, brain tumours… The list goes on.
There might be a less radical solution to the problem. The authors of this work suggest that, as always when it comes to protection from the sun, the answer is sunscreen. Our addiction, after all, is not to beautiful feeling of the warm sun on our faces (nice as that is), but to UV. Effective suntan lotions provide a barrier between the skin and UV.
So, we often find ourselves walking tenderly about the work place, trying not to rub any one painfully burnt area of our body against any other area. We repeatedly lay our pink and peely scalps down gingerly for a night’s unduly hot and uncomfortable sleep. Maybe we should remember that we now have one more reason to listen to Baz Lurhmann, and wear our sunscreen.
I love David Mitchell. Proper love him. Ever since the first episode of his sitcom, Peep Show, I’ve harboured an ill-advised crush that has been the cause of much hilarity amongst my friends. I loved him through the podgy phase, the slightly sweaty phase, the endless-voiceovers-on-dubious-tv-ads phase. My persistence paid off when he emerged, circa 2010, like a butterfly from a chrysalis – thinner, beardier, handsome, presenting left wing TV shows and marrying girls’ girl Victoria Coren. I was smug. I thought he could do no wrong.
He could. He did.
In a baffling rant in a recent column in the observer (ok, not so recent, but I’ve been busy, yeah, so deal with it), David Mitchell went crazy over the ridiculousness of telomeres (we’ll get to what telomeres are immediately after the ranting has finished). From what I can make out from his ramble, the main problem seems to be that they’re too ruddy complicated. Damn you, science! In a rather incoherent outburst, Mitchell first attacked journalists who attempt to simplify science with metaphors, then went on to bemoan the fact that science language is so complex that he can’t follow it. I think, although I can’t be sure, that his basic point is this – ‘I’ll never properly understand science, so what’s the point in trying? And why should I care anyway?’
But David – telomeres are amazing! Telomeres keep you alive! Telomeres won a Nobel prize for heavens sakes! And here’s why I think they’re so cool.
Halloween can be a stressful time of year for those who, like me, are fans of horror films
but also great big scaredy cats. So come October the 31st, after a movie marathon, you’ll most likely find me wide awake in bed in my 130 year-old flat, jumping at every creak and groan the decrepit pile of bricks makes. But there’s no real reason to worry, right? After all, we know that post-twilight era vampires are friendly, and
werewolves won’t be able to get through the front door, what with not having opposable thumbs. Anyway, there’s no full moon this Halloween, probably. But what about zombies? Surely the zombie apocalypse could never happen. Zombies don’t exist. Or do they?
Describe yourself to me in five words. I’m willing to bet that (once I’d eliminated all the rude and ‘witty’ responses) I could guess most of the things you would deem important enough to tell me. Name, age, occupation and sex are factors that most people use to define themselves. Arguably the most important of these is sex.
Let’s not beat about the bush – Alzheimer’s disease is a real bitch. The most common form of dementia, accounting for over 60% of dementia cases in the elderly, it’s estimated to affect more than 1 in 40 people over the age of 70. The symptoms are notorious – problems with memory, bouts of confusion, loss of liguistic skills, mood changes – and these changes can completely affect a sufferer’s personality or sense of self. It’s no wonder, then, that Alzheimer’s disease has such emotional connotations for so many people.
Alzheimer’s is currently a disease with no known cure. The latest drugs can, at best, temporarily halt the march of dementia in a certain subset of patients. After my granddad was diagnosed with probable Alzheimer’s, I began to wonder why this was the case.
Back in July, I received an email from a lovely woman named Jaclyn, asking me to take part in a campaign called the “What I See” Project. According to the press release, WISP, as we shall call it from now on, is ‘a global online platform that recognizes and amplifies women’s voices. Through each person’s unique and honest answer to the universal question “what do you see when you look in the mirror?”, women from all over the world can be empowered by relating to each other’s words.’ For the launch of the campaign, Jaclyn aimed to get 100 bloggers, including me, and 18 uber-successful ambassadors to talk about how they see themselves, in the hope of inspiring many more women to do the same. My first thought was ‘Empowered? Seriously? Are we successful businesswomen here, or are we the Spice Girls?’. My second was ‘Well this sounds like a very nice idea, and I’ll definitely do it because she’s promised that it will bring some more traffic to my blog, but I don’t see how a big group hug is going to help women overcome life’s prejudices’.
Now, I understand that making the above paragraph public is tantamount to admitting that what I see when I look in the mirror is a cynical, sarcastic cow – which probably isn’t too far from the truth – but bear with me, reader.
The UK government ploughs £4.6 billion a year into science and research programmes. Currently, decisions about how this money is spent are left largely up to the scientific community themselves, with the government determining how much of the budget is allocated to different sectors of science. Government funds are apportioned to 7 research councils, and panels of experts from the council examine all requests for funding in detail before awarding grants – a process called peer review. The principle that scientific experts are the best positioned to decide which projects deserve cash – the so called ‘Haldane principle’ – has been at the cornerstone of scientific policy for decades. However, the peer review process often happens behind closed doors, with little discussion with the public about how funding should be directed. In recent times, with many controversial scientific breakthroughs hitting the headlines, there are increasing calls for the public to be more involved in the decision making process. So, how much say should the public have in what science is conducted using their hard-earned taxpayer pounds? And how is public engagement with science changing?
Imagine a future in which memories can be implanted into your brain. Want to go to the Caribbean? India? Mars? Can’t afford the trip? No problem – if you have the memories of going, what’s the difference?
Of course, this is the plotline from recent sci fi remake Total Recall, and not a world we’re ever likely to inhabit. Or is it? It may sound implausible, but last month, scientists from MIT announced that they had achieved false memory implantation in mice.
At the Future of Humanity Institute in Oxford, a group of academics attempt to unravel the likely cause of the end of the world. The top contenders, so called ‘global catastrophic risks’, include the sci fi stalwart totalitarianism, cold war favourite nuclear war and Jeremy Clarkson bugbear global warming. Also on the list is the threat arising from misuse of biotechnology. In an interview with the BBC in March of this year, the director of the FHI, Nick Bostrom, stated that synthetic biology was a primary concern in this area (along with artificial intelligence and nanotechnology). With these technologies advancing at such a rate, he argues, we are not fully able to comprehend the potential dangers of the tools we develop. This was likened to ‘a dangerous weapon in the hands of a child’ by Bostrom.
Admittedly, these guys are paid good money to let us know that the end is nigh. They are bound to err on the side of caution. But they’re not the only people raising such concerns