Procrastination

is another word I like a lot. My dear old Dad, bless him, has often said that the word procrastination defines him. I think that’s rather unfair, really. Except for the Mr. & Mrs. Perfect out there, we all do it! So there goes, Dad, I never bought it!

Although I in fact have been really efficient today I started the day procrastinating. While David took Dane to school, I browsed through the news over coffee and stumbled over a couple of odd pieces. I managed to control myself and NOT start blogging about them first thing, but to DO WHAT I HAD TO DO first. Which was homework for my last course of this my last semester of my BA in library- and information science. The course is about building large websites (=corporate portals) and is quite techie, which suits me just fine. But because academia is academia (can’t think of a better explanation, sorry!) most of the texts are 7-8 years old. Which is perfectly OK if your subject is ancient runes or hieroglyphs or even if it’s WWII. But I just find it very, very hard to believe that the best stuff available about the building of portals and content management was written 7-8 years ago!

However, it’s done and my conscience is clear! So now, off to the odd pieces. There was this good one about how to tackle a project and get it over with, quickly. I needed that one! And this sad article from Washington Post about how Bush has rewarded his cronies:

Less than two weeks before leaving office, Bush made sure the senior aides shared a new assignment, naming them to an obscure World Bank agency called the International Center for Settlement of Investment Disputes.

One of the Guardian blogs has a very thought provoking post about what to do with that Afghan fellow, who’s clearly guilty of something, but who’s been tortured so badly that he’s been reduced to a head-case? The post is by seasoned Guardian journalist Michael White.

Those of you who know me personally will probably know that I was always a fierce advocate of the MMR vaccine. A “scientist” published a paper linking the MMR vaccine to autism. It was just the one paper, but it had all the ingredients of A STORY in the press. And it became huge. Suddenly everybody knew a child with autism who’d had the MMR vaccine. The fact that ALL children back then had the vaccine, also children with autism didn’t get in the way of this scaremongering story. When it was revealed that the “scientist’s” data were falsified and that there is NO link WHATSOEVER between the MMR and autism, this wasn’t at all A STORY. So there was nothing, or almost nothing, about this in the media that people actually read or watch. Which led to a huge drop in children who’d had the MMR. And now we see the result. A veritable measles epidemic. Try reading about measles and think that if it hadn’t been for that “scientist”, but primarily if it hadn’t been for the media who never seem to take responsibility for anything, all these children and teens wouldn’t have to suffer the dreadful complications to measles. The illness would most likely have been extinct! Here’s the story from the Sunday Times.

Sunday morning I read an article (no, not an article, an excerpt from this book) that truly scared me. The writer James Lovelock states that we’re too late to save the planet, so all we can do – as Brits – is to save ourselves from the hungry hordes, fleeing their over-heated or flooded homes! It came much too close to the article about the honey-bee I read only a week previously. Have we really come to the brink of our own extinction? And why are we all sitting back doing next to nothing? Probably because it’s just too much for our brains to handle! What I found even more scary than the prospect of living on a diet of strictly local produce and not enough of it in 2030, was his suggestion that we need a “strong leader” like Churchill to guide us out of this mess – democracy is no good in such dire straits. I shiver to even write it!

On a less dire note, here are some recent tech news. Amazon has launched a new version of the Kindle. I still want one and I still can’t have one. There’s no news about when this lovely gadget will be available in Europe. It’s something to do with the difficulty of finding an agreement with our multiple phone companies. Hmfff. I want it soon, and so, I think, does my husband. Look here how many books I’ve bought inside the last 3-4 weeks. Admittedly some of them are for course work, but as you can see, not all of them!

Which one should I start reading first? Dont say Jakob Nielsen, please!
Which one should I start reading first? Don't say Jakob Nielsen, please!

Here’s a funny one – I bet my oldest son will like it. It’s about bragging of your World of Warcraft skills in your resumé… I would say it depends on the job, really, if it’s a good idea or not!

Speaking of games, here’s an odd piece. I don’t play myself, so the thought hadn’t even occurred to me. But of course – in games that are so life-like there would have to be pregnancies. And it’s fun to read how they go about the deliveries etc. Thanks to Torill for the pointer.

Oh me, dinner is served, says husband. That’s so nice, I have to go! Sorry for this messy, messy post…

Share

Life as a busy bee and a crippling cold

have kept me from blogging. No running away from the busy bee, but must extend working day at other end! On Sunday I read a very thought provoking article in The Sunday Times, which they’ve been kind enough to publish online. It’s by another of the paper’s excellent writers, Bryan Apleyard and it’s about the possibility of actually proving the existence of an afterlife!

I guess that when someone close to you die or is close to dying, and when you yourself feel mortality creeping up on you, these things become important. I don’t particularly want to “go to Heaven”, but am no fan of the idea of just disappearing without a trace. I always wonder what atheists tell their children when someone close dies? “Your best friend got run over by a car and now he’s nothing.” It may be that I’m just a coward, but I could never say that!

Something along these lines is also this TED talk by a neuro scientist. You’ll have to bear with her absolutely horrible accent and just listen to what she actually says and the humour with which she says it. 

Share

Scatterbrain

I’ve always been a Scatterbrain. My memory is lousy, I have to write everything down and often I forget even that. My mind is always jumping ahead of the current situation – that’s super sometimes, but often it’s more than a little distracting. Today, when I was supposed to do two other things, I stumbled over an article…

I swear, I read the whole thing and my mind almost didn’t jump. I remember where it jumped to along the way, because due to the theme of the article, I made it my business to take note of my mind-jumps.

I was visiting this blog, which is a bi-product of some homework I’ve done for my course at uni. The blogger linked to the article in an ambiguous way, which made me click it. And once I’d seen the headline, I just had to read it. The fact that it’s in one of my all time favourite magazines, The Atlantic, of course made it even more palatable. The writer is Nicolas Carr. He has a blog, which after a cursory glance looks interesting, but demanding. The article is called Is Google Making us Stupid?

Here’s a few excerpts:

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

But every new technology has had an effect on our brain, as noted by Socrates:

In Plato’s Phaedrus, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue’s characters, “cease to exercise their memory and become forgetful.” And because they would be able to “receive a quantity of information without proper instruction,” they would “be thought very knowledgeable when they are for the most part quite ignorant.” They would be “filled with the conceit of wisdom instead of real wisdom.” Socrates wasn’t wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn’t foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).

In the end paragraph he returns to Kubrick’s 2001, which he quoted in the opening paragraph:

Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.

I don’t have such a gloomy view of my own thinking as Nicolas Carr. I acknowledge the disadvantages, but think that there must be some great advantages in being able to think “multilaterally” rather than “unilaterally”?

Back to where my mind jumped: At one point it jumped to a piece of Internet lore, which I’ve returned to many times: The Last Lecture by Randy Pauch, a university professor, who after being diagnosed with terminal cancer, gave a farewell lecture about grasping life’s opportunities, even in the face of death:

If any of you have not yet sat through it, you really must. He has a wonderful lecture for us all. It has been viewed 7 1/2 million times on Youtube! Why did my mind jump to that in the middle of this article? I don’t know!

Also, at the mention of Socrates, I though about something I’ve recently read by Aristotle (don’t worry, it was in connection with an essay for uni): “A speech (or document or whatever) consists of three things, the speaker, the subject which is treated in the speech, and the hearer to whom the speech is addressed” – logos, pathos & ethos. I thought of that because isn’t it so, that sometimes, you’re just very, very far from being “the intended audience” of a text – it’s either above you, beneath you or entirely irrelevant to you! When I read stuff like that I get distracted very easily… I’m afraid it happens rather frequently with academic papers for my courses. Sometimes I even think they don’t want me to read it. And certainly not to enjoy reading it.

And twice I suddenly remembered what it was, I’d set out to do, when I settled at the computer. Wrote it down – must do it when I’ve finished this post ;-)

And in the middle of the article I jumped to read about the writer. I knew I’d looked him up before, but had forgotten. I don’t think that’s something Google has done to my brain. I’m afraid I was like that years before the Internet entered my life (and that was in 1995, if anybody wants to know…).

Share

Here's another couple of reasons why you should vote for Mr. Obama if you're an American

Christopher Hitchens with whom I agree on very little, but who’s intelligence I most certainly admire, has this column on Slate. The below quote is his finishing lines. Before that he argues very convincingly – read it yourself!

This is what the Republican Party has done to us this year: It has placed within reach of the Oval Office a woman who is a religious fanatic and a proud, boastful ignoramus. Those who despise science and learning are not anti-elitist. They are morally and intellectually slothful people who are secretly envious of the educated and the cultured. And those who prate of spiritual warfare and demons are not just “people of faith” but theocratic bullies. On Nov. 4, anyone who cares for the Constitution has a clear duty to repudiate this wickedness and stupidity.

I received a message on Facebook from my old friend Lone Skovgaard about the power of being FOR something rather than being AGAINST something else. It is a very relevant point. So let it be noted that I’m

  • FOR a raised standing for America in the world √
  • FOR less aggressive meddling in other countries’ affairs √
  • FOR every American’s right to basic medical treatment √
  • FOR a tightened access to weapons in the US √
  • FOR intelligence and compassion in the White House √
  • etc…
Share

Wikipedia is cool

Lately I’ve been writing essays for a course I’m taking at Uni called “Source Reliability”. Readers of this blog will know that I’m rather keen on this subject. We get our essays accepted or not accepted – they aren’t graded. But the professor comments on them, and he liked my latest essay. It’s about Wikipedia and has a debacle between the science journal Nature and Encyclopaedia Britannica as its starting point. If you haven’t heard about the debacle, here’s what it says in Wikipedia (and it’s in fact quite a correct description):

On 14 December 2005, the scientific journal Nature reported that, within 42 randomly selected general science articles, there were 162 mistakes in Wikipedia versus 123 in Britannica. In its detailed 20-page rebuttal, Encyclopædia Britannica, Inc. characterized Nature’s study as flawed and misleading and called for a “prompt” retraction. It noted that two of the articles in the study were taken from a Britannica year book, and not the encyclopedia; another two were from Compton’s Encyclopedia (called the Britannica Student Encyclopedia on the company’s web site). The rebuttal went on to mention that some of the articles presented to reviewers were combinations of several articles, and that other articles were merely excerpts but were penalized for factual omissions. The company also noted that several facts classified as errors by Nature were minor spelling variations, and that several of its alleged errors were matters of interpretation. Nature defended its story and declined to retract, stating that, as it was comparing Wikipedia with the web version of Britannica, it used whatever relevant material was available on Britannica’s website.

Below find my essay – only edited slightly for use here (no footnotes etc.). If you can’t be bothered to read the whole thing – about a 1000 words – then scroll down to the bottom. There’s my tips for what to think about before you delve into a Wikipedia article.

The battle between Encyclopedia Britannica (hereafter EB) and Nature was intriguing – not least because it, in my view, is somewhat beside the point. Nature’s intentions were honourable, I believe, in letting their very informed readers know if it can be considered worthwhile – not safe – to use Wikipedia for anything. And they seemed to be rather baffled themselves at the result, that yes, it is worthwhile, also for the informed user, to consult Wikipedia. In my view the article did not try to put EB down.

One of the more interesting facts the investigation revealed was that the learned test persons were more sceptical towards the random articles than towards the articles within their fields of expertise. For reasons that I can’t quite understand, many teachers at all levels of the schooling system tell their pupils to NEVER use Wikipedia. Many times I’ve heard well educated and academically trained people say that they never use Wikipedia, because it’s completely untrustworthy. But upon inspection, they have never used it, so how is it that they know? Probably this is why the test persons were so sceptical towards the articles about subjects outside their intellectual comfort zone.

It is also interesting to notice the aggression and fervour with which EB responded to the article. A lot of their response may be correct in a narrow sense, but entirely beside the point, because the Wikipedia articles had had the exact same treatment. And the Nature article is actually quite critical about some things in Wikipedia – like the occasional rather poorly constructed articles and poor readability. This fervour may be related to the sad fact that academia frowns upon academics who choose to put their skills to use for the general public. Nature surveyed 1000 scientists, of which only 10% had ever helped updating Wikipedia. It probably doesn’t improve your academic career to invest time enlightening the public on your speciality.

And then there are all the things you can get from Wikipedia, which EB doesn’t give you. There are articles about every little town or village in the Western World, every politician, every pop group, every artist, every historical person, every technical term or gadget known to man – almost. And then there’s the freshness – the articles updated at the speed of light when events develop. Apart from the way they are created, these two factors are what really separates Wikipedia from EB. And why to some extent comparing them is a bit like comparing apples and pears. And access to EB is on subscription basis. In Denmark and here in the UK you can gain free access to EB via your local library. But unfortunately, most people don’t know this – or just can’t be bothered. In EB you cannot see when an article has been created or updated – or at least I can’t find it. And there are very few outside links and no references.

When I was a child we had two encyclopedias in the house: Lademanns and Gyldendals. I quickly discovered that Lademanns was best for looking up things to do with nature, science and geography because of the many, good colour photographs and illustrations. Whereas Gyldendal was best on history and literature, because the entries were better and longer. But, and this is the point, it never occurred to me to doubt the authenticity of any of the articles. And I wasn’t taught that at school either. I didn’t hear about source criticism (kildekritik) before high school (gymnasiet), where I had a history teacher (an elderly gentleman) who made it an issue. It was the first time I had ever heard of anyone questioning a source. Every time he gave us something to read, he asked us to consider who had written it, why he had written it and who we thought were the intended audience. This simple wisdom has stayed with me always and I try to remember to apply it to all things I read or hear.

The thing about Wikipedia, which could maybe teach many more Internet users source criticism, is exactly the knowledge of how it is written and (not) edited. One must always consider the fact that the article one’s looking at might just have been tampered with by some idiot or a person with malicious intent. Or that it’s written by somebody who has an overblown perception of her own knowledge. This is not a thought that automatically comes to mind when looking up something in EB or another “trusted source”. So I believe that the way Wikipedia is constructed actually encourages its users to be source critical. And that scepticism could even follow the user when she ventures outside Wikipedia and looks at other sources.

Quite often Wikipedia is an excellent starting point for research on a subject. Usually it becomes clear very quickly what kind of person or persons are responsible for a Wikipedia article. Some of them are clearly written by scholars or by extremely knowledgeable amateurs and their sources are often gold, when the goal is to move on to primary sources. Other articles are not so well written or edited and one instantly gets wary. That very often reflects on the sources, which will be few and erratic. And I believe this wariness and alertness to be very healthy for the users.

Setting aside the times I use Wikipedia to look up the full name of a pop star or the use of a technical gadget, I try to ask myself these questions while reading a Wikipedia article:

What kind of person wrote this?
Syntax, writing style, approach to subject. Is the faulty English because the writer doesn’t have English as her mother tongue or is it a warning sign?
Why did the person write this? Out of pride, to boast, for political/religious reasons or because the person honestly feels it is her duty to share her knowledge?
Does the article have the feel of having been worked over many times? If so, I check the history and debate pages.
What are the sources like? Are there many? Are they online, off line or a mix? How many of them are readily accessible (not necessarily online, but from a library)?
How sensitive is the subject? Can I maybe believe some parts of the article, but not other parts? This may be the case for quite a few historical articles, where basic facts are agreed on by everybody, but where historians disagree on the interpretation of certain incidents or documents. This is also the case for articles on pharmaceutical compounds.
Am I looking at a subject where recent events have led the article to be expanded or changed? The article about Sarah Palin is an obvious example. One can go back to the version of the article a couple of weeks before she was chosen as running mate for McCain and get an impression from that.

The above rules of thumb could very well be applied to most other sources as well. But with most other sources you can’t check the previous versions…

Share

BBC

Yesterday was the Times, today it is the BBC. Another love of my life – if you’ll allow me to go a bit overboard. On TED talks the creator of BBC online, Jonathan Drori, tells us quite a few things we thought we knew…

Why is it hotter in the summer than in winter?

See the video, if you want the answer. And don’t think you know it.

I wake up with the BBC every morning. Not on the radio, but as a news update on my phone. You can choose the areas you want info about, and – even better – not want to know about. that means that I don’t have to read one word about sports! I get some of my technology info from the BBC – the other day we watched (on TV, but you can see it online) an interview with Google’s first employee. He’s still there! And their science news are very good, as is the medical news.

That’s all from me today folks, I need a screen break…

PS: A few sports news I do like, and the fact that Murray beat Nadal in the US Open semifinals, made it to the main news. In 40 minutes time (that’s 10 o’clock PM our time) you can follow the final between Murray and Federer live on BBC Online – not as video, but as blog-like updates every five minutes. Quite cute.

Share

I knew it!

Daydreaming is good for you! I found this article in the Boston Globe via a new blog on The New York Times website, called Ideas. The article is long and thorough and does a good job of explaining the science behind this great news. There’s a funny paragraph where a distinction is made between two kinds of daydreamers:

However, not all daydreams seem to inspire creativity. In his experiments, Schooler distinguishes between two types of daydreaming. The first type consists of people who notice they are daydreaming only when asked by the researcher. Even though they are told to press a button as soon as they realize their mind has started to wander, these people fail to press the button. The second type, in contrast, occurs when subjects catch themselves daydreaming during the experiment, without needing to be questioned. Schooler and colleagues found that individuals who are unaware of their own daydreaming while it’s happening don’t seem to exhibit increased creativity.

“The point is that it’s not enough to just daydream,” Schooler says. “Letting your mind drift off is the easy part. The hard part is maintaining enough awareness so that even when you start to daydream you can interrupt yourself and notice a creative insight.”

I’m happy to say that I usually snap out of it quite easily. Which has probably to do with another trait of mine – the trait of being easily distracted…

Share