Archive | November, 2010

The demon is out of the bottle

November 14, 2010

4 Comments

Your desk at work, is it chaotic as mine, or clean and ordered? If the latter, I salute you, because it takes work to keep a desk tidy. Otherwise, chaos will soon reign. And while I admit that I should keep my desk cleaner (and no, I won’t share photos here), I have an excellent excuse: it is a fundamental law of nature that disorder and chaos are always increasing.

A measure of this disorder is a quantity called entropy. To clean your desk it takes work, which creates heat, which is energy that is wasted on the environment. So even though the entropy of the desk is reduced, overall it increases. And this is the second law of thermodynamics: on average, the entropy in the universe is always increasing, whatever the process.

Maxwell's demon uses thermal motion of particles to let them move up a staircase and then blocks their way back. The effect is the same as with a box: particles on top of the stairs are warm, and cold at the bottom. Credit: Mabuchi Design Office / Yuki Akimoto

Well, asked James Clerk Maxwell back in 1867, what if you have a box filled with gas of a certain temperature. The box is separated into two compartments by a wall that has a small door. The door is controlled by a small ‘demon‘ that lets fast-moving gas molecules go into the right half, and leaves slow ones in the left. The left box would cool down, and the right one heats up. Overall, the box is more ordered than before. If the demon itself doesn’t use up any energy (which can be done), entropy would decrease, right? But according to the second law of thermodynamics energy is needed to create order, and the demon wouldn’t use any. Is this then a violation of the second law?

Well, actually not. The reason is that there is energy in information. To store a bit of information a system like a computer memory needs to be put into a defined state, either a ‘1’ or a ‘0’. This reduces entropy. And the same is true for the two compartments in a box. We use the information whether a molecule is fast or slow to separate them. The energy stored in that information is used to reduce the entropy of the system. So we are fine, the second law won’t be violated.

The energy that is contained in the information is tiny. For a single bit the lower limit is on the order of 10-21 joules. In comparison, one calorie, the old energy unit often used for food, corresponds to 4.184 joules.

[…]

Continue reading...

Transparency in peer review

November 10, 2010

15 Comments

As an editor of a scientific journal, one of my key duties is to organise the peer review of submitted scientific papers. There, I ask other experts to take a look at a paper and let me know their opinion on technical correctness of their findings, and perhaps also what the importance and impact of a paper could be. The reviewers are aware of the identity of the authors, whereas the identity of the reviewers is not revealed to the authors.

The requirement to use peer review is not set in stone, but it has proven a very useful tool to assess a scientific paper. However, given the huge amount of work involved where scientists review each other’s work. Indeed, a lot has been said about the peer review process, whether it should be opened, or completely abolished etc. Here I just like to focus on the issue of transparency, which has been subject of a commentary by Bernd Pulverer from the European Molecular Biology Organization in last week’s issue of Nature. Access is free.

Historically, peer review as such is known for a long time, but is only systematically been used since around the mid-20th century. Certainly the very idea of peer review has been a new concept to Albert Einstein, when following peer review his paper on gravitational waves was rejected by Physical Review in 1936:

Dear Sir,

We (Mr. Rosen and I) had sent you our manuscript for publication and had not authorized you to show it to specialists before it is printed. I see no reason to address the — in any case erroneous — comments of your anonymous expert. On the basis of this incident I prefer to publish the paper elsewhere.

Respectfully,
Albert Einstein

And even later on, peer review has not necessarily always been done. The famous 1953 Nature paper by Watson and Crick on the structure of DNA has not been peer reviewed. Competition was very tough in this case, and as John Maddox, former editor of Nature, allegedly said:

The Watson and Crick paper was not peer-reviewed by Nature… the paper could not have been refereed: its correctness is self-evident. No referee working in the field … could have kept his mouth shut once he saw the structure

Clearly, peer review was not always considered necessary by scientists as well as publishers. To me, it remains essential. But I think the system could improve, and one area where this could be done with ease is its transparency. At the moment, the process is not fully transparent, neither to authors nor to scientists other than the reviewers. As it stands, there is a lot of implicit trust in the work of journal editors…

More transparency!

Bernd Pulverer’s commentary describes an effort to increase transparency at The EMBO journal. Since 2009, the journal has been running a trial where in case of publication the anonymized referee reports sent to authors, the editorial decision letters as well as the author’s rebuttals to reviewers are published as a supplementary file along with the paper. (disclaimer: Nature Publishing Group publishes this journal on behalf of EMBO).

[…]

Continue reading...

Real-time holographic video displays could be near

November 3, 2010

6 Comments

A refreshable holographic image of an F-4 fighter jet. Credit: gargaszphotos.com/College of Optical Sciences, The University of Arizona

Holograms may seem like an original invention from some science fiction films. A famous scene often mentioned in this context is that from Star Wars where Princess Leia records an important holographic message, ending with the words “Help me, Obi-Wan Kenobi“.

Such visions of holograms aren’t fiction. In a paper published in Nature, Nasser Peyghambarian, Pierre-Alexandre Blanche and colleagues from the College of Optical Sciences at The University of Arizona demonstrate a holographic system that is capable of displaying holograms at speeds approaching almost that of video capability. (and sure enough, they do mention Star Wars in the abstract of the paper…)

Holograms have been invented in 1947 by Dennis Gabor. They are made by shining a laser beam on an object and then recording the laser light reflected by the object on a photographic film. Simultaneously, a reference beam of the same laser is directly guided to the photographic film, where it causes an interference of the two beams. The interference pattern stored in the photographic film not only contains information on the light intensity (as in conventional photos) but also the phase difference between the two laser beams. The phase difference is a measure of the three-dimensional shape of the object. Together, intensity and phase contain the complete information of a light beam.

To recover the holographic image, the original laser needs to be used. Therefore, more practical ways of writing holograms have been develop and that do not require the original laser for viewing. Regular white light can be used instead. Although image quality for these holograms is not as good, they are widely used, for example on credit cards. Holograms can also be artificially created, without the use of an actual object, but by using a computer to calculate the necessary holographic interference pattern. Or information from a camera is digitally scanned and used to create a hologram elsewhere. “Holographic telepresence means we can record a three-dimensional image in one location and show it in another location, in real-time, anywhere in the world,” says Peyghambarian.

[…]

Continue reading...