Home Artists Posts Import Register

Content

[This is a transcript with links to references.]

Welcome everyone to this week’s science news. Today we’ll talk about plants that use quantum mechanics, the first data from a new galaxy survey, quantum utility, online hate groups, photonic computing, the most sensitive power measurement ever, how to map a tunnel with muons, bad climate news that I don’t want to talk about, and you don’t want to hear, but that we need to talk about anyway. And of course, the telephone will ring.

Scientists from the University of Chicago claim that not only do plants use quantum mechanics, they know a few tricks that physicists haven’t yet figured out. In a paper that was just published in PRX, the authors present a computer model of photosynthesis according to which plants use Bose–Einstein condensation for efficient energy transfer.

You might have heard of Bose-Einstein condensation as something that happens at very low temperature. And indeed, that’s how physicists normally do it. Bose-Einstein condensation is a process in which a particular class of particles – called Bosons – all occupy the quantum state of lowest energy. If that happens, these particles all act as one, they’re one big quantum state basically, and perturbations in the condensate can move without friction, so, no energy loss.

Bose-Einstein condensation is the phenomenon behind superfluidity which requires temperatures close to absolute zero. But at least theoretically you don’t need low temperatures. All you need is some environment in which some kind of boson condenses into the same quantum states.

The authors of the new paper now say that plants can do it without cooling, and that this explains why photosynthesis is so efficient. The idea is that the bosons in plants are created by sunlight. The light hits a part of the chlorophyll molecule, called a chromophore. There, it kicks out an electron. The electron itself is a fermion, not a boson, but it leaves behind a gap. The combination of the electron and the gap together is a boson. It’s called an “exciton”. In the new paper now the researchers do a calculation of the molecular environment of these excitons and claim that they condense within a matter of a picosecond or so. Energy packages can then travel basically without friction between chromophores.

This is extremely surprising. Physicists have tried to create condensates of excitons in the lab, but that’s turned out to be very difficult. The issue is that the electrons and the hole they usually leave behind recombine faster than a condensate could form. This finding could become extremely useful, if true, though I have a feeling that it’s going to be somewhat controversial. Be that as it may, If you’ve ever told someone they’re as dumb as a plant, keep in mind that even grass might know more quantum mechanics than you.

The Dark Energy Spectroscopic Instrument Survey, DESI for short, conducted at the Kitt Peak observatory in Arizona, has released its first batch of data, and there sure is a lot of it. DESI is set to map over forty million galaxies, quasars, and stars, and the first batch of data has nearly two million objects logged already. The data set includes distant galaxies as well as stars in the Milky Way and comes from the “survey validation” phase which took place between 2020 and 2021.

One of the major purposes of this survey is to look for evidence of baryon acoustic oscillations. Those are periodic fluctuations in the plasma in the early universe that should have left an imprint in the distribution of galaxies in the universe that we observe today.

In a paper that was just published, the collaboration reports that they indeed observed the signal  they had been looking for. The significance is low but it should get stronger as they collect further data.

DESI is also an amazing technological achievement. It has five thousand robotic positioners that move optical fibres and allow it to capture light from objects billions of light-years away, including extremely far away quasars. It can observe more than one hundred thousand galaxies in one night. The telescope is particularly well suited for spectrographic redshift measurements which is why it can tell us something about the expansion of the universe and the dark energy that drives it.

The data is all publicly available, so if you haven 80 terabytes free disk space, you can pull it down and then ask your lawn what to do with it.

A team of scientists at IBM has found a new application for current-day quantum computers. Quantum computers hold out as the future of computing, with the ability to solve complex problems in fields such as chemistry and finance in short amounts of time. However, a major issue in the development of quantum computers working at their full potential has been noise.

In order for quantum computers to fully realize their potential, we’d need fault-tolerant quantum circuits but current day technology is far away from that. Many physicists however have argued that even noisy quantum computers can be useful. That’s what IBM set out to show.

In 2019, scientists from Google published a paperin Nature claiming they’d done a calculation on a 53 qubit quantum computer which would have taken 10 thousand years on a conventional computer, an achievement that’s referred to as “quantum advantage” formerly known as “quantum supremacy”. The 10,000 years claim was swiftly questioned by IBM and indeed it was later done by a Chinese group on a conventional computer in only 5 minutes.

But hey, Einstein said time is relative anyway, so five minutes in China might well be a ten thousand years in silicon valley.

The new paper published in Nature now describes how the IBM team used a noisy quantum processor with 127 qubits, called the Eagle processor, to calculate what’s called the Trotterised time evolution of a 2D transverse-field Ising model.

What the heck is this? The Ising model is a model of coupled quantum spins, transverse-field means it’s a complicated version of that model. That it’s a trotterised time evolution means they used a specific method to calculate how these coupled quantum spins change in time. That it’s 2D means… Cmon you know what that means! It means they can one-to-one encode it with qubits on a flat chip.

And that’s exactly what they did. The chip doesn’t so much compute the Ising model, it *is the model. Instead of using error correction, they use what they call “error mitigation”.That works kind of how noise removal works on an audio track. You do enough runs to learn what the noise spectrum is and then you subtract it.

Since the calculation can’t be done otherwise, it’s somewhat hard to check that it’s correct. So they did it by comparing simplified cases and approximations that have previously been calculated by other means. They claim that their quantum chip outperforms any calculation that could be done on a conventional computer and that, quote “there is indeed merit to pursuing research towards deriving a practical computational advantage from noise-limited quantum circuits”.

It’s actually a really neat paper. One question it raises though is whether this was actually an example for a quantum simulation, rather than a computation, as the quantum chip basically *is the model. And there are other ways one can simulate the Ising model that have been done before, for example you can use ultracold gases of Rydberg atoms with suitable couplings. Then again, the chip is probably easier to customize, so I am willing to let it pass as a calculation.

You may now wonder what the utility is of this calculation other than publishing a paper in Nature about it, but other than better understanding the Ising model I’m afraid I don’t know one.

Hi Elon,

Well, you know when this guy from Bank of America, what’s his name…

Haim Israel, right! When he said that quantum computing would be “bigger than fire”, he probably didn’t mean the trotterized time-evolution of a 2D transverse field Ising model, what kind of utility is that.

You’re right, what do I know about finance. Maybe that’s how it works! Need to talk to my finance manager now, sorry, bye-ee.

A group of physicists from George Washington University has put forward a new theory to describe the dynamics of online hate communities. They suggest that hate groups are, wait for it, non-linear fluids.

In their model each individual is described by a string of numbers that stand for the individual’s traits. Then they assume that if the traits of several individuals are similar enough, they’ll fuse to a community. How fast they fuse and how many fuse depends on platform design and gives rise to what they call the “chemistry” of the platform. They compare their model to a lot of data which they collected on social media including Twitter, facebook, and YouTube. The data track communities that are against something, which they call “anti-X”. The authors claim their model describes the growth and spread of those communities well.

They also use their model to suggest mitigation policies that make it harder for hate communities to grow, such as increasing the influx of diverse users or changing the platform chemistry. It isn’t exactly the most practical advice.

Overall they find that online-hate communities are well described by non-linear fluids, including turbulence and shockwaves. Who’d have thought that fluids could be so mean?

Researchers from Caltech have done a calculation with light. They ran a code that’s known as a cellular automaton.

 A cellular automaton is basically the simplest algorithm you can think of. It’s a line of bits, or pixels, with a specific update rule. The possible rules all have numbers. Some give very simple patterns, some fractals, and some are remarkably complex. Rule 30, which you see here, is kinda famous for its complexity.

In this study, the cellular automaton was created by pulses of laser light. The light is split into three paths, each with a different length. Depending on the length, the light on each path either destructively or constructively interferes with its neighbours. This encodes a rule. The result is used as input for the next round. Rinse and repeat. This way, the researchers could reproduce some of the most famous rules of cellular automata, which includes fractals, chaos, solitons, and also rule 30.

Computing with light is one of the avenues that scientists and engineers are pursuing to pack more computing power into smaller space. It’s not just about size. Current computing architectures use electric currents to move around information and that creates a lot of heat. This heat is becoming a problem. Computing with light could overcome this issue.

And that’s what I’d call “light news”.

A group of scientists from Finland has made the most precise power measurement ever by accurately detecting less than a femtowatt. Just in case you’re not familiar with that unit, a femtowatt is one quadrillionth of a watt, and about as much as I produce on my cross-trainer.

A device that measures power is called a bolometer, they’re basically very precise thermometers that absorb a little bit of the power and then infer how much it was from the change of temperature. The researchers have called their new device a nanobolometer. Standard power sensors used in most laboratories measure the power of micro-radiation at the scale of milliwatts, so the new nanobolometer measures power one trillion times lower!

The team from Aalto University in Finland partnered up with researchers at the quantum-technology companies Bluefors and IQM. Their newly developed nanobolometer is so precise because it’s continuously calibrated by a reference heater that provides an exactly known current.

The reason they’re interested in doing this is that they want to exactly know the power of microwaves, which are often used to manipulate qubits in quantum computers. Though maybe one day it’ll also tell you how many nanocalories you burned on the cross-trainer.

Researchers from the university of Tokyo have successfully tested a new system to map tunnels and it works with muons.

GPS is all well and fine, but it requires contact with satellites and that works badly underground. If you want to triangulate underground you need something that can go through rock, lots of rock. For example, muons.

Muons are a type of elementary particle. They are created when cosmic rays hit the upper atmosphere, and constantly rain down on us. Mostly, they go through.

So the idea of the new system is to use a set of detectors on the surface that measure when a muon goes to, and that tracks the time and direction. A receiver underground then travels along the path one wants to map. Once back atop, the muons that the receiver detected can be cross-checked with those at the detector, allowing to reconstruct the path.

They can pinpoint positions to an accuracy of 10 to 20 meters, so if you’d ask them to look for your keys the answer would be “somewhere here”.

And now to some bad climate news. Antarctic Sea Ice levels are extremely low for this time of the year already. In this figure from Elliot Jacobson you see the current year in red in comparison to the past 30 years in various shades of blue.

In this figure you see the global surface temperature. The current year is the blue line, the green curve is the pre-industrial average. The surface temperature has already breached the 1 point 5 degrees Celsius line multiple days in a row even though the Northern hemisphere summer isn’t yet in full swing.

And in the North Atlantic, ocean temperatures have reached new daily records. This graph shows you how way, way out the ocean temperature in the North Atlantic is this year.

Some of the *global sea heating can be explained by normal climate changes such as the current transition to the warmer El Nino phase, but the heating in the North Atlantic is extreme. One possible explanation put forward by Leon Simons on twitter is that it’s related to new regulations on ship emissions that went into effect in 2020.

Sulfate aerosols from combustion engines are known to increase the amount of sunlight reflected. While a tweet is not a research study, it’s indeed plausible that in the past years ship emissions had a cooling effect on the North Atlantic which is one of the busiest routes. Another reason might be that so far this year there’s been relatively little Sahara dust blown above the North Atlantic, and this dust also has a cooling effect.

The brief summary of what this all means is that the Northern Hemisphere is likely in for several very hot summers. So, yeah, keep on dusting those solar panels guys

Files

Record Temperatures in the North Atlantic

Expand your scientific horizon with Brilliant! Use our link https://brilliant.org/sabine You can get started for free, and the first 200 will get 20% off the annual premium subscription. Today we’ll talk about plants that use quantum mechanics, the first data from a new galaxy survey, quantum utility, online hate groups, photonic computing, the most sensitive power measurement ever, how to map a tunnel with muons, bad climate news that I don’t want to talk about, and you don’t want to hear, but that we need to talk about anyway. And of course, the telephone will ring. 💌 Support us on Donatebox ➜ https://donorbox.org/swtg 🤓 Transcripts and written news on Substack ➜ https://sciencewtg.substack.com/ 👉 Transcript with links to references on Patreon ➜ https://www.patreon.com/Sabine 📩 Sign up for my weekly science newsletter. It's free! ➜ https://sabinehossenfelder.com/newsletter/ 🔗 Join this channel to get access to perks ➜ https://www.youtube.com/channel/UC1yNl2E66ZzKApQdRuTQ4tw/join 🖼️ On instagram ➜ https://www.instagram.com/sciencewtg/ 00:00 Intro 00:33 Plants use Quantum Mechanics 03:06 First Data from the DESI Galaxy Survey 04:46 IBM Demonstrates Quantum Utility 08:56 Online Hate Groups Behave like Non-linear Fluids 10:18 Cellular Automata on Photonic Computers 11:45 The Most Sensitive Power Measurement Ever 13:08 Mapping a Tunnel with Muons 14:16 Bad Climate News 16:01 Learn Science with Brilliant #science #sciencenews

Comments

Anonymous

DESI is an exciting instrument. Great and funny news episode 👍

Anonymous

Few people are taking global warming seriously and so we're heading to the tipping point that will result in an extinction event. A quick look at the human caused decimation of wildlife proves this as it is faster than in prehistoric events except for that at the KT boundary.