Reading view

There are new articles available, click to refresh the page.

Nearby star cluster houses unusually large black hole

Three panel image, with zoom increasing from left to right. Left most panel is a wide view of the globular cluster; right is a zoom in to the area where its central black hole must reside.

Enlarge / From left to right, zooming in from the globular cluster to the site of its black hole. (credit: ESA/Hubble & NASA, M. Häberle)

Supermassive black holes appear to reside at the center of every galaxy and to have done so since galaxies formed early in the history of the Universe. Currently, however, we can't entirely explain their existence, since it's difficult to understand how they could grow quickly enough to reach the cutoff for supermassive as quickly as they did.

A possible bit of evidence was recently found by using about 20 years of data from the Hubble Space Telescope. The data comes from a globular cluster of stars that's thought to be the remains of a dwarf galaxy and shows that a group of stars near the cluster's core are moving so fast that they should have been ejected from it entirely. That implies that something massive is keeping them there, which the researchers argue is a rare intermediate-mass black hole, weighing in at over 8,000 times the mass of the Sun.

Moving fast

The fast-moving stars reside in Omega Centauri, the largest globular cluster in the Milky Way. With an estimated 10 million stars, it's a crowded environment, but observations are aided by its relative proximity, at "only" 17,000 light-years away. Those observations have been hinting that there might be a central black hole within the globular cluster, but the evidence has not been decisive.

Read 11 remaining paragraphs | Comments

Why every quantum computer will need a powerful classical computer

Image of a set of spheres with arrows within them, with all the arrows pointing in the same direction.

Enlarge / A single logical qubit is built from a large collection of hardware qubits. (credit: at digit)

One of the more striking things about quantum computing is that the field, despite not having proven itself especially useful, has already spawned a collection of startups that are focused on building something other than qubits. It might be easy to dismiss this as opportunism—trying to cash in on the hype surrounding quantum computing. But it can be useful to look at the things these startups are targeting, because they can be an indication of hard problems in quantum computing that haven't yet been solved by any one of the big companies involved in that space—companies like Amazon, Google, IBM, or Intel.

In the case of a UK-based company called Riverlane, the unsolved piece that is being addressed is the huge amount of classical computations that are going to be necessary to make the quantum hardware work. Specifically, it's targeting the huge amount of data processing that will be needed for a key part of quantum error correction: recognizing when an error has occurred.

Error detection vs. the data

All qubits are fragile, tending to lose their state during operations, or simply over time. No matter what the technology—cold atoms, superconducting transmons, whatever—these error rates put a hard limit on the amount of computation that can be done before an error is inevitable. That rules out doing almost every useful computation operating directly on existing hardware qubits.

Read 13 remaining paragraphs | Comments

ITER fusion reactor to see further delays, with operations pushed to 2034

Image of a large metal vessel with a number of holes cut into it.

Enlarge / One of the components of the reactor during leak testing. (credit: ITER)

On Tuesday, the people managing the ITER experimental fusion reactor announced that a combination of delays and altered priorities meant that its first-of-its-kind hardware wouldn't see plasma until 2036, with the full-energy deuterium-tritium fusion pushed back to 2039. The latter represents a four-year delay relative to the previous roadmap. While the former is also a delay, it's due in part to changing priorities.

COVID and construction delays

ITER is an attempt to build a fusion reactor that's capable of sustaining plasmas that allow it to operate well beyond the break-even point, where the energy released by fusion reactions significantly exceeds the energy required to create the conditions that enable those reactions. It's meant to hit that milestone by scaling up a well-understood design called a tokamak.

But the problem has been plagued by delays and cost overruns nearly from its start. At early stages, many of these stemmed from changes in designs necessitated by a better and improved understanding of plasmas held at extreme pressures and temperatures due to better modeling capabilities and a better understanding of the behavior of plasmas in smaller reactions.

Read 7 remaining paragraphs | Comments

High-altitude cave used by Tibetan Buddhists yields a Denisovan fossil

Image of a sheer cliff face with a narrow path leading to a cave opening.

Enlarge / The Baishiya Karst Cave, where the recently analyzed samples were obtained. (credit: Dongju Zhang’s group (Lanzhou University))

For well over a century, we had the opportunity to study Neanderthals—their bones, the items they left behind, their distribution across Eurasia. So, when we finally obtained the sequence of their genome and discovered that we share a genetic legacy with them, it was easy to place the discoveries into context. By contrast, we had no idea Denisovans existed until sequencing DNA from a small finger bone revealed that yet another relative of modern humans had roamed Asia in the recent past.

Since then, we've learned little more. The frequency of their DNA in modern human populations suggests that they were likely concentrated in East Asia. But we've only discovered fragments of bone and a few teeth since then, so we can't even make very informed guesses as to what they might have looked like. On Wednesday, an international group of researchers described finds from a cave on the Tibetan Plateau that had been occupied by Denisovans, which tell us a bit more about these relatives: what they ate. And that appears to be anything they could get their hands on.

The Baishiya Karst Cave

The finds come from a site called the Baishiya Karst Cave, which is perched on a cliff on the northeast of the Tibetan Plateau. It's located at a high altitude (over 3,000 meters or nearly 11,000 feet) but borders a high open plain, as you can see in the picture below.

Read 14 remaining paragraphs | Comments

The Earth heated up when its day was 22 hours long

The Earth heated up when its day was 22 hours long

Enlarge (credit: Roman Studio)

Because most things about Earth change so slowly, it's difficult to imagine them being any different in the past. But Earth's rotation has been slowing due to tidal interactions with the Moon, meaning that days were considerably shorter in the past. It's easy to think that a 22-hour day wouldn't be all that different, but that turns out not to be entirely true.

For example, some modeling has indicated that certain day lengths will be in resonance with other effects caused by the planet's rotation, which can potentially offset the drag caused by the tides. Now, a new paper looks at how these resonances could affect the climate. The results suggest that it would shift rain to occurring in the morning and evening while leaving midday skies largely cloud-free. The resulting Earth would be considerably warmer.

On the Lamb

We're all pretty familiar with the fact that the daytime Sun warms up the air. And those of us who remember high school chemistry will recall that a gas that is warmed will expand. So, it shouldn't be a surprise to hear that the Earth's atmosphere expands due to warming on its day side and contracts back again as it cools (these lag the daytime peak in sunlight). These differences provide something a bit like a handle that the gravitational pulls of the Sun and Moon can grab onto, exerting additional forces on the atmosphere. This complicated network of forces churns our atmosphere, helping shape the planet's weather.

Read 13 remaining paragraphs | Comments

Bipartisan consensus in favor of renewable power is ending

Image of solar panels on a green grassy field, with blue sky in the background.

Enlarge (credit: tigerstrawberry)

One of the most striking things about the explosion of renewable power that's happening in the US is that much of it is going on in states governed by politicians who don't believe in the problem wind and solar are meant to address. Acceptance of the evidence for climate change tends to be lowest among Republicans, yet many of the states where renewable power has boomed—wind in Wyoming and Iowa, solar in Texas—are governed by Republicans.

That's partly because, up until about 2020, there was a strong bipartisan consensus in favor of expanding wind and solar power, with support above 75 percent among both parties. Since then, however, support among Republicans has dropped dramatically, approaching 50 percent, according to polling data released this week.

Renewables enjoyed solid Republican support until recently.

Renewables enjoyed solid Republican support until recently. (credit: Pew Research)

To a certain extent, none of this should be surprising. The current leader of the Republican Party has been saying that wind turbines cause cancer and offshore wind is killing whales. And conservative-backed groups have been spreading misinformation in order to drum up opposition to solar power facilities.

Read 8 remaining paragraphs | Comments

Supreme Court issues stay on EPA’s ozone plan, despite blistering dissent

Aerial view of Los Angeles, showing a layer of smog against the hills in the background.

Enlarge / Ozone-producing chemicals come from a variety of sources and don't respect state borders. (credit: John Edward Linden)

On Tuesday, a slim majority of the US Supreme Court issued an emergency ruling that places a stay on rules developed by the Environmental Protection Agency, meant to limit the spread of ozone-generating pollutants across state lines. Because it was handled on an emergency basis, the decision was made without any evidence gathered during lower court proceedings. As a result, the justices don't even agree on the nature of the regulations the EPA has proposed, leading to a blistering dissent from Justice Amy Coney Barrett, who was joined by the court's three liberal justices.

Bad neighbors

The rule at issue arose from the EPA's regular process of revisiting existing limits in light of changes in public health information and pollution-control technology. In this case, the focus was on ozone-producing chemicals; in 2015, the EPA chose to lower the limit on ozone from 75 to 70 parts per billion.

Once these standards are set, states are required to submit plans that fulfill two purposes. One is to limit pollution within the state itself; the second involves pollution controls that will limit the exposure in states that are downwind of the pollution sources. The EPA is required to evaluate these plans; if they are deemed insufficient, the EPA can require the states to follow a federal plan devised by the EPA.

Read 12 remaining paragraphs | Comments

DNA-based bacterial parasite uses completely new DNA-editing method

Top row: individual steps in the reaction process. Bottom row: cartoon diagram of the top, showing the position of each DNA and RNA strand.

Enlarge / Top row: individual steps in the reaction process. Bottom row: cartoon diagram of the top, showing the position of each DNA and RNA strand. (credit: Hiraizumi, et. al.)

While CRISPR is probably the most prominent gene-editing technology, there are others, some developed before and since. And people have been developing CRISPR variants to perform more specialized functions, like altering specific bases. In all of these cases, researchers are trying to balance a number of competing factors: convenience, flexibility, specificity and precision for the editing, low error rates, and so on.

So, having additional options for editing can be a good thing, enabling new ways of balancing those different needs. On Wednesday, a pair of papers in Nature describe a DNA-based parasite that moves itself around bacterial genomes through a mechanism that hasn't been previously described. It's nowhere near ready for use in humans, but it may have some distinctive features that make it worth further development.

Going mobile

Mobile genetic elements, commonly called transposons, are quite common in many species—they make up nearly half the sequences in the human genome, for example. They are indeed mobile, showing up in new locations throughout the genome, sometimes by cutting themselves out and hopping to new locations, other times by sending a copy out to a new place in the genome. For any of this to work, they need to have an enzyme that cuts DNA and specifically recognizes the right transposon sequence to insert into the cut.

Read 17 remaining paragraphs | Comments

Congress passes bill to jumpstart new nuclear power tech

A nuclear reactor and two cooling towards on a body of water, with a late-evening glow in the sky.

Enlarge (credit: hrui)

Earlier this week, the US Senate passed what's being called the ADVANCE Act, for Accelerating Deployment of Versatile, Advanced Nuclear for Clean Energy. Among a number of other changes, the bill would attempt to streamline permitting for newer reactor technology and offer cash incentives for the first companies that build new plants that rely on one of a handful of different technologies. It enjoyed broad bipartisan support both in the House and Senate and now heads to President Biden for his signature.

Given Biden's penchant for promoting his bipartisan credentials, it's likely to be signed into law. But the biggest hurdles nuclear power faces are all economic, rather than regulatory, and the bill provides very little in the way of direct funding that could help overcome those barriers.

Incentives

For reasons that will be clear only to congressional staffers, the Senate version of the bill was attached to an amendment to the Federal Fire Prevention and Control Act. Nevertheless, it passed by a margin of 88-2, indicating widespread (and potentially veto-proof) support. Having passed the House already, there's nothing left but the president's signature.

Read 17 remaining paragraphs | Comments

Researchers describe how to tell if ChatGPT is confabulating

Researchers describe how to tell if ChatGPT is confabulating

Enlarge (credit: Aurich Lawson | Getty Images)

It's one of the world's worst-kept secrets that large language models give blatantly false answers to queries and do so with a confidence that's indistinguishable from when they get things right. There are a number of reasons for this. The AI could have been trained on misinformation; the answer could require some extrapolation from facts that the LLM isn't capable of; or some aspect of the LLM's training might have incentivized a falsehood.

But perhaps the simplest explanation is that an LLM doesn't recognize what constitutes a correct answer but is compelled to provide one. So it simply makes something up, a habit that has been termed confabulation.

Figuring out when an LLM is making something up would obviously have tremendous value, given how quickly people have started relying on them for everything from college essays to job applications. Now, researchers from the University of Oxford say they've found a relatively simple way to determine when LLMs appear to be confabulating that works with all popular models and across a broad range of subjects. And, in doing so, they develop evidence that most of the alternative facts LLMs provide are a product of confabulation.

Read 14 remaining paragraphs | Comments

When did humans start social knowledge accumulation?

Two worked pieces of stone, one an axe head, and one a scraper.

Enlarge (credit: IURII BUKHTA)

A key aspect of humans' evolutionary success is the fact that we don't have to learn how to do things from scratch. Our societies have developed various ways—from formal education to YouTube videos—to convey what others have learned. This makes learning how to do things far easier than learning by doing, and it gives us more space to experiment; we can learn to build new things or handle tasks more efficiently, then pass information on how to do so on to others.

Some of our closer relatives, like chimps and bonobos, learn from their fellow species-members. They don't seem to engage in this iterative process of improvement—they don't, in technical terms, have a cumulative culture where new technologies are built on past knowledge. So, when did humans develop this ability?

Based on a new analysis of stone toolmaking, two researchers are arguing that the ability is relatively recent, dating to just 600,000 years ago. That's roughly the same time our ancestors and the Neanderthals went their separate ways.

Read 13 remaining paragraphs | Comments

❌