Reading view

There are new articles available, click to refresh the page.

The Summit 1 is not peak e-mountain bike, but it’s a great all-rounder

Image of a blue hard tail mountain bike leaning against a grey stone wall.

Enlarge (credit: John Timmer)

As I mentioned in another recent review, I've been checking out electric hardtail mountain bikes lately. Their relative simplicity compared to full-suspension models tends to allow companies to hit a lower price point without sacrificing much in terms of component quality, potentially opening up mountain biking to people who might not otherwise consider it. The first e-hardtail I checked out, Aventon's Ramblas, fits this description to a T, offering a solid trail riding experience at a price that's competitive with similar offerings from major manufacturers.

Velotric's Summit 1 has a slightly different take on the equation. The company has made a few compromises that allowed it to bring the price down to just under $2,000, which is significantly lower than a lot of the competition. The result is something that's a bit of a step down on some more challenging trails. But it still can do about 90 percent of what most alternatives offer, and it's probably a better all-around bicycle for people who intend to also use it for commuting or errand-running.

Making the Summit

Velotric is another e-bike-only company, and we've generally been impressed by its products, which offer a fair bit of value for their price. The Summit 1 seems to be a reworking of its T-series of bikes (which also impressed us) into mountain bike form. You get a similar app experience and integration of the bike into Apple's Find My system, though the company has ditched the thumbprint reader, which is supposed to function as a security measure. Velotric has also done some nice work adapting its packaging to smooth out the assembly process, placing different parts in labeled sub-boxes.

Read 19 remaining paragraphs | Comments

US solar production soars by 25 percent in just one year

A single construction person set in the midst of a sea of solar panels.

Enlarge (credit: Vithun Khamsong)

With the plunging price of photovoltaics, the construction of solar plants has boomed in the US. Last year, for example, the US's Energy Information Agency expected that over half of the new generating capacity would be solar, with a lot of it coming online at the very end of the year for tax reasons. Yesterday, the EIA released electricity generation numbers for the first five months of 2024, and that construction boom has seemingly made itself felt: generation by solar power has shot up by 25 percent compared to just one year earlier.

The EIA breaks down solar production according to the size of the plant. Large grid-scale facilities have their production tracked, giving the EIA hard numbers. For smaller installations, like rooftop solar on residential and commercial buildings, the agency has to estimate the amount produced, since the hardware often resides behind the metering equipment, so only shows up via lower-than-expected consumption.

In terms of utility-scale production, the first five months of 2024 saw it rise by 29 percent compared to the same period in the year prior. Small-scale solar was "only" up by 18 percent, with the combined number rising by 25.3 percent.

Read 7 remaining paragraphs | Comments

Webb directly images giant exoplanet that isn’t where it should be

A dark background with read and blue images embedded in it, both showing a single object near an area marked with an asterisk.

Enlarge / Image of Epsilon Indi A at two wavelengths, with the position of its host star indicated by an asterisk. (credit: T. Müller (MPIA/HdA), E. Matthews (MPIA))

We have a couple of techniques that allow us to infer the presence of an exoplanet based on its effects on the light coming from its host star. But there's an alternative approach that sometimes works: image them directly. It's much more limited, since the planet has to be pretty big and orbiting far away enough from its star to avoid having light coming from the planet swamped by the far more intense starlight.

Still, it has been done. Massive exoplanets have been captured relatively shortly after their formation, when the heat generated by the collapse of material into the planet causes them to glow in the infrared. But the Webb telescope is far more sensitive than any infrared observatory we've ever built, and it has managed to image a relatively nearby exoplanet that's roughly as old as the ones in our Solar System.

Looking directly at a planet

What do you need to directly image a planet that's orbiting a star light-years away? The first thing is a bit of hardware called a coronagraph attached to your telescope. This is responsible for blocking the light from the star the planet is orbiting; without it, that light will swamp any other sources in the exosolar system. Even with a good coronagraph, you need the planets to be orbiting at a significant distance from the star so that they're cleanly separated from the signal being blocked by the coronagraph.

Read 9 remaining paragraphs | Comments

Appeals Court denies stay to states trying to block EPA’s carbon limits

Cooling towers emitting steam, viewed from above.

Enlarge (credit: Bernhardt Lang)

On Friday, the US Court of Appeals for the DC Circuit denied a request to put a hold on recently formulated rules that would limit carbon emissions made by fossil fuel power plants. The request, made as part of a case that sees 25 states squaring off against the EPA, would have put the federal government's plan on hold while the case continued. Instead, the EPA will be allowed to continue the process of putting its rules into effect, and the larger case will be heard under an accelerated schedule.

Here we go again

The EPA's efforts to regulate carbon emissions from power plants go back all the way to the second Bush administration, when a group of states successfully sued the EPA to force it to regulate greenhouse gas emissions. This led to a formal endangerment finding regarding greenhouse gases during the Obama administration, something that remained unchallenged even during Donald Trump's term in office.

Obama tried to regulate emissions through the Clean Power Plan, but his second term came to an end before this plan had cleared court hurdles, allowing the Trump administration to formulate a replacement that did far less than the Clean Power Plan. This took place against a backdrop of accelerated displacement of coal by natural gas and renewables that had already surpassed the changes envisioned under the Clean Power Plan.

Read 6 remaining paragraphs | Comments

Model mixes AI and physics to do global forecasts

Image of a dark blue flattened projection of the Earth, with lighter blue areas showing the circulation of the atmosphere.

Enlarge / Image of some of the atmospheric circulation seen during NeuralGCM runs. (credit: Google)

Right now, the world's best weather forecast model is a General Circulation Model, or GCM, put together by the European Center for Medium-Range Weather Forecasts. A GCM is in part based on code that calculates the physics of various atmospheric processes that we understand well. For a lot of the rest, GCMs rely on what's termed "parameterization," which attempts to use empirically determined relationships to approximate what's going on with processes where we don't fully understand the physics.

Lately, GCMs have faced some competition from machine-learning techniques, which train AI systems to recognize patterns in meteorological data and use those to predict the conditions that will result over the next few days. Their forecasts, however, tend to get a bit vague after more than a few days and can't deal with the sort of long-term factors that need to be considered when GCMs are used to study climate change.

On Monday, a team from Google's AI group and the European Centre for Medium-Range Weather Forecasts are announcing NeuralGCM, a system that mixes physics-based atmospheric circulation with AI parameterization of other meteorological influences. Neural GCM is computationally efficient and performs very well in weather forecast benchmarks. Strikingly, it can also produce reasonable-looking output for runs that cover decades, potentially allowing it to address some climate-relevant questions. While it can't handle a lot of what we use climate models for, there are some obvious routes for potential improvements.

Read 16 remaining paragraphs | Comments

Aventon, a major e-bike maker, tries its hand with a hardtail

Image of a large, rugged frame with hefty wheels and a straight handlebar.

Enlarge / Aventon's Ramblas hardtail mountain bike. (credit: John TImmer)

Full suspension mountain bikes are complicated beasts, with sections of the frame that pivot and a shock absorber to moderate that pivot. These parts help limit the bumps that reach your body and keep your rear tire in contact with the trail across all sorts of terrain and obstacles. The complexity and additional parts, however, boost the costs of full suspension bikes considerably, a situation that only gets worse when you electrify things.

As a result, some of the electric mountain bikes we've looked at are either very expensive or make a few too many compromises to bring the price down. Even aiming for middle-of-the-road compromise hardware costs in the area of $5,000.

But there's one easy way to lower the price considerably: lose the full suspension. The electric "hardtails" from major manufacturers typically cost considerably less than a full suspension bike with similar components. And because the engineering demands are considerably lower than in a full suspension bike, it's easier for some of the smaller e-bike companies to put together a solid offering.

Read 17 remaining paragraphs | Comments

Researchers build ultralight drone that flies with onboard solar

Image of a metallic object composed from top to bottom of a propeller, a large cylinder with metallic panels, a stalk, and a flat slab with solar panels and electronics.

Enlarge / The CoulombFly doing its thing. (credit: Nature)

On Wednesday, researchers reported that they had developed a drone they're calling the CoulombFly, which is capable of self-powered hovering for as long as the Sun is shining. The drone, which is shaped like no aerial vehicle you've ever seen before, combines solar cells, a voltage converter, and an electrostatic motor to drive a helicopter-like propeller—with all components having been optimized for a balance of efficiency and light weight.

Before people get excited about buying one, the list of caveats is extensive. There's no onboard control hardware, and the drone isn't capable of directed flight anyway, meaning it would drift on the breeze if ever set loose outdoors. Lots of the components appear quite fragile, as well. However, the design can be miniaturized, and the researchers built a version that weighs only 9 milligrams.

Built around a motor

One key to this development was the researchers' recognition that most drones use electromagnetic motors, which involve lots of metal coils that add significant weight to any system. So, the team behind the work decided to focus on developing a lightweight electrostatic motor. These rely on charge attraction and repulsion to power the motor, as opposed to magnetic interactions.

Read 12 remaining paragraphs | Comments

Much of Neanderthal genetic diversity came from modern humans

A large, brown-colored skull seen in profile against a black background.

Enlarge (credit: Halamka)

The basic outline of the interactions between modern humans and Neanderthals is now well established. The two came in contact as modern humans began their major expansion out of Africa, which occurred roughly 60,000 years ago. Humans picked up some Neanderthal DNA through interbreeding, while the Neanderthal population, always fairly small, was swept away by the waves of new arrivals.

But there are some aspects of this big-picture view that don't entirely line up with the data. While it nicely explains the fact that Neanderthal sequences are far more common in non-African populations, it doesn't account for the fact that every African population we've looked at has some DNA that matches up with Neanderthal DNA.

A study published on Thursday argues that much of this match came about because an early modern human population also left Africa and interbred with Neanderthals. But in this case, the result was to introduce modern human DNA to the Neanderthal population. The study shows that this DNA accounts for a lot of Neanderthals' genetic diversity, suggesting that their population was even smaller than earlier estimates had suggested.

Read 14 remaining paragraphs | Comments

Frozen mammoth skin retained its chromosome structure

Artist's depiction of a large mammoth with brown fur and huge, curving tusks in an icy, tundra environment.

Enlarge (credit: LEONELLO CALVETTI/SCIENCE PHOTO LIBRARY)

One of the challenges of working with ancient DNA samples is that damage accumulates over time, breaking up the structure of the double helix into ever smaller fragments. In the samples we've worked with, these fragments scatter and mix with contaminants, making reconstructing a genome a large technical challenge.

But a dramatic paper released on Thursday shows that this isn't always true. Damage does create progressively smaller fragments of DNA over time. But, if they're trapped in the right sort of material, they'll stay right where they are, essentially preserving some key features of ancient chromosomes even as the underlying DNA decays. Researchers have now used that to detail the chromosome structure of mammoths, with some implications for how these mammals regulated some key genes.

DNA meets Hi-C

The backbone of DNA's double helix consists of alternating sugars and phosphates, chemically linked together (the bases of DNA are chemically linked to these sugars). Damage from things like radiation can break these chemical linkages, with fragmentation increasing over time. When samples reach the age of something like a Neanderthal, very few fragments are longer than 100 base pairs. Since chromosomes are millions of base pairs long, it was thought that this would inevitably destroy their structure, as many of the fragments would simply diffuse away.

Read 18 remaining paragraphs | Comments

Nearby star cluster houses unusually large black hole

Three panel image, with zoom increasing from left to right. Left most panel is a wide view of the globular cluster; right is a zoom in to the area where its central black hole must reside.

Enlarge / From left to right, zooming in from the globular cluster to the site of its black hole. (credit: ESA/Hubble & NASA, M. Häberle)

Supermassive black holes appear to reside at the center of every galaxy and to have done so since galaxies formed early in the history of the Universe. Currently, however, we can't entirely explain their existence, since it's difficult to understand how they could grow quickly enough to reach the cutoff for supermassive as quickly as they did.

A possible bit of evidence was recently found by using about 20 years of data from the Hubble Space Telescope. The data comes from a globular cluster of stars that's thought to be the remains of a dwarf galaxy and shows that a group of stars near the cluster's core are moving so fast that they should have been ejected from it entirely. That implies that something massive is keeping them there, which the researchers argue is a rare intermediate-mass black hole, weighing in at over 8,000 times the mass of the Sun.

Moving fast

The fast-moving stars reside in Omega Centauri, the largest globular cluster in the Milky Way. With an estimated 10 million stars, it's a crowded environment, but observations are aided by its relative proximity, at "only" 17,000 light-years away. Those observations have been hinting that there might be a central black hole within the globular cluster, but the evidence has not been decisive.

Read 11 remaining paragraphs | Comments

Why every quantum computer will need a powerful classical computer

Image of a set of spheres with arrows within them, with all the arrows pointing in the same direction.

Enlarge / A single logical qubit is built from a large collection of hardware qubits. (credit: at digit)

One of the more striking things about quantum computing is that the field, despite not having proven itself especially useful, has already spawned a collection of startups that are focused on building something other than qubits. It might be easy to dismiss this as opportunism—trying to cash in on the hype surrounding quantum computing. But it can be useful to look at the things these startups are targeting, because they can be an indication of hard problems in quantum computing that haven't yet been solved by any one of the big companies involved in that space—companies like Amazon, Google, IBM, or Intel.

In the case of a UK-based company called Riverlane, the unsolved piece that is being addressed is the huge amount of classical computations that are going to be necessary to make the quantum hardware work. Specifically, it's targeting the huge amount of data processing that will be needed for a key part of quantum error correction: recognizing when an error has occurred.

Error detection vs. the data

All qubits are fragile, tending to lose their state during operations, or simply over time. No matter what the technology—cold atoms, superconducting transmons, whatever—these error rates put a hard limit on the amount of computation that can be done before an error is inevitable. That rules out doing almost every useful computation operating directly on existing hardware qubits.

Read 13 remaining paragraphs | Comments

ITER fusion reactor to see further delays, with operations pushed to 2034

Image of a large metal vessel with a number of holes cut into it.

Enlarge / One of the components of the reactor during leak testing. (credit: ITER)

On Tuesday, the people managing the ITER experimental fusion reactor announced that a combination of delays and altered priorities meant that its first-of-its-kind hardware wouldn't see plasma until 2036, with the full-energy deuterium-tritium fusion pushed back to 2039. The latter represents a four-year delay relative to the previous roadmap. While the former is also a delay, it's due in part to changing priorities.

COVID and construction delays

ITER is an attempt to build a fusion reactor that's capable of sustaining plasmas that allow it to operate well beyond the break-even point, where the energy released by fusion reactions significantly exceeds the energy required to create the conditions that enable those reactions. It's meant to hit that milestone by scaling up a well-understood design called a tokamak.

But the problem has been plagued by delays and cost overruns nearly from its start. At early stages, many of these stemmed from changes in designs necessitated by a better and improved understanding of plasmas held at extreme pressures and temperatures due to better modeling capabilities and a better understanding of the behavior of plasmas in smaller reactions.

Read 7 remaining paragraphs | Comments

High-altitude cave used by Tibetan Buddhists yields a Denisovan fossil

Image of a sheer cliff face with a narrow path leading to a cave opening.

Enlarge / The Baishiya Karst Cave, where the recently analyzed samples were obtained. (credit: Dongju Zhang’s group (Lanzhou University))

For well over a century, we had the opportunity to study Neanderthals—their bones, the items they left behind, their distribution across Eurasia. So, when we finally obtained the sequence of their genome and discovered that we share a genetic legacy with them, it was easy to place the discoveries into context. By contrast, we had no idea Denisovans existed until sequencing DNA from a small finger bone revealed that yet another relative of modern humans had roamed Asia in the recent past.

Since then, we've learned little more. The frequency of their DNA in modern human populations suggests that they were likely concentrated in East Asia. But we've only discovered fragments of bone and a few teeth since then, so we can't even make very informed guesses as to what they might have looked like. On Wednesday, an international group of researchers described finds from a cave on the Tibetan Plateau that had been occupied by Denisovans, which tell us a bit more about these relatives: what they ate. And that appears to be anything they could get their hands on.

The Baishiya Karst Cave

The finds come from a site called the Baishiya Karst Cave, which is perched on a cliff on the northeast of the Tibetan Plateau. It's located at a high altitude (over 3,000 meters or nearly 11,000 feet) but borders a high open plain, as you can see in the picture below.

Read 14 remaining paragraphs | Comments

The Earth heated up when its day was 22 hours long

The Earth heated up when its day was 22 hours long

Enlarge (credit: Roman Studio)

Because most things about Earth change so slowly, it's difficult to imagine them being any different in the past. But Earth's rotation has been slowing due to tidal interactions with the Moon, meaning that days were considerably shorter in the past. It's easy to think that a 22-hour day wouldn't be all that different, but that turns out not to be entirely true.

For example, some modeling has indicated that certain day lengths will be in resonance with other effects caused by the planet's rotation, which can potentially offset the drag caused by the tides. Now, a new paper looks at how these resonances could affect the climate. The results suggest that it would shift rain to occurring in the morning and evening while leaving midday skies largely cloud-free. The resulting Earth would be considerably warmer.

On the Lamb

We're all pretty familiar with the fact that the daytime Sun warms up the air. And those of us who remember high school chemistry will recall that a gas that is warmed will expand. So, it shouldn't be a surprise to hear that the Earth's atmosphere expands due to warming on its day side and contracts back again as it cools (these lag the daytime peak in sunlight). These differences provide something a bit like a handle that the gravitational pulls of the Sun and Moon can grab onto, exerting additional forces on the atmosphere. This complicated network of forces churns our atmosphere, helping shape the planet's weather.

Read 13 remaining paragraphs | Comments

❌