โŒ

Normal view

There are new articles available, click to refresh the page.
Today โ€” 1 July 2024Technology

FreeDOS Founder Jim Hall: After 30 Years, What I've Learned About Open Source Community

1 July 2024 at 07:34
In 1994, college student Jim Hall created FreeDOS (in response to Microsoft's plan to gradually phase out MS-DOS). After celebrating its 30th anniversary last week, Hill wrote a new article Saturday for OpenSource.net: "What I've learned about Open Source community over 30 years." Lessons include "every Open Source project needs a website," but also "consider other ways to raise awareness about your Open Source software project." ("In the FreeDOS Project, we've found that posting videos to our YouTube channel is an excellent way to help people learn about FreeDOS... The more information you can share about your Open Source project, the more people will find it familiar and want to try it out.") But the larger lesson is that "Open Source projects must be grounded in community." Without open doors for new ideas and ongoing development, even the most well-intentioned project becomes a stagnant echo chamber... Maintain open lines of communication... This can take many forms, including an email list, discussion board, or some other discussion forum. Other forums where people can ask more general "Help me" questions are okay but try to keep all discussions about project development on your official discussion channel. The last of its seven points stresses that "An Open Source project isn't really Open Source without source code that everyone can download, study, use, modify and share" (urging careful selection for your project's licensing). But the first point emphasizes that "It's more than just code," and Hall ends his article by attributing FreeDOS's three-decade run to "the great developers and users in our community." In celebrating FreeDOS, we are celebrating everyone who has created programs, fixed bugs, added features, translated messages, written documentation, shared articles, or contributed in some other way to the FreeDOS Project... Here's looking forward to more years to come! Jim Hall is also Slashdot reader #2,985, and back in 2000 he answered questions from Slashdot's readers โ€” just six years after starting the project. "Jim isn't rich or famous," wrote RobLimo, "just an old-fashioned open source contributor who helped start a humble but useful project back in 1994 and still works on it as much as he can." As the years piled up, Slashdot ran posts celebrating FreeDOS's 10th, 15th, and 20th anniversary. And then for FreeDOS's 25th, Hall returned to Slashdot to answer more questions from Slashdot readers...

Read more of this story at Slashdot.

Many Carbon Capture Projects Are Now Launching

1 July 2024 at 03:34
The Los Angeles Times reports that "multiple projects seeking to remove carbon dioxide from the air have been launched across Los Angeles County: When completed, Project Monarch and its wastewater component, Pure Water Antelope Valley, will purify up to 4.5 million gallons of water each day and capture 25,000 tons of atmospheric CO2 each year. (The typical gasoline-powered automobile spews 4.6 tons of carbon each year, according to the Environmental Protection Agency).... But the Palmdale project isn't the only new carbon-capture development in L.A. County. On Friday, officials from CarbonCapture Inc. gathered in Long Beach to introduce the first commercial-scale U.S. direct air capture, or DAC, system designed for mass production. The unit, which resembles a shipping container, can remove more than 500 tons of atmospheric CO2 per year... The L.A.-based company also announced that it will mass-produce up to 4,000 of its DAC modules annually at a new facility in Mesa, Arizona. It joins similar efforts from L.A.-based Captura, which is working to remove CO2 from the upper ocean; L.A.-based Avnos, which produces water while capturing carbon; and L.A.-based Equatic, which is working to remove atmospheric CO2 using the ocean... [Equatic's] San Pedro facility pumps seawater through a series of electric plates that separate the water into hydrogen and oxygen as well as acidic and alkaline streams of liquid. The alkaline, or base, stream is exposed to the atmosphere, where it mineralizes CO2 into carbonates that are then dissolved and discharged back into the ocean for permanent storage, operators say Additionally, the hydrogen produced by the process is carbon-negative, making it a source of renewable energy that can be used to fuel the CO2 removal process or sold to other users, said Edward Sanders, chief operating officer at Equatic. Equatic announced this month that it will partner with a Canadian carbon removal project developer, Deep Sky, to build North America's first commercial-scale ocean-based CO2 removal plant in Quebec, following the success of its effort in Los Angeles as well as another facility in Singapore. While the San Pedro facility can capture about 40 tons of CO2 per year, the Quebec facility will capture about 100,000 tons per year, Sanders said. Meanwhile, two new projects by direct air capture company Heirloom were announced this week in Louisiana. Those projects are "expected to remove hundreds of thousands of tons of carbon dioxide from the air per year," according to the Associated Press, "and store it deep underground... part of "a slew of carbon removal and storage projects that have been announced in Louisiana." Heirloom estimates that they will eventually remove 320,000 tons of carbon dioxide each year... The company uses limestone, a natural absorbent, to extract carbon dioxide from the air. Heirloom's technology reduces the time it takes to absorb carbon dioxide in nature from years to just three days, according to the company's press release. The carbon dioxide is then removed from the limestone material and stored permanently underground. In May America's Energy department also announced $3.5 billion in funding for its carbon-capture program โ€” four large-scale, regional direct air capture hubs "that each comprise a network of carbon dioxide removal projects..." The hubs will have the capacity to capture and then permanently store at least one million metric tons of CO2 from the atmosphere annually, either from a single unit or from multiple interconnected units. And Shell Canada has a pair of carbon capture projects in Alberta it expects to have operational toward the end of 2028, according to the CBC: The Polaris project is designed to capture about 650,000 tonnes of carbon dioxide annually from the Scotford complex. That works out to approximately 40 per cent of Scotford's direct CO2 emissions from the refinery and 22 per cent of its emissions from the chemicals complex.

Read more of this story at Slashdot.

An Asteroid Just Passed Within 180,000 Miles of Earth

30 June 2024 at 23:34
game of Asteroids An anonymous reader shared this report from The Hill: An asteroid the size of a football stadium threaded the needle between Earth and the moon Saturday morning โ€” the second of two astronomical near misses in three days. Near miss, in this case, is a relative term: Saturday's asteroid, 2024 MK, came within 180,000 miles of Earth. On Thursday, meanwhile, asteroid 2011 UL21 flew within 4 million miles. But the Saturday passage of 2024 MK โ€” which scientists discovered only two weeks ago โ€” coincides with a sobering reminder of threats from space. Sunday is Asteroid Day, the anniversary of the 1908 explosion of a rock from space above a Russian town โ€” the sort of danger that, astronomers warn, is always lurking as the Earth hurtles through space... In 2013, for instance, an asteroid about 62 feet across that broke apart nearly 20 miles above Siberia released 30 times as much energy as the atomic bomb that hit Hiroshima. While most of the impact energy was absorbed by the atmosphere, the detonation triggered a shock wave that blew out windows and injured more than a thousand people. The article points out that if Saturday's asteroid had hit earth, the impact would have "the equivalent impact energy in the hundreds of megaton approaching a gigaton," Peter Brown of Canada's Western University told the Canadian Broadcasting Service. (For comparison, most hydrogen bombs are in the 50-megaton range.) Brown said "It's the sort of thing that if it hit the east coast of the U.S., you would have catastrophic effects over most of the eastern seaboard. But it's not big enough to affect the whole world." Meanwhile, the article adds that last Thursday's asteroid โ€” "while it was comfortably far out in space" โ€” was the size of Mt. Everest. "At 1.5 miles in diameter, that asteroid was about a quarter the size of the asteroid that struck the earth 65 million years ago, wiping out all dinosaurs that walked, as well as the majority of life on earth." But the risk of a collision like that "is very, very low." NASA has estimated that a civilization-ending event (like the collision of an asteroid the size of Thursday's with the Earth) should only happen every few million years. And such an impact from an asteroid half a mile in diameter or bigger will be almost impossible for a very long time, according to findings published last year in The Astronomical Journal. NASA's catalog of large and dangerous objects like 2011 UL21 is now 95 percent complete, MIT Technology Review reported.

Read more of this story at Slashdot.

William Gibson's 'Neuromancer' to Become a Series on Apple TV+

30 June 2024 at 21:34
It's been adapted into a graphic novel, a videogame, a radio play, and an opera, according to Wikipedia โ€” which also describes years of trying to adapt Neuromancer into a movie. "The landmark 1984 cyberpunk novel has been on Hollywood's wishlist for decades," writes Gizmodo, "with multiple filmmakers attempting to bring it to the big screen." (Back in 2010, Slashdot's CmdrTaco even posted an update with the headline "Neuromancer Movie In Your Future?" with a 2011 story promising the movie deal was "moving forward....") But now Deadline reports it's becoming a 10-episode series on Apple TV+ (co-produced by Apple Studios) starring Callum Turner and Brianna Middleton: Created for television by Graham Roland and JD Dillard, Neuromancer follows a damaged, top-rung super-hacker named Case (Turner) who is thrust into a web of digital espionage and high stakes crime with his partner Molly (Middleton), a razor-girl assassin with mirrored eyes, aiming to pull a heist on a corporate dynasty with untold secrets. More from Gizmodo: "We're incredibly excited to be bringing this iconic property to Apple TV+," Roland and Dillard said in a statement. "Since we became friends nearly 10 years ago, we've looked for something to team up on, so this collaboration marks a dream come true. Neuromancer has inspired so much of the science fiction that's come after it and we're looking forward to bringing television audiences into Gibson's definitive 'cyberpunk' world." The novel launched Gibson's "Sprawl" trilogy of novels (building on the dystopia in his 1982 short story "Burning Chrome"), also resurrecting the "Molly Millions" character from Johnny Mnemonic โ€” an even earlier short story from 1981...

Read more of this story at Slashdot.

Yesterday โ€” 30 June 2024Technology

Caching Is Key, and SIEVE Is Better Than LRU

30 June 2024 at 19:40
USENIX, the long-running OS/networking research group, also publishes a magazine called ;login:. Today the magazine's editor โ€” security consultant Rik Farrow โ€” stopped by Slashdot to share some new research. rikfarrow writes: Caching means using faster memory to store frequently requested data, and the most commonly used algorithm for determining which items to discard when the cache is full is Least Recently Used [or "LRU"]. These researchers have come up with a more efficient and scalable method that uses just a few lines of code to convert LRU to SIEVE. Just like a sieve, it sifts through objects (using a pointer called a "hand") to "filter out unpopular objects and retain the popular ones," with popularity based on a single bit that tracks whether a cached object has been visited: As the "hand" moves from the tail (the oldest object) to the head (the newest object), objects that have not been visited are evicted... During the subsequent rounds of sifting, if objects that survived previous rounds remain popular, they will stay in the cache. In such a case, since most old objects are not evicted, the eviction hand quickly moves past the old popular objects to the queue positions close to the head. This allows newly inserted objects to be quickly assessed and evicted, putting greater eviction pressure on unpopular items (such as "one-hit wonders") than LRU-based eviction algorithms. It's an example of "lazy promotion and quick demotion". Popular objects get retained with minimal effort, with quick demotion "critical because most objects are not reused before eviction." After 1559 traces (of 247,017 million requests to 14,852 million objects), they found SIEVE reduces the miss ratio (when needed data isn't in the cache) by more than 42% on 10% of the traces with a mean of 21%, when compared to FIFO. (And it was also faster and more scalable than LRU.) "SIEVE not only achieves better efficiency, higher throughput, and better scalability, but it is also very simple."

Read more of this story at Slashdot.

Boeing Fraud Violated Fatal MAX Crash Settlement, Says Justice Department, Seeking Guilty Plea on Criminal Charges

30 June 2024 at 17:28
America's Justice Department "is pushing for Boeing to plead guilty to a criminal charge," reports Reuters, "after finding the planemaker violated a settlement over fatal 737 MAX crashes in 2018 and 2019 that killed 346 people, two people familiar with the matter said on Sunday." Boeing previously paid $2.5 billion as part of the deal with prosecutors that granted the company immunity from criminal prosecution over a fraud conspiracy charge related to the 737 MAX's flawed design. Boeing had to abide by the terms of the deferred prosecution agreement for a three-year period that ended on Jan. 7. Prosecutors would then have been poised to ask a judge to dismiss the fraud conspiracy charge. But in May, the Justice Department found Boeing breached the agreement, exposing the company to prosecution. A guilty plea could "carry implications for Boeing's ability to enter into government contracts," the article points out, "such as those with the U.S. military that make up a significant portion of its revenue..." The proposal would require Boeing to plead guilty to conspiring to defraud the U.S. Federal Aviation Administration in connection with the fatal crashes, the sources said. The proposed agreement also includes a $487.2 million financial penalty, only half of which Boeing would be required to pay, they added. That is because prosecutors are giving the company credit for a payment it made as part of the previous settlement related to the fatal crashes of the Lion Air and Ethiopian Airlines flights. Boeing could also likely be forced to pay restitution under the proposal's terms, the amount of which will be at a judge's discretion, the sources said. The offer also contemplates subjecting Boeing to three years of probation, the people said. The plea deal would also require Boeing's board to meet with victims' relatives and impose an independent monitor to audit the company's safety and compliance practices for three years, they said. "Should Boeing refuse to plead guilty, prosecutors plan to take the company to trial, they said..." the article points out. "Justice Department officials revealed their decision to victims' family members during a call earlier on Sunday."

Read more of this story at Slashdot.

A 'Safe' Chemical in Plastic Bottles Could Reduce Insulin Responsiveness, Increase Diabetes Risk

30 June 2024 at 16:19
A new study "has found direct evidence linking a key chemical ingredient of plastic bottles to a higher risk of type 2 diabetes," reports the Independent: The study, published in the journal Diabetes, found that the chemical BPA used to make food and drink packages, including plastic water bottles, can reduce sensitivity to the hormone insulin which regulates the body's sugar metabolism. The findings, to be presented at the 2024 Scientific Sessions of the American Diabetes Association, call for the US Environmental Protection Agency to reconsider the safe limits for exposure to BPA in bottles and food containers. Previous studies have already shown that the chemical Bisphenol A (BPA) used to make plastic and epoxy resins could disrupt hormones in humans. While research has linked BPA to diabetes, no previous study has directly assessed if administration of this chemical to humans increases this risk in adults. The researchers administered the dosage considered safe by America's FDA to about 20 individuals โ€” and discovered they became less responsive to insulin after 4 days. The article includes this warning from the researchers: "These results suggest that maybe the U.S. EPA safe dose should be reconsidered and that healthcare providers could suggest these changes to patients." Thanks to Slashdot reader Bruce66423 for sharing the news.

Read more of this story at Slashdot.

Ask Slashdot: What's the Best Way to Charge Your Smartphone Battery?

30 June 2024 at 14:48
To stop their smartphone battery from swelling, long-time Slashdot reader shanen bought a Samsung Galaxy with a "restrictive charging option." But what setting should they use? The way this battery protection option worked was to stop charging the phone at 85%. That left me enough charge for my normal daily travels, which rarely took the phone below 50%, and the battery remained unswollen after a year, which included a month of quite heavy tethering, too. Unfortunately... After a recent upgrade, now my Galaxy has three options for the battery where it had two. The 85% option is still there, but it has been lowered to 80%. I've been using that for now and it still seems good enough. However my main concern is with the best option to maximize the overall lifespan of the smartphone... The other old option says something about using AI to control the battery charging, but I don't trust it and think it is just the old approach that causes phones to die quickly... The new third option is the one that is interesting me. This seems to be a kind of flutter charge where the phone will charge to 100% and then stop until it has dropped to 95% before charging again, even if it remains plugged in. This sounds attractive and would give me more battery insurance when I'm traveling, but maybe it reduces the overall lifetime of the phone? They tried getting answers from Samsung, but "I think I have been flagged as a low-profit customer." And of course, this raises several other questions? (Are other smartphones better? Have iPhones solved the battery-swelling issue?) And most importantly: is there a way to charge batteries without reducing their lifespan? Share your own thoughts and experiences in the comments. What's the best way to charge your smartphone battery?

Read more of this story at Slashdot.

Chinese Space Company's Static Rocket Test Ends In Premature Launch, Huge Explosion

30 June 2024 at 13:34
Commercial space efforts continue around the world, as the Chinese company Space Pioneer fired up a partially-fueled rocket engine Sunday for a short-duration test of its reusable rocket on the ground. But Space News reports that the test "ended in catastrophic failure and a dramatic explosion." "Amateur footage captured by Gongyi citizens and posted on Chinese social media shows the nine-engine test stage igntiing and then, exceptionally, taking off." Hold-down clamps and other structures are typically used to securely keep stages in place. The stage is seen climbing into the sky before halting, apparently with its engines shutting off, and returning to Earth. The stage impacted the ground around 50 seconds after it took off, apparently with much of its kerosene-liquid oxygen propellant remaining, causing a large explosion. The Tianlong-3 first stage would likely fire for a number of minutes on an orbital flight. Space Pioneer was conducting its test as a buildup to an orbital launch of the Tianlong-3, which is benchmarked against the SpaceX Falcon 9, in the coming months. The company announced earlier this month that it has secured $207 million in new funding. Shanghai-based digital newspaper The Paper reported Henan officials as saying there were no casualties reported. Space Pioneer issued its own statement later, stating there was a structural failure at the connection between the rocket body and the test bench. The rocket's onboard computer automatically shut down the engines and the rocket fell 1.5 kilometers southwest. It reiterated earlier reports that no casualties were found. The company said the test produced 820 tons of thrust. The article speculates on whether the event will delay the development of the rocket โ€” or the planned launches for a Chinese megaconstellation of satellites. "Space Pioneer says it will conduct an analysis and restart testing with new hardware as soon as possible."

Read more of this story at Slashdot.

Is AI's Demand for Energy Really 'Insatiable'?

30 June 2024 at 12:34
Bloomberg and The Washington Post "claim AI power usage is dire," writes Slashdot reader NoWayNoShapeNoForm. But Ars Technica "begs to disagree with those speculations." From Ars Technica's article: The high-profile pieces lean heavily on recent projections from Goldman Sachs and the International Energy Agency (IEA) to cast AI's "insatiable" demand for energy as an almost apocalyptic threat to our power infrastructure. The Post piece even cites anonymous "some [people]" in reporting that "some worry whether there will be enough electricity to meet [the power demands] from any source." Digging into the best available numbers and projections available, though, it's hard to see AI's current and near-future environmental impact in such a dire light... While the headline focus of both Bloomberg and The Washington Post's recent pieces is on artificial intelligence, the actual numbers and projections cited in both pieces overwhelmingly focus on the energy used by Internet "data centers" as a whole... Bloomberg asks one source directly "why data centers were suddenly sucking up so much power" and gets back a blunt answer: "It's AI... It's 10 to 15 times the amount of electricity." Unfortunately for Bloomberg, that quote is followed almost immediately by a chart that heavily undercuts the AI alarmism. That chart shows worldwide data center energy usage growing at a remarkably steady pace from about 100 TWh in 2012 to around 350 TWh in 2024. The vast majority of that energy usage growth came before 2022, when the launch of tools like Dall-E and ChatGPT largely set off the industry's current mania for generative AI. If you squint at Bloomberg's graph, you can almost see the growth in energy usage slowing down a bit since that momentous year for generative AI. Ars Technica first cites Dutch researcher Alex de Vries's estimate that in a few years the AI sector could use between 85 and 134 TWh of power. But another study estimated in 2018 that PC gaming already accounted for 75 TWh of electricity use per year, while "the IEA estimates crypto mining ate up 110 TWh of electricity in 2022." More to the point, de Vries' AI energy estimates are only a small fraction of the 620 to 1,050 TWh that data centers as a whole are projected to use by 2026, according to the IEA's recent report. The vast majority of all that data center power will still be going to more mundane Internet infrastructure that we all take for granted (and which is not nearly as sexy of a headline bogeyman as "AI"). The future is also hard to predict, the article concludes. "If customers don't respond to the hype by actually spending significant money on generative AI at some point, the tech-marketing machine will largely move on, as it did very recently with the metaverse and NFTs..."

Read more of this story at Slashdot.

New Linux 'Screen of Death' Options: Black - or a Monochrome Tux Logo

30 June 2024 at 11:34
It was analgous to the "Blue Screen of Death" that Windows gives for critical errors, Phoronix wrote. To enable error messages for things like a kernel panic, Linux 6.10 introduced a new panic handler infrastructure for "Direct Rendering Manager" (or DRM) drivers. Phoronix also published a follow-up from Red Hat engineer Javier Martinez Canillas (who was involved in the new DRM Panic infrastructure). Given complaints about being too like Microsoft Windows following his recent Linux "Blue Screen of Death" showcase... Javier showed that a black screen of death is possible if so desired... After all, it's all open-source and thus can customize to your heart's content. And now the panic handler is getting even more new features, Phoronix reported Friday: With the code in Linux 6.10 when DRM Panic is triggered, an ASCII art version of Linux's mascot, Tux the penguin, is rendered as part of the display. With Linux 6.11 it will also be able to handle displaying a monochrome image as the logo. If ASCII art on error messages doesn't satisfy your tastes in 2024+, the DRM Panic code will be able to support a monochrome graphical logo that leverages the Linux kernel's boot-up logo support. The ASCII art penguin will still be used when no graphical logo is found or when the existing "LOGO" Kconfig option is disabled. (Those Tux logo assets being here.) This monochrome logo support in the DRM Panic handler was sent out as part of this week's drm-misc-next pull request ahead of the Linux 6.11 merge window in July. This week's drm-misc-next material also includes TTM memory management improvements, various fixes to the smaller Direct Rendering Manager drivers, and also the previously talked about monochrome TV support for the Raspberry Pi. Long-time Slashdot reader unixbhaskar thinks the new option "will certainly satisfy the modern people... But it is not as eye candy as people think... Moreover, it is monochrome, so certainly not resource-hungry. Plus, if all else fails, the ASCII art logo is still there to show!"

Read more of this story at Slashdot.

New Undersea Power Cables Could Carry Green Energy From Country to Country

30 June 2024 at 10:34
What if across the bottom of the Atlantic Ocean, six high-voltage power cables stretched โ€” each over 2,000 miles long. CNN reports that a group of entrepreneurs "wants to build what would be the world's largest subsea energy interconnector between continents, linking Europe and North America...to connect places like the United Kingdom's west with eastern Canada, and potentially New York with western France... "The Europe-US cables could send 6 gigawatts of energy in both directions at the speed of light, said Laurent Segalen, founder of the London-based Megawatt-X renewable energy firm, who is also part of the trio proposing the transatlantic interconnector. That's equivalent to what six large-scale nuclear power plants can generate, transmitted in near-real time." The interconnector would send renewable energy both east and west, taking advantage of the sun's diurnal journey across the sky. "When the sun is at its zenith, we probably have more power in Europe than we can really use," said Simon Ludlam, founder and CEO of Etchea Energy, and one of the trio of Europeans leading the project. "We've got wind and we've also got too much solar. That's a good time to send it to a demand center, like the East Coast of the United States. Five, six hours later, it's the zenith in the East Coast, and obviously, we in Europe have come back for dinner, and we get the reverse flow," he added. The transatlantic interconnector is still a proposal, but networks of green energy cables are starting to sprawl across the world's sea beds. They are fast becoming part of a global climate solution, transmitting large amounts of renewable energy to countries struggling to make the green transition alone. But they are also forging new relations that are reshaping the geopolitical map and shifting some of the world's energy wars down to the depths of the ocean... Already, energy cables run between several countries in Europe, most of them allied neighbors. Not all of them carry renewable power exclusively โ€” that's sometimes determined by what makes up each country's energy grid โ€” รขbut new ones are typically being built for a green energy future. The UK, where land space for power plants is limited, is already connected with Belgium, Norway, the Netherlands and Denmark under the sea. It has signed up to a solar and wind link with Morocco to take advantage of the North African country's many hours of sunlight and strong trade winds that run across the equator. Similar proposals are popping up around the globe. A project called Sun Cable seeks to send solar power from sunny Australia, where land is abundant, to the Southeast Asian nation of Singapore, which also has plenty of sun but very little room for solar farms. India and Saudi Arabia plan to link their respective power grids via the Arabian Sea, part of a broader economic corridor plan to connect Asia, the Middle East and Europe.

Read more of this story at Slashdot.

Fuel From Water? Visiting a Texas 'Green Hydrogen' Plant

30 June 2024 at 07:34
It transforms water into the fuel โ€” one of the first fuel plants in the world to do so. The Washington Post visits a facility in Corpus Christi, Texas using renewable energy to produce "green" hydrogen. The plant feeds water through machines that pull out its hydrogen atoms... [T]he hydrogen is chemically transformed into diesel for delivery trucks. This process could represent the biggest change in how fuel for planes, ships, trains and trucks is made since the first internal combustion engine fired up in the 19th century... Turning hydrogen into liquid fuel could help slash planet-warming pollution from heavy vehicles, cutting a key source of emissions that contribute to climate change. But to fulfill that promise, companies will have to build massive numbers of wind turbines and solar panels to power the energy-hungry process. Regulators will have to make sure hydrogen production doesn't siphon green energy that could go towards cleaning up other sources of global warming gases, such as homes or factories. Although cars and light trucks are shifting to electric motors, other forms of transport will likely rely on some kind of liquid fuel for the foreseeable future. Batteries are too heavy for planes and too bulky for ships. Extended charging times could be an obstacle for long-haul trucks, and some rail lines may be too expensive to electrify. Together, these vehicles represent roughly half of emissions from transportation, the fourth-biggest source of greenhouse gases. To wean machines off oil, companies like Infinium, the owner of this plant, are starting to churn out hydrogen-based fuels that โ€” in the best case โ€” produce close to net zero emissions. They could also pave the way for a new technology, hydrogen fuel cells, to power planes, ships and trucks in the second half of this century. For now, these fuels are expensive and almost no one makes them, so the U.S. government, businesses and philanthropists including Bill Gates are investing billions of dollars to build up a hydrogen industry that could cut eventually some of the most stubborn, hard-to-remove carbon pollution. Most scenarios for how the world could avoid the worst effects of climate change envision hydrogen cleaning up emissions in transportation, as well as in fertilizer production and steel and chemical refining. But if they're not made with dedicated renewable energy, hydrogen-based fuels could generate even more pollution than regular diesel, creating a wasteful boondoggle that sets the world back in the fight against climate change. Their potential comes down to the way plants like this produce them... Only about 40 percent of the power on the [Texas] electric grid is from renewables, with the rest coming from natural gas and coal, according to state data. That grid energy is what flows through the power line into the Infinium plant. "One day, heavy transportation may shift to fuel cells that run on pure hydrogen and emit only water vapor from their tailpipes," the article points out. But to accommodate today's carbon-burning vehicles, Infinium produces "chemical copies of existing fuels made with crude oil" by combining captured carbon with green hydrogen. "A truck running on diesel made from hydrogen using only renewable electricity would create 89 percent fewer greenhouse gas emissions over the course of its lifetime than a truck burning diesel made from petroleum, according to a 2022 analysis from the European nonprofit Transport & Environment."

Read more of this story at Slashdot.

NASA's Commercial Spacesuit Program Just Hit a Major Snag

30 June 2024 at 03:34
Slashdot reader Required Snark shared this article from Ars Technica: Almost exactly two years ago, as it prepared for the next generation of human spaceflight, NASA chose a pair of private companies to design and develop new spacesuits. These were to be new spacesuits that would allow astronauts to both perform spacewalks outside the International Space Station as well as walk on the Moon as part of the Artemis program. Now, that plan appears to be in trouble, with one of the spacesuit providers โ€” Collins Aerospace โ€” expected to back out, Ars has learned. It's a blow for NASA, because the space agency really needs modern spacesuits. NASA's Apollo-era suits have long been retired. The current suits used for spacewalks in low-Earth orbit are four decades old. "These new capabilities will allow us to continue on the International Space Station and allows us to do the Artemis program and continue on to Mars," said the director of Johnson Space Center, Vanessa Wyche, during a celebratory news conference in Houston two years ago. The two winning teams were led by Collins Aerospace and Axiom Space, respectively. They were eligible for task orders worth up to $3.5 billion โ€” in essence NASA would rent the use of these suits for a couple of decades. Since then, NASA has designated Axiom to work primarily on a suit for the Moon and the Artemis Program, and Collins with developing a suit for operations in-orbit, such as space station servicing... The agency has been experiencing periodic problems with the maintenance of the suits built decades ago, known as the Extravehicular Mobility Unit, which made its debut in the 1980s. NASA has acknowledged the suit has exceeded its planned design lifetime. Just this Monday, the agency had to halt a spacewalk after the airlock had been de-pressurized and the hatch opened due to a water leak in the service and cooling umbilical unit of Tracy Dyson's spacesuit. As a result of this problem, NASA will likely only be able to conduct a single spacewalk this summer, after initially planning three, to complete work outside the International Space Station. Collins designed the original Apollo suits, according to the article. But a person familiar with the situation told Ars Technica that "Collins has admitted they have drastically underperformed and have overspent" on their work, "culminating in a request to be taken off the contract or renegotiate the scope and their budget." Ironically, the company's top's post on their account on Twitter/X is still a repost of NASA's February announcement that they're "getting a nextx-generation spacesuit" developed by Collins Aerospace, and saying that the company "recently completed a key NASA design milestone aboard a commercial microgravity aircraft." NASA's post said they needed the suit "In order to advance NASA's spacewalking capabilities in low Earth orbit and to support continued maintenance and operations at the Space Station."

Read more of this story at Slashdot.

Will a US Supreme Court Ruling Put Net Neutrality at Risk?

29 June 2024 at 23:34
Today the Wall Street Journal reported that restoring net neutrality to America is "on shakier legal footing after a Supreme Court decision on Friday shifted power away from federal agencies." "It's hard to overstate the impact that this ruling could have on the regulatory landscape in the United States going forward," said Leah Malone, a lawyer at Simpson Thacher & Bartlett. "This could really bind U.S. agencies in their efforts to write new rules." Now that [the "Chevron deference"] is gone, the Federal Communications Commission is expected to have a harder time reviving net neutrality โ€” a set of policies barring internet-service providers from assigning priority to certain web traffic... The Federal Communications Commission reclassified internet providers as public utilities under the Communications Act. There are pending court cases challenging the FCC's reinterpretation of that 1934 law, and the demise of Chevron deference heightens the odds of the agency losing in court, some legal experts said. "Chevron's thumb on the scale in favor of the agencies was crucial to their chances of success," said Geoffrey Manne, president of the International Center for Law and Economics. "Now that that's gone, their claims are significantly weaker." Other federal agencies could also be affected, according to the article. The ruling could also make it harder for America's Environmental Protection Agency to crack down on power-plant pollution. And the Federal Trade Commission face more trouble in court defending its recent ban on noncompete agreements. Lawyer Daniel Jarcho tells the Journal that the Court's decision "will unquestionably lead to more litigation challenging federal agency actions, and more losses for federal agencies." Friday a White House press secretary issued a statement calling the court's decision "deeply troubling," and arguing that the court had "decided in the favor of special interests".

Read more of this story at Slashdot.

Threads Expands Fediverse Beta, Letting Users See Replies (and Likes) on Other Fediverse Sites like Mastodon

29 June 2024 at 21:34
An anonymous Slashdot reader shared this report from the Verge: Threads will now let people like and see replies to their Threads posts that appear on other federated social media platforms, the company announced on Tuesday. Previously, if you made a post on Threads that was syndicated to another platform like Mastodon, you wouldn't be able to see responses to that post while still inside Threads. That meant you'd have to bounce back and forth between the platforms to stay up-to-date on replies... [I]n a screenshot, Meta notes that you can't reply to replies "yet," so it sounds like that feature will arrive in the future. "Threads is Meta's first app built to be compatible with the fediverse..." according to a Meta blog post. "Our vision is that people using other fediverse-compatible servers will be able to follow and interact with people on Threads without having a Threads profile, and vice versa, connecting communities..." [If you turn on "sharing"...] "Developers can build new types of features and user experiences that can easily plug into other open social networks, accelerating the pace of innovation and experimentation." And this week Instagram/Threads top executive Adam Mosseri posted that Threads is "also expanding the availability of the fediverse beta experience to more than 100 countries, and hope to roll it out everywhere soon."

Read more of this story at Slashdot.

Before yesterdayTechnology

Could We Lower The Carbon Footprint of Data Centers By Launching Them Into Space?

29 June 2024 at 18:34
The Wall Street Journal reports that a European initiative studying the feasibility data centers in space "has found that the project could be economically viable" โ€” while reducing the data center's carbon footprint. And they add that according to coordinator Thales Alenia Space, the project "could also generate a return on investment of several billion euros between now and 2050." The study โ€” dubbed Ascend, short for Advanced Space Cloud for European Net zero emission and Data sovereignty โ€” was funded by the European Union and sought to compare the environmental impacts of space-based and Earth-based data centers, the company said. Moving forward, the company plans to consolidate and optimize its results. Space data centers would be powered by solar energy outside the Earth's atmosphere, aiming to contribute to the European Union's goal of achieving carbon neutrality by 2050, the project coordinator said... Space data centers wouldn't require water to cool them, the company said. The 16-month study came to a "very encouraging" conclusion, project manager Damien Dumestier told CNBC. With some caveats... The facilities that the study explored launching into space would orbit at an altitude of around 1,400 kilometers (869.9 miles) โ€” about three times the altitude of the International Space Station. Dumestier explained that ASCEND would aim to deploy 13 space data center building blocks with a total capacity of 10 megawatts in 2036, in order to achieve the starting point for cloud service commercialization... The study found that, in order to significantly reduce CO2 emissions, a new type of launcher that is 10 times less emissive would need to be developed. ArianeGroup, one of the 12 companies participating in the study, is working to speed up the development of such reusable and eco-friendly launchers. The target is to have the first eco-launcher ready by 2035 and then to allow for 15 years of deployment in order to have the huge capacity required to make the project feasible, said Dumestier... Michael Winterson, managing director of the European Data Centre Association, acknowledges that a space data center would benefit from increased efficiency from solar power without the interruption of weather patterns โ€” but the center would require significant amounts of rocket fuel to keep it in orbit. Winterson estimates that even a small 1 megawatt center in low earth orbit would need around 280,000 kilograms of rocket fuel per year at a cost of around $140 million in 2030 โ€” a calculation based on a significant decrease in launch costs, which has yet to take place. "There will be specialist services that will be suited to this idea, but it will in no way be a market replacement," said Winterson. "Applications that might be well served would be very specific, such as military/surveillance, broadcasting, telecommunications and financial trading services. All other services would not competitively run from space," he added in emailed comments. [Merima Dzanic, head of strategy and operations at the Danish Data Center Industry Association] also signaled some skepticism around security risks, noting, "Space is being increasingly politicised and weaponized amongst the different countries. So obviously, there is a security implications on what type of data you send out there." Its not the only study looking at the potential of orbital data centers, notes CNBC. "Microsoft, which has previously trialed the use of a subsea data center that was positioned 117 feet deep on the seafloor, is collaborating with companies such as Loft Orbital to explore the challenges in executing AI and computing in space." The article also points out that the total global electricity consumption from data centers could exceed 1,000 terawatt-hours in 2026. "That's roughly equivalent to the electricity consumption of Japan, according to the International Energy Agency."

Read more of this story at Slashdot.

Get Ready For Nuclear Clocks

29 June 2024 at 17:34
Long-time Slashdot reader jrronimo says JILA physicist Jun Ye's group "has made a breakthrough towards the next stage of precision timekeeping." From their paper recently published to arXiv: Optical atomic clocks use electronic energy levels to precisely keep track of time. A clock based on nuclear energy levels promises a next-generation platform for precision metrology and fundamental physics studies.... These results mark the start of nuclear-based solid-state optical clocks and demonstrate the first comparison of nuclear and atomic clocks for fundamental physics studies. This work represents a confluence of precision metrology, ultrafast strong field physics, nuclear physics, and fundamental physics.

Read more of this story at Slashdot.

Citing 'Crisis' in Local Reporting, Associated Press Creates Sister Organization to Seek Grants

29 June 2024 at 16:34
Founded in 1846, the not-for-profit Associated Press distributes its news stories to other news outlets. But are free online sites putting those outlets at risk? This week the Associated Press wrote that a "crisis" in local and state news reporting "shows little signs of abating" โ€” and that it's now setting up "a sister organization that will seek to raise money" for those outlets. The organization, which will have a board of directors independent of the AP, will solicit philanthropic spending to boost this news coverage, both within the AP and through outside organizations, the news outlet said Tuesday. "We feel we have to lean in at this point, not pull back," said Daisy Veerasingham, the AP's president and CEO. "But the supporting mechanism โ€” the local newspaper market that used to support this โ€” can't afford to do that anymore." Veerasingham said she's been encouraged by preliminary talks with some funders who have expressed concern about the state of local journalism... The local news industry has collapsed over the past two decades, with the number of journalists working in newspapers dropping from 75,000 to 31,000 in 2022, according to Northwestern University. More than half of the nation's counties have no local news outlets or only one. The AP's CEO offered this succinct summary of their goal. "We want to add new products and services to help the industry."

Read more of this story at Slashdot.

90 Workers Given a Choice: Relocate Across the US, or Leave the Company

29 June 2024 at 15:34
"The outdoor-apparel brand Patagonia has given 90 U.S. employees a choice," reports Business Insider: "tell the company by Friday that you're willing to relocate or leave your job." [Alternate URL here.] The employees all work in customer services, known at Patagonia as the customer-experience, or CX, team, and have been allowed to work remotely to field calls and inquiries. These workers received a text and email Tuesday morning about an "important" meeting... Two company executives, Amy Velligan and Bruce Old, told staff in a 15-minute video meeting that the team would be moving to a new "hub" model. CX employees are now expected to live within 60 miles of one of seven "hubs" โ€” Atlanta; Salt Lake City; Reno, Nevada; Dallas; Austin; Chicago; or Pittsburgh. Workers were offered $4,000 toward relocation costs and extra paid time off. Those willing to relocate were told to do so by September 30. If CX staff are not willing to live near a hub city, they must leave the company. They were given 72 hours, until Friday, to confirm their decision... Access to company laptops and phones was shut off later that day until employees either agreed to relocate or said they wanted the severance, one affected CX worker said... Both employees who spoke to Business Insider believed this was because Patagonia didn't want to handle the increased demands of employees in states with higher costs of living. "We've been asking for raises for a long time, and they keep telling us that your wage is based on a Reno cost of living and where you choose to live is on you." According to the article, "The company hopes to bring staff together at the hubs at least once every six weeks for in-person training, company gatherings, or 'Activism Hours'." A company spokesperson described the changes as "crucial for us to build a vibrant team culture," and said there were workers who had been complaining about feeling disconnected. Though there may be another motive: "The reality is that our CX team has been running at 200% to 300% overstaffed for much of this year," she added. "While we hoped to reach the needed staffing levels through attrition, those numbers were very low, and retention remained high." One affected worker told Business Insider that the company's proposal "was very factual. If you don't live in these seven metro areas, you either need to move there or give us your stuff and hit the brick. If we don't respond by Friday, they will assume that we have chosen the severance package and we'll start that process." One worker added that the severance package they received was generous... Thanks to Slashdot reader NoWayNoShapeNoForm for sharing the article.

Read more of this story at Slashdot.

Lego Bricks Made From Meteorite Dust 3D Printed by Europe's Space Agency

29 June 2024 at 14:34
Lego teamed up with the European Space Agency to make Lego pieces from actual meteorite dust, writes Engadget. "It's a proof of concept to show how astronauts could use moondust to build lunar structures." Consider the sheer amount of energy and money required to haul up building materials from Earth to the Moon. It would be a game changer to, instead, build everything from pre-existing lunar materials. There's a layer of rock and mineral deposits at the surface of the Moon, which is called lunar regolith... However, there isn't too much lunar regolith here on Earth for folks to experiment with. ESA scientists made their own regolith by grinding up a really old meteorite. [4.5 billion years, according to Lego's site, discovered in Africa in 2000.] The dust from this meteorite was turned into a mixture that was used to 3D print the Lego pieces. Voila. Moon bricks. They click together just like regular Lego bricks, though they only come in one color (space gray obviously.) "The result is amazing," says ESA Science Officer Aidan Cowley on the Lego site (though "the bricks may look a little rougher than usual. Importantly the clutch power still works, enabling us to play and test our designs.") "Nobody has built a structure on the Moon," Cowley said in an ESA statement. "So it was great to have the flexibility to try out all kinds of designs and building techniques with our space bricks." And the bricks will also be "helping to inspire the next generation of space engineers," according to the ESA's announcement โ€” since they'll be on display in select Lego stores in the U.S., Canada, the U.K., Spain, France, Germany, the Netherlands, and Australia through September 20th.

Read more of this story at Slashdot.

Linux Foundation Announces Intent to Form LF Decentralized Trust

29 June 2024 at 13:34
This week the Linux Foundation announced a new organization for decentralized systems and technologies, with an aim of "fostering innovation and collaboration" in both their development and deployment. It will build on existing Linux Foundation blockchain and digital identity projects, according to the announcement, while supporting "a rapidly growing decentralized technology landscape." To foster this broader ecosystem, LF Decentralized Trust will encompass the growing portfolio of Hyperledger projects and host new open source software, communities, standards, and specifications that are critical to the macro shift toward decentralized systems of distributed trust.... LF Decentralized Trust's expanded project and member ecosystem will be both essential to emerging tokenized assets classes and networks, as well as to modernizing the core infrastructure for finance, trade, government, healthcare, and more. LF Decentralized Trust will serve as a neutral home for the open development of a broad range of ledger, identity, security, interoperability, scale, implementation, and related technologies... LF Decentralized Trust will also include new directed funding models that will drive strategic investments by members into individual projects and project resources. "With LF Decentralized Trust, we're expanding our commitment to open source innovation by embracing a wider array of decentralized technologies," said Jim Zemlin, Executive Director of the Linux Foundation. "This new, elevated foundation will enable the community to build a more robust ecosystem that drives forward transparency, security, and efficiency in global infrastructure." "After eight years of advancing the development of blockchain, decentralized identity and related technologies via the Hyperledger community, the time has come to broaden our effort and impact," said Daniela Barbosa, General Manager, Blockchain and Identity, the Linux Foundation. "Ledgers and ledger technologies are but one component of the decentralized systems that will underpin a digital-first global economy. LF Decentralized Trust is where we will gather and grow an expanded community and portfolio of technologies to deliver the transparency, reliability, security and efficiency needed to successfully upgrade critical systems around the world." The announcement includes quotes of support from numerous companies including Oracle, Siemens, Visa, Accenture, Citi, and Hitachi. Some highlights: "The formation of the LF Decentralized Trust reflects the growing demand for open source resources that are critical to the management and functionality of decentralized systems." โ€” CEO of Digital Asset "The adoption of decentralized infrastructure is at an inflection point, reflecting the increasing demand from both enterprises and consumers for more secure and transparent digital transactions. As the industry leader for onchain data, blockchain abstraction, and interoperability, we're excited to see the formation of the LF Decentralized Trust and to expand our collaboration with leading financial institutions on advancing tokenized assets and the onchain economy at large." โ€” CMO at Chainlink Labs. "As a founding member of the Hyperledger Foundation, and given our unique position in the financial markets, we recognize the vast potential for open-source innovation and decentralized technologies when it comes to reducing risk, increasing resiliency and improving security. The expansion of Hyperledger Foundation into LF Decentralized Trust represents an exciting opportunity to continue expanding these groundbreaking technologies." โ€” a managing director at DTCC

Read more of this story at Slashdot.

Colorado's Universal Basic Income Experiment Gets Surprising Results

29 June 2024 at 12:34
In November of 2022, "More than 800 people were selected to participate in the Denver Basic Income Project," reports the Colorado Sun, "while they were living on the streets, in shelters, on friends' couches or in vehicles. One group received $1,000 a month, according to the article, while a second group received $6,500 in the first month, and then $500 for the next 11 months. (And a "control" group received $50 a month.) Amazingly, about 45% of participants in all three groups "were living in a house or apartment that they rented or owned by the study's 10-month check-in point, according to the research." The number of nights spent in shelters among participants in the first and second groups decreased by half. And participants in those two groups reported an increase in full-time work, while the control group reported decreased full-time employment. The project also saved tax dollars, according to the report. Researchers tallied an estimated $589,214 in savings on public services, including ambulance rides, visits to hospital emergency departments, jail stays and shelter nights... The study, which began in November 2022 with payments to the first group of participants, has been extended for an additional eight months, until September, and organizers are attempting to raise money to extend it further.

Read more of this story at Slashdot.

Japan Achieves 402 TB/s Data Rate - Using Current Fiber Technology

29 June 2024 at 11:34
Tom's Hardware reports that Japan's National Institute of Information and Communications Technology (working with the Aston Institute of Photonic Technologies and Nokia Bell) set a 402 terabits per second data transfer record โ€” over commercially available optical fiber cables. The NICT and its partners were able to transmit signals through 1,505 channels over 50 km (about 31 miles) of optic fiber cable for this experiment. It used six types of amplifiers and an optical gain equalizer that taps into the unused 37 THz bandwidth to enable the 402 Tb/s transfer speed. One of the amplifiers this was demonstrated with is a thulium-based doped fiber amplifier, which uses C-band or C+L band systems. Additionally, semiconductor optical amplifiers and Raman amplifiers were used, which achieved 256 Tb/s data rate through almost 20 THz. Other amplifiers were also used for this exercise which provided a cumulative bandwidth of 25 THz for up to 119 Tb/s data rate. As a result, its maximum achievable result surpassed the previous data rate capacity by over 25 percent and increased transmission bandwidth by 35 percent. "This is achievable with currently available technology used by internet service providers..." the article points out. "With 'beyond 5G' potential speeds achievable through commercially available cables, it will likely further a new generation of internet services."

Read more of this story at Slashdot.

Are 'Immortal Stars' Feasting on Dark Matter in the Milky Way's Core?

29 June 2024 at 10:34
"Stars very close to the center of our galaxy could be fueled by dark matter in perpetuity," writes Gizmodo, "according to a team of astronomers who recently studied the distant light sources." The group of stars, known as S-cluster stars, is just three light-years from the center of the Milky Way (for reference, we are about 26,000 light-years from the center of our galaxy, which hosts a supermassive black hole at its core). The stars are surprisingly young for their galactic neighborhood, yet they don't look like stars that simply migrated to this part of the Milky Way after forming in another location... As reported by Space.com, the research team posits that these weird stars may be accreting dark matter, which they then use as fuel to keep burning. Since models estimate there is plenty of dark matter near the galaxy's core, the stars are "forever young," as study lead author Isabelle John, an astrophysicist at the Kavli Institute for Particle Astrophysics and Cosmology told Space.com. Effectively, the stars have a long, long way to go before they start running low on fuel. The team's paper is currently hosted on the preprint server arXiv, meaning it has not yet gone through the process of peer review. Dark matter is only "seen" through its effects on other objects, the article points out โ€” leading to lots of theories as to where it's actually located. "Earlier this year, a different team of researchers proposed that neutron stars โ€” extremely dense stellar remnants โ€” could actually be a source of dark matter. Last July, yet another team suggested that the Webb Telescope had detected stars that were powered by dark matter."

Read more of this story at Slashdot.

Amazon Labor Union, Airplane Hub Workers Ally with Teamsters Organizing Workers Nationwide

24 June 2024 at 07:34
Two prominent unions are teaming up to challenge Amazon, reports the New York Times โ€” "after years of organizing Amazon workers and pressuring the company to bargain over wages and working conditions." Members of the Amazon Labor Union "overwhelmingly chose to affiliate with the 1.3-million-member International Brotherhood of Teamsters" in a vote last Monday. While the Amazon Labor Union (or ALU) is the only union formally representing Amazon warehouse workers anywhere in America after an election in 2022, "it has yet to begin bargaining with Amazon, which continues to contest the election outcome." Leaders of both unions said the affiliation agreement would put them in a better position to challenge Amazon and would provide the Amazon Labor Union with more money and staff support... The Teamsters are ramping up their efforts to organize Amazon workers nationwide. The union voted to create an Amazon division in 2021, and O'Brien was elected that year partly on a platform of making inroads at the company. The Teamsters told the ALU that they had allocated $8 million to support organizing at Amazon, according to ALU President Christian Smalls, and that the larger union was prepared to tap its more than $300 million strike and defense fund to aid in the effort... The Teamsters also recently reached an affiliation agreement with workers organizing at Amazon's largest airplane hub in the United States, a Kentucky facility known as KCVG. Experts have said unionizing KCVG could give workers substantial leverage because Amazon relies heavily on the hub to meet its one- and two-day shipping goals. Their agreement with the Teamsters says the Amazon Labor Union will also "lend its expertise to assist in organizing other Amazon facilities" across America, according to the article.

Read more of this story at Slashdot.

Slashdot Asks: What Do You Remember About the Web in 1994?

24 June 2024 at 03:34
"The Short Happy Reign of the CD-ROM" was just one article in a Fast Company series called 1994 Week. As the week rolled along they also re-visited Yahoo, Netscape, and how the U.S. Congress "forced the videogame industry to grow up." But another article argues that it's in web pages from 1994 that "you can start to see in those weird, formative years some surprising signs of what the web would be, and what it could be." It's hard to say precisely when the tipping point was. Many point to September '93, when AOL users first flooded Usenet. But the web entered a new phase the following year. According to an MIT study, at the start of 1994, there were just 623 web servers. By year's end, it was estimated there were at least 10,000, hosting new sites including Yahoo!, the White House, the Library of Congress, Snopes, the BBC, sex.com, and something called The Amazing FishCam. The number of servers globally was doubling every two months. No one had seen growth quite like that before. According to a press release announcing the start of the World Wide Web Foundation that October, this network of pages "was widely considered to be the fastest-growing network phenomenon of all time." As the year began, Web pages were by and large personal and intimate, made by research institutions, communities, or individuals, not companies or brands. Many pages embodied the spirit, or extended the presence, of newsgroups on Usenet, or "User's Net." (Snopes and the Internet Movie Database, which landed on the Web in 1993, began as crowd-sourced projects on Usenet.) But a number of big companies, including Microsoft, Sun, Apple, IBM, and Wells Fargo, established their first modest Web outposts in 1994, a hint of the shopping malls and content farms and slop factories and strip mines to come. 1994 also marked the start of banner ads and online transactions (a CD, pizzas), and the birth of spam and phishing... [B]ack in '94, the salesmen and oilmen and land-grabbers and developers had barely arrived. In the calm before the storm, the Web was still weird, unruly, unpredictable, and fascinating to look at and get lost in. People around the world weren't just writing and illustrating these pages, they were coding and designing them. For the most part, the design was non-design. With a few eye-popping exceptions, formatting and layout choices were simple, haphazard, personal, and โ€” in contrast to most of today's web โ€” irrepressibly charming. There were no table layouts yet; cascading style sheets, though first proposed in October 1994 by Norwegian programmer Hรฅkon Wium Lie, wouldn't arrive until December 1996... The highways and megalopolises would come later, courtesy of some of the world's biggest corporations and increasingly peopled by bots, but in 1994 the internet was still intimate, made by and for individuals... Soon, many people would add "under construction" signs to their Web pages, like a friendly request to pardon our dust. It was a reminder that someone was working on it โ€” another indication of the craft and care that was going into this never-ending quilt of knowledge. The article includes screenshots of Netscape in action from browser-emulating site OldWeb.Today (albeit without using a 14.4 kbps modems). "Look in and think about how and why this web grew the way it did, and what could have been. Or try to imagine what life was like when the web wasn't worldwide yet, and no one knew what it really was." Slashdot reader tedlistens calls it "a trip down memory lane," offering "some telling glimpses of the future, and some lessons for it too." The article revisits 1994 sites like Global Network Navigator, Time-Warner's Pathfinder, and Wired's online site HotWired as well as 30-year-old versions of the home pages for Wells Fargo and Microsoft. What did they miss? Share your own memories in the comments. What do you remember about the web in 1994?

Read more of this story at Slashdot.

Amazon Retaliated After Employee Walkout Over Return-to-Office Policy, Says NLRB

23 June 2024 at 23:34
America's National Labor Relations Board "has filed a complaint against Amazon..." reports the Verge, "that alleges the company 'unlawfully disciplined and terminated an employee' after they assisted in organizing walkouts last May in protest of Amazon's new return-to-work [three days per week] directives, issued early last year." [T]housands of Amazon employees signed petitions against the new mandate and staged a walkout several months later. Despite the protests and pushback, according to a report by Insider, in a meeting in early August 2023, Jassy reaffirmed the company's commitment to employees returning to the office for the majority of the week. The NLRB complaint alleges Amazon "interrogated" employees about the walkout using its internal Chime system. The employee was first put on a performance improvement plan by Amazon following their organizing efforts for the walkout and later "offered a severance payment of nine weeks' salary if the employee signed a severance agreement and global release in exchange for their resignation." According to the NLRB's lawyers, all of that was because the employee engaged in organizing, and the retaliation was intended to discourage "...protected, concerted activities...." The NLRB's general counsel is seeking several different forms of remediation from Amazon, including reimbursement for the employee's "financial harms and search-for-work and work related expenses," a letter of apology, and a "Notice to Employees" that must be physically posted at the company's facilities across the country, distributed electronically, and read by an Amazon rep at a recorded videoconference. Amazon says their actions were entirely unrelated to the workers activism against their return-to-work policies. An Amazon spokesperson told the Verge that instead, the employee "consistently underperformed over a period of nearly a year and repeatedly failed to deliver on projects she was assigned. Despite extensive support and coaching, the former employee was unable to improve her performance and chose to leave the company."

Read more of this story at Slashdot.

Framework Laptop 13 is Getting a Drop-In RISC-V Mainboard Option

23 June 2024 at 21:34
An anonymous reader shared this report from the OMG Ubuntu blog: Those of you who own a Framework Laptop 13 โ€” consider me jealous, btw โ€” or are considering buying one in the near future, you may be interested to know that a RISC-V motherboard option is in the works. DeepComputing, the company behind the recently-announced Ubuntu RISC-V laptop, is working with Framework Computer Inc, the company behind the popular, modular, and Linux-friendly Framework laptops, on a RISC-V mainboard. This is a new announcement; the component itself is in early development, and there's no tentative price tag or pre-order date pencilled in... [T]he Framework RISC-V mainboard will use soldered memory and non-upgradeable eMMC storage (though it can boot from microSD cards). It will 'drop into' any Framework Laptop 13 chassis (or Cooler Master Mainboard Case), per Framework's modular ethos... Framework mentions DeepComputing is "working closely with the teams at Canonical and Red Hat to ensure Linux support is solid through Ubuntu and Fedora", which is great news, and cements Canonical's seriousness to supporting Ubuntu on RISC-V. "We want to be clear that in this generation, it is focused primarily on enabling developers, tinkerers, and hobbyists to start testing and creating on RISC-V," says Framework's announcement. "The peripheral set and performance aren't yet competitive with our Intel and AMD-powered Framework Laptop Mainboards." They're calling the Mainboard "a huge milestone both for expanding the breadth of the Framework ecosystem and for making RISC-V more accessible than ever... DeepComputing is demoing an early prototype of this Mainboard in a Framework Laptop 13 at the RISC-V Summit Europe next week, and we'll be sharing more as this program progresses." And their announcement included two additional updates: "Just like we did for Framework Laptop 16 last week, today we're sharing open source CAD for the Framework Laptop 13 shell, enabling development of skins, cases, and accessories." "We now have Framework Laptop 13 Factory Seconds systems available with British English and German keyboards, making entering the ecosystem more affordable than ever." "We're eager to continue growing a new Consumer Electronics industry that is grounded in open access, repairability, and customization at every level."

Read more of this story at Slashdot.

Why Washington's Mount Rainier Still Makes Volcanologists Worry

23 June 2024 at 19:33
It's been a 1,000 years since there was a significant volcanic eruption from Mount Rainier, CNN reminds readers. It's a full 60 miles from Tacoma, Washington โ€” and 90 miles from Seattle. Yet "more than Hawaii's bubbling lava fields or Yellowstone's sprawling supervolcano, it's Mount Rainier that has many U.S. volcanologists worried." "Mount Rainier keeps me up at night because it poses such a great threat to the surrounding communities, said Jess Phoenix, a volcanologist and ambassador for the Union of Concerned Scientists, on an episode of CNN's series "Violent Earth With Liv Schreiber." The sleeping giant's destructive potential lies not with fiery flows of lava, which, in the event of an eruption, would be unlikely to extend more than a few miles beyond the boundary of Mount Rainier National Park in the Pacific Northwest. And the majority of volcanic ash would likely dissipate downwind to the east away from population centers, according to the US Geological Survey. Instead, many scientists fear the prospect of a lahar โ€” a swiftly moving slurry of water and volcanic rock originating from ice or snow rapidly melted by an eruption that picks up debris as it flows through valleys and drainage channels. "The thing that makes Mount Rainier tough is that it is so tall, and it's covered with ice and snow, and so if there is any kind of eruptive activity, hot stuff ... will melt the cold stuff and a lot of water will start coming down," said Seth Moran, a research seismologist at USGS Cascades Volcano Observatory in Vancouver, Washington. "And there are tens, if not hundreds of thousands of people who live in areas that potentially could be impacted by a large lahar, and it could happen quite quickly." The deadliest lahar in recent memory was in November 1985 when Colombia's Nevado del Ruiz volcano erupted. Just a couple hours after the eruption started, a river of mud, rocks, lava and icy water swept over the town of Armero, killing over 23,000 people in a matter of minutes... Bradley Pitcher, a volcanologist and lecturer in Earth and environmental sciences at Columbia University, said in an episode of CNN's "Violent Earth"... said that Mount Rainier has about eight times the amount of glaciers and snow as Nevado del Ruiz had when it erupted. "There's the potential to have a much more catastrophic mudflow...." Lahars typically occur during volcanic eruptions but also can be caused by landslides and earthquakes. Geologists have found evidence that at least 11 large lahars from Mount Rainier have reached into the surrounding area, known as the Puget Lowlands, in the past 6,000 years, Moran said. Two major U.S. cities โ€” Tacoma and South Seattle โ€” "are built on 100-foot-thick (30.5-meter) ancient mudflows from eruptions of Mount Rainier," the volcanologist said on CNN's "Violent Earth" series. CNN's article adds that the US Geological Survey already set up a lahar detection system at Mount Rainier in 1998, "which since 2017 has been upgraded and expanded. About 20 sites on the volcano's slopes and the two paths identified as most at risk of a lahar now feature broadband seismometers that transmit real-time data and other sensors including trip wires, infrasound sensors, web cameras and GPS receivers."

Read more of this story at Slashdot.

Apple Might Partner with Meta on AI

23 June 2024 at 18:33
Earlier this month Apple announced a partnership with OpenAI to bring ChatGPT to Siri. "Now, the Wall Street Journal reports that Apple and Facebook's parent company Meta are in talks around a similar deal," according to TechCrunch: A deal with Meta could make Apple less reliant on a single partner, while also providing validation for Meta's generative AI tech. The Journal reports that Apple isn't offering to pay for these partnerships; instead, Apple provides distribution to AI partners who can then sell premium subscriptions... Apple has said it will ask for users' permission before sharing any questions and data with ChatGPT. Presumably, any integration with Meta would work similarly.

Read more of this story at Slashdot.

Michigan Lawmakers Advance Bill Requiring All Public High Schools To At Least Offer CS

23 June 2024 at 17:33
Michigan's House of Representatives passed a bill requiring all the state's public high schools to offer a computer science course by the start of the 2027-28 school year. (The bill now goes to the Senate, according to a report from Chalkbeat Detroit.) Long-time Slashdot reader theodp writes: Michigan is also removing the requirement for CS teacher endorsements in 2026, paving the way for CS courses to be taught in 2027 by teachers who have "demonstrated strong computer science skills" but do not hold a CS endorsement. Michigan's easing of CS teaching requirements comes in the same year that New York State will begin requiring credentials for all CS teachers. With lobbyist Julia Wynn from the tech giant-backed nonprofit Code.org sitting at her side, Michigan State Rep. Carol Glavnille introduced the CS bill (HB5649) to the House in May (hearing video, 16:20). "This is not a graduation requirement," Glavnille emphasized in her testimony. Code.org's Wynn called the Bill "an important first step" โ€” after all, Code.org's goal is "to require all students to take CS to earn a HS diploma" โ€” noting that Code.org has also been closely collaborating with Michigan's Education department "on the language and the Bill since inception." Wynn went on to inform lawmakers that "even just attending a high school that offers computer science delivers concrete employment and earnings benefits for students," citing a recent Brookings Institute article that also noted "30 states have adopted a key part of Code.org Advocacy Coalition's policy recommendations, which require all high schools to offer CS coursework, while eight states (and counting) have gone a step further in requiring all students to take CS as a high school graduation requirement." Minutes from the hearing report other parties submitting cards in support of HB 5649 included Amazon (a $3+ million Code.org Platinum Supporter) and AWS (a Code.org In-Kind Supporter), as well as College Board (which offers the AP CS A and CSP exams) and TechNet (which notes its "teams at the federal and state levels advocate with policymakers on behalf of our member companies").

Read more of this story at Slashdot.

Longtime Linux Wireless Developer Passes Away. RIP Larry Finger

23 June 2024 at 16:33
Slashdot reader unixbhaskar shared this report from Phoronix: Larry Finger who has contributed to the Linux kernel since 2005 and has seen more than 1,500 kernel patches upstreamed into the mainline Linux kernel has sadly passed away. His wife shared the news of Larry Finger's passing this weekend on the linux-wireless mailing list in a brief statement. Reactions are being shared around the internet. LWN writes: The LWN Kernel Source Database shows that Finger contributed to 94 releases in the (Git era) kernel history, starting with 2.6.16 โ€” 1,464 commits in total. He will be missed... In part to his contributions, the Linux wireless hardware support has come a long way over the past two decades. Larry was a frequent contributor to the Linux Wireless and Linux Kernel mailing lists. (Here's a 2006 discussion he had about Git with Linus Torvalds.) Larry also answered 54 Linux questions on Quora, and in 2005 wrote three articles for Linux Journal. And Larry's GitHub profile shows 122 contributions to open source projects just in 2024. In Reddit's Linux forum, one commenter wrote, "He was 84 years old and was still writing code. What a legend. May he rest in peace."

Read more of this story at Slashdot.

OpenAI's 'Media Manager' Mocked, Amid Accusations of Robbing Creative Professionals

23 June 2024 at 15:16
OpenAI's 'Media Manager' Mocked, Amid Accusations of Robbing Creative Professionals "Amid the hype surrounding Apple's new deal with OpenAI, one issue has been largely papered over," argues the Executive Director of America's writer's advocacy group, the Authors Guild. OpenAI's foundational models "are, and have always been, built atop the theft of creative professionals' work." [L]ast month the company quietly announced Media Manager, scheduled for release in 2025. A tool purportedly designed to allow creators and content owners to control how their work is used, Media Manager is really a shameless attempt to evade responsibility for the theft of artists' intellectual property that OpenAI is already profiting from. OpenAI says this tool would allow creators to identify their work and choose whether to exclude it from AI training processes. But this does nothing to address the fact that the company built its foundational models using authors' and other creators' works without consent, compensation or control over how OpenAI users will be able to imitate the artists' styles to create new works. As it's described, Media Manager puts the burden on creators to protect their work and fails to address the company's past legal and ethical transgressions. This overture is like having your valuables stolen from your home and then hearing the thief say, "Don't worry, I'll give you a chance to opt out of future burglaries ... next year...." AI companies often argue that it would be impossible for them to license all the content that they need and that doing so would bring progress to a grinding halt. This is simply untrue. OpenAI has signed a succession of licensing agreements with publishers large and small. While the exact terms of these agreements are rarely released to the public, the compensation estimates pale in comparison with the vast outlays for computing power and energy that the company readily spends. Payments to authors would have minimal effects on AI companies' war chests, but receiving royalties for AI training use would be a meaningful new revenue stream for a profession that's already suffering... We cannot trust tech companies that swear their innovations are so important that they do not need to pay for one of the main ingredients โ€” other people's creative works. The "better future" we are being sold by OpenAI and others is, in fact, a dystopia. It's time for creative professionals to stand together, demand what we are owed and determine our own futures. The Authors Guild (and 17 other plaintiffs) are now in an ongoing lawsuit against OpenAI and Microsoft. And the Guild's executive director also notes that there's also "a class action filed by visual artists against Stability AI, Runway AI, Midjourney and Deviant Art, a lawsuit by music publishers against Anthropic for infringement of song lyrics, and suits in the U.S. and U.K. brought by Getty Images against Stability AI for copyright infringement of photographs." They conclude that "The best chance for the wider community of artists is to band together."

Read more of this story at Slashdot.

Tuesday SpaceX Launches a NOAA Satellite to Improve Weather Forecasts for Earth and Space

23 June 2024 at 13:59
Tuesday a SpaceX Falcon Heavy rocket will launch a special satellite โ€” a state-of-the-art weather-watcher from America's National Oceanic and Atmospheric Administration. It will complete a series of four GOES-R satellite launches that began in 2016. Space.com drills down into how these satellites have changed weather forecasts: More than seven years later, with three of the four satellites in the series orbiting the Earth, scientists and researchers say they are pleased with the results and how the advanced technology has been a game changer. "I think it has really lived up to its hype in thunderstorm forecasting. Meteorologists can see the convection evolve in near real-time and this gives them enhanced insight on storm development and severity, making for better warnings," John Cintineo, a researcher from NOAA's National Severe Storms Laboratory , told Space.com in an email. "Not only does the GOES-R series provide observations where radar coverage is lacking, but it often provides a robust signal before radar, such as when a storm is strengthening or weakening. I'm sure there have been many other improvements in forecasts and environmental monitoring over the last decade, but this is where I have most clearly seen improvement," Cintineo said. In addition to helping predict severe thunderstorms, each satellite has collected images and data on heavy rain events that could trigger flooding, detected low clouds and fog as it forms, and has made significant improvements to forecasts and services used during hurricane season. "GOES provides our hurricane forecasters with faster, more accurate and detailed data that is critical for estimating a storm's intensity, including cloud top cooling, convective structures, specific features of a hurricane's eye, upper-level wind speeds, and lightning activity," Ken Graham, director of NOAA's National Weather Service told Space.com in an email. Instruments such as the Advanced Baseline Imager have three times more spectral channels, four times the image quality, and five times the imaging speed as the previous GOES satellites. The Geostationary Lightning Mapper is the first of its kind in orbit on the GOES-R series that allows scientists to view lightning 24/7 and strikes that make contact with the ground and from cloud to cloud. "GOES-U and the GOES-R series of satellites provides scientists and forecasters weather surveillance of the entire western hemisphere, at unprecedented spatial and temporal scales," Cintineo said. "Data from these satellites are helping researchers develop new tools and methods to address problems such as lightning prediction, sea-spray identification (sea-spray is dangerous for mariners), severe weather warnings, and accurate cloud motion estimation. The instruments from GOES-R also help improve forecasts from global and regional numerical weather models, through improved data assimilation." The final satellite, launching Tuesday, includes a new sensor โ€” the Compact Coronagraph โ€” "that will monitor weather outside of Earth's atmosphere, keeping an eye on what space weather events are happening that could impact our planet," according to the article. "It will be the first near real time operational coronagraph that we have access to," Rob Steenburgh, a space scientist at NOAA's Space Weather Prediction Center, told Space.com on the phone. "That's a huge leap for us because up until now, we've always depended on a research coronagraph instrument on a spacecraft that was launched quite a long time ago."

Read more of this story at Slashdot.

Foundation Honoring 'Star Trek' Creator Offers $1M Prize for AI Startup Benefiting Humanity

23 June 2024 at 12:34
The Roddenberry Foundation โ€” named for Star Trek creator Gene Roddenberry โ€” "announced Tuesday that this year's biennial award would focus on artificial intelligence that benefits humanity," reports the Los Angeles Times: Lior Ipp, chief executive of the foundation, told The Times there's a growing recognition that AI is becoming more ubiquitous and will affect all aspects of our lives. "We are trying to ... catalyze folks to think about what AI looks like if it's used for good," Ipp said, "and what it means to use AI responsibly, ethically and toward solving some of the thorny global challenges that exist in the world...." Ipp said the foundation shares the broad concern about AI and sees the award as a means to potentially contribute to creating those guardrails... Inspiration for the theme was also borne out of the applications the foundation received last time around. Ipp said the prize, which is "issue-agnostic" but focused on early-stage tech, produced compelling uses of AI and machine learning in agriculture, healthcare, biotech and education. "So," he said, "we sort of decided to double down this year on specifically AI and machine learning...." Though the foundation isn't prioritizing a particular issue, the application states that it is looking for ideas that have the potential to push the needle on one or more of the United Nations' 17 sustainable development goals, which include eliminating poverty and hunger as well as boosting climate action and protecting life on land and underwater. The Foundation's most recent winner was Sweden-based Elypta, according to the article, "which Ipp said is using liquid biopsies, such as a blood test, to detect cancer early." "We believe that building a better future requires a spirit of curiosity, a willingness to push boundaries, and the courage to think big," said Rod Roddenberry, co-founder of the Roddenberry Foundation. "The Prize will provide a significant boost to AI pioneers leading these efforts." According to the Foundation's announcement, the Prize "embodies the Roddenberry philosophy's promise of a future in which technology and human ingenuity enable everyone โ€” regardless of background โ€” to thrive." "By empowering entrepreneurs to dream bigger and innovate valiantly, the Roddenberry Prize seeks to catalyze the development of AI solutions that promote abundance and well-being for all."

Read more of this story at Slashdot.

EFF: New License Plate Reader Vulnerabilties Prove The Tech Itself is a Public Safety Threat

23 June 2024 at 11:34
Automated license plate readers "pose risks to public safety," argues the EFF, "that may outweigh the crimes they are attempting to address in the first place." When law enforcement uses automated license plate readers (ALPRs) to document the comings and goings of every driver on the road, regardless of a nexus to a crime, it results in gargantuan databases of sensitive information, and few agencies are equipped, staffed, or trained to harden their systems against quickly evolving cybersecurity threats. The Cybersecurity and Infrastructure Security Agency (CISA), a component of the U.S. Department of Homeland Security, released an advisory last week that should be a wake up call to the thousands of local government agencies around the country that use ALPRs to surveil the travel patterns of their residents by scanning their license plates and "fingerprinting" their vehicles. The bulletin outlines seven vulnerabilities in Motorola Solutions' Vigilant ALPRs, including missing encryption and insufficiently protected credentials... Unlike location data a person shares with, say, GPS-based navigation app Waze, ALPRs collect and store this information without consent and there is very little a person can do to have this information purged from these systems... Because drivers don't have control over ALPR data, the onus for protecting the data lies with the police and sheriffs who operate the surveillance and the vendors that provide the technology. It's a general tenet of cybersecurity that you should not collect and retain more personal data than you are capable of protecting. Perhaps ironically, a Motorola Solutions cybersecurity specialist wrote an article in Police Chief magazine this month that public safety agencies "are often challenged when it comes to recruiting and retaining experienced cybersecurity personnel," even though "the potential for harm from external factors is substantial." That partially explains why, more than 125 law enforcement agencies reported a data breach or cyberattacks between 2012 and 2020, according to research by former EFF intern Madison Vialpando. The Motorola Solutions article claims that ransomware attacks "targeting U.S. public safety organizations increased by 142 percent" in 2023. Yet, the temptation to "collect it all" continues to overshadow the responsibility to "protect it all." What makes the latest CISA disclosure even more outrageous is it is at least the third time in the last decade that major security vulnerabilities have been found in ALPRs... If there's one positive thing we can say about the latest Vigilant vulnerability disclosures, it's that for once a government agency identified and reported the vulnerabilities before they could do damage... The Michigan Cyber Command center found a total of seven vulnerabilities in Vigilant devices; two of which were medium severity and 5 of which were high severity vulnerabilities... But a data breach isn't the only way that ALPR data can be leaked or abused. In 2022, an officer in the Kechi (Kansas) Police Department accessed ALPR data shared with his department by the Wichita Police Department to stalk his wife. The article concludes that public safety agencies should "collect only the data they need for actual criminal investigations. "They must never store more data than they adequately protect within their limited resources-or they must keep the public safe from data breaches by not collecting the data at all."

Read more of this story at Slashdot.

Our Brains React Differently to Deepfake Voices, Researchers Find

23 June 2024 at 10:34
"University of Zurich researchers have discovered that our brains process natural human voices and "deepfake" voices differently," writes Slashdot reader jenningsthecat. From the University's announcement: The researchers first used psychoacoustical methods to test how well human voice identity is preserved in deepfake voices. To do this, they recorded the voices of four male speakers and then used a conversion algorithm to generate deepfake voices. In the main experiment, 25 participants listened to multiple voices and were asked to decide whether or not the identities of two voices were the same. Participants either had to match the identity of two natural voices, or of one natural and one deepfake voice. The deepfakes were correctly identified in two thirds of cases. "This illustrates that current deepfake voices might not perfectly mimic an identity, but do have the potential to deceive people," says Claudia Roswandowitz, first author and a postdoc at the Department of Computational Linguistics. The researchers then used imaging techniques to examine which brain regions responded differently to deepfake voices compared to natural voices. They successfully identified two regions that were able to recognize the fake voices: the nucleus accumbens and the auditory cortex. "The nucleus accumbens is a crucial part of the brain's reward system. It was less active when participants were tasked with matching the identity between deepfakes and natural voices," says Claudia Roswandowitz. In contrast, the nucleus accumbens showed much more activity when it came to comparing two natural voices. The complete paper appears in Nature.

Read more of this story at Slashdot.

Multiple AI Companies Ignore Robots.Txt Files, Scrape Web Content, Says Licensing Firm

23 June 2024 at 07:34
Multiple AI companies are ignoring Robots.txt files meant to block the scraping of web content for generative AI systems, reports Reuters โ€” citing a warning sent to publisher by content licensing startup TollBit. TollBit, an early-stage startup, is positioning itself as a matchmaker between content-hungry AI companies and publishers open to striking licensing deals with them. The company tracks AI traffic to the publishers' websites and uses analytics to help both sides settle on fees to be paid for the use of different types of content... It says it had 50 websites live as of May, though it has not named them. According to the TollBit letter, Perplexity is not the only offender that appears to be ignoring robots.txt. TollBit said its analytics indicate "numerous" AI agents are bypassing the protocol, a standard tool used by publishers to indicate which parts of its site can be crawled. "What this means in practical terms is that AI agents from multiple sources (not just one company) are opting to bypass the robots.txt protocol to retrieve content from sites," TollBit wrote. "The more publisher logs we ingest, the more this pattern emerges." The article includes this quote from the president of the News Media Alliance (a trade group representing over 2,200 U.S.-based publishers). "Without the ability to opt out of massive scraping, we cannot monetize our valuable content and pay journalists. This could seriously harm our industry." Reuters also notes another threat facing news sites: Publishers have been raising the alarm about news summaries in particular since Google rolled out a product last year that uses AI to create summaries in response to some search queries. If publishers want to prevent their content from being used by Google's AI to help generate those summaries, they must use the same tool that would also prevent them from appearing in Google search results, rendering them virtually invisible on the web.

Read more of this story at Slashdot.

America's Used EV Price Crash Keeps Getting Deeper

23 June 2024 at 03:34
Long-time Slashdot reader schwit1 shares CNBC's report on the U.S. car market: Back in February, used electric vehicle prices dipped below used gasoline-powered vehicle prices for the first time ever, and the pricing cliff keeps getting steeper as car buyers reject any "premium" tag formerly associated with EVs. The decline has been dramatic over the past year. In June 2023, average used EV prices were over 25% higher than used gas car prices, but by May, used EVs were on average 8% lower than the average price for a used gasoline-powered car in U.S. In dollar terms, the gap widened from $265 in February to $2,657 in May, according to an analysis of 2.2 million one to five year-old used cars conducted by iSeeCars. Over the past year, gasoline-powered used vehicle prices have declined between 3-7%, while electric vehicle prices have decreased 30-39%. "It's clear used car shoppers will no longer pay a premium for electric vehicles," iSeeCars executive analyst Karl Brauer stated in an iSeeCars report published last week. Electric power is now a detractor in the consumer's mind, with EVs "less desirable" and therefore less valuable than traditional cars, he said. The article notes there's been a price war among EV manufacturers โ€” and that newer EV models might be more attractive due to "longer ranges and improved battery life with temperature control for charging." But CNBC also notes a silver lining. "As more EVs enter the used market at lower prices, the EV market does become available to a wider market of potential first-time EV owners."

Read more of this story at Slashdot.

Launch of Chinese-French Satellite Scattered Debris Over Populated Area

23 June 2024 at 00:34
"A Chinese launch of the joint Sino-French SVOM mission to study Gamma-ray bursts early Saturday saw toxic rocket debris fall over a populated area..." writes Space News: SVOM is a collaboration between the China National Space Administration (CNSA) and France's Centre national d'รฉtudes spatiales (CNES). The mission will look for high-energy electromagnetic radiation from these events in the X-ray and gamma-ray ranges using two French and two Chinese-developed science payloads... Studying gamma-ray bursts, thought to be caused by the death of massive stars or collisions between stars, could provide answers to key questions in astrophysics. This includes the death of stars and the creation of black holes. However the launch of SVOM also created an explosion of its own closer to home.A video posted on Chinese social media site Sina Weibo appears to show a rocket booster falling on a populated area with people running for cover. The booster fell to Earth near Guiding County, Qiandongnan Prefecture in Guizhou province, according to another post... A number of comments on the video noted the danger posed by the hypergolic propellant from the Long March rocket... The Long March 2C uses a toxic, hypergolic mix of nitrogen tetroxide and unsymmetrical dimethylhydrazine (UDMH). Reddish-brown gas or smoke from the booster could be indicative of nitrogen tetroxide, while a yellowish gas could be caused by hydrazine fuel mixing with air. Contact with either remaining fuel or oxidizer from the rocket stage could be very harmful to individuals. "Falling rocket debris is a common issue with China's launches from its three inland launch sites..." the article points out. "Authorities are understood to issue warnings and evacuation notices for areas calculated to be at risk from launch debris, reducing the risk of injuries.

Read more of this story at Slashdot.

Open Source ChatGPT Clone 'LibreChat' Lets You Use Multiple AI Services - While Owning Your Data

22 June 2024 at 10:34
Slashdot reader DevNull127 writes: A free and open source ChatGPT clone โ€” named LibreChat โ€” lets its users choose which AI model to use, "to harness the capabilities of cutting-edge language models from multiple providers in a unified interface". This means LibreChat includes OpenAI's models, but also others โ€” both open-source and closed-source โ€” and its website promises "seamless integration" with AI services from OpenAI, Azure, Anthropic, and Google โ€” as well as GPT-4, Gemini Vision, and many others. ("Every AI in one place," explains LibreChat's home page.) Plugins even let you make requests to DALL-E or Stable Diffusion for image generations. (LibreChat also offers a database that tracks "conversation state" โ€” making it possible to switch to a different AI model in mid-conversation...) Released under the MIT License, LibreChat has become "an open source success story," according to this article, representing "the passionate community that's actively creating an ecosystem of open source AI tools." And its creator, Danny Avila, says in some cases it finally lets users own their own data, "which is a dying human right, a luxury in the internet age and even more so with the age of LLM's." Avila says he was inspired by the day ChatGPT leaked the chat history of some of its users back in March of 2023 โ€” and LibreChat is "inherently completely private". From the article: With locally-hosted LLMs, Avila sees users finally getting "an opportunity to withhold training data from Big Tech, which many trade at the cost of convenience." In this world, LibreChat "is naturally attractive as it can run exclusively on open-source technologies, database and all, completely 'air-gapped.'" Even with remote AI services insisting they won't use transient data for training, "local models are already quite capable" Avila notes, "and will become more capable in general over time." And they're also compatible with LibreChat...

Read more of this story at Slashdot.

Walmart Announces Electronic Shelf Labels They Can Change Remotely

22 June 2024 at 21:34
Walmart "became the latest retailer to announce it's replacing the price stickers in its aisles with electronic shelf labels," reports NPR. "The new labels allow employees to change prices as often as every ten seconds." "If it's hot outside, we can raise the price of water and ice cream. If there's something that's close to the expiration date, we can lower the price โ€” that's the good news," said Phil Lempert, a grocery industry analyst... The ability to easily change prices wasn't mentioned in Walmart's announcement that 2,300 stores will have the digitized shelf labels by 2026. Daniela Boscan, who participated in Walmart's pilot of the labels in Texas, said the label's key benefits are "increased productivity and reduced walking time," plus quicker restocking of shelves... As higher wages make labor more expensive, retailers big and small can benefit from the increased productivity that digitized shelf labels enable, said Santiago Gallino, a professor specializing in retail management at the University of Pennsylvania's Wharton School. "The bottom line, at least when I talk to retailers, is the calculation of the amount of labor that they're going to save by incorporating this. And in that sense, I don't think that this is something that only large corporations like Walmart or Target can benefit from," Gallino said. "I think that smaller chains can also see the potential benefit of it." Indeed, Walmart's announcement calls the tech "a win" for both customers and their workers, arguing that updating prices with a mobile app means "reducing the need to walk around the store to change paper tags by hand and giving us more time to support customers in the store." Professor Gallino tells NPR he doesn't think Walmart will suddenly change prices โ€” though he does think Walmart will use it to keep their offline and online prices identical. The article also points out you can already find electronic shelf labels at other major grocers inlcuding Amazon Fresh stores and Whole Foods โ€” and that digitized shelf labels "are even more common in stores across Europe." Another feature of electronic shelf labels is their product descriptions. [Grocery analyst] Lempert notes that barcodes on the new labels can provide useful details other than the price. "They can actually be used where you take your mobile device and you scan it and it can give you more information about the product โ€” whether it's the sourcing of the product, whether it's gluten free, whether it's keto friendly. That's really the promise of what these shelf tags can do," Lempert said. Thanks to long-time Slashdot reader loveandpeace for sharing the article.

Read more of this story at Slashdot.

Data Dump of Patient Records Possible After UK Hospital Breach

22 June 2024 at 18:34
An anonymous reader shared this report from the Associated Press: An investigation into a ransomware attack earlier this month on London hospitals by the Russian group Qilin could take weeks to complete, the country's state-run National Health Service said Friday, as concerns grow over a reported data dump of patient records. Hundreds of operations and appointments are still being canceled more than two weeks after the June 3 attack on NHS provider Synnovis, which provides pathology services primarily in southeast London... NHS England said Friday that it has been "made aware" that data connected to the attack have been published online. According to the BBC, Qilin shared almost 400GB of data, including patient names, dates of birth and descriptions of blood tests, on their darknet site and Telegram channel... According to Saturday's edition of the Guardian newspaper, records covering 300 million patient interactions, including the results of blood tests for HIV and cancer, were stolen during the attack. A website and helpline has been set up for patients affected.

Read more of this story at Slashdot.

Red Hat's RHEL-Based In-Vehicle OS Attains Milestone Safety Certification

22 June 2024 at 17:37
In 2022, Red Hat announced plans to extend RHEL to the automotive industry through Red Hat In-Vehicle Operating System (providing automakers with an open and functionally-safe platform). And this week Red Hat announced it achieved ISO 26262 ASIL-B certification from exida for the Linux math library (libm.so glibc) โ€” a fundamental component of that Red Hat In-Vehicle Operating System. From Red Hat's announcement: This milestone underscores Red Hat's pioneering role in obtaining continuous and comprehensive Safety Element out of Context certification for Linux in automotive... This certification demonstrates that the engineering of the math library components individually and as a whole meet or exceed stringent functional safety standards, ensuring substantial reliability and performance for the automotive industry. The certification of the math library is a significant milestone that strengthens the confidence in Linux as a viable platform of choice for safety related automotive applications of the future... By working with the broader open source community, Red Hat can make use of the rigorous testing and analysis performed by Linux maintainers, collaborating across upstream communities to deliver open standards-based solutions. This approach enhances long-term maintainability and limits vendor lock-in, providing greater transparency and performance. Red Hat In-Vehicle Operating System is poised to offer a safety certified Linux-based operating system capable of concurrently supporting multiple safety and non-safety related applications in a single instance. These applications include advanced driver-assistance systems (ADAS), digital cockpit, infotainment, body control, telematics, artificial intelligence (AI) models and more. Red Hat is also working with key industry leaders to deliver pre-tested, pre-integrated software solutions, accelerating the route to market for SDV concepts. "Red Hat is fully committed to attaining continuous and comprehensive safety certification of Linux natively for automotive applications," according to the announcement, "and has the industry's largest pool of Linux maintainers and contributors committed to this initiative..." Or, as Network World puts it, "The phrase 'open source for the open road' is now being used to describe the inevitable fit between the character of Linux and the need for highly customizable code in all sorts of automotive equipment."

Read more of this story at Slashdot.

Linux Foundation's 'Open Source Security Foundation' Launches New Threat Intelligence Mailing List

22 June 2024 at 16:34
The Linux Foundation's "Open Source Security Foundation" (or OpenSSF) is a cross-industry forum to "secure the development, maintenance, and consumption of the open source software". And now the OpenSSF has launched a new mailing list "which aims to monitor the threat landscape of open-source project vulnerabilities," reports I Programmer, "in order to provide real time alerts to anyone subscribed." The Record explains its origins: OpenSSF General Manager Omkhar Arasaratnam said that at a recent open source event, members of the community ran a tabletop exercise where they simulated a security incident involving the discovery of a zero-day vulnerability. They worked their way through the open source ecosystem โ€” from cloud providers to maintainers to end users โ€” clearly defining how the discovery of a vulnerability would be dealt with from top to bottom. But one of the places where they found a gap is in the dissemination of information widely. "What we lack within the open source community is a place in which we can convene to distribute indicators of compromise (IOCs) and threats, tactics and procedures (TTPs) in a way that will allow the community to identify threats when our packages are under attack," Arasaratnam said... "[W]e're going to be standing up a mailing list for which we can share this information throughout the community and there can be discussion of things that are being seen. And that's one of the ways that we're responding to this gap that we saw...." The Siren mailing list will encourage public discussions on security flaws, concepts, and practices in the open source community with individuals who are not typically engaged in traditional upstream communication channels... Members of the Siren email list will get real-time updates about emerging threats that may be relevant to their projects... OpenSSF has created a signup page for those interested and urged others to share the email list to other open source community members... OpenSSF ecyosystem strategist Christopher Robinson (also security communications director for Intel) told the site he expects government agencies and security researchers to be involved in the effort. And he issued this joint statement with OpenSSF ecosystem strategist Bennett Pursell: By leveraging the collective knowledge and expertise of the open source community and other security experts, the OpenSSF Siren empowers projects of all sizes to bolster their cybersecurity defenses and increase their overall awareness of malicious activities. Whether you're a developer, maintainer, or security enthusiast, your participation is vital in safeguarding the integrity of open source software. In less than a month, the mailing list has already grown to over 800 members...

Read more of this story at Slashdot.

Microsoft Admits No Guarantee of Sovereignty For UK Policing Data

22 June 2024 at 15:34
An anonymous reader shared this report from Computer Weekly: Microsoft has admitted to Scottish policing bodies that it cannot guarantee the sovereignty of UK policing data hosted on its hyperscale public cloud infrastructure, despite its systems being deployed throughout the criminal justice sector. According to correspondence released by the Scottish Police Authority (SPA) under freedom of information (FOI) rules, Microsoft is unable to guarantee that data uploaded to a key Police Scotland IT system โ€” the Digital Evidence Sharing Capability (DESC) โ€” will remain in the UK as required by law. While the correspondence has not been released in full, the disclosure reveals that data hosted in Microsoft's hyperscale public cloud infrastructure is regularly transferred and processed overseas; that the data processing agreement in place for the DESC did not cover UK-specific data protection requirements; and that while the company has the ability to make technical changes to ensure data protection compliance, it is only making these changes for DESC partners and not other policing bodies because "no one else had asked". The correspondence also contains acknowledgements from Microsoft that international data transfers are inherent to its public cloud architecture. As a result, the issues identified with the Scottish Police will equally apply to all UK government users, many of whom face similar regulatory limitations on the offshoring of data. The recipient of the FOI disclosures, Owen Sayers โ€” an independent security consultant and enterprise architect with over 20 years' experience in delivering national policing systems โ€” concluded it is now clear that UK policing data has been travelling overseas and "the statements from Microsoft make clear that they 100% cannot comply with UK data protection law".

Read more of this story at Slashdot.

Big Tech's AI Datacenters Demand Electricity. Are They Increasing Use of Fossil Fuels?

22 June 2024 at 14:34
The artificial intelligence revolution will demand more electricity, warns the Washington Post. "Much more..." They warn that the "voracious" electricity consumption of AI is driving an expansion of fossil fuel use in America โ€” "including delaying the retirement of some coal-fired plants." As the tech giants compete in a global AI arms race, a frenzy of data center construction is sweeping the country. Some computing campuses require as much energy as a modest-sized city, turning tech firms that promised to lead the way into a clean energy future into some of the world's most insatiable guzzlers of power. Their projected energy needs are so huge, some worry whether there will be enough electricity to meet them from any source... A ChatGPT-powered search, according to the International Energy Agency, consumes almost 10 times the amount of electricity as a search on Google. One large data center complex in Iowa owned by Meta burns the annual equivalent amount of power as 7 million laptops running eight hours every day, based on data shared publicly by the company... [Tech companies] argue advancing AI now could prove more beneficial to the environment than curbing electricity consumption. They say AI is already being harnessed to make the power grid smarter, speed up innovation of new nuclear technologies and track emissions.... "If we work together, we can unlock AI's game-changing abilities to help create the net zero, climate resilient and nature positive works that we so urgently need," Microsoft said in a statement. The tech giants say they buy enough wind, solar or geothermal power every time a big data center comes online to cancel out its emissions. But critics see a shell game with these contracts: The companies are operating off the same power grid as everyone else, while claiming for themselves much of the finite amount of green energy. Utilities are then backfilling those purchases with fossil fuel expansions, regulatory filings show... heavily polluting fossil fuel plants that become necessary to stabilize the power grid overall because of these purchases, making sure everyone has enough electricity. The article quotes a project director at the nonprofit Data & Society, which tracks the effect of AI and accuses the tech industry of using "fuzzy math" in its climate claims. "Coal plants are being reinvigorated because of the AI boom," they tell the Washington Post. "This should be alarming to anyone who cares about the environment." The article also summarzies a recent Goldman Sachs analysis, which predicted data centers would use 8% of America's total electricity by 2030, with 60% of that usage coming "from a vast expansion in the burning of natural gas. The new emissions created would be comparable to that of putting 15.7 million additional gas-powered cars on the road." "We all want to be cleaner," Brian Bird, president of NorthWestern Energy, a utility serving Montana, South Dakota and Nebraska, told a recent gathering of data center executives in Washington, D.C. "But you guys aren't going to wait 10 years ... My only choice today, other than keeping coal plants open longer than all of us want, is natural gas. And so you're going see a lot of natural gas build out in this country." Big Tech responded by "going all in on experimental clean-energy projects that have long odds of success anytime soon," the article concludes. "In addition to fusion, they are hoping to generate power through such futuristic schemes as small nuclear reactors hooked to individual computing centers and machinery that taps geothermal energy by boring 10,000 feet into the Earth's crust..." Some experts point to these developments in arguing the electricity needs of the tech companies will speed up the energy transition away from fossil fuels rather than undermine it. "Companies like this that make aggressive climate commitments have historically accelerated deployment of clean electricity," said Melissa Lott, a professor at the Climate School at Columbia University.

Read more of this story at Slashdot.

Systemd 256.1 Addresses Complaint That 'systemd-tmpfiles' Could Unexpectedly Delete Your /home Directory

22 June 2024 at 13:34
"A good portion of my home directory got deleted," complained a bug report for systemd filed last week. It requested an update to a flag for the systemd-tmpfiles tool which cleans up files and directories: "a huge warning next to --purge. This option is dangerous, so it should be made clear that it's dangerous." The Register explains: As long as five years ago, systemd-tmpfiles had moved on past managing only temporary files โ€” as its name might suggest to the unwary. Now it manages all sorts of files created on the fly ... such as things like users' home directories. If you invoke the systemd-tmpfiles --purge command without specifying that very important config file which tells it which files to handle, version 256 will merrily purge your entire home directory. The bug report first drew a cool response from systemd developer Luca Boccassi of Microsoft: So an option that is literally documented as saying "all files and directories created by a tmpfiles.d/ entry will be deleted", that you knew nothing about, sounded like a "good idea"? Did you even go and look what tmpfiles.d entries you had beforehand? Maybe don't just run random commands that you know nothing about, while ignoring what the documentation tells you? Just a thought eh But the report then triggered "much discussion," reports Phoronix. Some excerpts: Lennart Poettering: "I think we should fail --purge if no config file is specified on the command line. I see no world where an invocation without one would make sense, and it would have caught the problem here." Red Hat open source developer Zbigniew Jร„(TM)drzejewski-Szmek: "We need to rethink how --purge works. The principle of not ever destroying user data is paramount. There can be commands which do remove user data, but they need to be minimized and guarded." Systemd contributor Betonhaus: "Having a function that declares irreplaceable files โ€” such as the contents of a home directory โ€” to be temporary files that can be easily purged, is at best poor user interfacing design and at worst a severe design flaw." But in the end, Phoronix writes, systemd-tmpfiles behavior "is now improved upon." "Merged Wednesday was this patch that now makes systemd-tmpfiles accept a configuration file when running purge. That way the user must knowingly supply the configuration file(s) to which files they would ultimately like removed. The documentation has also been improved upon to make the behavior more clear." Thanks to long-time Slashdot reader slack_justyb for sharing the news.

Read more of this story at Slashdot.

โŒ
โŒ