Vue lecture

Engineers are Building the Hottest Geothermal Power Plant on Earth - Next to a US Volcano

"On the slopes of an Oregon volcano, engineers are building the hottest geothermal power plant on Earth," reports the Washington Post: The plant will tap into the infernal energy of Newberry Volcano, "one of the largest and most hazardous active volcanoes in the United States," according to the U.S. Geological Survey. It has already reached temperatures of 629 degrees Fahrenheit, making it one of the hottest geothermal sites in the world, and next year it will start selling electricity to nearby homes and businesses. But the start-up behind the project, Mazama Energy, wants to crank the temperature even higher — north of 750 degrees — and become the first to make electricity from what industry insiders call "superhot rock." Enthusiasts say that could usher in a new era of geothermal power, transforming the always-on clean energy source from a minor player to a major force in the world's electricity systems. "Geothermal has been mostly inconsequential," said Vinod Khosla, a venture capitalist and one of Mazama Energy's biggest financial backers. "To do consequential geothermal that matters at the scale of tens or hundreds of gigawatts for the country, and many times that globally, you really need to solve these high temperatures." Today, geothermal produces less than 1 percent of the world's electricity. But tapping into superhot rock, along with other technological advances, could boost that share to 8 percent by 2050, according to the International Energy Agency (IEA). Geothermal using superhot temperatures could theoretically generate 150 times more electricity than the world uses, according to the IEA. "We believe this is the most direct path to driving down the cost of geothermal and making it possible across the globe," said Terra Rogers, program director for superhot rock geothermal at the Clean Air Task Force, an environmentalist think tank. "The [technological] gaps are within reason. These are engineering iterations, not breakthroughs." The Newberry Volcano project combines two big trends that could make geothermal energy cheaper and more widely available. First, Mazama Energy is bringing its own water to the volcano, using a method called "enhanced geothermal energy"... [O]ver the past few decades, pioneering projects have started to make energy from hot dry rocks by cracking the stone and pumping in water to make steam, borrowing fracking techniques developed by the oil and gas industry... The Newberry project also taps into hotter rock than any previous enhanced geothermal project. But even Newberry's 629 degrees fall short of the superhot threshold of 705 degrees or above. At that temperature, and under a lot of pressure, water becomes "supercritical" and starts acting like something between a liquid and a gas. Supercritical water holds lots of heat like a liquid, but it flows with the ease of a gas — combining the best of both worlds for generating electricity... [Sriram Vasantharajan, Mazama's CEO] said Mazama will dig new wells to reach temperatures above 750 degrees next year. Alongside an active volcano, the company expects to hit that temperature less than three miles beneath the surface. But elsewhere, geothermal developers might have to dig as deep as 12 miles. While Mazama plans to generate 15 megawatts of electricity next year, it hopes to eventually increase that to 200 megawatts. (And the company's CEO said it could theoretically generate five gigawatts of power.) But more importantly, successful projects "motivate other players to get into the market," according to a senior geothermal research analyst at energy consultancy Wood Mackenzie, who predicted "a ripple effect," to the Washington Post where "we'll start seeing more companies get the financial support to kick off their own pilots."

Read more of this story at Slashdot.

  •  

How the Internet Rewired Work - and What That Tells Us About AI's Likely Impact

"The internet did transform work — but not the way 1998 thought..." argues the Wall Street Journal. "The internet slipped inside almost every job and rewired how work got done." So while the number of single-task jobs like travel agent dropped, most jobs "are bundles of judgment, coordination and hands-on work," and instead the internet brought "the quiet transformation of nearly every job in the economy... Today, just 10% of workers make minimal use of the internet on the job — roles like butcher and carpet installer." [T]he bigger story has been additive. In 1998, few could conceive of social media — let alone 65,000 social-media managers — and 200,000 information-security analysts would have sounded absurd when data still lived on floppy disks... Marketing shifted from campaign bursts to always-on funnels and A/B testing. Clinics embedded e-prescribing and patient portals, reshaping front-office and clinical handoffs. The steps, owners and metrics shifted. Only then did the backbone scale: We went from server closets wedged next to the mop sink to data centers and cloud regions, from lone system administrators to fulfillment networks, cybersecurity and compliance. That is where many unexpected jobs appeared. Networked machines and web-enabled software quietly transformed back offices as much as our on-screen lives. Similarly, as e-commerce took off, internet-enabled logistics rewired planning roles — logisticians, transportation and distribution managers — and unlocked a surge in last-mile work. The build-out didn't just hire coders; it hired coordinators, pickers, packers and drivers. It spawned hundreds of thousands of warehouse and delivery jobs — the largest pockets of internet-driven job growth, and yet few had them on their 1998 bingo card... Today, the share of workers in professional and managerial occupations has more than doubled since the dawn of the digital era. So what does that tell us about AI? Our mental model often defaults to an industrial image — John Henry versus the steam drill — where jobs are one dominant task, and automation maps one-to-one: Automate the task, eliminate the job. The internet revealed a different reality: Modern roles are bundles. Technologies typically hit routine tasks first, then workflows, and only later reshape jobs, with second-order hiring around the backbone. That complexity is what made disruption slower and more subtle than anyone predicted. AI fits that pattern more than it breaks it... [LLMs] can draft briefs, summarize medical notes and answer queries. Those are tasks — important ones — but still parts of larger roles. They don't manage risk, hold accountability, reassure anxious clients or integrate messy context across teams. Expect a rebalanced division of labor: The technical layer gets faster and cheaper; the human layer shifts toward supervision, coordination, complex judgment, relationship work and exception handling. What to expect from AI, then, is messy, uneven reshuffling in stages. Some roles will contract sharply — and those contractions will affect real people. But many occupations will be rewired in quieter ways. Productivity gains will unlock new demand and create work that didn't exist, alongside a build-out around data, safety, compliance and infrastructure. AI is unprecedented; so was the internet. The real risk is timing: overestimating job losses, underestimating the long, quiet rewiring already under way, and overlooking the jobs created in the backbone. That was the internet's lesson. It's likely to be AI's as well.

Read more of this story at Slashdot.

  •  

Microsoft Warns Its Windows AI Feature Brings Data Theft and Malware Risks, and 'Occasionally May Hallucinate'

"Copilot Actions on Windows 11" is currently available in Insider builds (version 26220.7262) as part of Copilot Labs, according to a recent report, "and is off by default, requiring admin access to set it up." But maybe it's off for a good reason...besides the fact that it can access any apps installed on your system: In a support document, Microsoft admits that features like Copilot Actions introduce " novel security risks ." They warn about cross-prompt injection (XPIA), where malicious content in documents or UI elements can override the AI's instructions. The result? " Unintended actions like data exfiltration or malware installation ." Yeah, you read that right. Microsoft is shipping a feature that could be tricked into installing malware on your system. Microsoft's own warning hits hard: "We recommend that you only enable this feature if you understand the security implications." When you try to enable these experimental features, Windows shows you a warning dialog that you have to acknowledge. ["This feature is still being tested and may impact the performance or security of your device."] Even with these warnings, the level of access Copilot Actions demands is concerning. When you enable the feature, it gets read and write access to your Documents, Downloads, Desktop, Pictures, Videos, and Music folders... Microsoft says they are implementing safeguards. All actions are logged, users must approve data access requests, the feature operates in isolated workspaces, and the system uses audit logs to track activity. But you are still giving an AI system that can "hallucinate and produce unexpected outputs" (Microsoft's words, not mine) full access to your personal files. To address this, Ars Technica notes, Microsoft added this helpful warning to its support document this week. "As these capabilities are introduced, AI models still face functional limitations in terms of how they behave and occasionally may hallucinate and produce unexpected outputs." But Microsoft didn't describe "what actions they should take to prevent their devices from being compromised. I asked Microsoft to provide these details, and the company declined..."

Read more of this story at Slashdot.

  •  

Amazon's AI-Powered IDE Kiro Helps Vibe Coders with 'Spec Mode'

A promotional video for Amazon's Kiro software development system took a unique approach, writes GeekWire. "Instead of product diagrams or keynote slides, a crew from Seattle's Packrat creative studio used action figures on a miniature set to create a stop-motion sequence..." "Can the software development hero conquer the 'AI Slop Monster' to uncover the gleaming, fully functional robot buried beneath the coding chaos?" Kiro (pronounced KEE-ro) is Amazon's effort to rethink how developers use AI. It's an integrated development environment that attempts to tame the wild world of vibe coding... But rather than simply generating code from prompts [in "vibe mode"], Kiro breaks down requests into formal specifications, design documents, and task lists [in "spec mode"]. This spec-driven development approach aims to solve a fundamental problem with vibe coding: AI can quickly generate prototypes, but without structure or documentation, that code becomes unmaintainable... The market for AI-powered development tools is booming. Gartner expects AI code assistants to become ubiquitous, forecasting that 90% of enterprise software engineers will use them by 2028, up from less than 14% in early 2024... Amazon launched Kiro in preview in July, to a strong response. Positive early reviews were tempered by frustration from users unable to gain access. Capacity constraints have since been resolved, and Amazon says more than 250,000 developers used Kiro in the first three months... Now, the company is taking Kiro out of preview into general availability, rolling out new features and opening the tool more broadly to development teams and companies... During the preview period, Kiro handled more than 300 million requests and processed trillions of tokens as developers explored its capabilities, according to stats provided by the company. Rackspace used Kiro to complete what they estimated as 52 weeks of software modernization in three weeks, according to Amazon executives. SmugMug and Flickr are among other companies espousing the virtues of Kiro's spec-driven development approach. Early users are posting in glowing terms about the efficiencies they're seeing from adopting the tool... startups in most countries can apply for up to 100 free Pro+ seats for a year's worth of Kiro credits. Kiro offers property-based testing "to verify that generated code actually does what developers specified," according to the article — plus a checkpointing system that "lets developers roll back changes or retrace an agent's steps when an idea goes sideways..." "And yes, they've been using Kiro to build Kiro, which has allowed them to move much faster."

Read more of this story at Slashdot.

  •  

Did Bitcoin Play a Role in Thursday's Stock Sell-Off?

A week ago Bitcoin was at $93,714. Saturday it dropped to $85,300. Late Thursday, market researcher Ed Yardeni blamed some of Thursday's stock market sell-off on "the ongoing plunge in bitcoin's price," reports Fortune: "There has been a strong correlation between it and the price of TQQQ, an ETF that seeks to achieve daily investment results that correspond to three times (3x) the daily performance of the Nasdaq-100 Index," [Yardeni wrote in a note]. Yardeni blamed bitcoin's slide on the GENIUS Act, which was enacted on July 18, saying that the regulatory framework it established for stablecoins eliminated bitcoin's transactional role in the monetary system. "It's possible that the rout in bitcoin is forcing some investors to sell stocks that they own," he added... Traders who used leverage to make crypto bets would need to liquidate positions in the event of margin calls. Steve Sosnick, chief strategist at Interactive Brokers, also said bitcoin could swing the entire stock market, pointing out that it's become a proxy for speculation. "As a long-time systematic trader, it tells me that algorithms are acting upon the relationship between stocks and bitcoin," he wrote in a note on Thursday.

Read more of this story at Slashdot.

  •  

PHP 8.5 Brings Long-Awaited Pipe Operator, Adds New URI Tools

"PHP 8.5 landed on Thursday with a long-awaited pipe operator and a new standards-compliant URI parser," reports the Register, "marking one of the scripting language's more substantial updates... " The pipe operator allows function calls to be chained together, which avoids the extraneous variables and nested statements that might otherwise be involved. Pipes tend to make code more readable than other ways to implement serial operations. Anyone familiar with the Unix/Linux command line or programming languages like R, F#, Clojure, or Elixir may have used the pipe operator. In JavaScript, aka ECMAScript, a pipe operator has been proposed, though there are alternatives like method chaining. Another significant addition is the URI extension, which allows developers to parse and modify URIs and URLs based on both the RFC 3986 and the WHATWG URL standards. Parsing with URIs and URLs â" reading them and breaking them down into their different parts â" is a rather common task for web-oriented applications. Yet prior versions of PHP didn't include a standards-compliant parser in the standard library. As noted by software developer Tim Düsterhus, the parse_url() function that dates back to PHP 4 doesn't follow any standard and comes with a warning that it should not be used with untrusted or malformed URLs. Other noteworthy additions to the language include: Clone With, for updating properties more efficiently; the #[\NoDiscard] attribute, for warning when a return value goes unused; the ability to use static closures and first-class callables in constant expressions; and persistent cURL handles that can be shared across multiple PHP requests.

Read more of this story at Slashdot.

  •  

'The Strange and Totally Real Plan to Blot Out the Sun and Reverse Global Warming'

In a 2023 pitch to investors, a "well-financed, highly credentialed" startup named Stardust aimed for a "gradual temperature reduction demonstration" in 2027, according to a massive new 9,600-word article from Politico. ("Annually dispersing ~1 million tons of sun-reflecting particles," says one slide. "Equivalent to ~1% extra cloud coverage.") "Another page told potential investors Stardust had already run low-altitude experiments using 'test particles'," the article notes: [P]ublic records and interviews with more than three dozen scientists, investors, legal experts and others familiar with the company reveal an organization advancing rapidly to the brink of being able to press "go" on its planet-cooling plans. Meanwhile, Stardust is seeking U.S. government contracts and quietly building an influence machine in Washington to lobby lawmakers and officials in the Trump administration on the need for a regulatory framework that it says is necessary to gain public approval for full-scale deployment.... The presentation also included revenue projections and a series of opportunities for venture capitalists to recoup their investments. Stardust planned to sign "government contracts," said a slide with the company's logo next to an American flag, and consider a "potential acquisition" by 2028. By 2030, the deck foresaw a "large-scale demonstration" of Stardust's system. At that point, the company claimed it would already be bringing in $200 million per year from its government contracts and eyeing an initial public offering, if it hadn't been sold already. The article notes that for "a widening circle of researchers and government officials, Stardust's perceived failures to be transparent about its work and technology have triggered a larger conversation about what kind of international governance framework will be needed to regulate a new generation of climate technologies." (Since currently Stardust and its backers "have no legal obligations to adhere to strenuous safety principles or to submit themselves to the public view.") In October Politico spoke to Stardust CEO, Yanai Yedvab, a former nuclear physicist who was once deputy chief scientist at the Israeli Atomic Energy Commission. Stardust "was ready to announce the $60 million it had raised from 13 new investors," the article points out, "far larger than any previous investment in solar geoengineering." [Yedvab] was delighted, he said, not by the money, but what it meant for the project. "We are, like, few years away from having the technology ready to a level that decisions can be taken" — meaning that deployment was still on track to potentially begin on the timeline laid out in the 2023 pitch deck. The money raised was enough to start "outdoor contained experiments" as soon as April, Yedvab said. These would test how their particles performed inside a plane flying at stratospheric heights, some 11 miles above the Earth's surface... The key thing, he insisted, was the particle was "safe." It would not damage the ozone layer and, when the particles fall back to Earth, they could be absorbed back into the biosphere, he said. Though it's impossible to know this is true until the company releases its formula. Yedvab said this round of testing would make Stardust's technology ready to begin a staged process of full-scale, global deployment before the decade is over — as long as the company can secure a government client. To start, they would only try to stabilize global temperatures — in other words fly enough particles into the sky to counteract the steady rise in greenhouse gas levels — which would initially take a fleet of 100 planes. This begs the question: should the world attempt solar geoengineering? That the global temperature would drop is not in question. Britain's Royal Society... said in a report issued in early November that there was little doubt it would be effective. They did not endorse its use, but said that, given the growing interest in this field, there was good reason to be better informed about the side effects... [T]hat doesn't mean it can't have broad benefits when weighed against deleterious climate change, according to Ben Kravitz, a professor of earth and atmospheric sciences at Indiana University who has closely studied the potential effects of solar geoengineering. "There would be some winners and some losers. But in general, some amount of ... stratospheric aerosol injection would likely benefit a whole lot of people, probably most people," he said. Other scientists are far more cautious. The Royal Society report listed a range of potential negative side effects that climate models had displayed, including drought in sub-Saharan Africa. In accompanying documents, it also warned of more intense hurricanes in the North Atlantic and winter droughts in the Mediterranean. But the picture remains partial, meaning there is no way yet to have an informed debate over how useful or not solar geoengineering could be... And then there's the problem of trying to stop. Because an abrupt end to geoengineering, with all the carbon still in the atmosphere, would cause the temperature to soar suddenly upward with unknown, but likely disastrous, effects... Once the technology is deployed, the entire world would be dependent on it for however long it takes to reduce the trillion or more tons of excess carbon dioxide in the atmosphere to a safe level... Stardust claims to have solved many technical and safety challenges, especially related to the environmental impacts of the particle, which they say would not harm nature or people. But researchers say the company's current lack of transparency makes it impossible to trust. Thanks to long-time Slashdot reader fjo3 for sharing the article.

Read more of this story at Slashdot.

  •  

Meta Plans New AI-Powered 'Morning Brief' Drawn From Facebook and 'External Sources'

Meta "is testing a new product that would give Facebook users a personalized daily briefing powered by the company's generative AI technology" reports the Washington Post. They cite records they've reviwed showing that Meta "would analyze Facebook content and external sources to push custom updates to its users." The company plans to test the product with a small group of Facebook users in select cities such as New York and San Francisco, according to a person familiar with the project who spoke on the condition of anonymity to discuss private company matters... Meta's foray into pushing updates for consumers follows years of controversy over its relationship with publishers. The tech company has waffled between prominently featuring content from mainstream news sources on Facebook to pulling news links altogether as regulators pushed the tech giant to pay publishers for content on its platforms. More recently, publishers have sued Meta, alleging it infringed on their copyrighted works to train its AI models.

Read more of this story at Slashdot.

  •  

Are Astronomers Wrong About Dark Energy?

An anonymous reader shared this report from CNN: The universe's expansion might not be accelerating but slowing down, a new study suggests. If confirmed, the finding would upend decades of established astronomical assumptions and rewrite our understanding of dark energy, the elusive force that counters the inward pull of gravity in our universe... Last year, a consortium of hundreds of researchers using data from the Dark Energy Spectroscopic Instrument (DESI) in Arizona, developed the largest ever 3D map of the universe. The observations hinted at the fact that dark energy may be weakening over time, indicating that the universe's rate of expansion could eventually slow. Now, a study published November 6 in the journal Monthly Notices of the Royal Astronomical Society provides further evidence that dark energy might not be pushing on the universe with the same strength it used to. The DESI project's findings last year represented "a major, major paradigm change ... and our result, in some sense, agrees well with that," said Young-Wook Lee, a professor of astrophysics at Yonsei University in South Korea and lead researcher for the new study.... To reach their conclusions, the researchers analyzed a sample of 300 galaxies containing Type 1a supernovas and posited that the dimming of distant exploding stars was not only due to their moving farther away from Earth, but also due to the progenitor star's age... [Study coauthor Junhyuk Son, a doctoral candidate of astronomy at Yonsei University, said] "we found that their luminosity actually depends on the age of the stars that produce them — younger progenitors yield slightly dimmer supernovae, while older ones are brighter." Son said the team has a high statistical confidence — 99.99% — about this age-brightness relation, allowing them to use Type 1a supernovas more accurately than before to assess the universe's expansion... Eventually, if the expansion continues to slow down, the universe could begin to contract, ending in what astronomers imagine may be the opposite of the big bang — the big crunch. "That is certainly a possibility," Lee said. "Even two years ago, the Big Crunch was out of the question. But we need more work to see whether it could actually happen." The new research proposes a radical revision of accepted knowledge, so, understandably, it is being met with skepticism. "This study rests on a flawed premise," Adam Riess, a professor of physics and astronomy at the Johns Hopkins University and one of the recipients of the 2011 Nobel Prize in physics, said in an email. "It suggests supernovae have aged with the Universe, yet observations show the opposite — today's supernovae occur where young stars form. The same idea was proposed years ago and refuted then, and there appears to be nothing new in this version." Lee, however, said Riess' claim is incorrect. "Even in the present-day Universe, Type Ia supernovae are found just as frequently in old, quiescent elliptical galaxies as in young, star-forming ones — which clearly shows that this comment is mistaken. The so-called paper that 'refuted' our earlier result relied on deeply flawed data with enormous uncertainties," he said, adding that the age-brightness correlation has been independently confirmed by two separate teams in the United States and China... "Extraordinary claims require extraordinary evidence," Dragan Huterer, a professor of physics at the University of Michigan in Ann Arbor, said in an email, noting that he does not feel the new research "rises to the threshold to overturn the currently favored model...." The new Vera C. Rubin Observatory, which started operating this year, is set to help settle the debate with the early 2026 launch of the Legacy Survey of Space and Time, an ultrawide and ultra-high-definition time-lapse record of the universe made by scanning the entire sky every few nights over 10 years to capture a compilation of asteroids and comets, exploding stars, and distant galaxies as they change.

Read more of this story at Slashdot.

  •  

Britain Sets New Record, Generating Enough Wind Power for 22 Million Homes

An anonymous reader shared this report from Sky News: A new wind record has been set for Britain, with enough electricity generated from turbines to power 22 million homes, the system operator has said. The mark of 22,711 megawatts (MW) was set at 7.30pm on 11 November... enough to keep around three-quarters of British homes powered, the National Energy System Operator (Neso) said. The country had experienced windy conditions, particularly in the north of England and Scotland... Neso has predicted that Britain could hit another milestone in the months ahead by running the electricity grid for a period entirely with zero carbon power, renewables and nuclear... Neso said wind power is now the largest source of electricity generation for the UK, and the government wants to generate almost all of the UK's electricity from low-carbon sources by 2030. "Wind accounted for 55.7 per cent of Britain's electricity mix at the time..." reports The Times: Gas provided only 12.5 per cent of the mix, with 11.3 per cent coming from imports over subsea power cables, 8 per cent from nuclear reactors, 8 per cent from biomass plants, 1.4 per cent from hydroelectric plants and 1.1 per cent from storage. Britain has about 32 gigawatts of wind farms installed, approximately half of that onshore and half offshore, according to the Wind Energy Database from the wind industry body Renewable UK. That includes five of the world's biggest offshore wind farms. The government is seeking to double onshore wind and quadruple offshore wind power by 2030 as part of its plan for clean energy.... Jane Cooper, deputy chief executive of Renewable UK, said: "On a cold, dark November evening, wind was generating enough electricity to power 80 per cent of British homes when we needed it most.

Read more of this story at Slashdot.

  •  

Analyzing 47,000 ChatGPT Conversations Shows Echo Chambers, Sensitive Data - and Unpredictable Medical Advice

For nearly three years OpenAI has touted ChatGPT as a "revolutionary" (and work-transforming) productivity tool, reports the Washington Post. But after analyzing 47,000 ChatGPT conversations, the Post found that users "are overwhelmingly turning to the chatbot for advice and companionship, not productivity tasks." The Post analyzed a collection of thousands of publicly shared ChatGPT conversations from June 2024 to August 2025. While ChatGPT conversations are private by default, the conversations analyzed were made public by users who created shareable links to their chats that were later preserved in the Internet Archive and downloaded by The Post. It is possible that some people didn't know their conversations would become publicly preserved online. This unique data gives us a glimpse into an otherwise black box... Overall, about 10 percent of the chats appeared to show people talking about their emotions, role-playing, or seeking social interactions with the chatbot. Some users shared highly private and sensitive information with the chatbot, such as information about their family in the course of seeking legal advice. People also sent ChatGPT hundreds of unique email addresses and dozens of phone numbers in the conversations... Lee Rainie, director of the Imagining the Digital Future Center at Elon University, said that it appears ChatGPT "is trained to further or deepen the relationship." In some of the conversations analyzed, the chatbot matched users' viewpoints and created a personalized echo chamber, sometimes endorsing falsehoods and conspiracy theories. Four of ChatGPT's answers about health problems got a failing score from a chair of medicine at the University of California San, Francisco, the Post points out. But four other answers earned a perfect score.

Read more of this story at Slashdot.

  •  

780,000 Windows Users Downloaded Linux Distro Zorin OS in the Last 5 Weeks

In October Zorin OS claimed it had 100,000 downloads in a little over two days in the days following Microsoft's end of support for Windows 10. And one month later, Zorin OS developers now claim that 780,000 people downloaded it from a Windows computer in the space of a month, according to the tech news site XDA Developers. In a post on the Zorin blog, the developers of the operating system Zorin OS 18 announced that they've managed to accrue one million downloads of the operating system in a single month [since its launch on October 14]. While this is plenty impressive by itself, the developers go on to reveal that, out of that million, 78% of the downloads came from a Windows machine. That means that at least 780,000 people on Windows gave Zorin OS 18 a download... [I]t's easy to see why: the developers put a heavy emphasis on making their system the perfect home for ex-Windows users.

Read more of this story at Slashdot.

  •  

Physicists Reveal a New Quantum State Where Electrons Run Wild

ScienceDaily reports: Electrons can freeze into strange geometric crystals and then melt back into liquid-like motion under the right quantum conditions. Researchers identified how to tune these transitions and even discovered a bizarre "pinball" state where some electrons stay locked in place while others dart around freely. Their simulations help explain how these phases form and how they might be harnessed for advanced quantum technologies... When electrons settle into these rigid arrangements, the material undergoes a shift in its state of matter and stops conducting electricity. Instead of acting like a metal, it behaves as an insulator. This unusual behavior provides scientists with valuable insight into how electrons interact and has opened the door to advances in quantum computing, high-performance superconductors used in energy and medical imaging, innovative lighting systems, and extremely precise atomic clocks... [Florida State University assistant professor Cyprian Lewandowski said] "Here, it turns out there are other quantum knobs we can play with to manipulate states of matter, which can lead to impressive advances in experimental research."

Read more of this story at Slashdot.

  •  

Tiny 'Micro-Robots' in your Bloodstream Could Deliver Drugs with Greater Precision

The Washington Post reports: Scientists in Switzerland have created a robot the size of a grain of sand that is controlled by magnets and can deliver drugs to a precise location in the human body, a breakthrough aimed at reducing the severe side effects that stop many medicines from advancing in clinical trials... "I think surgeons are going to look at this," [said Bradley J. Nelson, an author of the paper in Science describing the discovery and a professor of robotics and intelligent systems at ETH Zurich]. I'm sure they're going to have a lot of ideas on how to use" the microrobot. The capsule, which is steered by magnets, might also be useful in treating aneurysms, very aggressive brain cancers, and abnormal connections between arteries and veins known as arteriovenous malformations, Nelson said. The capsules have been tested successfully in pigs, which have similar vasculature to humans, and in silicone models of the blood vessels in humans and animals... Nelson said drug-ferrying microrobots of this kind may be three to five years from being tested in clinical trials. The problem faced by many drugs under development is that they spread throughout the body instead of going only to the area in need... A major cause of side effects in patients is medications traveling to parts of the body that don't need them. The capsules developed in Switzerland, however, can be maneuvered into precise locations by a surgeon using a tool not that different from a PlayStation controller. The navigation system involves six electromagnetic coils positioned around the patient, each about 8 to 10 inches in diameter... The capsules are made of materials that have been found safe for people in other medical tools... When the capsule reaches its destination in the body, "we can trigger the capsule to dissolve," Nelson said.

Read more of this story at Slashdot.

  •  

How Should the Linux Kernel Handle AI-Generated Contributions?

Linux kernel maintainers "are grappling with how to integrate AI-generated contributions without compromising the project's integrity," reports WebProNews: The latest push comes from a proposal by Sasha Levin, a prominent kernel developer at NVIDIA, who has outlined guidelines for tool-generated submissions. Posted to the kernel mailing list, these guidelines aim to standardize how AI-assisted patches are handled. According to Phoronix, the v3 iteration of the proposal [posted by Intel engineer Dave Hansen] emphasizes transparency and accountability, requiring developers to disclose AI involvement in their contributions. This move reflects broader industry concerns about the quality and copyright implications of machine-generated code. Linus Torvalds, the creator of Linux, has weighed in on the debate, advocating for treating AI tools no differently than traditional coding aids. As reported by heise online, Torvalds sees no need for special copyright treatment for AI contributions, stating that they should be viewed as extensions of the developer's work. This perspective aligns with the kernel's pragmatic approach to innovation. The proposal, initially put forward by Levin in July 2025, includes a 'Co-developed-by' tag for AI-assisted patches, ensuring credit and traceability. OSTechNix details how tools like GitHub Copilot and Claude are specifically addressed, with configurations to guide their use in kernel development... ZDNET warns that without official policy, AI could 'creep' into the kernel and cause chaos... The New Stack provides insight into how AI is already assisting kernel maintainers with mundane tasks. According to The New Stack, large language models (LLMs) are being used like 'novice interns' for drudgery work, freeing up experienced developers for complex problems... The Linux kernel's approach could set precedents for other open-source projects. With AI integration accelerating, projects like those in the Linux Foundation are watching closely... Recent kernel releases, such as 6.17.7, include performance improvements that indirectly support AI applications, as noted in Linux Compatible.

Read more of this story at Slashdot.

  •  

Bitcoin Erases Year's Gain as Crypto Bear Market Deepens

655"Just a little more than a month after reaching an all-time high, Bitcoin has erased the more than 30% gain registered since the start of the year..." reports Bloomberg: The dominant cryptocurrency fell below US$93,714 on Sunday, pushing the price beneath the closing level reached at the end of last year, when financial markets were rallying following President Donald Trump's election victory. Bitcoin soared to a record US$126,251 on Oct 6, only to begin tumbling four days later after unexpected comments on tariffs by Trump sent markets into a tailspin worldwide. "The general market is risk-off," said Matthew Hougan, the San Francisco-based chief investment officer for Bitwise Asset Management. "Crypto was the canary in the coal mine for that, it was the first to flinch." Over the past month, many of the biggest buyers — from exchange-traded fund allocators to corporate treasuries — have quietly stepped back, depriving the market of the flow-driven support that helped propel the token to records earlier this year. For much of the year, institutions were the backbone of Bitcoin's legitimacy and its price. ETFs as a cohort took in more than US$25 billion, according to Bloomberg data, pushing assets as high as roughly US$169 billion. Their steady allocation flows helped reframe the asset as a portfolio diversifier — a hedge against inflation, monetary debasement and political disarray. But that narrative — always tenuous — is fraying afresh, leaving the market exposed to something quieter but no less destabilising: disengagement. "The selloff is a confluence of profit-taking by LTHs, institutional outflows, macro uncertainty, and leveraged longs getting wiped out," said Jake Kennis, senior research analyst at Nansen. "What is clear is that the market has temporarily chosen a downward direction after a long period of consolidation/ranging..." Boom and bust cycles have been a constant since Bitcoin burst into the mainstream consciousness with a more than 13,000% surge in 2017, only to be followed by a plunge of almost 75% the following year... Bitcoin has whipsawed investors through the year, dropping to as low as US$74,400 in April as Trump unveiled his tariffs, before rebounding to record highs ahead of the latest retreat... The market downturn has been even tougher on smaller, less liquid tokens that traders often gravitate toward because of their higher volatility and typical outperformance during rallies. A MarketVector index tracking the bottom half of the largest 100 digital assets is down around 60% this year.

Read more of this story at Slashdot.

  •  

More Tech Moguls Want to Build Data Centers in Outer Space

"To be clear, the current economics of space-based data centers don't make sense," writes the Wall Street Journal. "But they could in the future, perhaps as soon as a decade or so from now, according to an analysis by Phil Metzger, a research professor at the University of Central Florida and formerly of the National Aeronautics and Space Administration." "Space enthusiasts (comme moi) have long sought a business case to enable human migration beyond our home world," he posted on X amid the new hype. "I think AI servers in space is the first real business case that will lead to many more...." The argument essentially boils down to the belief that AI's needs are eventually going to grow so great that we need to move to outer space. There the sun's power can be more efficiently harvested. In space, the sun's rays can be direct and constant for solar panels to collect — no clouds, no rainstorms, no nighttime. Demands for cooling could also be cut because of the vacuum of space. Plus, there aren't those pesky regulations that executives like to complain about, slowing construction of new power plants to meet the data-center needs. In space, no one can hear the Nimbys scream. "We will be able to beat the cost of terrestrial data centers in space in the next couple of decades," Bezos said at a tech conference last month. "Space will end up being one of the places that keeps making Earth better." It's still early days. At Alphabet, Google's plans sound almost conservative. The search-engine company in recent days announced Project Suncatcher, which it describes as a moonshot project to scale machine learning in space. It plans to launch two prototype satellites by early 2027 to test its hardware in orbit. "Like any moonshot, it's going to require us to solve a lot of complex engineering challenges," Pichai posted on social media. Nvidia, too, has announced a partnership with startup Starcloud to work on space-based data centers. Not to be outdone, Elon Musk has been painting his own updated vision for the heavens... in recent weeks he has been talking more about how he can use his spaceships to deploy new versions of his solar-powered Starlink satellites equipped with high-speed lasers to build out in-space data centers. On Friday, Musk further reiterated how those AI satellites would be able to generate 100 gigawatts of annual solar power — or, what he said, would be roughly a quarter of what the U.S. consumes on average in a year. "We have a plan mapped out to do it," he told investor Ron Baron during an event. "It gets crazy." Previously, he has suggested he was four to five years away from that ability. He's also touted even wilder ideas, saying on X that 100 terawatts a year "is possible from a lunar base producing solar-powered AI satellites locally and accelerating them to escape velocity with a mass driver." Simply put, he's suggesting a moon base will crank out satellites and throw them into orbit with a catapult. And those satellites' solar panels would generate 100,000 gigawatts a year. "I think we'll see intelligence continue to scale all the way up to where...most of the power of the sun is harnessed for compute," Musk told a tech conference in September.

Read more of this story at Slashdot.

  •  

Microsoft Executives Discuss How AI Will Change Windows, Programming -- and Society

"Windows is evolving into an agentic OS," Microsoft's president of Windows Pavan Davuluri posted on X.com, "connecting devices, cloud, and AI to unlock intelligent productivity and secure work anywhere." But former Uber software engineer and engineering manager Gergely Orosz was unimpressed. "Can't see any reason for software engineers to choose Windows with this weird direction they are doubling down on. So odd because Microsoft has building dev tools in their DNA... their OS doesn't look like anything a builder who wants OS control could choose. Mac or Linux it is for devs." Davuluri "has since disabled replies on his original post..." notes the blog Windows Central, "which some people viewed as an attempt to shut out negative feedback." But he also replied to that comment... Davuluri says "we care deeply about developers. We know we have work to do on the experience, both on the everyday usability, from inconsistent dialogs to power user experiences. When we meet as a team, we discuss these pain points and others in detail, because we want developers to choose Windows..." The good news is Davuluri has confirmed that Microsoft is listening, and is aware of the backlash it's receiving over the company's obsession with AI in Windows 11. That doesn't mean the company is going to stop with adding AI to Windows, but it does mean we can also expect Microsoft to focus on the other things that matter too, such as stability and power user enhancements. Elsewhere on X.com, Microsoft CEO Satya Nadella shared his own thoughts on "the net benefit of the AI platform wave ." The Times of India reports: Nadella said tech companies should focus on building AI systems that create more value for the people and businesses using them, not just for the companies that make the technology. He cited Bill Gates to emphasize the same: "A platform is when the economic value of everybody that uses it exceeds the value of the company that creates it."Tesla CEO Elon Musk responded to Nadella's post with a facepalm emoji. Nadella said this idea matters even more during the current AI boom, where many firms risk giving away too much of their own value to big tech platforms. "The real question is how to empower every company out there to build their own AI-native capabilities," he wrote. Nadella says Microsoft's partnership with OpenAI is an example of zero-sum mindset industry... [He also cited Microsoft's "work to bring AMD into the fleet."] More from Satya Nadella's post: Thanks to AI, the [coding] category itself has expanded and may ultimately become one of the largest software categories. I don't ever recall any analyst ever asking me about how much revenue Visual Studio makes! But now everyone is excited about AI coding tools. This is another aspect of positive sum, when the category itself is redefined and the pie becomes 10x what it was! With GitHub Copilot we compete for our share and with GitHub and Agent HQ we also provide a platform for others. Of course, the real test of this era won't be when another tech company breaks a valuation record. It will be when the overall economy and society themselves reach new heights. When a pharma company uses AI in silico to bring a new therapy to market in one year instead of twelve. When a manufacturer uses AI to redesign a supply chain overnight. When a teacher personalizes lessons for every student. When a farmer predicts and prevents crop failure.That's when we'll know the system is working. Let us move beyond zero-sum thinking and the winner-take-all hype and focus instead on building broad capabilities that harness the power of this technology to achieve local success in each firm, which then leads to broad economic growth and societal benefits. And every firm needs to make sure they have control of their own destiny and sovereignty vs just a press release with a Tech/AI company or worse leak all their value through what may seem like a partnership, except it's extractive in terms of value exchange in the long run.

Read more of this story at Slashdot.

  •  

Chinese Astronauts Return From Their Space Station After Delay Blamed on Space Debris Damage

"Three Chinese astronauts returned from their nation's space station Friday," reports the Associated Press, "after more than a week's delay because the return capsule they had planned to use was damaged, likely from being hit by space debris." The team left their Shenzhou-20 spacecraft in orbit and came back using the recently arrived Shenzhou-21, which had ferried a three-person replacement crew to the station, China's Manned Space Agency said. The original return plan was scrapped because a window in the Shenzhou-20 capsule had tiny cracks, most likely caused by impact from space debris, the space agency said Friday... Their return was delayed for nine days, and their 204-day stay in space was the longest for any astronaut at China's space station... China developed the Tiangong space station after the country was excluded from the International Space Station over U.S. national security concerns. China's space program is controlled by its military.

Read more of this story at Slashdot.

  •  

Rust in Android: More Memory Safety, Fewer Revisions, Fewer Rollbacks, Shorter Reviews

Android's security team published a blog post this week about their experience using Rust. Its title? "Move fast and fix things." Last year, we wrote about why a memory safety strategy that focuses on vulnerability prevention in new code quickly yields durable and compounding gains. This year we look at how this approach isn't just fixing things, but helping us move faster. The 2025 data continues to validate the approach, with memory safety vulnerabilities falling below 20% of total vulnerabilities for the first time. We adopted Rust for its security and are seeing a 1000x reduction in memory safety vulnerability density compared to Android's C and C++ code. But the biggest surprise was Rust's impact on software delivery. With Rust changes having a 4x lower rollback rate and spending 25% less time in code review, the safer path is now also the faster one... Data shows that Rust code requires fewer revisions. This trend has been consistent since 2023. Rust changes of a similar size need about 20% fewer revisions than their C++ counterparts... In a self-reported survey from 2022, Google software engineers reported that Rust is both easier to review and more likely to be correct. The hard data on rollback rates and review times validates those impressions. Historically, security improvements often came at a cost. More security meant more process, slower performance, or delayed features, forcing trade-offs between security and other product goals. The shift to Rust is different: we are significantly improving security and key development efficiency and product stability metrics. With Rust support now mature for building Android system services and libraries, we are focused on bringing its security and productivity advantages elsewhere. Android's 6.12 Linux kernel is our first kernel with Rust support enabled and our first production Rust driver. More exciting projects are underway, such as our ongoing collaboration with Arm and Collabora on a Rust-based kernel-mode GPU driver. [They've also been deploying Rust in firmware for years, and Rust "is ensuring memory safety from the ground up in several security-critical Google applications," including Chromium's parsers for PNG, JSON, and web fonts.] 2025 was the first year more lines of Rust code were added to Android than lines of C++ code...

Read more of this story at Slashdot.

  •