Vue lecture

Polar Bears are Rewiring Their Own Genetics to Survive a Warming Climate

"Polar bears are still sadly expected to go extinct this century," with two-thirds of the population gone by 2050," says the lead researcher on a new study from the University of East Anglia in Britain. But their research also suggests polar bears "are rapidly rewiring their own genetics in a bid to survive," reports NBC News, in "the first documented case of rising temperatures driving genetic change in a mammal." "I believe our work really does offer a glimmer of hope — a window of opportunity for us to reduce our carbon emissions to slow down the rate of climate change and to give these bears more time to adapt to these stark changes in their habitats," [the lead author of the study told NBC News]. Building on earlier University of Washington research, [lead researcher] Godden's team analyzed blood samples from polar bears in northeastern and southeastern Greenland. In the slightly warmer south, they found that genes linked to heat stress, aging and metabolism behaved differently from those in northern bears. "Essentially this means that different groups of bears are having different sections of their DNA changed at different rates, and this activity seems linked to their specific environment and climate," Godden said in a university press release. She said this shows, for the first time, that a unique group of one species has been forced to "rewrite their own DNA," adding that this process can be considered "a desperate survival mechanism against melting sea ice...." Researchers say warming ocean temperatures have reduced vital sea ice platforms that the bears use to hunt seals, leading to isolation and food scarcity. This led to genetic changes as the animals' digestive system adapts to a diet of plants and low fats in the absence of prey, Godden told NBC News.

Read more of this story at Slashdot.

  •  

America Adds 11.7 GW of New Solar Capacity in Q3 - Third Largest Quarter on Record

America's solar industry "just delivered another huge quarter," reports Electrek, "installing 11.7 gigawatts (GW) of new capacity in Q3 2025. That makes it the third-largest quarter on record and pushes total solar additions this year past 30 GW..." According to the new "US Solar Market Insight Q4 2025" report from Solar Energy Industries Association (SEIA) and Wood Mackenzie, 85% of all new power added to the grid during the first nine months of the Trump administration came from solar and storage. And here's the twist: Most of that growth — 73% — happened in red [Republican-leaning] states. Eight of the top 10 states for new installations fall into that category, including Texas, Indiana, Florida, Arizona, Ohio, Utah, Kentucky, and Arkansas... Two new solar module factories opened this year in Louisiana and South Carolina, adding a combined 4.7 GW of capacity. That brings the total new U.S. module manufacturing capacity added in 2025 to 17.7 GW. With a new wafer facility coming online in Michigan in Q3, the U.S. can now produce every major component of the solar module supply chain... SEIA also noted that, following an analysis of EIA data, it found that more than 73 GW of solar projects across the U.S. are stuck in permitting limbo and at risk of politically motivated delays or cancellations.

Read more of this story at Slashdot.

  •  

Purdue University Approves New AI Requirement For All Undergrads

Nonprofit Code.org released its 2025 State of AI & Computer Science Education report this week with a state-by-state analysis of school policies complaining that "0 out of 50 states require AI+CS for graduation." But meanwhile, at the college level, "Purdue University will begin requiring that all of its undergraduate students demonstrate basic competency in AI," writes former college president Michael Nietzel, "starting with freshmen who enter the university in 2026." The new "AI working competency" graduation requirement was approved by the university's Board of Trustees at its meeting on December 12... The requirement will be embedded into every undergraduate program at Purdue, but it won't be done in a "one-size-fits-all" manner. Instead, the Board is delegating authority to the provost, who will work with the deans of all the academic colleges to develop discipline-specific criteria and proficiency standards for the new campus-wide requirement. [Purdue president] Chiang said students will have to demonstrate a working competence through projects that are tailored to the goals of individual programs. The intent is to not require students to take more credit hours, but to integrate the new AI expectation into existing academic requirements... While the news release claimed that Purdue may be the first school to establish such a requirement, at least one other university has introduced its own institution-wide expectation that all its graduates acquire basic AI skills. Earlier this year, The Ohio State University launched an AI Fluency initiative, infusing basic AI education into core undergraduate requirements and majors, with the goal of helping students understand and use AI tools — no matter their major. Purdue wants its new initiative to help graduates: — Understand and use the latest AI tools effectively in their chosen fields, including being able to identify the key strengths and limits of AI technologies; — Recognize and communicate clearly about AI, including developing and defending decisions informed by AI, as well as recognizing the influence and consequences of AI in decision-making; — Adapt to and work with future AI developments effectively.

Read more of this story at Slashdot.

  •  

Repeal Section 230 and Its Platform Protections, Urges New Bipartisan US Bill

U.S. Senator Sheldon Whitehouse said Friday he was moving to file a bipartisan bill to repeal Section 230 of America's Communications Decency Act. "The law prevents most civil suits against users or services that are based on what others say," explains an EFF blog post. "Experts argue that a repeal of Section 230 could kill free speech on the internet," writes LiveMint — though America's last two presidents both supported a repeal: During his first presidency, U.S. President Donald Trump called to repeal the law and signed an executive order attempting to curb some of its protections, though it was challenged in court. Subsequently, former President Joe Biden also voiced his opinion against the law. An EFF blog post explains the case for Section 230: Congress passed this bipartisan legislation because it recognized that promoting more user speech online outweighed potential harms. When harmful speech takes place, it's the speaker that should be held responsible, not the service that hosts the speech... Without Section 230, the Internet is different. In Canada and Australia, courts have allowed operators of online discussion groups to be punished for things their users have said. That has reduced the amount of user speech online, particularly on controversial subjects. In non-democratic countries, governments can directly censor the internet, controlling the speech of platforms and users. If the law makes us liable for the speech of others, the biggest platforms would likely become locked-down and heavily censored. The next great websites and apps won't even get started, because they'll face overwhelming legal risk to host users' speech. But "I strongly believe that Section 230 has long outlived its use," Senator Whitehouse said this week, saying Section 230 "a real vessel for evil that needs to come to an end." "The laws that Section 230 protect these big platforms from are very often laws that go back to the common law of England, that we inherited when this country was initially founded. I mean, these are long-lasting, well-tested, important legal constraints that have — they've met the test of time, not by the year or by the decade, but by the century. "And yet because of this crazy Section 230, these ancient and highly respected doctrines just don't reach these people. And it really makes no sense, that if you're an internet platform you get treated one way; you do the exact same thing and you're a publisher, you get treated a completely different way. "And so I think that the time has come.... It really makes no sense... [Testimony before the committee] shows how alone and stranded people are when they don't have the chance to even get justice. It's bad enough to have to live through the tragedy... But to be told by a law of Congress, you can't get justice because of the platform — not because the law is wrong, not because the rule is wrong, not because this is anything new — simply because the wrong type of entity created this harm."

Read more of this story at Slashdot.

  •  

Time Magazine's 'Person of the Year': the Architects of AI

Time magazine used its 98th annual "Person of the Year" cover to "recognize a force that has dominated the year's headlines, for better or for worse. For delivering the age of thinking machines, for wowing and worrying humanity, for transforming the present and transcending the possible, the Architects of AI are TIME's 2025 Person of the Year." One cover illustration shows eight AI executives sitting precariously on a beam high above the city, while Time's 6,700-word article promises "the story of how AI changed our world in 2025, in new and exciting and sometimes frightening ways. It is the story of how [Nvidia CEO] Huang and other tech titans grabbed the wheel of history, developing technology and making decisions that are reshaping the information landscape, the climate, and our livelihoods." Time describes them betting on "one of the biggest physical infrastructure projects of all time," mentioning all the usual worries — datacenters' energy consumption, chatbot psychosis, predictions of "wiping out huge numbers of jobs" and the possibility of an AI stock market bubble. (Although "The drumbeat of warning that advanced AI could kill us all has mostly quieted"). But it also notes AI's potential to jumpstart innovation (and economic productivity) This year, the debate about how to wield AI responsibly gave way to a sprint to deploy it as fast as possible. "Every industry needs it, every company uses it, and every nation needs to build it," Huang tells TIME in a 75-minute interview in November, two days after announcing that Nvidia, the world's first $5 trillion company, had once again smashed Wall Street's earnings expectations. "This is the single most impactful technology of our time..." The risk-averse are no longer in the driver's seat. Thanks to Huang, Son, Altman, and other AI titans, humanity is now flying down the highway, all gas no brakes, toward a highly automated and highly uncertain future. Perhaps Trump said it best, speaking directly to Huang with a jovial laugh in the U.K. in September: "I don't know what you're doing here. I hope you're right."

Read more of this story at Slashdot.

  •  

Trump Ban on Wind Energy Permits 'Unlawful', Court Rules

A January order blocking wind energy projects in America has now been vacated by a U.S. judge and declared unlawful, reports the Associated Press: [Judge Saris of the U.S. district court for the district of Massachusetts] ruled in favor of a coalition of state attorneys general from 17 states and Washington DC, led by Letitia James, New York's attorney general, that challenged President Trump's day one order that paused leasing and permitting for wind energy projects... The coalition that opposed Trump's order argued that Trump does not have the authority to halt project permitting, and that doing so jeopardizes the states' economies, energy mix, public health and climate goals. The coalition includes Arizona, California, Colorado, Connecticut, Delaware, Illinois, Maine, Maryland, Massachusetts, Michigan, Minnesota, New Jersey, New Mexico, New York, Oregon, Rhode Island, Washington state and Washington DC. They say they have invested hundreds of millions of dollars collectively to develop wind energy and even more on upgrading transmission lines to bring wind energy to the electrical grid... Wind is the United States' largest source of renewable energy, providing about 10% of the electricity generated in the nation, according to the American Clean Power Association. But the BBC quotes Timothy Fox, managing director at the Washington, DC-based research firm ClearView Energy Partners, as saying he doesn't expect the ruling to reinvigorate the industry: "It's more symbolic than substantive," he said. "All the court is saying is ... you need to go back to work and consider these applications. What does that really mean?" he said. Officials could still deny permits or bog applications down in lengthy reviews, he noted.

Read more of this story at Slashdot.

  •  

New Rule Forbids GNOME Shell Extensions Made Using AI-Generated Code

An anonymous reader shared this report from Phoronix: Due to the growing number of GNOME Shell extensions looking to appear on extensions.gnome.org that were generated using AI, it's now prohibited. The new rule in their guidelines note that AI-generated code will be explicitly rejected: "Extensions must not be AI-generated While it is not prohibited to use AI as a learning aid or a development tool (i.e. code completions), extension developers should be able to justify and explain the code they submit, within reason. Submissions with large amounts of unnecessary code, inconsistent code style, imaginary API usage, comments serving as LLM prompts, or other indications of AI-generated output will be rejected." In a blog post, GNOME developer Javad Rahmatzadeh explains that "Some devs are using AI without understanding the code..."

Read more of this story at Slashdot.

  •  

Is the R Programming Language Surging in Popularity?

The R programming language "is sometimes frowned upon by 'traditional' software engineers," says the CEO of software quality services vendor Tiobe, "due to its unconventional syntax and limited scalability for large production systems." But he says it "continues to thrive at universities and in research-driven industries, and "for domain experts, it remains a powerful and elegant tool." Yet it's now gaining more popularity as statistics and large-scale data visualization become important (a trend he also sees reflected in the rise of Wolfram/Mathematica). That's according to December's edition of his TIOBE Index, which attempts to rank the popularity of programming languages based on search-engine results for courses, third-party vendors, and skilled engineers. InfoWorld explains: In the December 2025 index, published December 7, R ranks 10th with a 1.96% rating. R has cracked the Tiobe index's top 10 before, such as in April 2020 and July 2020, but not in recent years. The rival Pypl Popularity of Programming Language Index, meanwhile, has R ranked fifth this month with a 5.84% share. "Programming language R is known for fitting statisticians and data scientists like a glove," said Paul Jansen, CEO of software quality services vendor Tiobe, in a bulletin accompanying the December index... Although data science rival Python has eclipsed R in terms of general adoption, Jansen said R has carved out a solid and enduring niche, excelling at rapid experimentation, statistical modeling, and exploratory data analysis. "We have seen many Tiobe index top 10 entrants rising and falling," Jansen wrote. "It will be interesting to see whether R can maintain its current position." "Python remains ahead at 23.64%," notes TechRepublic, "while the familiar chase group behind it holds steady for the moment. The real movement comes deeper in the list, where SQL edges upward, R rises to the top 10, and Delphi/Object Pascal slips away... SQLclimbs from tenth to eighth at 2.10%, adding a small +0.11% that's enough to move it upward in a tightly packed section of the table. Perl holds ninth at 1.97%, strengthened by a +1.33% gain that extends its late-year resurgence." It's interesting to see how TIOBE's ranking compare with PYPL's (which ranks languages based solely on how often language tutorials are searched on Google): TIOBE PYPL Python Python C C/C++ C++ Objective-C Java Java C# R JavaScript JavaScript Visual Basic Swift SQL C# Perl PHP R Rust Despite their different methodologies, both lists put Python at #1, Java at #5, and JavaScript at #7.

Read more of this story at Slashdot.

  •  

System76 Launches First Stable Release of COSMIC Desktop and Pop!_OS 24.04 LTS

This week System76 launched the first stable release of its Rust-based COSMIC desktop environment. Announced in 2021, it's designed for all GNU/Linux distributions — and it shipping with Pop!_OS 24.04 LTS (based on Ubuntu 24.04 LTS). An anonymous reader shared this report from 9to5Linux: Previous Pop!_OS releases used a version of the COSMIC desktop that was based on the GNOME desktop environment. However, System76 wanted to create a new desktop environment from scratch while keeping the same familiar interface and user experience built for efficiency and fun. This means that some GNOME apps have been replaced by COSMIC apps, including COSMIC Files instead of Nautilus (Files), COSMIC Terminal instead of GNOME Terminal, COSMIC Text Editor instead of GNOME Text Editor, and COSMIC Media Player instead of Totem (Video Player). Also, the Pop!_Shop graphical package manager used in previous Pop!_OS releases has now been replaced by a new app called COSMIC Store. "If you're ambitious enough, or maybe just crazy enough, there eventually comes a time when you realize you've reached the limits of current potential, and must create something completely new if you're to go further..." explains System76 founder/CEO Carl Richell: For twenty years we have shipped Linux computers. For seven years we've built the Pop!_OS Linux distribution. Three years ago it became clear we had reached the limit of our current potential and had to create something new. Today, we break through that limit with the release of Pop!_OS 24.04 LTS with the COSMIC Desktop Environment. Today is special not only in that it's the culmination of over three years of work, but even more so in that System76 has built a complete desktop environment for the open source community... I hope you love what we've built for you. Now go out there and create. Push the limits, make incredible things, and have fun doing it!

Read more of this story at Slashdot.

  •  

'Free Software Awards' Winners Announced: Andy Wingo, Alx Sa, Govdirectory

This week the Free Software Foundation honored Andy Wingo, Alx Sa, and Govdirectory with this year's annual Free Software Awards (given to community members and groups making "significant" contributions to software freedom): Andy Wingo is one of the co-maintainers of GNU Guile, the official extension language of the GNU operating system and the Scheme "backbone" of GNU Guix. Upon receiving the award, he stated: "Since I learned about free software, the vision of a world in which hackers freely share and build on each others' work has been a profound inspiration to me, and I am humbled by this recognition of my small efforts in the context of the Guile Scheme implementation. I thank my co-maintainer, Ludovic Courtès, for his comradery over the years: we are just building on the work of the past maintainers of Guile, and I hope that we live long enough to congratulate its many future maintainers." The 2024 Award for Outstanding New Free Software Contributor went to Alx Sa for work on the GNU Image Manipulation Program (GIMP). When asked to comment, Alx responded: "I am honored to receive this recognition! I started contributing to the GNU Image Manipulation Program as a way to return the favor because of all the cool things it's allowed me to do. Thanks to the help and mentorship of amazing people like Jehan Pagès, Jacob Boerema, Liam Quin, and so many others, I hope I've been able to help other people do some cool new things, too." Govdirectory was presented with this year's Award for Projects of Social Benefit, given to a project or team responsible for applying free software, or the ideas of the free software movement, to intentionally and significantly benefit society. Govdirectory provides a collaborative and fact-checked listing of government addresses, phone numbers, websites, and social media accounts, all of which can be viewed with free software and under a free license, allowing people to always reach their representatives in freedom... The FSF plans to further highlight the Free Software Award winners in a series of events scheduled for the new year to celebrate their contributions to free software.

Read more of this story at Slashdot.

  •  

Applets Are Officially Going, But Java In the Browser Is Better Than Ever

"The entire java.applet package has been removed from JDK 26, which will release in March 2026," notes Inside Java. But long-time Slashdot reader AirHog links to this blog post reminding us that "Applets Are Officially Gone, But Java In The Browser Is Better Than Ever." This brings to an official end the era of applets, which began in 1996. However, for years it has been possible to build modern, interactive web pages in Java without needing applets or plugins. TeaVM provides fast, performant, and lightweight tooling to transpile Java to run natively in the browser... TeaVM, at its heart, transpiles Java code into JavaScript (or, these days, WASM). However, in order for Java code to be useful for web apps, much more is required, and TeaVM delivers. It includes a minifier, to shrink the generated code and obfuscate the intent, to complicate reverse-engineering. It has a tree-shaker to eliminate unused methods and classes, keeping your app download compact. It packages your code into a single file for easy distribution and inclusion in your HTML page. It also includes wrappers for all popular browser APIs, so you can invoke them from your Java code easily, with full IDE assistance and auto-correct. The blog post also touts Flavour, an open-source framework "for coding, packaging, and optimizing single-page apps implemented in Java... a full front-end toolkit with templates, routing, components, and more" to "build your modern single-page app using 100% Java."

Read more of this story at Slashdot.

  •  

Startup Successfully Uses AI to Find New Geothermal Energy Reservoirs

A Utah-based startup announced last week it used AI to locate a 250-degree Fahrenheit geothermal reservoir, reports CNN. It'll start producing electricity in three to five years, the company estimates — and at least one geologist believes AI could be an exciting "gamechanger" for the geothermal industry. [Startup Zanskar Geothermal & Minerals] named it "Big Blind," because this kind of site — which has no visual indication of its existence, no hot springs or geysers above ground, and no history of geothermal exploration — is known as a "blind" system. It's the first industry-discovered blind site in more than three decades, said Carl Hoiland, co-founder and CEO of Zanskar. "The idea that geothermal is tapped out has been the narrative for decades," but that's far from the case, he told CNN. He believes there are many more hidden sites across the Western U.S. Geothermal energy is a potential gamechanger. It offers the tantalizing prospect of a huge source of clean energy to meet burgeoning demand. It's near limitless, produces scarcely any climate pollution, and is constantly available, unlike wind and solar, which are cheap but rely on the sun shining and the wind blowing. The problem, however, has been how to find and scale it. It requires a specific geology: underground reservoirs of hot water or steam, along with porous rocks that allow the water to move through them, heat up, and be brought to the surface where it can power turbines... The AI models Zanskar uses are fed information on where blind systems already exist. This data is plentiful as, over the last century and more, humans have accidentally stumbled on many around the world while drilling for other resources such as oil and gas. The models then scour huge amounts of data — everything from rock composition to magnetic fields — to find patterns that point to the existence of geothermal reserves. AI models have "gotten really good over the last 10 years at being able to pull those types of signals out of noise," Hoiland said... Zanskar's discovery "is very significant," said James Faulds, a professor of geosciences at Nevada Bureau of Mines and Geology.... Estimates suggest over three-quarters of US geothermal resources are blind, Faulds told CNN. "Refining methods to find such systems has the potential to unleash many tens and perhaps hundreds of gigawatts in the western US alone," he said... Big Blind is the company's first blind site discovery, but it's the third site it has drilled and hit commercial resources. "We expect dozens, to eventually hundreds, of new sites to be coming to market," Hoiland said.... Hoiland says Zanskar's work shows conventional geothermal still has huge untapped potential. Thanks to long-time Slashdot reader schwit1 for sharing the article.

Read more of this story at Slashdot.

  •  

Firefox Survey Finds Only 16% Feel In Control of Their Privacy Choices Online

Choosing your browser "is one of the most important digital decisions you can make, shaping how you experience the web, protect your data, and express yourself online," says the Firefox blog. They've urged readers to "take a stand for independence and control in your digital life." But they also recently polled 8,000 adults in France, Germany, the UK and the U.S. on "how they navigate choice and control both online and offline" (attending in-person events in Chicago, Berlin, LA, and Munich, San Diego, Stuttgart): The survey, conducted by research agency YouGov, showcases a tension between people's desire to have control over their data and digital privacy, and the reality of the internet today — a reality defined by Big Tech platforms that make it difficult for people to exercise meaningful choice online: — Only 16% feel in control of their privacy choices (highest in Germany at 21%) — 24% feel it's "too late" because Big Tech already has too much control or knows too much about them. And 36% said the feeling of Big Tech companies knowing too much about them is frustrating — highest among respondents in the U.S. (43%) and the UK (40%) — Practices respondents said frustrated them were Big Tech using their data to train AI without their permission (38%) and tracking their data without asking (47%; highest in U.S. — 55% and lowest in France — 39%) And from our existing research on browser choice, we know more about how defaults that are hard to change and confusing settings can bury alternatives, limiting people's ability to choose for themselves — the real problem that fuels these dynamics. Taken together our new and existing insights could also explain why, when asked which actions feel like the strongest expressions of their independence online, choosing not to share their data (44%) was among the top three responses in each country (46% in the UK; 45% in the U.S.; 44% in France; 39% in Germany)... We also see a powerful signal in how people think about choosing the communities and platforms they join — for 29% of respondents, this was one of their top three expressions of independence online. "For Firefox, community has always been at the heart of what we do," says their VP of Global Marketing, "and we'll keep fighting to put real choice and control back in people's hands so the web once again feels like it belongs to the communities that shape it." At TwitchCon in San Diego Firefox even launched a satirical new online card game with a privacy theme called Data War.

Read more of this story at Slashdot.

  •  

The World's Electric Car Sales Have Spiked 21% So Far in 2025

Electrek reports: EV and battery supply chain research specialists Benchmark Mineral Intelligence reports that 2.0 million electric vehicles were sold globally in November 2025, bringing global EV sales to 18.5 million units year-to-date. That's a 21% increase compared to the same period in 2024. Europe was the clear growth leader in November, while North America continued to lag following the expiration of US EV tax credits. China, meanwhile, remains the world's largest EV market by a wide margin. Europe's EV market jumped 36% year-over-year in November 2025, with BEV sales up 35% and plug-in hybrid (PHEV) sales rising 39%. That brings Europe's total EV sales to 3.8 million units for the year so far, up 33% compared to January-November 2024... In North America, EV sales in the US did tick up month-over-month in November, following a sharp October drop after federal tax credits expired on September 30, 2025. Brands including Kia (up 30%), Hyundai (up 20%), Honda (up 11%), and Subaru (232 Solterra sales versus just 13 the month before) all saw gains, but overall volumes remain below levels when the federal tax credit was still available... [North America shows a -1% drop in EV sales from January to November 2025 vs. January to November 2024] Year-to-date, EV sales in China are up 19%, with 11.6 million units sold. One of the biggest headlines out of China is exports. BYD reported a record 131,935 EV exports in November, blowing past its previous high of around 90,000 units set in June. BYD sales in Europe have jumped more than fourfold this year to around 200,000 vehicles, doubled in Southeast Asia, and climbed by more than 50% in South America... "Overall, EV demand remains resilient, supported by expanding model ranges and sustained policy incentives worldwide," said Rho Motion data manager Charles Lester. Beyond China, Europe, and North America, the rest of the world saw a 48% spike in EV sales in 2025 vs the same 11 months in 2024, representing 1.5 million EVs sold. "The takeaway: EV demand continues to grow worldwide," the article adds, "but policy support — or the lack thereof — is increasingly shaping where this growth shows up."

Read more of this story at Slashdot.

  •  

How a 23-Year-Old in 1975 Built the World's First Handheld Digital Camera

In 1975, 23-year-old electrical engineer Steve Sasson joined Kodak. And in a new interview with the BBC, he remembers that he'd found the whole photographic process "really annoying.... I wanted to build a camera with no moving parts. Now that was just to annoy the mechanical engineers..." "You take your picture, you have to wait a long time, you have to fiddle with these chemicals. Well, you know, I was raised on Star Trek, and all the good ideas come from Star Trek. So I said what if we could just do it all electronically...?" Researchers at Bell Labs in the US had, in 1969, created a type of integrated circuit called a charge-coupled device (CCD). An electric charge could be stored on a metal-oxide semiconductor (MOS), and could be passed from one MOS to another. Its creators believed one of its applications might one day be used as part of an imaging device — though they hadn't worked out how that might happen. The CCD, nevertheless, was quickly developed. By 1974, the US microchip company Fairchild Semiconductors had built the first commercial CCD, measuring just 100 x 100 pixels — the tiny electronic samples taken of an original image. The new device's ability to capture an image was only theoretical — no-one had, as yet, tried to take an image and display it. (NASA, it turned out, was also looking at this technology, but not for consumer cameras....) The CCD circuit responded to light but could only form an image if Sasson was somehow able to attach a lens to it. He could then convert the light into digital information — a blizzard of 1s and 0s — but there was just one problem: money. "I had no money to build this thing. Nobody told me to build it, and I certainly couldn't demand any money for it," he says. "I basically stole all the parts, I was in Kodak and the apparatus division, which had a lot of parts. I stole the optical assembly from an XL movie camera downstairs in a used parts bin. I was just walking by, you see it, and you take it, you know." He was also able to source an analogue to digital converter from a $12 (about £5 in 1974) digital voltmeter, rather than spending hundreds on the part. I could manage to get all these parts without anybody really noticing," he says.... The bulky device needed a way to store the information the CCD was capturing, so Sasson used an audio cassette deck. But he also needed a way to view the image once it was saved on the magnetic tape. "We had to build a playback unit," Sasson says. "And, again, nobody asked me to do that either. So all I got to do is the reverse of what I did with the camera, and then I have to turn that digital pattern into an NTSC television signal." NTSC (National Television System Committee) was the conversion standard used by American TV sets. Sasson had to turn only 100 lines of digital code captured by the camera into the 400 lines that would form a television signal. The solution was a Motorola microprocessor, and by December 1975, the camera and its playback unit was complete, the article points out. With his colleague Jim Schueckler, Sasson had spent more than a year putting together the "increasingly bulky" device, that "looked like an oversized toaster." The camera had a shutter that would take an image at about 1/20th of a second, and — if everything worked as it should — the cassette tape would start to move as the camera transferred the stored information from its CCD [which took 23 seconds]. "It took about 23 seconds to play it back, and then about eight seconds to reconfigure it to make it look like a television signal, and send it to the TV set that I stole from another lab...." In 1978, Kodak was granted the first patent for a digital camera. It was Sasson's first invention. The patent is thought to have earned Eastman Kodak billions in licensing and infringement payments by the time they sold the rights to it, fearing bankruptcy, in 2012... As for Sasson, he never worked on anything other than the digital technology he had helped to create until he retired from Eastman Kodak in 2009. Thanks to long-time Slashdot reader sinij for sharing the article.

Read more of this story at Slashdot.

  •  

More of America's Coal-Fired Power Plants Cease Operations

New England's last coal-fired power plant "has ceased operations three years ahead of its planned retirement date," reports the New Hampshire Bulletin. "The closure of the New Hampshire facility paves the way for its owner to press ahead with an initiative to transform the site into a clean energy complex including solar panels and battery storage systems." "The end of coal is real, and it is here," said Catherine Corkery, chapter director for Sierra Club New Hampshire. "We're really excited about the next chapter...." The closure in New Hampshire — so far undisputed by the federal government — demonstrates that prolonging operations at some facilities just doesn't make economic sense for their owners. "Coal has been incredibly challenged in the New England market for over adecade," said Dan Dolan, president of the New England Power Generators Association. Merrimack Station, a 438-megawatt power plant, came online in the1960s and provided baseload power to the New England region for decades. Gradually, though, natural gas — which is cheaper and more efficient — took over the regional market... Additionally, solar power production accelerated from 2010 on, lowering demand on the grid during the day and creating more evening peaks. Coal plants take longer to ramp up production than other sources, and are therefore less economical for these shorter bursts of demand, Dolan said. In recent years, Merrimack operated only a few weeks annually. In 2024, the plant generated just0.22% of the region's electricity. It wasn't making enough money to justify continued operations, observers said. The closure "is emblematic of the transition that has been occurring in the generation fleet in New England for many years," Dolan said. "The combination of all those factors has meant that coal facilities are no longer economic in this market." Meanwhile Los Angeles — America's second-largest city — confirmed that the last coal-fired power plant supplying its electricity stopped operations just before Thanksgiving, reports the Utah News Dispatch: Advocates from the Sierra Club highlighted in a news release that shutting down the units had no impact on customers, and questioned who should "shoulder the cost of keeping an obsolete coal facility on standby...." Before ceasing operations, the coal units had been working at low capacities for several years because the agency's users hadn't been calling on the power [said John Ward, spokesperson for Intermountain Power Agency]. The coal-powered units "had a combined capacity of around 1,800 megawatts when fully operational," notes Electrek, "and as recently as 2024, they still supplied around 11% of LA's electricity. The plant sits in Utah's Great Basin region and powered Southern California for decades." Now, for the first time, none of California's power comes from coal. There's a political hiccup with IPP, though: the Republican-controlled Utah Legislature blocked the Intermountain Power Agency from fully retiring the coal units this year, ordering that they can't be disconnected or decommissioned. But despite that mandate, no buyers have stepped forward to keep the outdated coal units online. The Los Angeles Department of Water and Power (LADWP) is transitioning to newly built, hydrogen-capable generating units at the same IPP location, part of a modernization effort called IPP Renewed. These new units currently run on natural gas, but they're designed to burn a blend of natural gas and up to 30% green hydrogen, and eventually100% green hydrogen. LADWP plans to start adding green hydrogen to the fuel mix in 2026. "With the plant now idled but legally required to remain connected, serious questions remain about who will shoulder the cost of keeping an obsolete coal facility on standby," says the Sierra Club. One of the natural gas units started commerical operations last Octoboer, with the second starting later this month, IPP spokesperson John Ward told Agency]. the Utah News Dispatch.

Read more of this story at Slashdot.

  •  

Rust in Linux's Kernel 'is No Longer Experimental'

Steven J. Vaughan-Nichols files this report from Tokyo: At the invitation-only Linux Kernel Maintainers Summit here, the top Linux maintainers decided, as Jonathan Corbet, Linux kernel developer, put it, "The consensus among the assembled developers is that Rust in the kernel is no longer experimental — it is now a core part of the kernel and is here to stay. So the 'experimental' tag will be coming off." As Linux kernel maintainer Steven Rosted told me, "There was zero pushback." This has been a long time coming. This shift caps five years of sometimes-fierce debate over whether the memory-safe language belonged alongside C at the heart of the world's most widely deployed open source operating system... It all began when Alex Gaynor and Geoffrey Thomas at the 2019 Linux Security Summit said that about two-thirds of Linux kernel vulnerabilities come from memory safety issues. Rust, in theory, could avoid these by using Rust's inherently safer application programming interfaces (API)... In those early days, the plan was not to rewrite Linux in Rust; it still isn't, but to adopt it selectively where it can provide the most security benefit without destabilizing mature C code. In short, new drivers, subsystems, and helper libraries would be the first targets... Despite the fuss, more and more programs were ported to Rust. By April 2025, the Linux kernel contained about 34 million lines of C code, with only 25 thousand lines written in Rust. At the same time, more and more drivers and higher-level utilities were being written in Rust. For instance, the Debian Linux distro developers announced that going forward, Rust would be a required dependency in its foundational Advanced Package Tool (APT). This change doesn't mean everyone will need to use Rust. C is not going anywhere. Still, as several maintainers told me, they expect to see many more drivers being written in Rust. In particular, Rust looks especially attractive for "leaf" drivers (network, storage, NVMe, etc.), where the Rust-for-Linux bindings expose safe wrappers over kernel C APIs. Nevertheless, for would-be kernel and systems programmers, Rust's new status in Linux hints at a career path that blends deep understanding of C with fluency in Rust's safety guarantees. This combination may define the next generation of low-level development work.

Read more of this story at Slashdot.

  •  

Idaho Lab Produces World's First Molten Salt Fuel for Nuclear Reactors

America's Energy Department runs a research lab in Idaho — and this week announced successful results from a ground-breaking experiment. "This is the first time in history that chloride-based molten salt fuel has been produced for a fast reactor," says Bill Phillips, the lab's technical lead for salt synthesis. He calls it "a major milestone for American innovation and a clear signal of our national commitment to advanced nuclear energy." Unlike traditional reactors that use solid fuel rods and water as a coolant, most molten salt reactors rely on liquid fuel — a mixture of salts containing fissile material. This design allows for higher operating temperatures, better fuel efficiency, and enhanced safety. It also opens the door to new applications, including compact nuclear systems for ships and remote installations. "The Molten Chloride Fast Reactor represents a paradigm shift in the nuclear fuel cycle, and the Molten Chloride Reactor Experiment (MCRE) will directly inform the commercialization of that reactor," said Jeff Latkowski, senior vice president of TerraPower and program director for the Molten Chloride Fast Reactor. "Working with world-leading organizations such as INL to successfully synthesize this unique new fuel demonstrates how real progress in Gen IV nuclear is being made together." "The implications for the maritime industry are significant," said Don Wood, senior technical advisor for MCRE. "Molten salt reactors could provide ships with highly efficient, low-maintenance nuclear power, reducing emissions and enabling long-range, uninterrupted travel. The technology could spark the rise of a new nuclear sector — one that is mobile, scalable and globally transformative. More details from America's Energy Department: MCRE will require a total of 72 to 75 batches of fuel salt to go critical, making it the largest fuel production effort at INL since the operations of Experimental Breeder Reactor-II more than 30 years ago. The full-scale demonstration of the new fuel salt synthesis line for MCRE was made possible by a breakthrough in 2024. After years of testing, the team found the right recipe to convert 95 percent of uranium metal feedstock into 18 kilograms of uranium chloride fuel salt in only a few hours — a process that previously took more than a week to complete... After delivering the first batch of fuel salt this fall, the team anticipates delivering four additional batches by March of 2026. MCRE is anticipated to run in 2028 for approximately six months at INL in the Laboratory for Operation and Testing (LOTUS) in the United States test bed. "With the first batch of fuel salt successfully created at INL, researchers will now conduct testing to better understand the physics of the process, with a goal of moving the process to a commercial scale over the next decade," says Cowboy State Daily. Thanks to long-time Slashdot reader schwit1 for sharing the article.

Read more of this story at Slashdot.

  •  

Was the Airbus A320 Recall Caused By Cosmic Rays?

What triggered that Airbus emergency software recall? The BBC reports that Airbus's initial investigation into an aircraft's sudden drop in altitude linked it "to a malfunction in one of the aircraft's computers that controls moving parts on the aircraft's wings and tail." But that malfunction "seems to have been triggered by cosmic radiation bombarding the Earth on the day of the flight..." The BBC believes radiation from space "could become a growing problem as ever more microchips run our lives." What Airbus says occurred on that JetBlue flight from Cancun to New Jersey was a phenomenon called a single-event upset, or bit flip. As the BBC has previously reported, these computer errors occur when high-speed subatomic particles from outer space, such as protons, smash into atoms in our planet's atmosphere. This can cause a cascade of particles to rain down through our atmosphere, like throwing marbles across a table. In rare cases, those fast-moving neutrons can strike computer electronics and disrupt tiny bits of data stored in the computer's memory, switching that bit — often represented as a 0 or 1 — from one state to another. "That can cause your electronics to behave in ways you weren't expecting," says Matthew Owens, professor of space physics at the University of Reading in the UK. Satellites are particularly affected by this phenomenon, he says. "For space hardware we see this quite frequently." This is because the neutron flux — a measure of neutron radiation — rises the higher up in the atmosphere you go, increasing the chance of a strike hitting sensitive parts of the computer equipment on board. Aircraft are more vulnerable to this problem than computer equipment on the ground, although bit flips do occur at ground level, too. The increasing reliance of computers in fly-by-wire systems in aircraft, which use electronics rather than mechanical systems to control the plane in the air, also mean the risk posed by bit flips when they do occur is higher... Airbus told the BBC that it tested multiple scenarios when attempting to determine what happened to the 30 October 2025 JetBlue flight. In this case also, the company ruled out various possibilities except that of a bit flip. It is hard to attribute the incident to this for sure, however, because careering neutrons leave no trace of their activity behind, says Owens... [Airbus's software update] works by inducing "rapid refreshing of the corrupted parameter so it has no time to have effect on the flight controls", Airbus says. This is, in essence, a way of continually sanitising computer data on these aircraft to try and ensure that any errors don't end up actually impacting a flight... As computer chips have become smaller, they have also become more vulnerable to bit flips because the energy required to corrupt tiny packets of data has got lower over time. Plus, more and more microchips are being loaded into products and vehicles, potentially increasing the chance that a bit flip could cause havoc. If nothing else, the JetBlue incident will focus minds across many industries on the risk posed to our modern, microchip-dependent lives from cosmic radiation that originates far beyond our planet. Airbus said their analysis revealed "intense solar radiation" could corrupt data "critical to the functioning of flight control." But that explanation "has left some space weather scientists scratching their heads," adds the BBC. Space.com explains: Solar radiation levels on Oct. 30 were unremarkable and nowhere near levels that could affect aircraft electronics, Clive Dyer, a space weather and radiation expert at University of Surrey in the U.K., told Space.com. Instead, Dyer, who has studied effects of solar radiation on aircraft electronics for decades, thinks the onboard computer of the affected jet could have been struck by a cosmic ray, a stream of high-energy particles from a distant star explosion that may have travelled millions of years before reaching Earth. "[Cosmic rays] can interact with modern microelectronics and change the state of a circuit," Dyer said. "They can cause a simple bit flip, like a 0 to 1 or 1 to 0. They can mess up information and make things go wrong. But they can cause hardware failures too, when they induce a current in an electronic device and burn it out."

Read more of this story at Slashdot.

  •  

All of Russia's Porsches Were Bricked By a Mysterious Satellite Outage

An anonymous reader shared this report from Autoblog: Imagine walking out to your car, pressing the start button, and getting absolutely nothing. No crank, no lights on the dash, nothing. That's exactly what happened to hundreds of Porsche owners in Russia last week. The issue is with the Vehicle Tracking System, a satellite-based security system that's supposed to protect against theft. Instead, it turned these Porsches into driveway ornaments. The issue was first reported at the end of November, with owners reporting identical symptoms of their cars refusing to start or shutting down soon after ignition. Russia's largest dealership group, Rolf, confirmed that the problem stems from a complete loss of satellite connectivity to the VTS. When it loses its connection, it interprets the outage as a potential theft attempt and automatically activates the engine immobilizer. The issue affects all models and engine types, meaning any Porsche equipped with the system could potentially disable itself without warning. The malfunction impacts Porsche models dating back to 2013 that have the factory VTS installed... When the VTS connection drops, the anti-theft protocol kicks in, cutting fuel delivery and locking down the engine completely.

Read more of this story at Slashdot.

  •