Vue lecture

Thoughts About the Evolution of Mainstream Macroeconomics Over the Last 40 Years

Abstract of a paper featured on NBER: This year marks the 40th anniversary of the NBER Macro Annual Conference, founded in 1986. This paper reviews the evolution of mainstream macroeconomics since then. It presents my views, informed by a survey of a number of researchers who have made important contributions to the field. I develop two main arguments. The first is that, starting from strikingly different positions, there has been substantial convergence, in terms of methodology, architecture, and main mechanisms. Methodology: Explicit micro foundations, explicit treatment of distortions, with, at the same time, an increased willingness to deviate from rational expectations, neoclassical utility and profit maximization. Architecture: The wide acceptance of nominal rigidities as an essential distortion, although with mixed feelings. Mechanisms: The wide nature of the shocks to both the demand and the supply side. The second is that this convergence has been, for the most part, good convergence, i.e. the creation of a generally accepted conceptual and analytical structure, a core to which additional distortions can be added, allowing for discussions and integration of new ideas and evidence, rather than fights about basic methodology. Not everything is right however, with too much emphasis on general equilibrium implications from the start, rather than, first, on partial equilibrium analysis of the phenomenon at hand.

Read more of this story at Slashdot.

  •  

Danes Are Finally Going Nuclear. They Have To, Because of All Their Renewables

"The Danish government plans to evaluate the prospect of beginning a nuclear power programme," reports the Telegraph, noting that this week Denmark lifted a nuclear power ban imposed 40 years ago. Unlike its neighbours in Sweden and Germany, Denmark has never had a civil nuclear power programme. It has only ever had three small research reactors, the last of which closed in 2001. Most of the renewed interest in nuclear seen around the world stems from the expected growth in electricity demand from AI data centres, but Denmark is different. The Danes are concerned about possible blackouts similar to the one that struck Iberia recently. Like Spain and Portugal, Denmark is heavily dependent on weather-based renewable energy which is not very compatible with the way power grids operate... ["The spinning turbines found in fossil-fuelled energy systems provide inertia and act as a shock absorber to stabilise the grid during sudden changes in supply or demand," explains a diagram in the article, while solar and wind energy provide no inertia.] The Danish government is worried about how it will continue to decarbonise its power grid if it closes all of its fossil fuel generators leaving minimal inertia. There are only three realistic routes to decarbonisation that maintain physical inertia on the grid: hydropower, geothermal energy and nuclear. Hydro and geothermal depend on geographic and geological features that not every country possesses. While renewable energy proponents argue that new types of inverters could provide synthetic inertia, trials have so far not been particularly successful and there are economic challenges that are difficult to resolve. Denmark is realising that in the absence of large-scale hydroelectric or geothermal energy, it may have little choice other than to re-visit nuclear power if it is to maintain a stable, low carbon electricity grid. Thanks to long-time Slashdot reader schwit1 for sharing the news.

Read more of this story at Slashdot.

  •  

EV Sales Keep Growing In the US, Represent 20% of Global Car Sales and Half in China

"Despite many obstacles — and what you may read elsewhere — electric-vehicle sales continue to grow at a healthy pace in the U.S. market," Cox Automotive reported this week. "Roughly 7.5% of total new-vehicle sales in the first quarter were electric vehicles, an increase from 7% a year earlier." An anonymous reader shared this analysis from Autoweek: "Despite a cloud of uncertainty around future EV interest and potential economic headwinds hanging over the automotive industry, consumer demand for electric vehicles has remained stable," according to the J.D. Power 2025 US Electric Vehicle Consideration Study released yesterday. Specifically, the study showed that 24% of vehicle shoppers in the U.S. say they are "very likely" to consider purchasing an EV and 35% say they are "somewhat likely," both of which figures remain unchanged from a year ago... Globally the numbers are even more pro-EV. Electric car sales exceeded 17 million globally in 2024, reaching a sales share of more than 20%, according to a report issued this week by the International Energy Agency. "Just the additional 3.5 million electric cars sold in 2024 compared with the previous year is more than the total number of electric cars sold worldwide in 2020," the IEA said. China, which has mandated increases in EV sales, is the leader in getting electric vehicles on the road, with electric cars accounting for almost half of all Chinese car sales in 2024, the IEA said. "The over 11 million electric cars sold in China last year were more than global sales just 2 years earlier. As a result of continued strong growth, 1 in 10 cars on Chinese roads is now electric." Interesting figures on U.S. EV sales from the article: 2024 EV sales rose 7.3% from 2023, according to Cox Automotive data. "Last year American consumers purchased 1.3 million electric vehicles, which was a new record, according to data from KBB. "Sales have never stopped growing, and the percentage of new cars sold powered purely by gasoline continues to slip.

Read more of this story at Slashdot.

  •  

Since 2022 Nuclear Fusion Breakthrough, US Researchers Have More Than Doubled Its Power Output

TechCrunch reports: The world's only net-positive fusion experiment has been steadily ramping up the amount of power it produces, TechCrunch has learned. In recent attempts, the team at the U.S. Department of Energy's National Ignition Facility (NIF) increased the yield of the experiment, first to 5.2 megajoules and then again to 8.6 megajoules, according to a source with knowledge of the experiment. The new results are significant improvements over the historic experiment in 2022, which was the first controlled fusion reaction to generate more energy than the it consumed. The 2022 shot generated 3.15 megajoules, a small bump over the 2.05 megajoules that the lasers delivered to the BB-sized fuel pellet. None of the shots to date have been effective enough to feed electrons back into the grid, let alone to offset the energy required to power the entire facility — the facility wasn't designed to do that. The first net-positive shot, for example, required 300 megajoules to power the laser system alone. But they are continued proof that controlled nuclear fusion is more than hypothetical.

Read more of this story at Slashdot.

  •  

Why We're Unlikely to Get Artificial General Intelligence Any Time Soon

OpenAI CEO and Sam Altman believe Artificial General Intelligence could arrive within the next few years. But the speculations of some technologists "are getting ahead of reality," writes the New York Times, adding that many scientists "say no one will reach AGI without a new idea — something beyond the powerful neural networks that merely find patterns in data. That new idea could arrive tomorrow. But even then, the industry would need years to develop it." "The technology we're building today is not sufficient to get there," said Nick Frosst, a founder of the AI startup Cohere who previously worked as a researcher at Google and studied under the most revered AI researcher of the last 50 years. "What we are building now are things that take in words and predict the next most likely word, or they take in pixels and predict the next most likely pixel. That's very different from what you and I do." In a recent survey of the Association for the Advancement of Artificial Intelligence, a 40-year-old academic society that includes some of the most respected researchers in the field, more than three-quarters of respondents said the methods used to build today's technology were unlikely to lead to AGI. Opinions differ in part because scientists cannot even agree on a way of defining human intelligence, arguing endlessly over the merits and flaws of IQ tests and other benchmarks. Comparing our own brains to machines is even more subjective. This means that identifying AGI is essentially a matter of opinion.... And scientists have no hard evidence that today's technologies are capable of performing even some of the simpler things the brain can do, like recognizing irony or feeling empathy. Claims of AGI's imminent arrival are based on statistical extrapolations — and wishful thinking. According to various benchmark tests, today's technologies are improving at a consistent rate in some notable areas, like math and computer programming. But these tests describe only a small part of what people can do. Humans know how to deal with a chaotic and constantly changing world. Machines struggle to master the unexpected — the challenges, small and large, that do not look like what has happened in the past. Humans can dream up ideas that the world has never seen. Machines typically repeat or enhance what they have seen before. That is why Frosst and other sceptics say pushing machines to human-level intelligence will require at least one big idea that the world's technologists have not yet dreamed up. There is no way of knowing how long that will take. "A system that's better than humans in one way will not necessarily be better in other ways," Harvard University cognitive scientist Steven Pinker said. "There's just no such thing as an automatic, omniscient, omnipotent solver of every problem, including ones we haven't even thought of yet. There's a temptation to engage in a kind of magical thinking. But these systems are not miracles. They are very impressive gadgets." While Google's AlphaGo could be humans in a game with "a small, limited set of rules," the article points out that tthe real world "is bounded only by the laws of physics. Modelling the entirety of the real world is well beyond today's machines, so how can anyone be sure that AGI — let alone superintelligence — is just around the corner?" And they offer this alternative perspective from Matteo Pasquinelli, a professor of the philosophy of science at Ca' Foscari University in Venice, Italy. "AI needs us: living beings, producing constantly, feeding the machine. It needs the originality of our ideas and our lives."

Read more of this story at Slashdot.

  •  

Bungie Blames Stolen 'Marathon' Art On Former Developer

An anonymous reader shared this report from Kotaku: One of the most striking things about Bungie's Marathon is its presentation. The sci-fi extraction shooter combines bleak settings with bright colors in a way that makes it feel a bit like a sneaker promo meets Ghost in the Shell, or as designer Jeremy Skoog put it, "Y2K Cyberpunk mixed with Acid Graphic Design Posters." But it now looks like at least a few of the visual design elements that appeared in the recent alpha test were lifted from eight-year old work by an outside artist. "The Marathon alpha released recently and its environments are covered with assets lifted from poster designs I made in 2017," Bluesky user antire.alâ posted on Thursday. She shared two images showing elements of her work and where they appeared in Marathon's gameplay, including a rotated version of her own logo. A poster full of small repeating icon patterns also seems to be all but recreated in Marathon's press kit ARG and website... Bungie has responded and blamed the incident on a former employee. The studio says it's reaching out to the artist in question and conducting a full review of its in-game assets for Marathon ["and implementing stricter checks to document all artist contributions."] "We immediately investigated a concern regarding unauthorized use of artist decals in Marathon and confirmed that a former Bungie artist included these in a texture sheet that was ultimately used in-game," the studio posted on X. "As a matter of policy, we do not use the work of artists without their permission..." their X post emphasizes. "We value the creativity and dedication of all artists who contribute to our games, and we are committed to doing right by them. Thank you for bringing this to our attention."

Read more of this story at Slashdot.

  •  

'The People Stuck Using Ancient Windows Computers'

The BBC visits "the strange, stubborn world of obsolete Windows machines." Even if you're a diehard Apple user, you're probably interacting with Windows systems on a regular basis. When you're pulling cash out, for example, chances are you're using a computer that's downright geriatric by technology standards. (Microsoft declined to comment for this article.) "Many ATMs still operate on legacy Windows systems, including Windows XP and even Windows NT," which launched in 1993, says Elvis Montiero, an ATM field technician based in Newark, New Jersey in the US. "The challenge with upgrading these machines lies in the high costs associated with hardware compatibility, regulatory compliance and the need to rewrite proprietary ATM software," he says. Microsoft ended official support for Windows XP in 2014, but Montiero says many ATMs still rely on these primordial systems thanks to their reliability, stability and integration with banking infrastructure. And a job listing for an IT systems administrator for Germany's railway service "were expected to have expertise with Windows 3.11 and MS-DOS — systems released 32 and 44 years ago, respectively. In certain parts of Germany, commuting depends on operating systems that are older than many passengers." It's not just German transit, either. The trains in San Francisco's Muni Metro light railway, for example, won't start up in the morning until someone sticks a floppy disk into the computer that loads DOS software on the railway's Automatic Train Control System (ATCS). Last year, the San Francisco Municipal Transit Authority (SFMTA) announced its plans to retire this system over the coming decade, but today the floppy disks live on. Apple is "really aggressive about deprecating old products," M. Scott Ford, a software developer who specialises in updating legacy systems, tells the BBC. "But Microsoft took the approach of letting organisations leverage the hardware they already have and chasing them for software licenses instead. They also tend to have a really long window for supporting that software." And so you get things like two enormous LightJet printers in San Diego powered by servers running Windows 2000, says photographic printer John Watts: Long out of production, the few remaining LightJets rely on the Windows operating systems that were around when these printers were sold. "A while back we looked into upgrading one of the computers to Windows Vista. By the time we added up the money it would take to buy new licenses for all the software it was going to cost $50,000 or $60,000 [£38,000 to £45,000]," Watts says. "I can't stand Windows machines," he says, "but I'm stuck with them...." In some cases, however, old computers are a labour of love. In the US, Dene Grigar, director of the Electronic Literature Lab at Washington State University, Vancouver, spends her days in a room full of vintage (and fully functional) computers dating back to 1977... She's not just interested in early, experimental e-books. Her laboratory collects everything from video games to Instagram zines.... Grigar's Electronic Literature Lab maintains 61 computers to showcase the hundreds of electronic works and thousands of files in the collection, which she keeps in pristine condition. Grigar says they're still looking for a PC that reads five-and-a-quarter-inch floppy disks.

Read more of this story at Slashdot.

  •  

Why Two Amazon Drones Crashed at a Test Facility in a December

While Amazon won FAA approval to fly beyond an operators' visual line of sight, "the program remains a work in progress," reports Bloomberg: A pair of Amazon.com Inc. package delivery drones were flying through a light rain in mid-December when, within minutes of one another, they both committed robot suicide... [S]ome 217 feet (66 meters) in the air [at a drone testing facility], the aircraft cut power to its six propellers, fell to the ground and was destroyed. Four minutes later and 183 feet over the taxiway, a second Prime Air drone did the same thing. Not long after the incidents, Amazon paused its experimental drone flights to tweak the aircraft software but said the crashes weren't the "primary reason" for halting the program. Now, five months after the twin crashes, a more detailed explanation of what happened is starting to emerge. Faulty readings from lidar sensors made the drones think they had landed, prompting the software to shut down the propellers, according to National Transportation Safety Board documents reviewed by Bloomberg. The sensors failed after a software update made them more susceptible to being confused by rain, the NTSB said. Amazon also removed a backup sensor present that had been present on earlier iterations, according to the article — though an Amazon spokesperson said the company had found ways to replicate the removed sensors. But Bloomberg notes Amazon's drone efforts has faced "technical challenges and crashes, including one in 2021 that set a field ablaze at the company's testing facility in Pendleton, Oregon." Deliveries are currently limited to College Station, Texas, and greater Phoenix, with plans to expand to Kansas City, Missouri, the Dallas area and San Antonio, as well as the UK and Italy. Starting with a craft that looked like a hobbyist drone — and was vulnerable to even modest gusts of wind — Amazon went through dozens of designs to toughen the vehicle and ultimately make it capable of carting about 5 pounds, giving it the capability to transport items typically ordered from its warehouses. Engineers settled on a six-propeller design that takes off vertically before cruising like a plane. The first model to make regular customer deliveries, the MK27, was succeeded last year by the MK30, which flies at about 67 miles an hour and can deliver packages up to 7.5 miles from its launch point. The craft takes off, flies and lands autonomously.

Read more of this story at Slashdot.

  •  

When a Company Does Job Interviews with a Malfunctioning AI - and Then Rejects You

IBM laid off "a couple hundred" HR workers and replaced them with AI agents. "It's becoming a huge thing," says Mike Peditto, a Chicago-area consultant with 15 years of experience advising companies on hiring practices. He tells Slate "I do think we're heading to where this will be pretty commonplace." Although A.I. job interviews have been happening since at least 2023, the trend has received a surge of attention in recent weeks thanks to several viral TikTok videos in which users share videos of their A.I. bots glitching. Although some of the videos were fakes posted by a creator whose bio warns that his content is "all satire," some are authentic — like that of Kendiana Colin, a 20-year-old student at Ohio State University who had to interact with an A.I. bot after she applied for a summer job at a stretching studio outside Columbus. In a clip she posted online earlier this month, Colin can be seen conducting a video interview with a smiling white brunette named Alex, who can't seem to stop saying the phrase "vertical-bar Pilates" in an endless loop... Representatives at Apriora, the startup company founded in 2023 whose software Colin was forced to engage with, did not respond to a request for comment. But founder Aaron Wang told Forbes last year that the software allowed companies to screen more talent for less money... (Apriora's website claims that the technology can help companies "hire 87 percent faster" and "interview 93 percent cheaper," but it's not clear where those stats come from or what they actually mean.) Colin (first interviewed by 404 Media) calls the experience dehumanizing — wondering why they were told dress professionally, since "They had me going the extra mile just to talk to a robot." And after the interview, the robot — and the company — then ghosted them with no future contact. "It was very disrespectful and a waste of time." Houston resident Leo Humphries also "donned a suit and tie in anticipation for an interview" in which the virtual recruiter immediately got stuck repeating the same phrase. Although Humphries tried in vain to alert the bot that it was broken, the interview ended only when the A.I. program thanked him for "answering the questions" and offering "great information" — despite his not being able to provide a single response. In a subsequent video, Humphries said that within an hour he had received an email, addressed to someone else, that thanked him for sharing his "wonderful energy and personality" but let him know that the company would be moving forward with other candidates.

Read more of this story at Slashdot.

  •  

'Rust is So Good You Can Get Paid $20K to Make It as Fast as C'

The Prossimo project (funded by the nonprofit Internet Security Research Group) seeks to "move the Internet's security-sensitive software infrastructure to memory safe code." Two years ago the Prossimo project made an announcement: they'd begun work on rav1d, a safer high performance AV1 decoder written in Rust, according to a new update: We partnered with Immunant to do the engineering work. By September of 2024 rav1d was basically complete and we learned a lot during the process. Today rav1d works well — it passes all the same tests as the dav1d decoder it is based on, which is written in C. It's possible to build and run Chromium with it. There's just one problem — it's not quite as fast as the C version... Our Rust-based rav1d decoder is currently about 5% slower than the C-based dav1d decoder (the exact amount differs a bit depending on the benchmark, input, and platform). This is enough of a difference to be a problem for potential adopters, and, frankly, it just bothers us. The development team worked hard to get it to performance parity. We brought in a couple of other contractors who have experience with optimizing things like this. We wrote about the optimization work we did. However, we were still unable to get to performance parity and, to be frank again, we aren't really sure what to do next. After racking our brains for options, we decided to offer a bounty pool of $20,000 for getting rav1d to performance parity with dav1d. Hopefully folks out there can help get rav1d performance advanced to where it needs to be, and ideally we and the Rust community will also learn something about how Rust performance stacks up against C. This drew a snarky response from FFmpeg, the framework that powers audio and video processing for everyone from VLC to Twitch. "Rust is so good you can get paid $20k to make it as fast as C," they posted to their 68,300 followers on X.com. Thanks to the It's FOSS blog for spotting the announcement.

Read more of this story at Slashdot.

  •  

Taiwan Shuts Down Its Last Nuclear Reactor

The only nuclear power plant still operating in Taiwan was shut down on Saturday, reports Japan's public media organization NHK: People in Taiwan have grown increasingly concerned about nuclear safety in recent years, especially after the 2011 nuclear disaster in Fukushima, northeastern Japan... Taiwan's energy authorities plan to focus more on thermoelectricity fueled by liquefied natural gas. They aim to source 20 percent of all electricity from renewables such as wind and solar power next year. AFP notes that nuclear power once provided more than half of Taiwan's energy, with three plants operating six reactors across an island that's 394 km (245 mi) long and 144 km (89 mi) wide. So the new move to close Taiwan's last reactor is "fuelling concerns over the self-ruled island's reliance on imported energy and vulnerability to a Chinese blockade," — though Taiwan's president insists the missing nucelar energy can be replace by new units in LNG and coal-fired plants: The island, which targets net-zero emissions by 2050, depends almost entirely on imported fossil fuel to power its homes, factories and critical semiconductor chip industry. President Lai Ching-te's Democratic Progressive Party has long vowed to phase out nuclear power, while the main opposition Kuomintang (KMT) party says continued supply is needed for energy security... [The Ma'anshan Nuclear Power Plant] has operated for 40 years in a region popular with tourists and which is now dotted with wind turbines and solar panels. More renewable energy is planned at the site, where state-owned Taipower plans to build a solar power station capable of supplying an estimated 15,000 households annually. But while nuclear only accounted for 4.2 percent of Taiwan's power supply last year, some fear Ma'anshan's closure risks an energy crunch.... Most of Taiwan's power is fossil fuel-based, with liquefied natural gas (LNG) accounting for 42.4 percent and coal 39.3 percent last year. Renewable energy made up 11.6 percent, well short of the government's target of 20 percent by 2025. Solar has faced opposition from communities worried about panels occupying valuable land, while rules requiring locally made parts in wind turbines have slowed their deployment. Taiwan's break-up with nuclear is at odds with global and regional trends. Even Japan aims for nuclear to account for 20-22 percent of its electricity by 2030, up from well under 10 percent now. And nuclear power became South Korea's largest source of electricity in 2024, accounting for 31.7 percent of the country's total power generation, and reaching its highest level in 18 years, according to government data.... And Lai acknowledged recently he would not rule out a return to nuclear one day. "Whether or not we will use nuclear power in the future depends on three foundations which include nuclear safety, a solution to nuclear waste, and successful social dialogue," he said. DW notes there's over 100,000 barrels of nuclear waste on Taiwan's easternmost island "despite multiple attempts to remove them... At one point, Taiwan signed a deal with North Korea so they could send barrels of nuclear waste to store there, but it did not work out due to a lack of storage facilities in the North and strong opposition from South Korea... "Many countries across the world have similar problems and are scrambling to identify sites for a permanent underground repository for nuclear fuel. Finland has become the world's first nation to build one." Thanks to long-time Slashdot reader AmiMoJo for sharing the news.

Read more of this story at Slashdot.

  •  

Firefox Announces Same-Day Update After Two Minor Pwn2Own Exploits

During this year's annual Pwn2Own contest, two researchers from Palo Alto Networks demonstrated an out-of-bounds write vulnerability in Mozilla Firefox, reports Cyber Security News, "earning $50,000 and 5 Master of Pwn points." And the next day another participant used an integer overflow to exploit Mozilla Firefox (renderer only). But Mozilla's security blog reminds users that a sandbox escape would be required to break out from a tab to gain wider system access "due to Firefox's robust security architecture" — and that "neither participating group was able to escape our sandbox..." We have verbal confirmation that this is attributed to the recent architectural improvements to our Firefox sandbox which have neutered a wide range of such attacks. This continues to build confidence in Firefox's strong security posture. Even though neither attack could escape their sandbox, "Out of abundance of caution, we just released new Firefox versions... all within the same day of the second exploit announcement." (Last year Mozilla responded to an exploitable security bug within 21 hours, they point out, even winning an award as the fastest to patch.) The new updated versions are Firefox 138.0.4, Firefox ESR 128.10.1, Firefox ESR 115.23.1 and Firefox for Android. "Despite the limited impact of these attacks, all users and administrators are advised to update Firefox as soon as possible...." To review and fix the reported exploits a diverse team of people from all across the world and in various roles (engineering, QA, release management, security and many more) rushed to work. We tested and released a new version of Firefox for all of our supported platforms, operating systems, and configurations with rapid speed.... Our work does not end here. We continue to use opportunities like this to improve our incident response. We will also continue to study the reports to identify new hardening features and security improvements to keep all of our Firefox users across the globe protected.

Read more of this story at Slashdot.

  •  

OSU's Open Source Lab Eyes Infrastructure Upgrades and Sustainability After Recent Funding Success

It's a nonprofit that's provide hosting for the Linux Foundation, the Apache Software Foundation, Drupal, Firefox, and 160 other projects — delivering nearly 430 terabytes of information every month. (It's currently hosting Debian, Fedora, and Gentoo Linux.) But hosting only provides about 20% of its income, with the rest coming from individual and corporate donors (including Google and IBM). "Over the past several years, we have been operating at a deficit due to a decline in corporate donations," the Open Source Lab's director announced in late April. It's part of the CS/electrical engineering department at Oregon State University, and while the department "has generously filled this gap, recent changes in university funding makes our current funding model no longer sustainable. Unless we secure $250,000 in committed funds, the OSL will shut down later this year." But "Thankfully, the call for support worked, paving the way for the OSU Open Source Lab to look ahead, into what the future holds for them," reports the blog It's FOSS. "Following our OSL Future post, the community response has been incredible!" posted director Lance Albertson. "Thanks to your amazing support, our team is funded for the next year. This is a huge relief and lets us focus on building a truly self-sustaining OSL." To get there, we're tackling two big interconnected goals: 1. Finding a new, cost-effective physical home for our core infrastructure, ideally with more modern hardware. 2. Securing multi-year funding commitments to cover all our operations, including potential new infrastructure costs and hardware refreshes. Our current data center is over 20 years old and needs to be replaced soon. With Oregon State University evaluating the future of this facility, it's very likely we'll need to relocate in the near future. While migrating to the State of Oregon's data center is one option, it comes with significant new costs. This makes finding free or very low-cost hosting (ideally between Eugene and Portland for ~13-20 racks) a huge opportunity for our long-term sustainability. More power-efficient hardware would also help us shrink our footprint. Speaking of hardware, refreshing some of our older gear during a move would be a game-changer. We don't need brand new, but even a few-generations-old refurbished systems would boost performance and efficiency. (Huge thanks to the Yocto Project and Intel for a recent hardware donation that showed just how impactful this is!) The dream? A data center partner donating space and cycled-out hardware. Our overall infrastructure strategy is flexible. We're enhancing our OpenStack/Ceph platforms and exploring public cloud credits and other donated compute capacity. But whatever the resource, it needs to fit our goals and come with multi-year commitments for stability. And, a physical space still offers unique value, especially the invaluable hands-on data center experience for our students.... [O]ur big focus this next year is locking in ongoing support — think annualized pledges, different kinds of regular income, and other recurring help. This is vital, especially with potential new data center costs and hardware needs. Getting this right means we can stop worrying about short-term funding and plan for the future: investing in our tech and people, growing our awesome student programs, and serving the FOSS community. We're looking for partners, big and small, who get why foundational open source infrastructure matters and want to help us build this sustainable future together. The It's FOSS blog adds that "With these prerequisites in place, the OSUOSL intends to expand their student program, strengthen their managed services portfolio for open source projects, introduce modern tooling like Kubernetes and Terraform, and encourage more community volunteers to actively contribute." Thanks to long-time Slashdot reader I'm just joshin for suggesting the story.

Read more of this story at Slashdot.

  •  

YouTube Announces Gemini AI Feature to Target Ads When Viewers are Most Engaged

A new YouTube tool will let advertisers use Google's Gemini AI model to target ads to viewers when they're most engaged, reports CNBC: Peak Points has the potential to enable more impressions and a higher click-through rate on YouTube, a primary metric that determines how creators earn money on the video platform... Peak Points is currently in a pilot program and will be rolling out over the rest of the year. The product "aims to benefit advertisers by using a tactic that aims to grab users' attention right when they're most invested in the content," reports TechCrunch: This approach appears to be similar to a strategy called emotion-based targeting, where advertisers place ads that align with the emotions evoked by the video. It's believed that when viewers experience heightened emotional states, it leads to better recall of the ads. However, viewers may find these interruptions frustrating, especially when they're deeply engaged in the emotional arc of a video and want the ad to be over quickly to resume watching. In related news, YouTube announced another ad format that may be more appealing to users. The platform debuted a shoppable product feed where users can browse and purchase items during an ad.

Read more of this story at Slashdot.

  •  

9 Months Later, Microsoft Finally Fixes Linux Dual-Booting Bug

Last August a Microsoft security update broke dual-booting Windows 11 and Linux systems, remembers the blog Neowin. Distros like Debian, Ubuntu, Linux Mint, Zorin OS, and Puppy Linux were all affected, and "a couple of days later, Microsoft provided a slightly lengthy workaround that involved tweaking around with policies and the Registry in order to fix the problem." The update "was meant to address a GRUB bootloader vulnerability that allowed malicious actors to bypass Secure Boot's safety mechanisms," notes the It's FOSS blog. "Luckily, there's now a proper fix for this, as Microsoft has quietly released a new patch on May 13, 2025, addressing the issue nine months after it was first reported... Meanwhile, many dual-boot users were left with borked setups, having to use workarounds or disable Secure Boot altogether."

Read more of this story at Slashdot.

  •  

Ask Slashdot: Would You Consider a Low-Latency JavaScript Runtime For Your Workflow?

Amazon's AWS Labs has created LLRT an experimental, lightweight JavaScript runtime designed to address the growing demand for fast and efficient serverless applications. Slashdot reader BitterEpic wants to know what you think of it: Traditional JavaScript runtimes like Node.js rely on garbage collection, which can introduce unpredictable pauses and slow down performance, especially during cold starts in serverless environments like AWS Lambda. LLRT's manual memory management, courtesy of Rust, eliminates this issue, leading to smoother, more predictable performance. LLRT also has a runtime under 2MB, a huge reduction compared to the 100MB+ typically required by Node.js. This lightweight design means lower memory usage, better scalability, and reduced operational costs. Without the overhead of garbage collection, LLRT has faster cold start times and can initialize in milliseconds—perfect for latency-sensitive applications where every millisecond counts. For JavaScript developers, LLRT offers the best of both worlds: rapid development with JavaScript's flexibility, combined with Rust's performance. This means faster, more scalable applications without the usual memory bloat and cold start issues. Still in beta, LLRT promises to be a major step forward for serverless JavaScript applications. By combining Rust's performance with JavaScript's flexibility, it opens new possibilities for building high-performance, low-latency applications. If it continues to evolve, LLRT could become a core offering in AWS Lambda, potentially changing how we approach serverless JavaScript development. Would you consider Javascript as the core of your future workflow? Or maybe you would prefer to go lower level with quckjs?

Read more of this story at Slashdot.

  •  

Google Restores Nextcloud Users' File Access on Android

An anonymous reader shared this report from Ars Technica: Nextcloud, a host-your-own cloud platform that wants to help you "regain control over your data," has had to tell its Android-using customers for months now that they cannot upload files from their phone to their own servers. Months of emails and explanations to Google's Play Store representatives have yielded no changes, Nextcloud . That blog post — and media coverage of it — seem to have moved the needle. In an update to the post, Nextcloud wrote that as of May 15, Google has offered to restore full file access permissions. "We are preparing a test release first (expected tonight) and a final update with all functionality restored. If no issues occur, the update will hopefully be out early next week," the Nextcloud team wrote.... [Nextcloud] told The Register that it had more than 800,000 Android users. The company's blog post goes further than pinpointing technical and support hurdles. "It is a clear example of Big Tech gatekeeping smaller software vendors, making the products of their competitors worse or unable to provide the same services as the giants themselves sell," Nextcloud's post states. "Big Tech is scared that small players like Nextcloud will disrupt them, like they once disrupted other companies. So they try to shut the door." Nextcloud is one of the leaders of an antitrust-minded movement against Microsoft's various integrated apps and services, having filed a complaint against the firm in 2021.

Read more of this story at Slashdot.

  •  

Stack Overflow Seeks Realignment 'To Support the Builders of the Future in an AI World'

"The world has changed," writes Stack Overflow's blog. "Fast. Artificial intelligence is reshaping how we build, learn, and solve problems. Software development looks dramatically different than it did even a few years ago — and the pace of change is only accelerating." And they believe their brand "at times" lost "fidelity and clarity. It's very much been always added to and not been thought of holistically. So, it's time for our brand to evolve too," they write, hoping to articulate a perspective "forged in the fires of community, powered by collaboration, shaped by AI, and driven by people." The developer news site DevClass notes the change happens "as the number of posts to its site continues a dramatic decline thanks to AI-driven alternatives." According to a quick query on the official data explorer, the sum of questions and answers posted in April 2025 was down by over 64 percent from the same month in 2024, and plunged more than 90 percent from April 2020, when traffic was near its peak... Although declining traffic is a sign of Stack Overflow's reduced significance in the developer community, the company's business is not equally affected so far. Stack Exchange is a business owned by investment company Prosus, and the Stack Exchange products include private versions of its site (Stack Overflow for Teams) as well as advertising and recruitment. According to the Prosus financial results, in the six months ended September 2024, Stack Overflow increased its revenue and reduced its losses. The company's search for a new direction though confirms that the fast-disappearing developer engagement with Stack Overflow poses an existential challenge to the organization. DevClass says Stack Overflow's parent company "is casting about for new ways to provide value (and drive business) in this context..." The company has already experimented with various new services, via its Labs research department, including an AI Answer Assistant and Question Assistant, as well as a revamped jobs site in association with recruitment site Indeed, Discussions for technical debate, and extensions for GitHub Copilot, Slack, and Visual Studio Code. From the official announcement on Stack Overflow's blog: This rebrand isn't just a fresh coat of paint. It's a realignment with our purpose: to support the builders of the future in an AI world — with clarity, speed, and humanity. It's about showing up in a way that reflects who we are today, and where we're headed tomorrow. "We have appointed an internal steering group and we have engaged with an external expert partner in this area to help bring about the required change," notes a post in Stack Exchange's "meta" area. This isn't just about a visual update or marketing exercise — it's going to bring about a shift in how we present ourselves to the world which you will feel everywhere from the design to the copywriting, so that we can better achieve our goals and shared mission. As the emergence of AI has called into question the role of Stack Overflow and the Stack Exchange Network, one of the desired outputs of the rebrand process is to clarify our place in the world. We've done work toward this already — our recent community AMA is an example of this — but we want to ensure that this comes across in our brand and identity as well. We want the community to be involved and have a strong voice in the process of renewing and refreshing our brand. Remember, Stack Overflow started with a public discussion about what to name it! And another another post two months ago Stack Exchange is exploring early ideas for expanding beyond the "single lane" Q&A highway. Our goal right now is to better understand the problems, opportunities, and needs before deciding on any specific changes... The vision is to potentially enable: - A slower lane, with high-quality durable knowledge that takes time to create and curate, like questions and answers. - A medium lane, for more flexible engagement, with features like Discussions or more flexible Stack Exchanges, where users can explore ideas or share opinions. - A fast lane for quick, real-time interaction, with features like Chat that can bring the community together to discuss topics instantly. With this in mind, we're seeking your feedback on the current state of Chat, what's most important to you, and how you see Chat fitting into the future. In a post in Stack Exchange's "meta" area, brand design director David Longworth says the "tension mentioned between Stack Overflow and Stack Exchange" is probably the most relevant to the rebranding. But he posted later that "There's a lot of people behind the scenes on this who care deeply about getting this right! Thank you on behalf of myself and the team."

Read more of this story at Slashdot.

  •  

Intel Struggles To Reverse AMD's Share Gains In x86 CPU Market

An anonymous reader shared this report from CRN: CPU-tracking firm Mercury Research reported on Thursday that Intel's x86 CPU market share grew 0.3 points sequentially to 75.6 percent against AMD's 24.4 percent in the first quarter. However, AMD managed to increase its market share by 3.6 points year over year. These figures only captured the server, laptop and desktop CPU segments. When including IoT and semicustom products, AMD grew its x86 market share sequentially by 1.5 points and year over year by 0.9 points to 27.1 percent against Intel's 72.9 percent... AMD managed to gain ground on Intel in the desktop and server segments sequentially and year over year. But it was in the laptop segment where Intel eked out a sequential share gain, even though rival AMD ended up finishing the first quarter with a higher share of shipments than what it had a year ago... While AMD mostly came out on top in the first quarter, [Mercury Research President Dean] McCarron said ARM's estimated CPU share against x86 products crossed into the double digits for the first time, growing 2.3 points sequentially to 11.9 percent. This was mainly due to a "surge" of Nvidia's Grace CPUs for servers and a large increase of Arm CPU shipments for Chromebooks. Meanwhile, PC Gamer reports that ARM's share of the PC processor market "grew to 13.6% in the first quarter of 2025 from 10.8% in the fourth quarter of 2024." And they note the still-only-rumors that an Arm-based chip from AMD will be available as soon next year. [I]f one of the two big players in x86 does release a mainstream Arm chip for the PC, that will very significant. If it comes at about the same time as Nvidia's rumoured Arm chip for the PC, well, momentum really will be building and questioning x86's dominance will be wholly justified.

Read more of this story at Slashdot.

  •  

Is the Altruistic OpenAI Gone?

"The altruistic OpenAI is gone, if it ever existed," argues a new article in the Atlantic, based on interviews with more than 90 current and former employees, including executives. It notes that shortly before Altman's ouster (and rehiring) he was "seemingly trying to circumvent safety processes for expediency," with OpenAI co-founder/chief scientist Ilya telling three board members "I don't think Sam is the guy who should have the finger on the button for AGI." (The board had already discovered Altman "had not been forthcoming with them about a range of issues" including a breach in the Deployment Safety Board's protocols.) Adapted from the upcoming book, Empire of AI, the article first revisits the summer of 2023, when Sutskever ("the brain behind the large language models that helped build ChatGPT") met with a group of new researchers: Sutskever had long believed that artificial general intelligence, or AGI, was inevitable — now, as things accelerated in the generative-AI industry, he believed AGI's arrival was imminent, according to Geoff Hinton, an AI pioneer who was his Ph.D. adviser and mentor, and another person familiar with Sutskever's thinking.... To people around him, Sutskever seemed consumed by thoughts of this impending civilizational transformation. What would the world look like when a supreme AGI emerged and surpassed humanity? And what responsibility did OpenAI have to ensure an end state of extraordinary prosperity, not extraordinary suffering? By then, Sutskever, who had previously dedicated most of his time to advancing AI capabilities, had started to focus half of his time on AI safety. He appeared to people around him as both boomer and doomer: more excited and afraid than ever before of what was to come. That day, during the meeting with the new researchers, he laid out a plan. "Once we all get into the bunker — " he began, according to a researcher who was present. "I'm sorry," the researcher interrupted, "the bunker?" "We're definitely going to build a bunker before we release AGI," Sutskever replied. Such a powerful technology would surely become an object of intense desire for governments globally. The core scientists working on the technology would need to be protected. "Of course," he added, "it's going to be optional whether you want to get into the bunker." Two other sources I spoke with confirmed that Sutskever commonly mentioned such a bunker. "There is a group of people — Ilya being one of them — who believe that building AGI will bring about a rapture," the researcher told me. "Literally, a rapture...." But by the middle of 2023 — around the time he began speaking more regularly about the idea of a bunker — Sutskever was no longer just preoccupied by the possible cataclysmic shifts of AGI and superintelligence, according to sources familiar with his thinking. He was consumed by another anxiety: the erosion of his faith that OpenAI could even keep up its technical advancements to reach AGI, or bear that responsibility with Altman as its leader. Sutskever felt Altman's pattern of behavior was undermining the two pillars of OpenAI's mission, the sources said: It was slowing down research progress and eroding any chance at making sound AI-safety decisions. "For a brief moment, OpenAI's future was an open question. It might have taken a path away from aggressive commercialization and Altman. But this is not what happened," the article concludes. Instead there was "a lack of clarity from the board about their reasons for firing Altman." There was fear about a failure to realize their potential (and some employees feared losing a chance to sell millions of dollars' worth of their equity). "Faced with the possibility of OpenAI falling apart, Sutskever's resolve immediately started to crack... He began to plead with his fellow board members to reconsider their position on Altman." And in the end "Altman would come back; there was no other way to save OpenAI." To me, the drama highlighted one of the most urgent questions of our generation: How do we govern artificial intelligence? With AI on track to rewire a great many other crucial functions in society, that question is really asking: How do we ensure that we'll make our future better, not worse? The events of November 2023 illustrated in the clearest terms just how much a power struggle among a tiny handful of Silicon Valley elites is currently shaping the future of this technology. And the scorecard of this centralized approach to AI development is deeply troubling. OpenAI today has become everything that it said it would not be.... The author believes OpenAI "has grown ever more secretive, not only cutting off access to its own research but shifting norms across the industry to no longer share meaningful technical details about AI models..." "At the same time, more and more doubts have risen about the true economic value of generative AI, including a growing body of studies that have shown that the technology is not translating into productivity gains for most workers, while it's also eroding their critical thinking."

Read more of this story at Slashdot.

  •