Vue lecture

'Rust is So Good You Can Get Paid $20K to Make It as Fast as C'

The Prossimo project (funded by the nonprofit Internet Security Research Group) seeks to "move the Internet's security-sensitive software infrastructure to memory safe code." Two years ago the Prossimo project made an announcement: they'd begun work on rav1d, a safer high performance AV1 decoder written in Rust, according to a new update: We partnered with Immunant to do the engineering work. By September of 2024 rav1d was basically complete and we learned a lot during the process. Today rav1d works well — it passes all the same tests as the dav1d decoder it is based on, which is written in C. It's possible to build and run Chromium with it. There's just one problem — it's not quite as fast as the C version... Our Rust-based rav1d decoder is currently about 5% slower than the C-based dav1d decoder (the exact amount differs a bit depending on the benchmark, input, and platform). This is enough of a difference to be a problem for potential adopters, and, frankly, it just bothers us. The development team worked hard to get it to performance parity. We brought in a couple of other contractors who have experience with optimizing things like this. We wrote about the optimization work we did. However, we were still unable to get to performance parity and, to be frank again, we aren't really sure what to do next. After racking our brains for options, we decided to offer a bounty pool of $20,000 for getting rav1d to performance parity with dav1d. Hopefully folks out there can help get rav1d performance advanced to where it needs to be, and ideally we and the Rust community will also learn something about how Rust performance stacks up against C. This drew a snarky response from FFmpeg, the framework that powers audio and video processing for everyone from VLC to Twitch. "Rust is so good you can get paid $20k to make it as fast as C," they posted to their 68,300 followers on X.com. Thanks to the It's FOSS blog for spotting the announcement.

Read more of this story at Slashdot.

  •  

Ask Slashdot: Would You Consider a Low-Latency JavaScript Runtime For Your Workflow?

Amazon's AWS Labs has created LLRT an experimental, lightweight JavaScript runtime designed to address the growing demand for fast and efficient serverless applications. Slashdot reader BitterEpic wants to know what you think of it: Traditional JavaScript runtimes like Node.js rely on garbage collection, which can introduce unpredictable pauses and slow down performance, especially during cold starts in serverless environments like AWS Lambda. LLRT's manual memory management, courtesy of Rust, eliminates this issue, leading to smoother, more predictable performance. LLRT also has a runtime under 2MB, a huge reduction compared to the 100MB+ typically required by Node.js. This lightweight design means lower memory usage, better scalability, and reduced operational costs. Without the overhead of garbage collection, LLRT has faster cold start times and can initialize in milliseconds—perfect for latency-sensitive applications where every millisecond counts. For JavaScript developers, LLRT offers the best of both worlds: rapid development with JavaScript's flexibility, combined with Rust's performance. This means faster, more scalable applications without the usual memory bloat and cold start issues. Still in beta, LLRT promises to be a major step forward for serverless JavaScript applications. By combining Rust's performance with JavaScript's flexibility, it opens new possibilities for building high-performance, low-latency applications. If it continues to evolve, LLRT could become a core offering in AWS Lambda, potentially changing how we approach serverless JavaScript development. Would you consider Javascript as the core of your future workflow? Or maybe you would prefer to go lower level with quckjs?

Read more of this story at Slashdot.

  •  

Stack Overflow Seeks Realignment 'To Support the Builders of the Future in an AI World'

"The world has changed," writes Stack Overflow's blog. "Fast. Artificial intelligence is reshaping how we build, learn, and solve problems. Software development looks dramatically different than it did even a few years ago — and the pace of change is only accelerating." And they believe their brand "at times" lost "fidelity and clarity. It's very much been always added to and not been thought of holistically. So, it's time for our brand to evolve too," they write, hoping to articulate a perspective "forged in the fires of community, powered by collaboration, shaped by AI, and driven by people." The developer news site DevClass notes the change happens "as the number of posts to its site continues a dramatic decline thanks to AI-driven alternatives." According to a quick query on the official data explorer, the sum of questions and answers posted in April 2025 was down by over 64 percent from the same month in 2024, and plunged more than 90 percent from April 2020, when traffic was near its peak... Although declining traffic is a sign of Stack Overflow's reduced significance in the developer community, the company's business is not equally affected so far. Stack Exchange is a business owned by investment company Prosus, and the Stack Exchange products include private versions of its site (Stack Overflow for Teams) as well as advertising and recruitment. According to the Prosus financial results, in the six months ended September 2024, Stack Overflow increased its revenue and reduced its losses. The company's search for a new direction though confirms that the fast-disappearing developer engagement with Stack Overflow poses an existential challenge to the organization. DevClass says Stack Overflow's parent company "is casting about for new ways to provide value (and drive business) in this context..." The company has already experimented with various new services, via its Labs research department, including an AI Answer Assistant and Question Assistant, as well as a revamped jobs site in association with recruitment site Indeed, Discussions for technical debate, and extensions for GitHub Copilot, Slack, and Visual Studio Code. From the official announcement on Stack Overflow's blog: This rebrand isn't just a fresh coat of paint. It's a realignment with our purpose: to support the builders of the future in an AI world — with clarity, speed, and humanity. It's about showing up in a way that reflects who we are today, and where we're headed tomorrow. "We have appointed an internal steering group and we have engaged with an external expert partner in this area to help bring about the required change," notes a post in Stack Exchange's "meta" area. This isn't just about a visual update or marketing exercise — it's going to bring about a shift in how we present ourselves to the world which you will feel everywhere from the design to the copywriting, so that we can better achieve our goals and shared mission. As the emergence of AI has called into question the role of Stack Overflow and the Stack Exchange Network, one of the desired outputs of the rebrand process is to clarify our place in the world. We've done work toward this already — our recent community AMA is an example of this — but we want to ensure that this comes across in our brand and identity as well. We want the community to be involved and have a strong voice in the process of renewing and refreshing our brand. Remember, Stack Overflow started with a public discussion about what to name it! And another another post two months ago Stack Exchange is exploring early ideas for expanding beyond the "single lane" Q&A highway. Our goal right now is to better understand the problems, opportunities, and needs before deciding on any specific changes... The vision is to potentially enable: - A slower lane, with high-quality durable knowledge that takes time to create and curate, like questions and answers. - A medium lane, for more flexible engagement, with features like Discussions or more flexible Stack Exchanges, where users can explore ideas or share opinions. - A fast lane for quick, real-time interaction, with features like Chat that can bring the community together to discuss topics instantly. With this in mind, we're seeking your feedback on the current state of Chat, what's most important to you, and how you see Chat fitting into the future. In a post in Stack Exchange's "meta" area, brand design director David Longworth says the "tension mentioned between Stack Overflow and Stack Exchange" is probably the most relevant to the rebranding. But he posted later that "There's a lot of people behind the scenes on this who care deeply about getting this right! Thank you on behalf of myself and the team."

Read more of this story at Slashdot.

  •  

Curl Warns GitHub About 'Malicious Unicode' Security Issue

A Curl contributor replaced an ASCII letter with a Unicode alternative in a pull request, writes Curl lead developer/founder Daniel Stenberg. And not a single human reviewer on the team (or any of their CI jobs) noticed. The change "looked identical to the ASCII version, so it was not possible to visually spot this..." The impact of changing one or more letters in a URL can of course be devastating depending on conditions... [W]e have implemented checks to help us poor humans spot things like this. To detect malicious Unicode. We have added a CI job that scans all files and validates every UTF-8 sequence in the git repository. In the curl git repository most files and most content are plain old ASCII so we can "easily" whitelist a small set of UTF-8 sequences and some specific files, the rest of the files are simply not allowed to use UTF-8 at all as they will then fail the CI job and turn up red. In order to drive this change home, we went through all the test files in the curl repository and made sure that all the UTF-8 occurrences were instead replaced by other kind of escape sequences and similar. Some of them were also used more or less by mistake and could easily be replaced by their ASCII counterparts. The next time someone tries this stunt on us it could be someone with less good intentions, but now ideally our CI will tell us... We want and strive to be proactive and tighten everything before malicious people exploit some weakness somewhere but security remains this never-ending race where we can only do the best we can and while the other side is working in silence and might at some future point attack us in new creative ways we had not anticipated. That future unknown attack is a tricky thing. In the original blog post Stenberg complained he got "barely no responses" from GitHub (joking "perhaps they are all just too busy implementing the next AI feature we don't want.") But hours later he posted an update. "GitHub has told me they have raised this as a security issue internally and they are working on a fix."

Read more of this story at Slashdot.

  •  

Rust Creator Graydon Hoare Thanks Its Many Stakeholders - and Mozilla - on Rust's 10th Anniversary

Thursday was Rust's 10-year anniversary for its first stable release. "To say I'm surprised by its trajectory would be a vast understatement," writes Rust's original creator Graydon Hoare. "I can only thank, congratulate, and celebrate everyone involved... In my view, Rust is a story about a large community of stakeholders coming together to design, build, maintain, and expand shared technical infrastructure." It's a story with many actors: - The population of developers the language serves who express their needs and constraints through discussion, debate, testing, and bug reports arising from their experience writing libraries and applications. - The language designers and implementers who work to satisfy those needs and constraints while wrestling with the unexpected consequences of each decision. - The authors, educators, speakers, translators, illustrators, and others who work to expand the set of people able to use the infrastructure and work on the infrastructure. - The institutions investing in the project who provide the long-term funding and support necessary to sustain all this work over decades. All these actors have a common interest in infrastructure. Rather than just "systems programming", Hoare sees Rust as a tool for building infrastructure itself, "the robust and reliable necessities that enable us to get our work done" — a wide range that includes everything from embedded and IoT systems to multi-core systems. So the story of "Rust's initial implementation, its sustained investment, and its remarkable resonance and uptake all happened because the world needs robust and reliable infrastructure, and the infrastructure we had was not up to the task." Put simply: it failed too often, in spectacular and expensive ways. Crashes and downtime in the best cases, and security vulnerabilities in the worst. Efficient "infrastructure-building" languages existed but they were very hard to use, and nearly impossible to use safely, especially when writing concurrent code. This produced an infrastructure deficit many people felt, if not everyone could name, and it was growing worse by the year as we placed ever-greater demands on computers to work in ever more challenging environments... We were stuck with the tools we had because building better tools like Rust was going to require an extraordinary investment of time, effort, and money. The bootstrap Rust compiler I initially wrote was just a few tens of thousands of lines of code; that was nearing the limits of what an unfunded solo hobby project can typically accomplish. Mozilla's decision to invest in Rust in 2009 immediately quadrupled the size of the team — it created a team in the first place — and then doubled it again, and again in subsequent years. Mozilla sustained this very unusual, very improbable investment in Rust from 2009-2020, as well as funding an entire browser engine written in Rust — Servo — from 2012 onwards, which served as a crucial testbed for Rust language features. Rust and Servo had multiple contributors at Samsung, Hoare acknowledges, and Amazon, Facebook, Google, Microsoft, Huawei, and others "hired key developers and contributed hardware and management resources to its ongoing development." Rust itself "sits atop LLVM" (developed by researchers at UIUC and later funded by Apple, Qualcomm, Google, ARM, Huawei, and many other organizations), while Rust's safe memory model "derives directly from decades of research in academia, as well as academic-industrial projects like Cyclone, built by AT&T Bell Labs and Cornell." And there were contributions from "interns, researchers, and professors at top academic research programming-language departments, including CMU, NEU, IU, MPI-SWS, and many others." JetBrains and the Rust-Analyzer OpenCollective essentially paid for two additional interactive-incremental reimplementations of the Rust frontend to provide language services to IDEs — critical tools for productive, day-to-day programming. Hundreds of companies and other institutions contributed time and money to evaluate Rust for production, write Rust programs, test them, file bugs related to them, and pay their staff to fix or improve any shortcomings they found. Last but very much not least: Rust has had thousands and thousands of volunteers donating years of their labor to the project. While it might seem tempting to think this is all "free", it's being paid for! Just less visibly than if it were part of a corporate budget. All this investment, despite the long time horizon, paid off. We're all better for it. He looks ahead with hope for a future with new contributors, "steady and diversified streams of support," and continued reliability and compatability (including "investment in ever-greater reliability technology, including the many emerging formal methods projects built on Rust.") And he closes by saying Rust's "sustained, controlled, and frankly astonishing throughput of work" has "set a new standard for what good tools, good processes, and reliable infrastructure software should be like. "Everyone involved should be proud of what they've built."

Read more of this story at Slashdot.

  •  

Over 3,200 Cursor Users Infected by Malicious Credential-Stealing npm Packages

Cybersecurity researchers have flagged three malicious npm packages that target the macOS version of AI-powered code-editing tool Cursor, reports The Hacker News: "Disguised as developer tools offering 'the cheapest Cursor API,' these packages steal user credentials, fetch an encrypted payload from threat actor-controlled infrastructure, overwrite Cursor's main.js file, and disable auto-updates to maintain persistence," Socket researcher Kirill Boychenko said. All three packages continue to be available for download from the npm registry. "Aiide-cur" was first published on February 14, 2025... In total, the three packages have been downloaded over 3,200 times to date.... The findings point to an emerging trend where threat actors are using rogue npm packages as a way to introduce malicious modifications to other legitimate libraries or software already installed on developer systems... "By operating inside a legitimate parent process — an IDE or shared library — the malicious logic inherits the application's trust, maintains persistence even after the offending package is removed, and automatically gains whatever privileges that software holds, from API tokens and signing keys to outbound network access," Socket told The Hacker News. "This campaign highlights a growing supply chain threat, with threat actors increasingly using malicious patches to compromise trusted local software," Boychenko said. The npm packages "restart the application so that the patched code takes effect," letting the threat actor "execute arbitrary code within the context of the platform."

Read more of this story at Slashdot.

  •  

'Who Needs Rust's Borrow-Checking Compiler Nanny? C++ Devs Aren't Helpless'

"When Rust developers think of us C++ folks, they picture a cursed bloodline," writes professional game developer Mamadou Babaei (also a *nix enthusiast who contributes to the FreeBSD Ports collection). "To them, every line of C++ we write is like playing Russian Roulette — except all six chambers are loaded with undefined behavior." But you know what? We don't need a compiler nanny. No borrow checker. No lifetimes. No ownership models. No black magic. Not even Valgrind is required. Just raw pointers, raw determination, and a bit of questionable sanity. He's created a video on "how to hunt down memory leaks like you were born with a pointer in one hand and a debugger in the other." (It involves using a memory leak tracker — specifically, Visual Studio's _CrtDumpMemoryLeaks, which according to its documentation "dumps all the memory blocks in the debug heap when a memory leak has occurred," identifying the offending lines and pointers.) "If that sounds unreasonably dangerous — and incredibly fun... let's dive into the deep end of the heap." "The method is so easy, it renders Rust's memory model (lifetimes, ownership) and the borrow checker useless!" writes Slashdot reader NuLL3rr0r. Does anybody agree with him? Share your own experiences and reactions in the comments. And how do you feel about Rust's "borrow-checking compiler nanny"?

Read more of this story at Slashdot.

  •  

What Happens If AI Coding Keeps Improving?

Fast Company's "AI Decoded" newsletter makes the case that the first "killer app" for generative AI... is coding. Tools like Cursor and Windsurf can now complete software projects with minimal input or oversight from human engineers... Naveen Rao, chief AI officer at Databricks, estimates that coding accounts for half of all large language model usage today. A 2024 GitHub survey found that over 97% of developers have used AI coding tools at work, with 30% to 40% of organizations actively encouraging their adoption.... Microsoft CEO Satya Nadella recently said AI now writes up to 30% of the company's code. Google CEO Sundar Pichai echoed that sentiment, noting more than 30% of new code at Google is AI-generated. The soaring valuations of AI coding startups underscore the momentum. Anysphere's Cursor just raised $900 million at a $9 billion valuation — up from $2.5 billion earlier this year. Meanwhile, OpenAI acquired Windsurf (formerly Codeium) for $3 billion. And the tools are improving fast. OpenAI's chief product officer, Kevin Weil, explained in a recent interview that just five months ago, the company's best model ranked around one-millionth on a well-known benchmark for competitive coders — not great, but still in the top two or three percentile. Today, OpenAI's top model, o3, ranks as the 175th best competitive coder in the world on that same test. The rapid leap in performance suggests an AI coding assistant could soon claim the number-one spot. "Forever after that point computers will be better than humans at writing code," he said... Google DeepMind research scientist Nikolay Savinov said in a recent interview that AI coding tools will soon support 10 million-token context windows — and eventually, 100 million. With that kind of memory, an AI tool could absorb vast amounts of human instruction and even analyze an entire company's existing codebase for guidance on how to build and optimize new systems. "I imagine that we will very soon get to superhuman coding AI systems that will be totally unrivaled, the new tool for every coder in the world," Savinov said.

Read more of this story at Slashdot.

  •  

Developer Tries Resurrecting 47-Year-Old 'Apple Pascal' (and its p-System) in Rust

Long-time Slashdot reader mbessey (a Mac/iOS developer) writes: As we're coming up on the 50th anniversary of the first release of UCSD Pascal, I thought it would be interesting to poke around in it a bit, and work on some tools to bring this "portable operating system" back to life on modern hardware, in a modern language (Rust). Wikipedia describes UCSD Pascal as "a version that ran on a custom operating system that could be ported to different platforms. A key platform was the Apple II, where it saw widespread use as Apple Pascal. This led to Pascal becoming the primary high-level language used for development in the Apple Lisa, and later, the Macintosh. Parts of the original Macintosh operating system were hand-translated into Motorola 68000 assembly language from the Pascal source code." mbessey is chronicling their new project in a series of blog posts which begins here: The p-System was not the first portable byte-code interpreter and compiler system — that idea goes very far back, at least to the origins of the Pascal language itself. But it was arguably one of the most-successful early versions of the idea and served as an inspiration for future portable software systems (including Java's bytecode, and Infocom's Z-machine). And they've already gotten UCSD Pascal running in an emulator and built some tools (in Rust) to transfer files to disk images. Now they're working towards writing a p-machine emulator in Rust, which they can they port to "something other than the Mac. Ideally, something small â" like an Arduino or Raspberry Pi Pico."

Read more of this story at Slashdot.

  •  

Tech Leaders Launch Campaign To Make CS and AI a Graduation Requirement

"Our future won't be handed to us," says the young narrator in a new ad from the nonprofit Code.org. "We will build it." "But how can we when the education we need is still just an elective?" says another young voice... The ad goes on to tout the power "to create with computer science and AI — the skills transforming every industry..." and ends by saying "This isn't radical. It's what education is supposed to do. Make computer science and AI a graduation requirement." There's also a hard-hitting new web site, which urges people to sign a letter of support (already signed by executives from top tech companies including Microsoft, Dropbox, AMD, Meta, Blue Origin, and Palantir — and by Steve Ballmer, who is listed as the chairman of the L.A. Clippers basketball team). Long-time Slashdot reader theodp says the letter ran in the New York Times, while this campaign will officially kick off Monday... Code.org teased the new Unlock8 campaign last month on social media as it celebrated a new Executive Order that makes K–12 AI literacy a U.S. priority, which it called a big win for CS & AI education, adding, "We've been building to this moment." The move to make CS and AI a graduation requirement is a marked reversal of Code.org's early days, when it offered Congressional testimony on behalf of itself and tech-led Computing in the Core reassuring lawmakers that: "Making computer science courses 'count' would not require schools to offer computer science or students to study it; it would simply allow existing computer science courses to satisfy a requirement that already exists."

Read more of this story at Slashdot.

  •  

Microsoft CEO Says Up To 30% of the Company's Code Was Written by AI

Microsoft CEO Satya Nadella said that 20%-30% of code inside the company's repositories was "written by software" -- meaning AI -- during a fireside chat with Meta CEO Mark Zuckerberg at Meta's LlamaCon conference on Tuesday. From a report: Nadella gave the figure after Zuckerberg asked roughly how much of Microsoft's code is AI-generated today. The Microsoft CEO said the company was seeing mixed results in AI-generated code across different languages, with more progress in Python and less in C++.

Read more of this story at Slashdot.

  •  

AI-Generated Code Creates Major Security Risk Through 'Package Hallucinations'

A new study [PDF] reveals AI-generated code frequently references non-existent third-party libraries, creating opportunities for supply-chain attacks. Researchers analyzed 576,000 code samples from 16 popular large language models and found 19.7% of package dependencies -- 440,445 in total -- were "hallucinated." These non-existent dependencies exacerbate dependency confusion attacks, where malicious packages with identical names to legitimate ones can infiltrate software. Open source models hallucinated at nearly 22%, compared to 5% for commercial models. "Once the attacker publishes a package under the hallucinated name, containing some malicious code, they rely on the model suggesting that name to unsuspecting users," said lead researcher Joseph Spracklen. Alarmingly, 43% of hallucinations repeated across multiple queries, making them predictable targets.

Read more of this story at Slashdot.

  •  

AI Tackles Aging COBOL Systems as Legacy Code Expertise Dwindles

US government agencies and Fortune 500 companies are turning to AI to modernize mission-critical systems built on COBOL, a programming language dating back to the late 1950s. The US Social Security Administration plans a three-year, $1 billion AI-assisted upgrade of its legacy COBOL codebase [alternative source], according to Bloomberg. Treasury Secretary Scott Bessent has repeatedly stressed the need to overhaul government systems running on COBOL. As experienced programmers retire, organizations face growing challenges maintaining these systems that power everything from banking applications to pension disbursements. Engineers now use tools like ChatGPT and IBM's watsonX to interpret COBOL code, create documentation, and translate it to modern languages.

Read more of this story at Slashdot.

  •  

Figma Sent a Cease-and-Desist Letter To Lovable Over the Term 'Dev Mode'

An anonymous reader quotes a report from TechCrunch: Figma has sent a cease-and-desist letter to popular no-code AI startup Lovable, Figma confirmed to TechCrunch. The letter tells Lovable to stop using the term "Dev Mode" for a new product feature. Figma, which also has a feature called Dev Mode, successfully trademarked that term last year, according to the U.S. Patent and Trademark office. What's wild is that "dev mode" is a common term used in many products that cater to software programmers. It's like an edit mode. Software products from giant companies like Apple's iOS, Google's Chrome, Microsoft's Xbox have features formally called "developer mode" that then get nicknamed "dev mode" in reference materials. Even "dev mode" itself is commonly used. For instance Atlassian used it in products that pre-date Figma's copyright by years. And it's a common feature name in countless open source software projects. Figma tells TechCrunch that its trademark refers only to the shortcut "Dev Mode" -- not the full term "developer mode." Still, it's a bit like trademarking the term "bug" to refer to "debugging." Since Figma wants to own the term, it has little choice but send cease-and-desist letters. (The letter, as many on X pointed out, was very polite, too.) If Figma doesn't defend the term, it could be absorbed as a generic term and the trademarked becomes unenforceable.

Read more of this story at Slashdot.

  •  

You Should Still Learn To Code, Says GitHub CEO

You should still learn to code, says GitHub's CEO. And you should start as soon as possible. From a report: "I strongly believe that every kid, every child, should learn coding," Thomas Dohmke said in a recent podcast interview with EO. "We should actually teach them coding in school, in the same way that we teach them physics and geography and literacy and math and what-not." Coding, he added, is one such fundamental skill -- and the only reason it's not part of the curriculum is because it took "us too long to actually realize that." Dohmke, who's been a programmer since the 90s, said he's never seen "anything more exciting" than the current moment in engineering -- the advent of AI, he believes, has made the field that much easier to break into, and is poised to make software more ubiquitous than ever. "It's so much easier to get into software development. You can just write a prompt into Copilot or ChatGPT or similar tools, and it will likely write you a basic webpage, or a small application, a game in Python," Dohmke said. "And so, AI makes software development so much more accessible for anyone who wants to learn coding." AI, Dohmke said, helps to "realize the dream" of bringing an idea to life, meaning that fewer projects will end up dead in the water, and smaller teams of developers will be enabled to tackle larger-scale projects. Dohmke said he believes it makes the overall process of creation more efficient. "You see some of the early signs of that, where very small startups -- sometimes five developers and some of them actually only one developer -- believe they can become million, if not billion dollar businesses by leveraging all the AI agents that are available to them," he added.

Read more of this story at Slashdot.

  •  

AI Models Still Struggle To Debug Software, Microsoft Study Shows

Some of the best AI models today still struggle to resolve software bugs that wouldn't trip up experienced devs. TechCrunch: A new study from Microsoft Research, Microsoft's R&D division, reveals that models, including Anthropic's Claude 3.7 Sonnet and OpenAI's o3-mini, fail to debug many issues in a software development benchmark called SWE-bench Lite. The results are a sobering reminder that, despite bold pronouncements from companies like OpenAI, AI is still no match for human experts in domains such as coding. The study's co-authors tested nine different models as the backbone for a "single prompt-based agent" that had access to a number of debugging tools, including a Python debugger. They tasked this agent with solving a curated set of 300 software debugging tasks from SWE-bench Lite. According to the co-authors, even when equipped with stronger and more recent models, their agent rarely completed more than half of the debugging tasks successfully. Claude 3.7 Sonnet had the highest average success rate (48.4%), followed by OpenAI's o1 (30.2%), and o3-mini (22.1%).

Read more of this story at Slashdot.

  •  

'There is No Vibe Engineering'

Software engineer Sergey Tselovalnikov weighs in on the new hype: The term caught on and Twitter quickly flooded with posts about how AI has radically transformed coding and will soon replace all software engineers. While AI undeniably impacts the way we write code, it hasn't fundamentally changed our role as engineers. Allow me to explain. [...] Vibe coding is interacting with the codebase via prompts. As the implementation is hidden from the "vibe coder", all the engineering concerns will inevitably get ignored. Many of the concerns are hard to express in a prompt, and many of them are hard to verify by only inspecting the final artifact. Historically, all engineering practices have tried to shift all those concerns left -- to the earlier stages of development when they're cheap to address. Yet with vibe coding, they're shifted very far to the right -- when addressing them is expensive. The question of whether an AI system can perform the complete engineering cycle and build and evolve software the same way a human can remains open. However, there are no signs of it being able to do so at this point, and if it one day happens, it won't have anything to do with vibe coding -- at least the way it's defined today. [...] It is possible that there'll be a future where software is built from vibe-coded blocks, but the work of designing software able to evolve and scale doesn't go away. That's not vibe engineering -- that's just engineering, even if the coding part of it will look a bit different.

Read more of this story at Slashdot.

  •  

'No Longer Think You Should Learn To Code,' Says CEO of AI Coding Startup

Learning to code has become sort of become pointless as AI increasingly dominates programming tasks, said Replit founder and chief executive Amjad Masad. "I no longer think you should learn to code," Masad wrote on X. The statement comes as major tech executives report significant AI inroads into software development. Google CEO Sundar Pichai recently revealed that 25% of new code at the tech giant is AI-generated, though still reviewed by engineers. Furthermore, Anthropic CEO Dario Amodei predicted AI could generate up to 90% of all code within six months. Masad called this shift a "bittersweet realization" after spending years popularizing coding through open-source work, Codecademy, and Replit -- a platform that now uses AI to help users build apps and websites. Instead of syntax-focused programming skills, Masad recommends learning "how to think, how to break down problems... how to communicate clearly, with humans and with machines."

Read more of this story at Slashdot.

  •  

How Rust Finally Got a Specification - Thanks to a Consultancy's Open-Source Donation

As Rust approaches its 10th anniversary, "there is an important piece of documentation missing that many other languages provide," notes the Rust Foundation. While there's documentation and tutorials — there's no official language specification: In December 2022, an RFC was submitted to encourage the Rust Project to begin working on a specification. After much discussion, the RFC was approved in July 2023, and work began. Initially, the Rust Project specification team (t-spec) were interested in creating the document from scratch using the Rust Reference as a guiding marker. However, the team knew there was already an external Rust specification that was being used successfully for compiler qualification purposes — the FLS. Thank Berlin-based Ferrous Systems, a Rust-based consultancy who assembled that description "some years ago," according to a post on the Rust blog: They've since been faithfully maintaining and updating this document for new versions of Rust, and they've successfully used it to qualify toolchains based on Rust for use in safety-critical industries. [The Rust Foundation notes it part of the consultancy's "Ferrocene" Rust compiler/toolchain.] Seeing this success, others have also begun to rely on the FLS for their own qualification efforts when building with Rust. The Rust Foundation explains: The FLS provides a structured and detailed reference for Rust's syntax, semantics, and behavior, serving as a foundation for verification, compliance, and standardization efforts. Since Rust did not have an official language specification back then, nor a plan to write one, the FLS represented a major step toward describing Rust in a way that aligns with industry requirements, particularly in high-assurance domains. And the Rust Project is "passionate about shipping high quality tools that enable people to build reliable software at scale," adds the Rust blog. So... It's in that light that we're pleased to announce that we'll be adopting the FLS into the Rust Project as part of our ongoing specification efforts. This adoption is being made possible by the gracious donation of the FLS by Ferrous Systems. We're grateful to them for the work they've done in assembling the FLS, in making it fit for qualification purposes, in promoting its use and the use of Rust generally in safety-critical industries, and now, for working with us to take the next step and to bring the FLS into the Project. With this adoption, we look forward to better integrating the FLS with the processes of the Project and to providing ongoing and increased assurances to all those who use Rust in safety-critical industries and, in particular, to those who use the FLS as part of their qualification efforts. More from the Rust Foundation: The t-spec team wanted to avoid potential confusion from having two highly visible Rust specifications in the industry and so decided it would be worthwhile to try to integrate the FLS with the Rust Reference to create the official Rust Project specification. They approached Ferrous Systems, which agreed to contribute its FLS to the Rust Project and allow the Rust Project to take over its development and management... This generous donation will provide a clearer path to delivering an official Rust specification. It will also empower the Rust Project to oversee its ongoing evolution, providing confidence to companies and individuals already relying on the FLS, and marking a major milestone for the Rust ecosystem. "I really appreciate Ferrous taking this step to provide their specification to the Rust Project," said Joel Marcey, Director of Technology at the Rust Foundation and member of the t-spec team. "They have already done a massive amount of legwork...." This effort will provide others who require a Rust specification with an official, authoritative reference for their work with the Rust programming language... This is an exciting outcome. A heartfelt thank you to the Ferrous Systems team for their invaluable contribution! Marcey said the move allows the team "to supercharge our progress in the delivery of an official Rust specification." And the co-founder of Ferrous Systems, Felix Gilcher, also sounded excited. "We originally created the Ferrocene Language Specification to provide a structured and reliable description of Rust for the certification of the Ferrocene compiler. As an open source-first company, contributing the FLS to the Rust Project is a logical step toward fostering the development of a unified, community-driven specification that benefits all Rust users."

Read more of this story at Slashdot.

  •  

DOGE To Rewrite SSA Codebase In 'Months'

Longtime Slashdot reader frank_adrian314159 writes: According to an article in Wired, Elon Musk has appointed a team of technologists from DOGE to "rewrite the code that runs the SSA in months." This codebase has over 60 million lines of COBOL and handles record keeping for all American workers and payments for all Social Security recipients. Given that the code has to track the byzantine regulations dealing with Social Security, it's no wonder that the codebase is this large. What is in question though is whether a small team can rewrite this code "in months." After all, what could possibly go wrong? "The project is being organized by Elon Musk lieutenant Steve Davis ... and aims to migrate all SSA systems off COBOL ... and onto a more modern replacement like Java within a scheduled tight timeframe of a few months," notes Wired. "Under any circumstances, a migration of this size and scale would be a massive undertaking, experts tell WIRED, but the expedited deadline runs the risk of obstructing payments to the more than 65 million people in the US currently receiving Social Security benefits." In 2017, SSA announced a plan to modernize its core systems with a timeline of around five years. However, the work was "pivoted away" because of the pandemic.

Read more of this story at Slashdot.

  •