Vue lecture

Thomas E. Kurtz, Co-Inventor of BASIC, Dies At 96

Slashdot readers damn_registrars and GFS666 share the news of the passing of Thomas E. Kurtz, co-inventor of the BASIC programming language back in the 1960s. He was 96. Hackaday reports: The origins of BASIC lie in the Dartmouth Timesharing System, like similar timesharing operating systems of the day, designed to allow the resources of a single computer to be shared across many terminals. In this case the computer was at Dartmouth College, and BASIC was designed to be a language with which software could be written by average students who perhaps didn't have a computing background. In the decade that followed it proved ideal for the new microcomputers, and few were the home computers of the era which didn't boot into some form of BASIC interpreter. Kurtz continued his work as a distinguished academic and educator until his retirement in 1993, but throughout he remained as the guiding hand of the language.

Read more of this story at Slashdot.

The Rust Foundation Wants to Improve Rust and C++ Interoperability

The goal? "Make C++ and Rust interoperability easily accessible and approachable to the widest possible audience." And the Rust Foundation's "Interop Initiative" is specifically focused on the goal of interoperability "within the same executable," through either inline embedding that allows "integrated compilation", or foreign function interfaces. To that end, a statement addressing "the challenges and opportunities in C++ and Rust interoperability" was announced this week by the Rust Foundation. Pointing out that the "Interop Initiative" was launched in February 2024 with a $1M contribution from Google, it now "proposes a collaborative, problem-space approach engaging key stakeholders from both language communities. "Rather than prescribing specific solutions, this problem statement serves as a foundation for community input and participation in shaping both the strategic direction and tactical implementation of improved C++/Rust interoperability." Their official problem statement outlines three "key strategic approaches." - Improve existing tools and address tactical issues to reduce interoperability friction and risk in the short term. - Build consensus around long-term goals requiring changes to Rust itself and develop the tactical approaches to begin pursuing them. - Engage with the C++ community and committee to improve the quality of interoperation for both languages to help realize the mutual goals of safety and performance. And it argues that interoperability "is essential to pursuing safety and performance which is maintainable and scalable." A significant amount of development has gone into libraries to facilitate interoperability with both C and C++, but from the language and compiler level, the situation remains largely unchanged from the early days of Rust. As the desire to integrate Rust into more C++ codebases increases, the value of making C++/Rust interoperability safer, easier, and more efficient is rapidly increasing. While each language takes a different overall approach, both view safety as an essential concern in modern systems. Both Rust and C++ have language- and standard-library-level facilities to improve safety in seemingly compatible ways, but significant benefits are lost when transiting the foreign function interfaces (FFI) boundary using the C ABI... The consequence of this increased cost to interoperate means both C++ and Rust codebases are less able to access valuable code that already exists in the other language, and the ability to transition system components from one language to another is reduced outside of existing C-like interface boundaries. Ultimately, this reduction in freedom leads to worse outcomes for all users since technologists are less free to choose the most effective solutions.

Read more of this story at Slashdot.

On 15th Anniversary, Go Programming Languages Rises in Popularity

The Tiobe index tries to track the popularity of programming languages by counting the number of search results for the language's name followed by the word "programming" (on 25 different search engines). And this month there were some surprises... By TIOBE's reckoning, compared to a year ago PHP has now fallen from #7 to #12, while Delphi/Object Pascal shot up five spots from #16 to #11. In that same year, Fortran jumped from #12 to #8 — while both Visual Basic and SQL dropped down a single rank. Toward the top of the list, C actually fell from the #2 spot over the last 12 months to the #4 spot. And Go just reached the #7 rank on the TIOBE's ranking of programming language popularity — "an all time high for Go," according to TIOBE CEO Paul Jansen. In this month's note, he explains what he thinks is unusual about this — starting by saying that Go programs are both fast, and easy in many ways — easy to deploy, easy to learn, and easy to understand. Python for instance is easy to learn but not fast, and deployment for larger Python programs is fragile due to dependencies on all kind of versioned libraries in the environment. If compared to Rust for instance (another contender for a top position), Go is a tiny bit slower, but the Go programs are much easier to understand. The next hurdle for Go in the TIOBE index is JavaScript at position #6. That will be a tough one to pass. JavaScript is ubiquitous in software development, although for larger JavaScript systems we see a shift to TypeScript nowadays. "If annual trends continue this way, Go will bypass JavaScript within 3 years," TIOBE's CEO predicts. (Adding "Let's see what the future has in store for Go...") Although the Go team actually has specific plans for the future, according to a blog post this week celebrating Go's 15th anniversary: We're working on making Go better for AI — and AI better for Go — by enhancing Go's capabilities in AI infrastructure, applications, and developer assistance. Go is a great language for building production systems, and we want it to be a great language for building production AI systems, too... For AI applications, we will continue building out first-class support for Go in popular AI SDKs, including LangChainGo and Genkit. And from its very beginning, Go aimed to improve the end-to-end software engineering process, so naturally we're looking at bringing the latest tools and techniques from AI to bear on reducing developer toil, leaving more time for the fun stuff — like actually programming! TIOBE's top 10 programming language rankings for the month of November: Python C++ Java C C# JavaScript Go Fortran Visual Basic SQL

Read more of this story at Slashdot.

OpenMP 6.0 Released

Phoronix's Michael Larabel reports: The OpenMP Architecture Review Board announced from SC24 that OpenMP 6.0 is now available as a major upgrade to the OpenMP specification for multi-process programming within C / C++ / Fortran. A big emphasis on OpenMP 6.0 is making it easier for developers to embrace. OpenMP 6.0 aims to make it easier to support parallel programming in new applications, easier to adapt to new use-cases, and more fine-grained developer control. OpenMP 6.0 simplifies task programming with support for task execution by free-agent threads, allowing for recording of task graphs for efficient replay, and other improvements. OpenMP 6.0 also brings support for array syntax applications, better control over memory allocations and accessibility, easier writing of asynchronous data transfers, and other improvements for enhanced device support / offloading. There is also easier programming of loop transformations, support for induction, support for C23 / Fortran 2023 / C++23, grater user control of storage resources and memory spaces, and other improvements.

Read more of this story at Slashdot.

The Ultimate in Debugging

Mark Rainey: Engineers are currently debugging why the Voyager 1 spacecraft, which is 15 billions miles away, turned off its main radio and switched to a backup radio that hasn't been used in over forty years! I've had some tricky debugging issues in the past, including finding compiler bugs and debugging code with no debugger that had been burnt into prom packs for terminals, however I have huge admiration for the engineers maintaining the operation of Voyager 1. Recently they sent a command to the craft that caused it to shut off its main radio transmitter, seemingly in an effort to preserve power and protect from faults. This prompted it to switch over to the backup radio transmitter, that is lower power. Now they have regained communication they are trying to determine the cause on hardware that is nearly 50 years old. Any communication takes days. When you think you have a difficult issue to debug, spare a thought for this team.

Read more of this story at Slashdot.

Will We Care About Frameworks in the Future?

Paul Kinlan, who leads the Chrome and the Open Web Developer Relations team at Google, asks and answers the question (with a no.): Frameworks are abstractions over a platform designed for people and teams to accelerate their teams new work and maintenance while improving the consistency and quality of the projects. They also frequently force a certain type of structure and architecture to your code base. This isn't a bad thing, team productivity is an important aspect of any software. I'm of the belief that software development is entering a radical shift that is currently driven by agents like Replit's and there is a world where a person never actually has to manipulate code directly anymore. As I was making broad and sweeping changes to the functionality of the applications by throwing the Agent a couple of prompts here and there, the software didn't seem to care that there was repetition in the code across multiple views, it didn't care about shared logic, extensibility or inheritability of components... it just implemented what it needed to do and it did it as vanilla as it could. I was just left wondering if there will be a need for frameworks in the future? Do the architecture patterns we've learnt over the years matter? Will new patterns for software architecture appear that favour LLM management?

Read more of this story at Slashdot.

Google Research Chief Says Learning To Code 'as Important as Ever'

Google's head of research Yossi Matias maintains that learning to code remains "as important as ever" despite AI's growing role in software development. While AI tools have reduced coding time for some developers -- and Alphabet CEO Sundar Pichai noting that AI now generates a quarter of all code, Matias stressed that human engineers still review and approve AI-generated code. The Google executive, who also serves as a company VP, acknowledged that junior professionals have faced challenges gaining experience as AI handles entry-level tasks. Google has launched initiatives to support early-career employees through this transition. Matias compared coding literacy to basic mathematics, arguing it provides crucial understanding of technology regardless of career path.

Read more of this story at Slashdot.

The Team Behind GitHub's 'Atom' IDE Build a Cross-Platform, AI-Optional 'Zed Editor'

Nathan Sobo "joined GitHub in late 2011 to build the Atom text editor," according to an online biography, "and he led the Atom team until 2018." Max Brunsfeld joined the Atom team in 2013, and "While driving Atom towards its 1.0 launch during the day, Max spent nights and weekends building Tree-sitter, a blazing-fast and expressive incremental parsing framework that currently powers all code analysis at GitHub." Last year they teamed up with Antonio Scandurra (another Atom alumnus) to launch a new startup called Zed (which in 2023 raised $10 million, according to TechCrunch). And today the open source blog It's FOSS checks in on their open-source code editor — "Zed Editor". Mainly written in Rust, it supports running in CLI, diagnosing project-wide errors, split panes, and markdown previews: By default, any added content is treated as plain text. I used the language switcher to change it to Rust so that I would get proper syntax highlighting, indentation, error detection, and other useful language-specific functions. The switch highlighted all the Rust elements correctly, and I then focused on Zed Editor's user interface. The overall feel of the editor was minimal, with all the important options being laid out nicely. [Its status bar] had some interesting panels. The first one I checked was the Terminal Panel, which, as the name suggests, lets you run commands, scripts, and facilitates interaction with system files or processes directly from within the editor. I then moved to the Assistant Panel, which is home to various large language models that can be integrated into Zed Editor. There are options like Anthropic, GitHub Copilot Chat, Ollama, OpenAI, and Google AI... The Zed Editor team has also recently introduced Zed AI in collaboration with Anthropic for assisting with coding, allowing for code generation, advanced context-powered interactions, and more... The real-time collaboration features on Zed Editor are quite appealing too. To check them out, I had to log in with my GitHub account. After logging in, the Collab Panel opened up, and I could see many channels from the official Zed community. I could chat with others, add collaborators to existing projects, join a call with the option to share my screen and track other collaborators' cursors, add new contacts, and carry out many other collaborative tasks. One can also use extensions and themes to extend what Zed Editor can do. There are some nice pre-installed themes as well.

Read more of this story at Slashdot.

Rust Foundation Shares Draft of New, Simpler Trademark Policy

"The Rust trademark policy has been updated and a new draft is available to view," announced the Rust Foundation this week. The last proposed trademark policy (in April of 2023) was criticized by open source advocate Bruce Perens in The Register as "far awry of fair use which is legally permitted." The Rust Foundation says this new version has "incorporated a number of suggestions from the Rust community," in a blog post that summarizes the feedback and enumerates specific ways it's been addressed: 1. We primarily plan to lean on community reports for enforcement and have no intention of spending our limited resources policing the work of small creators. 2. We have removed the non-legal language summary and instead have clarified wording throughout as best we can while keeping the policy valid. 3. The Rust trademark does not cover use of the word "Rust" in general and instead pertains to its use in relevant technical settings. 4. We have updated the logo usage policy. Color modifications are allowed. 5. The non-endorsement rule is about managing perception of official affiliation with the Foundation and Rust Project, and is thus subjective. 6. We removed restrictions on the use of "Rust" and "Cargo" in package names. The crates prefixes "rust-" and "cargo-" are no longer reserved to the Rust Project. 7. We will usually allow the community to use the marks on limited merchandise (more details in the updated draft).... [T]he central purpose of these updates is to empower all Rustaceans to engage with the Rust language ecosystem more confidently. As a final step in this process, we invite you to review the updated policy and share any blocking concerns you might have... Thank you to everyone who weighed in with helpful suggestions on the initial trademark policy draft we shared. The level of engagement and passion within the Rust community is inspiring to all of us at the Rust Foundation. The tech news site Heise Online writes "It is noticeable that the language is much clearer and dispenses with a lot of legal jargon," in a piece which argues the new draft "should calm the waves and create clarity." The new draft is not only formulated more simply, but is also significantly shorter. Some restrictions have been softened in the new rules or have disappeared completely... Meanwhile, the Foundation has also adapted its logo so that it is clear which logo stands for the programming language and which for the Foundation. The use of the name Rust is explicitly permitted to identify projects that are either written in the programming language or are compatible with it... Before the new trademark rules come into force, the Rust Foundation is collecting feedback on the current draft. The web form is open until November 20, 2024.

Read more of this story at Slashdot.

Python Overtakes JavaScript on GitHub, Annual Survey Finds

GitHub released its annual "State of the Octoverse" report this week. And while "Systems programming languages, like Rust, are also on the rise... Python, JavaScript, TypeScript, and Java remain the most widely used languages on GitHub." In fact, "In 2024, Python overtook JavaScript as the most popular language on GitHub." They also report usage of Jupyter Notebooks "skyrocketed" with a 92% jump in usage, which along with Python's rise seems to underscore "the surge in data science and machine learning on GitHub..." We're also seeing increased interest in AI agents and smaller models that require less computational power, reflecting a shift across the industry as more people focus on new use cases for AI... While the United States leads in contributions to generative AI projects on GitHub, we see more absolute activity outside the United States. In 2024, there was a 59% surge in the number of contributions to generative AI projects on GitHub and a 98% increase in the number of projects overall — and many of those contributions came from places like India, Germany, Japan, and Singapore... Notable growth is occurring in India, which is expected to have the world's largest developer population on GitHub by 2028, as well as across Africa and Latin America... [W]e have seen greater growth outside the United States every year since 2013 — and that trend has sped up over the past few years. Last year they'd projected India would have the most developers on GitHub #1 by 2027, but now believe it will happen a year later. This year's top 10? 1. United States 2. India 3. China 4. Brazil 5. United Kingdom 6. Russia 7. Germany 8. Indonesia 9. Japan 10. Canada Interestingly, the UK's population ranks #21 among countries of the world, while Germany ranks #19, and Canada ranks #36.) GitHub's announcement argues the rise of non-English, high-population regions "is notable given that it is happening at the same time as the proliferation of generative AI tools, which are increasingly enabling developers to engage with code in their natural language." And they offer one more data point: GitHub's For Good First Issue is a curated list of Digital Public Goods that need contributors, connecting those projects with people who want to address a societal challenge and promote sustainable development... Significantly, 34% of contributors to the top 10 For Good Issue projects... made their first contribution after signing up for GitHub Copilot. There's now 518 million projects on GitHub — with a year-over-year growth of 25%...

Read more of this story at Slashdot.

More Than a Quarter of New Code At Google Is Generated By AI

Google has integrated AI deeply across its operations, with over 25% of its new code generated by AI. CEO Sundar Pichai announced the milestone during the company's third quarter 2024 earnings call. The Verge reports: AI is helping Google make money as well. Alphabet reported $88.3 billion in revenue for the quarter, with Google Services (which includes Search) revenue of $76.5 billion, up 13 percent year-over-year, and Google Cloud (which includes its AI infrastructure products for other companies) revenue of $11.4 billion, up 35 percent year-over-year. Operating incomes were also strong. Google Services hit $30.9 billion, up from $23.9 billion last year, and Google Cloud hit $1.95 billion, significantly up from last year's $270 million. "In Search, our new AI features are expanding what people can search for and how they search for it," CEO Sundar Pichai says in a statement. "In Cloud, our AI solutions are helping drive deeper product adoption with existing customers, attract new customers and win larger deals. And YouTube's total ads and subscription revenues surpassed $50 billion over the past four quarters for the first time."

Read more of this story at Slashdot.

An Alternative to Rewriting Memory-Unsafe Code in Rust: the 'Safe C++ Extensions' Proposal

"After two years of being beaten with the memory-safety stick, the C++ community has published a proposal to help developers write less vulnerable code," reports the Register. "The Safe C++ Extensions proposal aims to address the vulnerable programming language's Achilles' heel, the challenge of ensuring that code is free of memory safety bugs..." Acknowledging the now deafening chorus of calls to adopt memory safe programming languages, developers Sean Baxter, creator of the Circle compiler, and Christian Mazakas, from the C++ Alliance, argue that while Rust is the only popular systems level programming language without garbage collection that provides rigorous memory safety, migrating C++ code to Rust poses problems. "Rust lacks function overloading, templates, inheritance and exceptions," they explain in the proposal. "C++ lacks traits, relocation and borrow checking. These discrepancies are responsible for an impedance mismatch when interfacing the two languages. Most code generators for inter-language bindings aren't able to represent features of one language in terms of the features of another." Though DARPA is trying to develop better automated C++ to Rust conversion tools, Baxter and Mazakas argue telling veteran C++ developers to learn Rust isn't an answer... The Safe C++ project adds new technology for ensuring memory safety, Baxter explained, and isn't just a reiteration of best practices. "Safe C++ prevents users from writing unsound code," he said. "This includes compile-time intelligence like borrow checking to prevent use-after-free bugs and initialization analysis for type safety." Baxter said that rewriting a project in a different programming language is costly, so the aim here is to make memory safety more accessible by providing the same soundness guarantees as Rust at a lower cost. "With Safe C++, existing code continues to work as always," he explained. "Stakeholders have more control for incrementally opting in to safety." The next step, Baxter said, involves greater participation from industry to help realize the Safe C++ project. "The foundations are in: We have fantastic borrow checking and initialization analysis which underpin the soundness guarantees," he said. "The next step is to comprehensively visit all of C++'s features and specify memory-safe versions of them. It's a big effort, but given the importance of reducing C++ security vulnerabilities, it's an effort worth making."

Read more of this story at Slashdot.

'Running Clang in the Browser Using WebAssembly'

This week (MIT-licensed) WebAssembly runtime Wasmer announced "a major milestone in making any software run with WebAssembly." The announcement's headline? Running Clang in the browser using WebAssembly... Thanks to the newest release of Wasmer (4.4) and the Wasmer JS SDK (0.8.0) you can now run [compiler front-end] clang anywhere Wasmer runs! This allows compiling C programs from virtually anywhere. Including Javascript and your preferred browser! (we tested Chrome, Safari and Firefox and everything is working like a charm)... - You can compile C code to WebAssembly easily just using the Wasmer CLI: no toolchains or complex installations needed, install Wasmer and you are ready to go...! - You can compile C projects directly from JavaScript...! - We expect online IDEs to start adopting the SDK to allow their users compile and run C programs in the browser.... Do you want to use clang in your Javascript project? Thanks to our newly released Wasmer JS SDK you can do it easily, in both the browser and Node.js/Bun etc... Wasmer's clang can even optimize the file for you automatically using wasm-opt under the hood (Clang automatically detects if wasm-opt is used, and it will be automatically called when optimizing the file). Imagine using Emscripten without needing its toolchain installed — or even better, imagine running Emscripten in the browser. The announcement looks to a future of compiling native Python libraries, when "any project depending on LLVM can now be easily compiled to WebAssembly..." "This is the beginning of an awesome journey, we can't wait to see what you create next with this."

Read more of this story at Slashdot.

Are AI Coding Assistants Really Saving Developers Time?

Uplevel provides insights from coding and collaboration data, according to a recent report from CIO magazine — and recently they measured "the time to merge code into a repository [and] the number of pull requests merged" for about 800 developers over a three-month period (comparing the statistics to the previous three months). Their study "found no significant improvements for developers" using Microsoft's AI-powered coding assistant tool Copilot, according to the article (shared by Slashdot reader snydeq): Use of GitHub Copilot also introduced 41% more bugs, according to the study... In addition to measuring productivity, the Uplevel study looked at factors in developer burnout, and it found that GitHub Copilot hasn't helped there, either. The amount of working time spent outside of standard hours decreased for both the control group and the test group using the coding tool, but it decreased more when the developers weren't using Copilot. An Uplevel product manager/data analyst acknowledged to the magazine that there may be other ways to measure developer productivity — but they still consider their metrics solid. "We heard that people are ending up being more reviewers for this code than in the past... You just have to keep a close eye on what is being generated; does it do the thing that you're expecting it to do?" The article also quotes the CEO of software development firm Gehtsoft, who says they didn't see major productivity gains from LLM-based coding assistants — but did see them introducing errors into code. With different prompts generating different code sections, "It becomes increasingly more challenging to understand and debug the AI-generated code, and troubleshooting becomes so resource-intensive that it is easier to rewrite the code from scratch than fix it." On the other hand, cloud services provider Innovative Solutions saw significant productivity gains from coding assistants like Claude Dev and GitHub Copilot. And Slashdot reader destined2fail1990 says that while large/complex code bases may not see big gains, "I have seen a notable increase in productivity from using Cursor, the AI powered IDE." Yes, you have to review all the code that it generates, why wouldn't you? But often times it just works. It removes the tedious tasks like querying databases, writing model code, writing forms and processing forms, and a lot more. Some forms can have hundreds of fields and processing those fields along with doing checks for valid input is time consuming, but can be automated effectively using AI. This prompted an interesting discussion on the original story submission. Slashdot reader bleedingobvious responded: Cursor/Claude are great BUT the code produced is almost never great quality. Even given these tools, the junior/intern teams still cannot outpace the senior devs. Great for learning, maybe, but the productivity angle not quite there.... yet. It's damned close, though. GIve it 3-6 months. And Slashdot reader abEeyore posted: I suspect that the results are quite a bit more nuanced than that. I expect that it is, even outside of the mentioned code review, a shift in where and how the time is spent, and not necessarily in how much time is spent. Agree? Disagree? Share your own experiences in the comments. And are developers really saving time with AI coding assistants?

Read more of this story at Slashdot.

'Compile and Run C in JavaScript', Promises Bun

The JavaScript runtime Bun is a Node.js/Deno alternative (that's also a bundler/test runner/package manager). And Bun 1.1.28 now includes experimental support for ">compiling and running native C from JavaScript, according to this report from The New Stack: "From compression to cryptography to networking to the web browser you're reading this on, the world runs on C," wrote Jarred Sumner, creator of Bun. "If it's not written in C, it speaks the C ABI (C++, Rust, Zig, etc.) and is available as a C library. C and the C ABI are the past, present, and future of systems programming." This is a low-boilerplate way to use C libraries and system libraries from JavaScript, he said, adding that this feature allows the same project that runs JavaScript to also run C without a separate build step... "It's good for glue code that binds C or C-like libraries to JavaScript. Sometimes, you want to use a C library or system API from JavaScript, and that library was never meant to be used from JavaScript," Sumner added. It's currently possible to achieve this by compiling to WebAssembly or writing a N-API (napi) addon or V8 C++ API library addon, the team explained. But both are suboptimal... WebAssembly can do this but its isolated memory model comes with serious tradeoffs, the team wrote, including an inability to make system calls and a requirement to clone everything. "Modern processors support about 280 TB of addressable memory (48 bits). WebAssembly is 32-bit and can only access its own memory," Sumner wrote. "That means by default, passing strings and binary data JavaScript WebAssembly must clone every time. For many projects, this negates any performance gain from leveraging WebAssembly." The latest version of Bun, released Friday, builds on this by adding N-API (nap) support to cc [Bun's C compiler, which uses TinyCC to compile the C code]. "This makes it easier to return JavaScript strings, objects, arrays and other non-primitive values from C code," wrote Sumner. "You can continue to use types like int, float, double to send & receive primitive values from C code, but now you can also use N-API types! Also, this works when using dlopen to load shared libraries with bun:ffi (such as Rust or C++ libraries with C ABI exports).... "TinyCC compiles to decently performant C, but it won't do advanced optimizations that Clang or GCC does like autovectorization or very specialized CPU instructions," Sumner wrote. "You probably won't get much of a performance gain from micro-optimizing small parts of your codebase through C, but happy to be proven wrong!"

Read more of this story at Slashdot.

❌