Vue lecture

Gen Z Officially Worse At Passwords Than 80-Year-Olds

A NordPass analysis found that Gen Z is actually worse at password security than older generations, with "12345" topping their list while "123456" dominates among everyone else. The Register reports: And while there were a few more "skibidis" among the Zoomer dataset compared to those who came before them, the trends were largely similar. Variants on the "123456" were among the most common for all age groups, with that exact string proving to be the most common among all users -- the sixth time in seven years it holds the undesirable crown. Some of the more adventurous would stretch to "1234567," while budding cryptologists shored up their accounts by adding an 8 or even a 9 to the mix. However, according to Security.org's password security checker, a computer could crack any of these instantly. Most attackers would not even need to expend the resources required to reveal the password, given how commonly used they are. They could just spray a list of known passwords at an authentication API and secure a quick win.

Read more of this story at Slashdot.

  •  

Cloud-Native Computing Is Poised To Explode

An anonymous reader quotes a report from ZDNet: At KubeCon North America 2025 in Atlanta, the Cloud Native Computing Foundation (CNCF)'s leaders predicted an enormous surge in cloud-native computing, driven by the explosive growth of AI inference workloads. How much growth? They're predicting hundreds of billions of dollars in spending over the next 18 months. [...] Where cloud-native computing and AI inference come together is when AI is no longer a separate track from cloud-native computing. Instead, AI workloads, particularly inference tasks, are fueling a new era where intelligent applications require scalable and reliable infrastructure. That era is unfolding because, said [CNCF Executive Director Jonathan Bryce], "AI is moving from a few 'Training supercomputers' to widespread 'Enterprise Inference.' This is fundamentally a cloud-native problem. You, the platform engineers, are the ones who will build the open-source platforms that unlock enterprise AI." "Cloud native and AI-native development are merging, and it's really an incredible place we're in right now," said CNCF CTO Chris Aniszczyk. The data backs up this opinion. For example, Google has reported that its internal inference jobs have processed 1.33 quadrillion tokens per month recently, up from 980 trillion just months before. [...] Aniszczyk added that cloud-native projects, especially Kubernetes, are adapting to serve inference workloads at scale: "Kubernetes is obviously one of the leading examples as of the last release the dynamic resource allocation feature enables GPU and TPU hardware abstraction in a Kubernetes context." To better meet the demand, the CNCF announced the Certified Kubernetes AI Conformance Program, which aims to make AI workloads as portable and reliable as traditional cloud-native applications. "As AI moves into production, teams need a consistent infrastructure they can rely on," Aniszczyk stated during his keynote. "This initiative will create shared guardrails to ensure AI workloads behave predictably across environments. It builds on the same community-driven standards process we've used with Kubernetes to help bring consistency as AI adoption scales." What all this effort means for business is that AI inference spending on cloud-native infrastructure and services will reach into the hundreds of billions within the next 18 months. That investment is because CNCF leaders predict that enterprises will race to stand up reliable, cost-effective AI services.

Read more of this story at Slashdot.

  •  

Red Hat Losing Another Prominent Linux Kernel Engineer

Another highly influential Linux kernel engineer, David Hildenbrand, is leaving Red Hat after a decade of major contributions to memory management, virtualization, and VirtIO. His recent kernel patch updates his maintainer info to a kernel.org address, signaling his departure. He hasn't yet said where he's headed next. Phoronix reports: David Hildenbrand serves as a reviewer for the HugeTLB code, s390 KVM code, and memory management reclaim code. He also serves as an upstream maintainer for the Linux kernel's core memory management code, Get User Pages (GUP) memory management code, kernel samepage merging (KSM), reverse mapping (RMAP), transparent hugepage (THP), memory advice (MADVISE), VirtIO memory driver, and VirtIO balloon driver. Hildenbrand had been employed by Red Hat the past decade in Munich working on QEMU/KVM virtualization, Linux kernel memory management, VirtIO, and related low-level areas. Just this year alone so far in 2025 he's authored or been mentioned on more than one thousand mainline Linux kernel patches.

Read more of this story at Slashdot.

  •  

Blender 5.0 Released

Blender 5.0 has been released with major upgrades including HDR and wide-gamut color support on Linux via Wayland/Vulkan, significant theme and UI improvements, new color-space tools, revamped curve and geometry features, and expanded hardware requirements. 9to5Linux reports: Blender 5.0 also introduces a working color space for Blend files, a new AgX HDR view, a new Convert to Display compositor node, new Rec.2100-PQ and Rec.2100-HLG displays that can be used for color grading for HDR video export, and new ACES 1.3 and 2.0 views as an alternative to AgX and Filmic. A new "Jump Time by Delta" operator for jumping forward/backward in time by a user-specified delta has been introduced as well, along with a revamped Curve drawing, which better supports the new Curves object type and all of their features, and a new Geometry Attribute constraint. Also new is a "Cylinder" option for curve display type that allows rendering thicker curves without the flat ribbon appearance, support for the Zstd (Zstandard) fast lossless compression algorithm for point caches, as well as a new "Curve Data" panel in edit mode that allows tweaking built-in curve attribute values. A full list of changes can be found here. You can download from the official website.

Read more of this story at Slashdot.

  •  

Report Claims That Apple Has Yet Again Put the Mac Pro 'On the Back Burner'

An anonymous reader quotes a report from Ars Technica: Apple's Power Mac and Mac Pro towers used to be the company's primary workstations, but it has been years since they were updated with the same regularity as the MacBook Air or MacBook Pro. The Mac Pro has seen just four hardware updates in the last 15 years, and that's counting a 2012 refresh that was mostly identical to the 2010 version. Long-suffering Mac Pro buyers may have taken heart when Apple finally added an M2 Ultra processor to the tower in mid-2023, making it one of the very last Macs to switch from Intel to Apple Silicon -- surely this would mean that the computer would at least be updated once every year or two, like the Mac Studio has been? But Bloomberg's Mark Gurman says that Mac Pro buyers shouldn't get their hopes up for new hardware in 2026. Gurman says that the tower is "on the back burner" at Apple and that the company is "focused on a new Mac Studio" for the next-generation M5 Ultra chip that is in the works. As we reported earlier this year, Apple doesn't have plans to design or release an M4 Ultra, and the Mac Studio refresh from this spring included an M3 Ultra alongside the M4 Max. Note that Gurman carefully stops short of saying we definitely won't see a Mac Pro update next year -- the emphasis on the Mac Studio merely "suggests the Mac Pro won't be updated in 2026 in a significant way," and internal sources tell him "Apple has largely written off the Mac Pro." The current Mac Pro does still use the M2 Ultra rather than the M3 Ultra, which indicates that Apple doesn't see the need to update its high-end desktop every time it releases a suitable chip. But all of Apple's other desktops -- the iMac, the Mac mini, and the Studio -- have skipped a silicon generation once since the M1 came out in 2020.

Read more of this story at Slashdot.

  •  

ACLU and EFF Sue a City Blanketed With Flock Surveillance Cameras

An anonymous reader shares a report: Lawyers from the American Civil Liberties Union (ACLU) and Electronic Frontier Foundation (EFF) sued the city of San Jose, California over its deployment of Flock's license plate-reading surveillance cameras, claiming that the city's nearly 500 cameras create a pervasive database of residents movements in a surveillance network that is essentially impossible to avoid. The lawsuit was filed on behalf of the Services, Immigrant Rights & Education Network and Council on American-Islamic Relations, California, and claims that the surveillance is a violation of California's constitution and its privacy laws. The lawsuit seeks to require police to get a warrant in order to search Flock's license plate system. The lawsuit is one of the highest profile cases challenging Flock; a similar lawsuit in Norfolk, Virginia seeks to get Flock's network shut down in that city altogether. "San Jose's ALPR [automatic license plate reader] program stands apart in its invasiveness," ACLU of Northern California and EFF lawyers wrote in the lawsuit. "While many California agencies run ALPR systems, few retain the locations of drivers for an entire year like San Jose. Further, it is difficult for most residents of San Jose to get to work, pick up their kids, or obtain medical care without driving, and the City has blanketed its roads with nearly 500 ALPRs."

Read more of this story at Slashdot.

  •  

Klarna Says AI Drive Has Helped Halve Staff Numbers and Boost Pay

Klarna has claimed that AI-related savings have allowed the buy now, pay later company to increase staff salaries by nearly 60%, but hinted it could slash more jobs after nearly halving its workforce over the past three years. From a report: Chief executive Sebastian Siemiatkowski said headcount had dropped from 5,527 to 2,907 since 2022, mostly as a result of natural attrition, with departing staff replaced by technology rather than by new staff members. The figures add to the impact of an internal artificial intelligence programme, which had steadily reduced its use of outsourced workers including those in customer service, with technology now carrying out the work of 853 full-time staff, up from 700 earlier this year. It meant the company, which was founded in Sweden in 2005, had managed to increase revenues by 108% while keeping operating costs flat. Siemiatkowski told analysts on an earnings call on Tuesday that it was "pretty remarkable, and unheard of as a number, among businesses."

Read more of this story at Slashdot.

  •  

Oracle is Already Underwater On Its 'Astonishing' $300B OpenAI Deal

An anonymous reader shares a report: It's too soon to be talking about the Curse of OpenAI, but we're going to anyway. Since September 10, when Oracle announced a $300 billion deal with the chatbot maker, its stock has shed $315 billion in market value. OK, yes, it's a gross simplification to just look at market cap. But equivalents to Oracle shares are little changed over the same period (Nasdaq Composite, Microsoft, Dow Jones US Software Index), so the $15 billion loss figure [figure updated with stock price] is not entirely wrong. Oracle's "astonishing quarter" really has cost it nearly as much as one General Motors, or two Kraft Heinz.

Read more of this story at Slashdot.

  •  

'Talking To Windows' Copilot AI Makes a Computer Feel Incompetent'

Microsoft's Copilot AI assistant in Windows 11 fails to replicate the capabilities shown in the company's TV advertisements. The Verge tested Copilot Vision over a week using the same prompts featured in ads airing during NFL games. When asked to identify a HyperX QuadCast 2S microphone visible in a YouTube video -- a task successfully completed in Microsoft's ad -- Copilot gave multiple incorrect answers. The assistant identified the microphone as a first-generation HyperX QuadCast, then as a Shure SM7b on two other occasions. Copilot couldn't identify the Saturn V rocket from a PowerPoint presentation despite the words "Saturn V" appearing on screen. When asked about a cave image from Microsoft's ad, Copilot gave inconsistent responses. About a third of the time it provided directions to find the photo in File Explorer. On two occasions it explained how to launch Google Chrome. Four times it offered advice about booking flights to Belize. The cave is Rio Secreto in Playa del Carmen, Mexico. Microsoft spokesperson Blake Manfre said "Copilot Actions on Windows, which can take actions on local files, is not yet available." He described it as "an opt-in experimental feature that will be coming soon to Windows Insiders in Copilot Labs, starting with a narrow set of use cases while we optimize model performance and learn." Copilot cannot toggle basic Windows settings like dark mode. When asked to analyze a benchmark table in Google Sheets, it "constantly misread clear-as-day scores both in the spreadsheet and in the on-page review."

Read more of this story at Slashdot.

  •  

[Bon plan] Clavier CORSAIR K70 RGB PRO (MX Speed) ) 99,99€ livré

La FNAC et Darty commencent leurs offres pour le Black Friday et proposent un excellent prix sur le CORSAIR K70 RGB PRO, un clavier mécanique pour joueurs. Il s'agit ici de la version équipée de switchs Cherry MX Speed qui sont de type linéaire, mais avec activation qui se fait au bout de 1,2 mm con...

  •  

IRS Accessed Massive Database of Americans Flights Without a Warrant

An anonymous reader shares a report: The IRS accessed a database of hundreds of millions of travel records, which show when and where a specific person flew and the credit card they used, without obtaining a warrant, according to a letter signed by a bipartisan group of lawmakers and shared with 404 Media. The country's major airlines, including Delta, United Airlines, American Airlines, and Southwest, funnel customer records to a data broker they co-own called the Airlines Reporting Corporation (ARC), which then sells access to peoples' travel data to government agencies. The IRS case in the letter is the clearest example yet of how agencies are searching the massive trove of travel data without a search warrant, court order, or similar legal mechanism. Instead, because the data is being sold commercially, agencies are able to simply buy access. In the letter addressed to nine major airlines, the lawmakers urge them to shut down the data selling program. Update: after this piece was published, ARC said it already planned to shut down the program. "Disclosures made by the IRS to Senator Wyden confirm that it did not follow federal law and its own policies in purchasing airline data from ARC," the letter reads. The letter says the IRS "confirmed that it did not conduct a legal review to determine if the purchase of Americans' travel data requires a warrant."

Read more of this story at Slashdot.

  •  

Federal Judge Rules Meta's Instagram and WhatsApp Purchases Did Not Stifle Competition

A federal judge ruled Tuesday that Meta did not illegally stifle competition when it acquired Instagram and WhatsApp. The decision marks Big Tech's first major victory against antitrust enforcement that began during President Donald Trump's first term. The U.S. Federal Trade Commission had sought to force Meta to sell or restructure the platforms to restore competition among social media networks. Meta argued it faced competitive pressure from TikTok, YouTube, and Apple's messaging app.

Read more of this story at Slashdot.

  •  

Fund Managers Warn AI Investment Boom Has Gone Too Far

A majority of global fund managers think companies are overinvesting, as market anxiety grows about the sustainability of the AI spending boom. From a report: A net 20 per cent of fund managers surveyed this month by Bank of America said companies were spending too much on their investments -- the first time this has been a majority view in data running back to 2005. "This jump is driven by concerns over the magnitude and financing of the AI capex boom," said BofA analysts. The surge in investment to develop AI infrastructure has been a dominant theme in the record rally in US tech stocks this year -- with chipmaker Nvidia becoming the world's first $5tn company last month -- but growing concerns about the sustainability of this spending has caused a pullback on Wall Street in recent weeks.

Read more of this story at Slashdot.

  •  

OpCore et EDF négocient un datacenter à 4 milliards d’euros en Seine-et-Marne

Charbons ardents
OpCore et EDF négocient un datacenter à 4 milliards d’euros en Seine-et-Marne

OpCore, la coentreprise réunissant le Groupe iliad et des fonds gérés par InfraVia Capital Partners, devrait pouvoir exploiter l’un des sites industriels d’EDF pour développer un datacenter représentant une puissance électrique de plusieurs centaines de MW. Le site retenu est l’ancienne centrale charbon EDF de Montereau-Vallée-de-la-Seine.

EDF et OpCore ont annoncé lundi leur entrée en négociations préalable à l’implantation d’un datacenter de grande capacité sur l’un des sites industriels proposés clés en main par l’énergéticien. C’est l’ancienne centrale charbon dite de Montereau en Seine-et-Marne qui a été retenue pour ce qui s’annonce comme un complexe de grande envergure. Le site se situe en réalité à cheval sur les communes de Vernou-La-Celle-sur-Seine et La Grande Paroisse.

Les deux partenaires évoquent une infrastructure équivalente à « plusieurs centaines de mégawatts » de puissance électrique, avec une enveloppe globale dont le montant se situerait, pour OpCore, aux alentours de 4 milliards d’euros.

OpCore (anciennement Scaleway Datacenter) est pour mémoire depuis le printemps une coentreprise entre le groupe iliad (maison mère de l’opérateur Free) et le fonds d’infrastructure français InfraVia.

Jusqu’à 700 MW sur 20 hectares

Dans leur communiqué d’annonce conjoint, diffusé le jour d’un événement dédié à l’investissement en France organisé par l’Élysée (Choose France, mais dans une édition destinée aux acteurs français), les deux partenaires restent volontairement évasifs sur les chiffres. La mèche avait cependant été vendue fin juillet par James Chéron, le maire de Montereau-Fault-Yonne, qui précisait alors que le projet correspondait à une puissance électrique maximale de 700 MW, et à une surface au sol d’environ 20 hectares.

Présenté comme « l’un des centres de calcul les plus importants d’Europe », le projet « constituera un atout majeur pour soutenir l’émergence d’une filière française et européenne d’excellence dans l’intelligence artificielle, et renforcera la souveraineté numérique du continent », vantent les deux partenaires. La finalité précise et les modalités de commercialisation des ressources informatiques mises en œuvre ne sont pas évoquées à ce stade, ce qui n’empêche pas EDF et OpCore d’affirmer que ce datacenter « permettra de créer plusieurs centaines d’emplois locaux, directs et indirects ».

« L’investissement se compte en milliards d’euros et notre vallée de la Seine peut être fière de la mise en œuvre de cette formidable ambition, qui participera à la constitution et au renforcement d’un cluster avec notamment le centre de recherche des Renardières et le Campus Énergie Durable », se réjouissait quant à lui le maire de Montereau.

Un projet vertueux ?

L’annonce par voie de communiqué a été doublée d’une conférence de presse organisée sur le site, pendant laquelle les porteurs de projet ont défendu son caractère vertueux. D’après Jean-Louis Thiériot, député de Seine-et-Marne, la construction du datacenter n’engendrerait aucune artificialisation des sols, puisque les bâtiments seraient construits sur l’emprise de l’ancienne centrale à charbon.

Les porteurs de projet auraient par ailleurs évoqué un refroidissement en circuit fermé, sans prélèvements d’eau extérieure, une réutilisation de la chaleur fatale via des usages locaux, et une utilisation d’électricité décarbonée. « C’est maintenant aux communautés de communes, en charge de l’activité économique, de faire croître un écosystème favorable avec les entreprises du secteur », veut croire le député. Les activités économiques les plus proches du site sont pour l’instant un fournisseur de granulats et un site de stockage de grains.

L’ancienne centrale thermique, retenue pour le datacenter, est à proximité immédiate de la Seine et voisine du poste électrique du Chesnoy – capture d’écran Next, carte OpenStreetMap

La mise en service de ce datacenter serait programmée à partir de 2027 avec, sans doute, un déploiement par tranches. Outre le soutien d’EDF, le site bénéficie en effet du dispositif de raccordement fast track, voulu par le Gouvernement et approuvé en mai dernier par la Commission de régulation de l’énergie (CRE), qui permet « de demander un raccordement accéléré et sans limitation à la capacité du réseau ».

Le site de Montereau, qui accueille aujourd’hui deux turbines à combustion alimentées par du gaz, est raccordé « au réseau électrique national au travers du poste du Chesnoy (400 kV/225 kV/63 kV) », indique Wikipédia.

Le datacenter d’OpCore sera ainsi situé à une trentaine de kilomètres d’un autre projet très médiatisé et doté d’un budget dix fois plus important, le fameux Campus IA dont la concertation publique vient de s’achever autour de Fouju, toujours en Seine-et-Marne.

  •  

OpCore et EDF négocient un datacenter à 4 milliards d’euros en Seine-et-Marne

Charbons ardents
OpCore et EDF négocient un datacenter à 4 milliards d’euros en Seine-et-Marne

OpCore, la coentreprise réunissant le Groupe iliad et des fonds gérés par InfraVia Capital Partners, devrait pouvoir exploiter l’un des sites industriels d’EDF pour développer un datacenter représentant une puissance électrique de plusieurs centaines de MW. Le site retenu est l’ancienne centrale charbon EDF de Montereau-Vallée-de-la-Seine.

EDF et OpCore ont annoncé lundi leur entrée en négociations préalable à l’implantation d’un datacenter de grande capacité sur l’un des sites industriels proposés clés en main par l’énergéticien. C’est l’ancienne centrale charbon dite de Montereau en Seine-et-Marne qui a été retenue pour ce qui s’annonce comme un complexe de grande envergure. Le site se situe en réalité à cheval sur les communes de Vernou-La-Celle-sur-Seine et La Grande Paroisse.

Les deux partenaires évoquent une infrastructure équivalente à « plusieurs centaines de mégawatts » de puissance électrique, avec une enveloppe globale dont le montant se situerait, pour OpCore, aux alentours de 4 milliards d’euros.

OpCore (anciennement Scaleway Datacenter) est pour mémoire depuis le printemps une coentreprise entre le groupe iliad (maison mère de l’opérateur Free) et le fonds d’infrastructure français InfraVia.

Jusqu’à 700 MW sur 20 hectares

Dans leur communiqué d’annonce conjoint, diffusé le jour d’un événement dédié à l’investissement en France organisé par l’Élysée (Choose France, mais dans une édition destinée aux acteurs français), les deux partenaires restent volontairement évasifs sur les chiffres. La mèche avait cependant été vendue fin juillet par James Chéron, le maire de Montereau-Fault-Yonne, qui précisait alors que le projet correspondait à une puissance électrique maximale de 700 MW, et à une surface au sol d’environ 20 hectares.

Présenté comme « l’un des centres de calcul les plus importants d’Europe », le projet « constituera un atout majeur pour soutenir l’émergence d’une filière française et européenne d’excellence dans l’intelligence artificielle, et renforcera la souveraineté numérique du continent », vantent les deux partenaires. La finalité précise et les modalités de commercialisation des ressources informatiques mises en œuvre ne sont pas évoquées à ce stade, ce qui n’empêche pas EDF et OpCore d’affirmer que ce datacenter « permettra de créer plusieurs centaines d’emplois locaux, directs et indirects ».

« L’investissement se compte en milliards d’euros et notre vallée de la Seine peut être fière de la mise en œuvre de cette formidable ambition, qui participera à la constitution et au renforcement d’un cluster avec notamment le centre de recherche des Renardières et le Campus Énergie Durable », se réjouissait quant à lui le maire de Montereau.

Un projet vertueux ?

L’annonce par voie de communiqué a été doublée d’une conférence de presse organisée sur le site, pendant laquelle les porteurs de projet ont défendu son caractère vertueux. D’après Jean-Louis Thiériot, député de Seine-et-Marne, la construction du datacenter n’engendrerait aucune artificialisation des sols, puisque les bâtiments seraient construits sur l’emprise de l’ancienne centrale à charbon.

Les porteurs de projet auraient par ailleurs évoqué un refroidissement en circuit fermé, sans prélèvements d’eau extérieure, une réutilisation de la chaleur fatale via des usages locaux, et une utilisation d’électricité décarbonée. « C’est maintenant aux communautés de communes, en charge de l’activité économique, de faire croître un écosystème favorable avec les entreprises du secteur », veut croire le député. Les activités économiques les plus proches du site sont pour l’instant un fournisseur de granulats et un site de stockage de grains.

L’ancienne centrale thermique, retenue pour le datacenter, est à proximité immédiate de la Seine et voisine du poste électrique du Chesnoy – capture d’écran Next, carte OpenStreetMap

La mise en service de ce datacenter serait programmée à partir de 2027 avec, sans doute, un déploiement par tranches. Outre le soutien d’EDF, le site bénéficie en effet du dispositif de raccordement fast track, voulu par le Gouvernement et approuvé en mai dernier par la Commission de régulation de l’énergie (CRE), qui permet « de demander un raccordement accéléré et sans limitation à la capacité du réseau ».

Le site de Montereau, qui accueille aujourd’hui deux turbines à combustion alimentées par du gaz, est raccordé « au réseau électrique national au travers du poste du Chesnoy (400 kV/225 kV/63 kV) », indique Wikipédia.

Le datacenter d’OpCore sera ainsi situé à une trentaine de kilomètres d’un autre projet très médiatisé et doté d’un budget dix fois plus important, le fameux Campus IA dont la concertation publique vient de s’achever autour de Fouju, toujours en Seine-et-Marne.

  •  

Blender 5.0 Released With Better Vulkan Support, HDR On Wayland

It's the Blender 5.0 release day! Blender 5.0 is a big step forward for this open-source 3D modeling software with better Vulkan viewport support across different GPUs/drivers, HDR support when using Vulkan and Wayland on Linux, and other very nice refinements for this popular cross-platform software package...
  •  

Google Launches Gemini 3, Its 'Most Intelligent' AI Model Yet

Google released Gemini 3 on Tuesday, launching its latest AI model with a breakthrough score of 1501 Elo on the LMArena Leaderboard alongside state-of-the-art performance across multiple benchmarks including 91.9% on GPQA Diamond for PhD-level reasoning and 37.5% on Humanity's Last Exam without tool usage. The model is available starting today in the Gemini app, AI Mode in Search for Google AI Pro, Google AI Studio, Vertex AI and the newly launched Google Antigravity agentic development platform. Third-party platforms including Cursor, GitHub, JetBrains, Manus, and Replit are also gaining access. Separately, Google said AI Overviews now have 2 billion users every month. Gemini app has topped 650 million users per month.

Read more of this story at Slashdot.

  •  

Microsoft is Adding an 'Experimental Agentic Features' Toggle To Windows 11

Microsoft has rolled out a new preview build for Windows 11 Insiders in the Dev and Beta Channel this week that introduces a new toggle called 'experimental agentic features' that can be enabled or disabled in the Windows Settings app. From a report: According to Microsoft, this new toggle is designed to "allow agents to use new Windows agentic features." The company says the feature will work with AI-powered apps, which "help you automate everyday tasks -- like organizing files, scheduling meetings, or sending emails -- so you can spend less time on busy work and more time on what matters most. One powerful way apps are implementing AI today is by interacting with your apps and your files, using vision and advanced reasoning to click, type and scroll like a human would." The setting in the Windows Setting says "When this setting is on, agents can use Windows agentic features." Features such as the recently announced Copilot Actions for Windows feature are going to take advantage of this new experimental agentic feature capability.

Read more of this story at Slashdot.

  •  

Radeon RX 9060 non-XT : une carte pas si mauvaise, mais planquée dans les PC pré-montés

Bien qu’elle ne soit pas disponible chez nous, la Radeon RX 9060 existe, alors autant en parler. Lancée en août 2025, c’est une référence absente du marché DIY mais présente dans quelques rares PC pré-montés (comme rapporté deux lignes au-dessus, pas en France — du moins selon nos recherches)... [Tout lire]
  •