Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Aujourd’hui — 26 juin 2024Flux principal

Barbecue à gaz XL GZ1 HYBA

20 juin 2024 à 06:37
29,70€ - Carrefour

Envie d'un petit barbecue pour cet été ;)

Offre en remise immédiate donc c'est top !

Barbecue gaz sur chariot :


  • Dimensions : 61 x 40 cm
  • 3 brûleurs en inox à réglage indépendant
  • Puissance totale : 9 kW
  • Grille de cuisson en acier émaillé
  • Allumage Piezzo
  • Récupérateur de graisses intégré
  • Panier porte-épices
  • Grande surface latérale pour la préparation des aliments
  • Thermomètre de cuisson pour une cuisson type four
  • Montage simple et facile
  • Emplacement bouteille de gaz

Topaz Photo AI is now $20 off for a limited time

Par : PR admin
26 juin 2024 à 01:41

Topaz Photo AI is now $20 off until July 2. Topaz recently released Photo AI version 3.0.4 with bug fixes and UI updates:

  • Updated status indicator
  • Other UI/X tweaks
  • Fix crop, upscale to specific pixel dimensions, and face recovery causing disjointed face output
  • Fix crop and upscale causing preview area to shrink
  • Added version number to the title bar
  • New frameless window implementation on Windows to fix black window area issue and allow hovering over the maximize button to show the snap options
  • Lensfun Update

Via NikonRumors

The post Topaz Photo AI is now $20 off for a limited time appeared first on Photo Rumors.

Législatives 2024 : ce qu’il faut retenir du débat entre Gabriel Attal, Jordan Bardella et Manuel Bompard

Cinq jours avant le premier tour de l’élection, le premier ministre, le président du RN et le coordinateur de La France insoumise, représentant du Nouveau Front populaire, se sont affrontés sur différents thèmes.

© STEFANO RELLANDINI / AFP

Researchers Upend AI Status Quo By Eliminating Matrix Multiplication In LLMs

Par : BeauHD
26 juin 2024 à 00:50
Researchers from UC Santa Cruz, UC Davis, LuxiTech, and Soochow University have developed a new method to run AI language models more efficiently by eliminating matrix multiplication, potentially reducing the environmental impact and operational costs of AI systems. Ars Technica's Benj Edwards reports: Matrix multiplication (often abbreviated to "MatMul") is at the center of most neural network computational tasks today, and GPUs are particularly good at executing the math quickly because they can perform large numbers of multiplication operations in parallel. [...] In the new paper, titled "Scalable MatMul-free Language Modeling," the researchers describe creating a custom 2.7 billion parameter model without using MatMul that features similar performance to conventional large language models (LLMs). They also demonstrate running a 1.3 billion parameter model at 23.8 tokens per second on a GPU that was accelerated by a custom-programmed FPGA chip that uses about 13 watts of power (not counting the GPU's power draw). The implication is that a more efficient FPGA "paves the way for the development of more efficient and hardware-friendly architectures," they write. The paper doesn't provide power estimates for conventional LLMs, but this post from UC Santa Cruz estimates about 700 watts for a conventional model. However, in our experience, you can run a 2.7B parameter version of Llama 2 competently on a home PC with an RTX 3060 (that uses about 200 watts peak) powered by a 500-watt power supply. So, if you could theoretically completely run an LLM in only 13 watts on an FPGA (without a GPU), that would be a 38-fold decrease in power usage. The technique has not yet been peer-reviewed, but the researchers -- Rui-Jie Zhu, Yu Zhang, Ethan Sifferman, Tyler Sheaves, Yiqiao Wang, Dustin Richmond, Peng Zhou, and Jason Eshraghian -- claim that their work challenges the prevailing paradigm that matrix multiplication operations are indispensable for building high-performing language models. They argue that their approach could make large language models more accessible, efficient, and sustainable, particularly for deployment on resource-constrained hardware like smartphones. [...] The researchers say that scaling laws observed in their experiments suggest that the MatMul-free LM may also outperform traditional LLMs at very large scales. The researchers project that their approach could theoretically intersect with and surpass the performance of standard LLMs at scales around 10^23 FLOPS, which is roughly equivalent to the training compute required for models like Meta's Llama-3 8B or Llama-2 70B. However, the authors note that their work has limitations. The MatMul-free LM has not been tested on extremely large-scale models (e.g., 100 billion-plus parameters) due to computational constraints. They call for institutions with larger resources to invest in scaling up and further developing this lightweight approach to language modeling.

Read more of this story at Slashdot.

MTV News Website Goes Dark, Archives Pulled Offline

Par : BeauHD
26 juin 2024 à 00:10
MTVNews.com has been shut down, with more than two decades' worth of content no longer available. "Content on its sister site, CMT.com, seems to have met a similar fate," adds Variety. From the report: In 2023, MTV News was shuttered amid the financial woes of parent company Paramount Global. As of Monday, trying to access MTV News articles on mtvnews.com or mtv.com/news resulted in visitors being redirected to the main MTV website. The now-unavailable content includes decades of music journalism comprising thousands of articles and interviews with countless major artists, dating back to the site's launch in 1996. Perhaps the most significant loss is MTV News' vast hip-hop-related archives, particularly its weekly "Mixtape Monday" column, which ran for nearly a decade in the 2000s and 2010s and featured interviews, reviews and more with many artists, producers and others early in their careers. "So, mtvnews.com no longer exists. Eight years of my life are gone without a trace," Patrick Hosken, former music editor for MTV News, wrote on X. "All because it didn't fit some executives' bottom lines. Infuriating is too small a word." "sickening (derogatory) to see the entire @mtvnews archive wiped from the internet," Crystal Bell, culture editor at Mashable and one-time entertainment director of MTV News, posted on X."decades of music history gone... including some very early k-pop stories." "This is disgraceful. They've completely wiped the MTV News archive," longtime Rolling Stone senior writer Brian Hiatt commented. "Decades of pop culture history research material gone, and why?" The report notes that some MTV News articles may be available via internet archiving services like the Wayback Machine. However, older articles aren't available.

Read more of this story at Slashdot.

❌
❌