Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
À partir d’avant-hierFlux principal

La Chine censure WhatsApp, Telegram et Signal

19 avril 2024 à 08:12

Déjà bloqué depuis 2017, WhatsApp est désormais introuvable en Chine sur décision du gouvernement. Les autorités chinoises ne veulent pas prendre le risque que l'application soit utilisée avec un VPN. D'autres messageries occidentales ont connu le même sort.

Meta AI et Llama 3 : tout comprendre à la stratégie de Facebook et Instagram pour détrôner ChatGPT

18 avril 2024 à 16:31

Meta AI, un chatbot uniquement disponible en anglais pour l'instant, devient encore plus performant grâce au nouveau modèle de langage Llama 3, dont les deux premières versions sont dévoilées aujourd'hui (avec 8 milliards ou 70 milliards de paramètres). L'objectif de Meta est de dépasser OpenAI et Google, grâce à ses 3 milliards d'utilisateurs dans le monde.

Samsung Unveils 10.7Gbps LPDDR5X Memory - The Fastest Yet

17 avril 2024 à 14:00

Samsung today has announced that they have developed an even faster generation of LPDDR5X memory that is set to top out at LPDDR5X-10700 speeds. The updated memory is slated to offer 25% better performance and 30% greater capacity compared to existing mobile DRAM devices from the company. The new chips also appear to be tangibly faster than Micron's LPDDR5X memory and SK hynix's LPDDR5T chips.

Samsung's forthcoming LPDDR5X devices feature a data transfer rate of 10.7 GT/s as well as maximum capacity per stack of 32 GB. This allows Samsung's clients to equip their latest smartphones or laptops with 32 GB of low-power memory using just one DRAM package, which greatly simplifies their designs. Samsung says that 32 GB of memory will be particularly beneficial for on-device AI applications.

Samsung is using its latest-generation 12nm-class DRAM process technology to make its LPDDR5X-10700 devices, which allows the company to achieve the smallest LPDDR device size in the industry, the memory maker said.

In terms of power efficiency, Samsung claims that they have integrated multiple new power-saving features into the new LPDDR5X devices. These include an optimized power variation system that adjusts energy consumption based on workload, and expanded intervals for low-power mode that extend the periods of energy saving. These innovations collectively enhance power efficiency by 25% compared to earlier versions, benefiting mobile platforms by extending battery life, the company said.

“As demand for low-power, high-performance memory increases, LPDDR DRAM is expected to expand its applications from mainly mobile to other areas that traditionally require higher performance and reliability such as PCs, accelerators, servers and automobiles,” said YongCheol Bae, Executive Vice President of Memory Product Planning of the Memory Business at Samsung Electronics. “Samsung will continue to innovate and deliver optimized products for the upcoming on-device AI era through close collaboration with customers.”

Samsung plans to initiate mass production of the 10.7 GT/s LPDDR5X DRAM in the second half of this year. This follows a series of compatibility tests with mobile application processors and device manufacturers to ensure seamless integration into future products.

SK hynix to Build $3.87 Billion Memory Packaging Fab in the U.S. for HBM4 and Beyond

5 avril 2024 à 11:00

SK hynix this week announced plans to build its advanced memory packaging facility in West Lafayette, Indiana. The move can be considered as a milestone both for the memory maker and the U.S., as this is the first advanced memory packaging facility in the country and the company's first significant manufacturing operation in America. The facility will be used to build next-generation types of high-bandwidth memory (HBM) stacks when it begins operations in 2028. Also, SK hynix agreed to work on R&D projects with Purdue University.

"We are excited to become the first in the industry to build a state-of-the-art advanced packaging facility for AI products in the United States that will help strengthen supply-chain resilience and develop a local semiconductor ecosystem," said SK hynix CEO Kwak Noh-Jung.

One of The Most Advanced Chip Packaging Facility Ever

The facility will handle assembly of HBM known good stacked dies (KGSDs), which consist of multiple memory devices stacked on a base die. Furthermore, it will be used to develop next-generations of HBM and will therefore house a packaging R&D line. However, the plant will not make DRAM dies themselves, and will likely source them from SK hynix's fabs in South Korea.

The plant will require SK hynix to invest $3.87 billion, which will make it one of the most advanced semiconductor packaging facilities in the world. Meanwhile, SK hynix held the investment agreement ceremony with representatives from Indiana State, Purdue University, and the U.S. government, which indicates parties financially involved in the project, but this week's event did not disclose whether SK hynix will receive any money from the U.S. government under the CHIPS Act or other funding initiatives.

The cost of the facility significantly exceeds that of packaging facilities built by other major players in the industry, such as ASE Group, Intel, and TSMC, which highlights how significant of an investment this is for SK hnix. In fact, $3.87 billion higher than advanced packaging CapEx budgets of Intel, TSMC and Samsung in 2023, based on estimates from Yole Intelligence.

Given that the fab comes online in 2028, based on SK hynix's product roadmap we'd expect that it will be used at least in part to assemble HBM4 and HBM4E stacks. Notably, since HBM4 and HBM4E stacks are set to feature a 2048-bit interface, their packaging process will be considerably more complex than the existing 1024-bit HBM3/HBM3E packaging and will require usage of more advanced tools, which is why it is poised to be more expensive than some existing advanced packaging facilities. Due to the extremely complex 2048-bit interface, many chip designers who are going to use HBM4/HBM4E are expected to integrate it directly onto their processors using hybrid bonding and not use silicon interposers. Unfortunately, it is unclear whether the SK hynix facility will be able to offer such service.

HBM is mainly used for AI and HPC applications, so it is strategically important to have its production in the U.S. Meanwhile, actual memory dies will still need to be made elsewhere, at dedicated DRAM fabs.

Purdue University Collaboration

In addition to support set to be provided by state and local governmens, SK hynix chose to establish its new facility in West Lafayette, Indiana, to collaborate with Purdue University as well as with Purdue's Birck Nanotechnology Center on R&D projects, which includes advanced packaging and heterogeneous integration.

SK hynix intends to work in partnership with Purdue University and Ivy Tech Community College to create training programs and multidisciplinary degree courses aimed at nurturing a skilled workforce and establishing a consistent stream of emerging talent for its advanced memory packaging facility and R&D operations.

"SK hynix is the global pioneer and dominant market leader in memory chips for AI," Purdue University President Mung Chiang said. "This transformational investment reflects our state and university's tremendous strength in semiconductors, hardware AI, and hard tech corridor. It is also a monumental moment for completing the supply chain of digital economy in our country through chips advanced packaging. Located at Purdue Research Park, the largest facility of its kind at a U.S. university will grow and succeed through innovation."

Deepfake : d’où viennent les millions d’images trafiquées à caractère sexuel

19 mars 2024 à 16:35

Alors que de plus en plus de personnes, massivement des femmes, sont victimes de deepfakes à caractères sexuels, leur contrôle sur le web devient impossible tant les outils prolifèrent. Numerama a consulté ces chaînes « usines » à détournement de visage.

SK Hynix Starts Mass Production of HBM3E: 9.2 GT/s

19 mars 2024 à 13:30

SK Hynix said that it had started volume production of its HBM3E memory and would supply it to a customer in late March. The South Korean company is the second DRAM producer to announce mass production of HBM3E, so the market of ultra-high-performance memory will have some competition, which is good for companies that plan to use HBM3E.

According to specifications, SK Hynix's HBM3E known good stack dies (KGSDs) feature data transfer rates up to 9.2 GT/s, a 1024-bit interface, and a bandwidth of 1.18 TB/s, which is massively higher than the 6.4 GT/s and 819 GB/s offered by HBM3. The company does not say whether it mass produces 8Hi 24GB HBM3E memory modules or 12Hi 36GB HBM3E devices, but it will likely begin its HBM3E ramp from lower-capacity products as they are easier to make.

We already know that SK Hynix's HBM3E stacks employ the company's advanced Mass Reflow Molded Underfill (MR-RUF) technology, which promises to reduce heat dissipation by 10%. This technology involves the use of an enhanced underfill between DRAM layers, which not only improves heat dissipation but also reduces the thickness of HBM stacks. As a result, 12-Hi HBM stacks can be constructed that are the same height as 8-Hi modules. However, this does not necessarily imply that the stacks currently in mass production are 12-Hi HBM3E stacks.

Although the memory maker does not officially confirm this, SK Hynix's 24GB HBM3E stacks will arrive just in time to address NVIDIA's Blackwell accelerator family for artificial intelligence and high-performance computing applications.

"With the success story of the HBM business and the strong partnership with customers that it has built for years, SK Hynix will cement its position as the total AI memory provider," said Sungsoo Ryu, Head of HBM business at SK Hynix. As a result, NVIDIA will have access to HBM3E memory from multiple suppliers with both Micron and SK Hynix.

Meanwhile, AMD recently confirmed that it was looking forward to expanding its Instinct MI300-series lineup for AI and HPC applications with higher-performance memory configurations, so SK Hynix's HBM3E memory could also be used for this.

« Mes chéris, j’ai une grande nouvelle à vous annoncer » : qu’est-ce que Royaltiz, la plateforme promue par Cyril Hanouna ?

Par : Aurore Gayte
18 mars 2024 à 17:15

Cyril Hanouna a fait la promotion sur X (ex-Twitter) de Royaltiz, un site qui propose d'investir dans des stars. Les mécanismes financiers de l'entreprise sont critiqués par certains utilisateurs, mais Royaltiz défend son action, et estime qu'elle n'enfreint pas la loi.

Comment savoir si j’ai été bloqué sur Instagram ?

Vous ne voyez plus les publications d'une personne particulière dans votre flux Instagram ? Elle ne répond plus à vos messages en privé ? C’est bien probable qu’elle vous ait bloqué pour que vous ne puissiez plus voir son compte. Mais, comment en être sûr ? 

Facebook, Messenger et Instagram (Meta) sont en panne ? Vous n’êtes pas les seuls

Par : Marie Turcan
5 mars 2024 à 15:35

Facebook Messenger Instagram

Les applications de Meta (Facebook, Messenger, Instagram) ont cessé de fonctionner ce 5 mars 2024. La panne apparaît globalisée et mondiale : les utilisateurs et utilisatrices se retrouvent déconnectés de leur compte Facebook et Messenger sans pouvoir se reconnecter.

Comment supprimer un compte Instagram ?

Par : Aurore Gayte
15 janvier 2024 à 15:36

Instagram, c'est fini pour vous. La décision de quitter le réseau social est prise, mais vous ignorez comment supprimer un compte Instagram ? Numerama vous explique pas à pas comment retirer l'application de tous vos appareils : téléphones Android ou iOS et ordinateurs, sans oublier de télécharger vos photos et vos données.

❌
❌