Vue lecture

Thunderobot présente un MiniPC au look de Steam Machine

Je pense que cela va être une tendance dans les mois à venir, des MiniPC au format de la Steam Machine de Valve vont être présentés. Thunderrobot annonce cette tendance sans beaucoup de détails techniques pour le moment. Il faut dire que la présentation de la solution de Valve est récente.

Thunderobot est une marque chinoise qui sort rarement de ses frontières. Débordant à peine sur les régions limitrophes. Son nouveau MiniPC présenté au CES 2026 dans un format cubique se rapprochant du Steam Deck semble pourtant faire partie des pistes envisagées par beaucoup de constructeurs pour 2026.

Equipé d’un processeur AMD Ryzen AI Max+ 395 très performant avec son circuit graphique Radeon 8060S, la machine a, semble t-il, beaucoup de points communs avec la Steam Machine. En particulier, l’emploi d’un énorme dissipateur sur sa partie supérieure et une évacuation de la chaleur qui fonctionne de la même manière que l’engin de Valve.

Thunderobot

La connectique est toutefois largement plus étoffée, on retrouve sur la station de Thunderobot l’ensemble des ports classiques d’un MiniPC haut de gamme. Double port USB 3.2 Type-A en façade, un USB 3.2 Type-C, un jack audio combo 3.5 mm et un lecteur de cartes SDXC.

A l’arrière, quatre ports USB 3.2 Type-A supplémentaires, deux sorties HDMI, un DisplayPort, deux ports Ethernet et un Antivol type Kensington Lock. 

Aucune autre information technique n’est disponible. Pas de détails sur la mémoire vive, le stockage ou la gestion des réseaux. Pas d’info précise sur les capacités des différents connecteurs non plus. 

Des similitudes et beaucoup de différences entre Valve et Thunderobot.

La face avant ne présente pas de système de LEDs identique à la trouvaille de Valve pour le Steam Deck. Mais il est possible que le bouton de démarrage, entouré d’une zone qui semble transparente, joue le même rôle. Reste à espérer que Thunderobot ne joue pas encore une fois au jeu de l’IA et présente des versions uniquement équipées des 128 Go de mémoire vive habituels. Non pas que le public soit contre le fait d’avoir beaucoup de mémoire vive, simplement que le prix de cet équipement est un peu prohibitif en ce moment.

Je ne serais vraiment pas surpris de voir une myriade de minimachines de ce type débarquer dans les trimestres à venir. Pas forcément équipées de puces Strix Halo mais reprenant ce design particulier. Difficile de savoir dans quelle mesure ce type d’engin pourrait avoir droit à une licence de Steam OS.

Valve annonce la Steam Machine, sa minimachine de jeu

Source : VideoCardz

Thunderobot présente un MiniPC au look de Steam Machine © MiniMachines.net. 2025

  •  

Samsung's Rolling Ballie Robot Indefinitely Shelved After Delays

Samsung Electronics has once again sidelined Ballie, a long-anticipated robot that was first announced six years ago but never released. Bloomberg News: The device -- designed to roll and roam throughout the home -- is completely absent from this week's CES, the biggest electronics trade show. And though Samsung said last year that Ballie was nearly ready for a retail release, the product is now unlikely to resurface soon. In an emailed statement, Samsung referred to Ballie as an "active innovation platform" within the company, rather than a forthcoming consumer device. "After multiple years of real-world testing, it continues to inform how Samsung designs spatially aware, context-driven experiences, particularly in areas like smart home intelligence, ambient AI and privacy-by-design," a Samsung spokesperson said in the statement.

Read more of this story at Slashdot.

  •  

Hyundai and Boston Dynamics Unveil Humanoid Robot Atlas At CES

At CES 2026 today, Hyundai and Boston Dynamics publicly demonstrated its humanoid robot Atlas, showing off fluid movement and announcing plans to deploy a production version in Hyundai's EV factory by 2028. NBC News reports: "For the first time ever in public, please welcome Atlas to the stage," said Boston Dynamics' Zachary Jackowski as a life-sized robot with two arms and two legs picked itself up from the floor at a Las Vegas hotel ballroom. It then fluidly walked around the stage for several minutes, sometimes waving to the crowd and swiveling its head like an owl. An engineer remotely piloted the robot from nearby for the purpose of the demonstration, though in real life Atlas will move around on its own, said Jackowski, the company's general manager for humanoid robots. [...] Hyundai also announced a new partnership with Google's DeepMind, which will supply its artificial intelligence technology to Boston Dynamics robots. It's a return to a familiar partnership for Google, which bought Boston Dynamics in 2013 before selling it to Japanese tech giant SoftBank several years later. Hyundai acquired it from SoftBank in 2021. [...] At the end of Monday's live Atlas demonstration, which appeared flawless, the humanoid prototype swung its arms in a theatrical gesture to introduce a static model of the new product version of Atlas, which looked slightly different and was blue in color. "I think the question comes back to what are the use cases and where is the applicability of the technology," said Alex Panas, a partner at consultancy McKinsey who helped lead a CES robotics panel that attracted hundreds of people earlier in the day. "In some cases, it may look more humanoid. In some cases, it may not." Either way, Panas said, "the software, the chipsets, the communication, all the other pieces of the technology are coming together, and they will create new applications." You can watch a video of the demonstration on YouTube.

Read more of this story at Slashdot.

  •  

Researchers Make 'Neuromorphic' Artificial Skin For Robots

An anonymous reader quotes a report from Ars Technica: The nervous system does an astonishing job of tracking sensory information, and does so using signals that would drive many computer scientists insane: a noisy stream of activity spikes that may be transmitted to hundreds of additional neurons, where they are integrated with similar spike trains coming from still other neurons. Now, researchers have used spiking circuitry to build an artificial robotic skin, adopting some of the principles of how signals from our sensory neurons are transmitted and integrated. While the system relies on a few decidedly not-neural features, it has the advantage that we have chips that can run neural networks using spiking signals, which would allow this system to integrate smoothly with some energy-efficient hardware to run AI-based control software. [...] There are four ways that these trains of spikes can convey information: the shape of an individual pulse, through their magnitude, through the length of the spike, and through the frequency of the spikes. Spike frequency is the most commonly used means of conveying information in biological systems, and the researchers use that to convey the pressure experienced by a sensor. The remaining forms of information are used to create something akin to a bar code that helps identify which sensor the reading came from. In addition to registering the pressure, the researchers had each sensor send a "I'm still here" signal at regular time intervals. Failure to receive this would be an indication that something has gone wrong with a sensor. The spiking signals allow the next layer of the system to identify any pressure being experienced by the skin, as well as where it originated. This layer can also do basic evaluation of the sensory input: "Pressure-initiated raw pulses from the pulse generator accumulated in the signal cache center until a predefined pain threshold is surpassed, activating a pain signal." This can allow the equivalent of basic reflex reactions that don't involve higher-level control systems. For example, the researchers set up a robotic arm covered with their artificial skin, and got it to move the arm whenever it experiences pressure that can cause damage. The second layer also combines and filters signals from the skin before sending the information on to the arm's controller, which is the equivalent of the brain in this situation. So, the same system caused a robotic face to change expressions based on how much pressure its arm was sensing. [...] The skin is designed to be assembled from a collection of segments that can snap together using magnetic interlocks. These automatically link up any necessary wiring, and each segment of skin broadcasts a unique identity code. So, if the system identifies damage, it's relatively easy for an operator to pop out the damaged segment and replace it with fresh hardware, and then update any data that links the new segment's ID with its location. The researchers call their development a neuromorphic robotic e-skin, or NRE-skin. "Neuromorphic" as a term is a bit vague, with some people using it to mean a technology that directly follows the principles used by the nervous system. That's definitely not this skin. Instead, it uses "neuromorphic" far more loosely, with the operation of the nervous system acting as an inspiration for the system.The findings have been published in the journal PNAS.

Read more of this story at Slashdot.

  •  

Researchers Show Some Robots Can Be Hijacked Just Through Spoken Commands

An anonymous Slashdot reader shared this story from Interesting Engineering: Cybersecurity specialists from the research group DARKNAVY have demonstrated how modern humanoid robots can be compromised and weaponised through weaknesses in their AI-driven control systems. In a controlled test, the team demonstrated that a commercially available humanoid robot could be hijacked with nothing more than spoken commands, exposing how voice-based interaction can serve as an attack vector rather than a safeguard, reports Yicaiglobal... Using short-range wireless communication, the hijacked machine transmitted the exploit to another robot that was not connected to the network. Within minutes, this second robot was also taken over, demonstrating how a single breach could cascade through a group of machines. To underline the real-world implications, the researchers issued a hostile command during the demonstration. The robot advanced toward a mannequin on stage and struck it, illustrating the potential for physical harm.

Read more of this story at Slashdot.

  •  
❌