Vue lecture

AI-Trained Surgical Robot Removes Pig Gallbladders Without Any Human Help

An anonymous reader quotes a report from The Guardian: Automated surgery could be trialled on humans within a decade, say researchers, after an AI-trained robot armed with tools to cut, clip and grab soft tissue successfully removed pig gall bladders without human help. The robot surgeons were schooled on video footage of human medics conducting operations using organs taken from dead pigs. In an apparent research breakthrough, eight operations were conducted on pig organs with a 100% success rate by a team led by experts at Johns Hopkins University in Baltimore in the US. [...] The technology allowing robots to handle complex soft tissues such as gallbladders, which release bile to aid digestion, is rooted in the same type of computerized neural networks that underpin widely used artificial intelligence tools such as Chat GPT or Google Gemini. The surgical robots were slightly slower than human doctors but they were less jerky and plotted shorter trajectories between tasks. The robots were also able to repeatedly correct mistakes as they went along, asked for different tools and adapted to anatomical variation, according to a peer-reviewed paper published in the journal Science Robotics. The authors from Johns Hopkins, Stanford and Columbia universities called it "a milestone toward clinical deployment of autonomous surgical systems." [...] In the Johns Hopkins trial, the robots took just over five minutes to carry out the operation, which required 17 steps including cutting the gallbladder away from its connection to the liver, applying six clips in a specific order and removing the organ. The robots on average corrected course without any human help six times in each operation. "We were able to perform a surgical procedure with a really high level of autonomy," said Axel Krieger, assistant professor of mechanical engineering at Johns Hopkins. "In prior work, we were able to do some surgical tasks like suturing. What we've done here is really a full procedure. We have done this on eight gallbladders, where the robot was able to perform precisely the clipping and cutting step of gallbladder removal without any human intervention. "So I think it's a really big landmark study that such a difficult soft tissue surgery is possible to do autonomously." Currently, nearly all of the NHS's 70,000 annual robotic surgeries are human-controlled, but the UK plans to expand robot-assisted procedures to 90% within the next decade.

Read more of this story at Slashdot.

  •  

Hugging Face Launches $299 Robot That Could Disrupt Entire Robotics Industry

An anonymous reader quotes a report from VentureBeat: Hugging Face, the $4.5 billion artificial intelligence platform that has become the GitHub of machine learning, announced Tuesday the launch of Reachy Mini, a $299 desktop robot designed to bring AI-powered robotics to millions of developers worldwide. The 11-inch humanoid companion represents the company's boldest move yet to democratize robotics development and challenge the industry's traditional closed-source, high-cost model. The announcement comes as Hugging Face crosses a significant milestone of 10 million AI builders using its platform, with CEO Clement Delangue revealing in an exclusive interview that "more and more of them are building in relation to robotics." The compact robot, which can sit on any desk next to a laptop, addresses what Delangue calls a fundamental barrier in robotics development: accessibility. "One of the challenges with robotics is that you know you can't just build on your laptop. You need to have some sort of robotics partner to help in your building, and most people won't be able to buy $70,000 robots," Delangue explained, referring to traditional industrial robotics systems and even newer humanoid robots like Tesla's Optimus, which is expected to cost $20,000-$30,000. Reachy Mini emerges from Hugging Face's April acquisition of French robotics startup Pollen Robotics, marking the company's most significant hardware expansion since its founding. The robot represents the first consumer product to integrate natively with the Hugging Face Hub, allowing developers to access thousands of pre-built AI models and share robotics applications through the platform's "Spaces" feature. [...] Reachy Mini packs sophisticated capabilities into its compact form factor. The robot features six degrees of freedom in its moving head, full body rotation, animated antennas, a wide-angle camera, multiple microphones, and a 5-watt speaker. The wireless version includes a Raspberry Pi 5 computer and battery, making it fully autonomous. The robot ships as a DIY kit and can be programmed in Python, with JavaScript and Scratch support planned. Pre-installed demonstration applications include face and hand tracking, smart companion features, and dancing moves. Developers can create and share new applications through Hugging Face's Spaces platform, potentially creating what Delangue envisions as "thousands, tens of thousands, millions of apps." Reachy Mini's $299 price point could significantly transform robotics education and research. "Universities, coding bootcamps, and individual learners could use the platform to explore robotics concepts without requiring expensive laboratory equipment," reports VentureBeat. "The open-source nature enables educational institutions to modify hardware and software to suit specific curricula. Students could progress from basic programming exercises to sophisticated AI applications using the same platform, potentially accelerating robotics education and workforce development." "... For the first time, a major AI platform is betting that the future of robotics belongs not in corporate research labs, but in the hands of millions of individual developers armed with affordable, open-source tools."

Read more of this story at Slashdot.

  •  

Swarms of Tiny Nose Robots Could Clear Infected Sinuses, Researchers Say

An anonymous reader quotes a report from The Guardian: Swarms of tiny robots, each no larger than a speck of dust, could be deployed to cure stubborn infected sinuses before being blown out through the nose into a tissue, researchers have claimed. The micro-robots are a fraction of the width of a human hair and have been inserted successfully into animal sinuses in pre-clinical trials by researchers at universities in China and Hong Kong. Swarms are injected into the sinus cavity via a duct threaded through the nostril and guided to their target by electromagnetism, where they can be made to heat up and catalyze chemical reactions to wipe out bacterial infections. There are hopes the precisely targeted technology could eventually reduce reliance on antibiotics and other generalized medicines. [...] The latest breakthrough, based on animal rather than human trials, involves magnetic particles "doped" with copper atoms which clinicians insert with a catheter before guiding to their target under a magnetic field. The swarms can be heated up by reacting to light from an optical fibre that is also inserted into the body as part of the therapy. This allows the micro-robots to loosen up and penetrate viscous pus that forms a barrier to the infection site. The light source also prompts the micro-robots to disrupt bacterial cell walls and release reactive oxygen species that kill the bacteria. The study, published in Nature Robotics, showed the robots were capable of eradicating bacteria from pig sinuses and could clear infections in live rabbits with "no obvious tissue damage." The researchers have produced a model of how the technology could work on a human being, with the robot swarms being deployed in operating theatre conditions, allowing doctors to see their progress by using X-rays. Future applications could include tackling bacterial infections of the respiratory tract, stomach, intestine, bladder and urethra, they suggested. "Our proposed micro-robotic therapeutic platform offers the advantages of non-invasiveness, minimal resistance, and drug-free intervention," they said.

Read more of this story at Slashdot.

  •  

Google Rolls Out New Gemini Model That Can Run On Robots Locally

Google DeepMind has launched Gemini Robotics On-Device, a new language model that enables robots to perform complex tasks locally without internet connectivity. TechCrunch reports: Building on the company's previous Gemini Robotics model that was released in March, Gemini Robotics On-Device can control a robot's movements. Developers can control and fine-tune the model to suit various needs using natural language prompts. In benchmarks, Google claims the model performs at a level close to the cloud-based Gemini Robotics model. The company says it outperforms other on-device models in general benchmarks, though it didn't name those models. In a demo, the company showed robots running this local model doing things like unzipping bags and folding clothes. Google says that while the model was trained for ALOHA robots, it later adapted it to work on a bi-arm Franka FR3 robot and the Apollo humanoid robot by Apptronik. Google claims the bi-arm Franka FR3 was successful in tackling scenarios and objects it hadn't "seen" before, like doing assembly on an industrial belt. Google DeepMind is also releasing a Gemini Robotics SDK. The company said developers can show robots 50 to 100 demonstrations of tasks to train them on new tasks using these models on the MuJoCo physics simulator.

Read more of this story at Slashdot.

  •  

Scientists Built a Badminton-Playing Robot With AI-Powered Skills

An anonymous reader quotes a report from Ars Technica: The robot built by [Yuntao Ma and his team at ETH Zurich] was called ANYmal and resembled a miniature giraffe that plays badminton by holding a racket in its teeth. It was a quadruped platform developed by ANYbotics, an ETH Zurich spinoff company that mainly builds robots for the oil and gas industries. "It was an industry-grade robot," Ma said. The robot had elastic actuators in its legs, weighed roughly 50 kilograms, and was half a meter wide and under a meter long. On top of the robot, Ma's team fitted an arm with several degrees of freedom produced by another ETH Zurich spinoff called Duatic. This is what would hold and swing a badminton racket. Shuttlecock tracking and sensing the environment were done with a stereoscopic camera. "We've been working to integrate the hardware for five years," Ma said. Along with the hardware, his team was also working on the robot's brain. State-of-the-art robots usually use model-based control optimization, a time-consuming, sophisticated approach that relies on a mathematical model of the robot's dynamics and environment. "In recent years, though, the approach based on reinforcement learning algorithms became more popular," Ma told Ars. "Instead of building advanced models, we simulated the robot in a simulated world and let it learn to move on its own." In ANYmal's case, this simulated world was a badminton court where its digital alter ego was chasing after shuttlecocks with a racket. The training was divided into repeatable units, each of which required that the robot predict the shuttlecock's trajectory and hit it with a racket six times in a row. During this training, like a true sportsman, the robot also got to know its physical limits and to work around them. The idea behind training the control algorithms was to develop visuo-motor skills similar to human badminton players. The robot was supposed to move around the court, anticipating where the shuttlecock might go next and position its whole body, using all available degrees of freedom, for a swing that would mean a good return. This is why balancing perception and movement played such an important role. The training procedure included a perception model based on real camera data, which taught the robot to keep the shuttlecock in its field of view while accounting for the noise and resulting object-tracking errors. Once the training was done, the robot learned to position itself on the court. It figured out that the best strategy after a successful return is to move back to the center and toward the backline, which is something human players do. It even came with a trick where it stood on its hind legs to see the incoming shuttlecock better. It also learned fall avoidance and determined how much risk was reasonable to take given its limited speed. The robot did not attempt impossible plays that would create the potential for serious damage -- it was committed, but not suicidal. But when it finally played humans, it turned out ANYmal, as a badminton player, was amateur at best. The findings have been published in the journal Science Robotics. You can watch a video of the four-legged robot playing badminton on YouTube.

Read more of this story at Slashdot.

  •  

Amazon Prepares To Test Humanoid Robots for Delivering Packages

Amazon is developing software for humanoid robots that could eventually replace hundreds of thousands of delivery workers, [non-paywalled source] The Information reports. The company is building a "humanoid park" obstacle course at its San Francisco office to test robots that would ride in the back of Amazon's Rivian electric vans and deliver packages to customers, the report said. The indoor testing facility, roughly the size of a coffee shop, will house a Rivian van and serve as a controlled environment before Amazon takes the robots on "field trips" to deliver real packages on actual streets. This summer, Amazon plans to test multiple humanoid models, including a $16,000 unit from China-based Unitree that has gained popularity among robotics developers, the report said. The initiative represents Amazon's most ambitious robotics project yet, extending beyond its existing warehouse automation to tackle the significantly more complex challenge of outdoor package delivery. Amazon currently operates more than 20,000 Rivian vehicles for deliveries and plans to expand its electric fleet to 100,000 vehicles by 2030.

Read more of this story at Slashdot.

  •  

Hugging Face Introduces Two Open-Source Robot Designs

An anonymous reader quotes a report from SiliconANGLE: Hugging Face has open-sourced the blueprints of two internally developed robots called HopeJR and Reachy Mini. The company debuted the machines on Thursday. Hugging Face is backed by more than $390 million in funding from Nvidia Corp., IBM Corp. and other investors. It operates a GitHub-like platform for sharing open-source artificial intelligence projects. It says its platform hosts more than 1 million AI models, hundreds of thousands of datasets and various other technical assets. The company started prioritizing robotics last year after launching LeRobot, a section of its platform dedicated to autonomous machines. The portal provides access to AI models for powering robots and datasets that can be used to train those models. Hugging Face released its first hardware blueprint, a robotic arm design called the SO-100, late last year. The SO-100 was developed in partnership with a startup called The Robot Studio. Hugging Face also collaborated with the company on the HopeJR, the first new robot that debuted this week. According to TechCrunch, it's a humanoid robot that can perform 66 movements including walking. HopeJR is equipped with a pair of robotic arms that can be remotely controlled by a human using a pair of specialized, chip-equipped gloves. HopeJR's arms replicate the movements made by the wearer of the gloves. A demo video shared by Hugging Face showed that the robot can shake hands, point to a specific text snippet on a piece of paper and perform other tasks. Hugging Face's other new robot, the Reachy Mini, likewise features an open-source design. It's based on technology that the company obtained through the acquisition of a venture-backed startup called Pollen Robotics earlier this year. Reachy Mini is a turtle-like robot that comes in a rectangular case. Its main mechanical feature is a retractable neck that allows it to follow the user with its head or withdraw into the case. This case, which is stationary, is compact and lightweight enough to be placed on a desk. Hugging Face will offer pre-assembled versions of its open-source Reach Mini and HopeJR robots for $250 and $3,000, with the first units starting to ship by the end of the year.

Read more of this story at Slashdot.

  •  

Robot Industry Split Over That Humanoid Look

An anonymous reader quotes a report from Axios: Advanced robots don't necessarily need to look like C3PO from "Star Wars" or George Jetson's maid Rosie, despite all the hype over humanoids from Wall Street and Big Tech. In fact, some of the biggest skeptics about human-shaped robots come from within the robotics industry itself. [...] The most productive -- and profitable -- bots are the ones that can do single tasks cheaply and efficiently. "If you look at where robots are really bringing value in a manufacturing environment, it is combining industrial or collaborative robots with mobility," ABB managing director Ali Raja tells Axios. "I don't see that there are any real practical applications where humanoids are bringing in a lot of value." "The reason we have two legs is because whether Darwin or God or whoever made us, we have to figure out how to traverse an infinite number of things," like climbing a mountain or riding a bike, explains Michael Cicco, CEO of Fanuc America Corp. "When you get into the factory, even if it's a million things, it's still a finite number of things that you need to do." Human-shaped robots are over-engineered solutions to most factory chores that could be better solved by putting a robot arm on a wheeled base, he said. "The thing about humanoids is not that it's a human factor. It's that it's more dynamically stable," counters Melonee Wise, chief product officer at Agility Robotics, which is developing a humanoid robot called Digit. When humans grab something heavy, they can shift their weight for better balance. The same is true for a humanoid, she said. Using a robotic arm on a mobile base to pick up something heavy, "it's like I'm a little teapot and you become very unstable," she said, bending at the waist.

Read more of this story at Slashdot.

  •  

Student's Robot Obliterates 4x4 Rubik's Cube World Record

An anonymous reader quotes a report from the BBC: A student's robot has beaten the world record for solving a four-by-four Rubik's cube -- by 33 seconds. Matthew Pidden, a 22-year-old University of Bristol student, built and trained the "Revenger" over 15 weeks for his computer science bachelor's degree. The robot solved the cube in 45.305 seconds, obliterating the world record of 1 minute 18 seconds. However, the human record for solving the cube is 15.71 seconds. Mr Pidden's robot uses dual webcams to scan the cube, a custom mechanism to manipulate the faces, and a fully self-built solving algorithm to generate efficient solutions. The student now plans to study for a master's degree in robotics at Imperial College London.

Read more of this story at Slashdot.

  •