AIs are starting to learn like human babies by grasping and poking objects

Researchers at Carnegie Mellon University are teaching robots to learn by touch, much like human babies. The experiment one day could allow artificial intelligence to learn about the physical environment through senses, including touch. The development draws robotics and AI closer together, paving the way for potential unified applications in factories, automated deliveries of goods, or household assistants. The research is set out in a paper by CMU students Lerrel Pinto, Dhiraj Gandhi, and Yuanfeng Han; and professors Yong-Lae Park and Abhinav Gupta. The paper, titled “The Curious Robot: Learning Visual Representations via Physical Interactions (pdf),” describes the experiment’s goal: to use physical robotic interactions to teach an AI to recognize objects. Poking and pushing The research is a departure from established AI practices, which often use an algorithm to…


Link to Full Article: AIs are starting to learn like human babies by grasping and poking objects