MIT makes an AI that can predict what the future looks and sounds like

We take anticipation for granted, but robots predicting what happens next is very difficult. If you walk down the street and see two people meeting in front of a café, you know they’ll shake hands or even hug a few seconds later, depending on how close their relationship is. We, humans, sense this sort of stuff very easily, and now some robots can too. Left: still frame given to the algorithm which had to predict what happened next. Right: next frames from the video. Credit: MIT CSAIL MIT used deep-learning algorithms — neural networks that teach computers to find patterns by themselves from an ocean of data — to build an artificial intelligence which can predict what action will occur next starting from nothing but a still frame. Each of these…


Link to Full Article: MIT makes an AI that can predict what the future looks and sounds like