Teaching Artificial Intelligence To Search, Target Ads Based On Memory

Intelligent automation will become the next tool for online advertising. A team of researchers at Google recently demonstrated an artificially intelligent system that could reliably identify a mountain-unicycling video, demonstrating the implications of recurrent neural networks (RNN), which use an internal memory to process sequences of data for ad targeting.

The system could identify the mountain unicycling because it remembered the event or the object in the image. As it examined each frame of a video, the technology reviewed previously viewed frames. RNNs can recognize complex moving images, automatically generate detailed captions for online photos and videos, and improve online translation services.

Google ran a RNN in 2013 for voice activity detection, and later in 2014 to automatically search, identify and describe different images and content in pictures. The company also said recently that it improved text to speech using artificial intelligence. Improvements should transform nonsensical transcripts of Google voicemails into clear and audible transcriptions, the service only available today in the United States through Project Fi.

I have no doubt that the advertising industry will soon see a time when search marketers work with recurrent neural network models built into retargeting platforms that have short-term memory through artificial intelligence, removing the need for pixels or any other type of Web site coding required today for tracking.

“We are working, and have implemented, all kinds of ways to integrate without using Web site coding such as pixels and javascript,” said James Green, CEO of Magnetic, which supports search retargeting. “These include integrating with CRM systems, so we can use e-mail address for targeting, and direct integration with e-commerce platforms.”

While automation is key, marketers that want to buy media programmatically must work with technology that matches data “harvested with cookies and device IDs,” which Magnetic does. “Then the algos take over,” Green said. “We haven’t found that neural networks are better than more traditional techniques, but you never know.”

RNN technology, which has been around for a while, continues to see a revival among established players and startups. Nnaisense was recently founded by Jürgen Schmidhuber, who assisted in the development of “modern” RNNs, and four researchers who work alongside him at the Swiss AI lab, Istituto Dalle Molle di Studi sull’Intelligenza Artificiale (ADSIA), according to one report.

Nnaisense’s mission, per the company’s Web site, is to “build large-scale neural network solutions for superhuman perception and intelligent automation, with the ultimate goal of marketing general-purpose neural network-based Artificial Intelligences.”

Schmidhuber helped create a certain type of RNN called LSTM, or Long Short Term Memory, influencing recent AI research at Google, Microsoft, and IBM, among others. He has quite a few backers with many years experience in “deep learning” behind the company working in the same direction.

Just remember when seeing the words “deep learning” think of it as a form of artificial intelligence that includes RNNs. We will be hearing much more about the technology in the coming year.

Source: Teaching Artificial Intelligence To Search, Target Ads Based On Memory

Via: Google Alerts for AI