Google AI project apes memory, programs (sort of) like a human

Neural Turing Machines attempt to emulate the brain's short-term memory

Artificial intelligence concept illustration

An artificial intelligence concept illustration.

abstract, android, artificial, binary, blue, brain, cell, communication, computer, concept, connection, creative, cyber, cybernetic, cyberspace, cyborg, data, digit, digital, fantasy, fiction, future, futuristic, fuzzy, head, human, idea, illustration, imagination, informatics, information, dreamstime

dreamstime_29416761
Artificial intelligence concept illustration An artificial intelligence concept illustration. abstract, android, artificial, binary, blue, brain, cell, communication, computer, concept, connection, creative, cyber, cybernetic, cyberspace, cyborg, data, digit, digital, fantasy, fiction, future, futuristic, fuzzy, head, human, idea, illustration, imagination, informatics, information, dreamstime dreamstime_29416761

The mission of Google's DeepMind Technologies startup is to "solve intelligence." Now, researchers there have developed an artificial intelligence system that can mimic some of the brain's memory skills and even program like a human.

The researchers developed a kind of neural network that can use external memory, allowing it to learn and perform tasks based on stored data.

Neural networks are interconnected computational "neurons." While conventional neural networks have lacked readable and writeable memory, they have been used in machine learning and pattern-recognition applications such as computer vision and speech recognition.

The so-called Neural Turing Machine (NTM) that DeepMind researchers have been working on combines a neural network controller with a memory bank, giving it the ability to learn to store and retrieve information.

The system's name refers to computer pioneer Alan Turing's formulation of computers as machines having working memory for storage and retrieval of data.

The researchers put the NTM through a series of tests including tasks such as copying and sorting blocks of data. Compared to a conventional neural net, the NTM was able to learn faster and copy longer data sequences with fewer errors. They found that its approach to the problem was comparable to that of a human programmer working in a low-level programming language.

The NTM "can infer simple algorithms such as copying, sorting and associative recall from input and output examples," DeepMind's Alex Graves, Greg Wayne and Ivo Danihelka wrote in a research paper available on the arXiv repository.

"Our experiments demonstrate that it is capable of learning simple algorithms from example data and of using these algorithms to generalize well outside its training regime."

A spokesman for Google declined to provide more information about the project, saying only that the research is "quite a few layers down from practical applications."

In a 2013 paper, Graves and colleagues showed how they had used a technique known as deep reinforcement learning to get DeepMind software to learn to play seven classic Atari 2600 video games, some better than a human expert, with the only input being information visible on the game screen.

Google confirmed earlier this year that it had acquired London-based DeepMind Technologies, founded in 2011 as an artificial intelligence company. The move is expected to have a major role in advancing the search giant's research into robotics, self-driving cars and smart-home technologies.

More recently, DeepMind co-founder Demis Hassabis wrote in a blog post that Google is partnering with artificial intelligence researchers from Oxford University to study topics including image recognition and natural language understanding.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.
Show Comments

Blog Posts

What the modern gig economy is doing to customer experience

Most marketing theory was established in the context of stable employment relationships. From front-line staff to marketing strategists and brand managers, employees generally enjoyed job security with classic benefits such as superannuation plans, stable income streams, employment rights, training, sabbaticals and long-service leave.

Dr Chris Baumann

Associate professor, Macquarie University

The new data hierarchy

We are all digital lab rats spewing treasure troves of personal data wherever we go.

Gerry Murray

Research director, marketing and sales technology services, IDC

When marketing a business, we can learn a lot from neuroscience

In 2015, a study at MIT suggested an algorithm could predict someone’s behaviour faster and more reliably than humans can.

Michael Jenkins

Founder and director, Shout agency

Because you are missing the point of the term "disruption"

Sean

Uber for the truckies: How one Aussie startup is disrupting the freight industry

Read more

Absolutely agree with this ... Facebook doesn't care what adds they show. You report an add for fake news/scam and it just remains "open...

Quasi Carbon

Unilever CMO threatens Facebook, Google with digital advertising boycott

Read more

How to create Pinball game in 4 minshttps://youtu.be/S1bsp7del3M

Alex Atmavan

Rethinking gamification in marketing

Read more

True Local - one of the least credible review sites on the entire internet.

MyNameIsStomp

Former Virgin Mobile CMO and CEO joins oOh! as first customer chief

Read more

Data-driven marketing solutions are the way forward to inspire customer engagement. Data should be given a long leash when it comes ident...

Claudia

C-suite perspectives: How Ray White's executive perceive marketing's role today

Read more

Latest Podcast

More podcasts

Sign in