Five things you need to know about AI: Cognitive and neural and deep, oh my!

When it comes to tech lingo, not all terms are created equal

There's never any shortage of buzzwords in the IT world, but when it comes to AI, they can be hard to tell apart. There's artificial intelligence, but then there's also machine intelligence. There's machine learning, but there's also deep learning. What's the difference? Here are five things you need to understand.

1. AI is basically an umbrella term for it all

Artificial intelligence refers to "a broad set of methods, algorithms and technologies that make software 'smart' in a way that may seem human-like to an outside observer," said Lynne Parker, director of the division of Information and Intelligent Systems for the National Science Foundation.

Machine learning, computer vision, natural language processing, robotics and related topics are all part of AI, in other words.

2. Machine intelligence = AI

"Some people may come up with distinctions between the two, but there is not a universal view that the two terms mean anything different," Parker said.

There may actually be a regional preference at work behind the origins of the separate terms. "Machine intelligence” has a more "down-to-earth engineering sensibility" that's been popular in Europe, whereas "artificial intelligence" has a slightly more "science-fiction feel" that's made it more popular in the U.S., said Thomas Dietterich, a professor at Oregon State University and president of the Association for the Advancement of Artificial Intelligence. In Canada, it's often been called "computational intelligence,” he added.

3. Machine learning is also a blanket term covering multiple technologies

As a part of AI, machine learning refers to a wide variety of algorithms and methodologies that enable software to improve its performance over time as it obtains more data. That includes both neural networks and deep learning (see below).

"Fundamentally, all of machine learning is about recognizing trends from data or recognizing the categories that the data fit in so that when the software is presented with new data, it can make proper predictions," Parker explained.

As an example, think about the task of recognizing someone's face. "I have no idea how I recognize my wife’s face," Dietterich said. "This makes it very difficult to guess how to program a computer to do it."

By learning from examples, machine learning provides a way to do that. "This is 'programming by input-output examples' rather than by coding," Dietterich said.

Commonly used machine-learning techniques include neural networks, support vector machines, decision trees, Bayesian belief networks, k-nearest neighbors, self-organizing maps, case-based reasoning, instance-based learning, hidden Markov models and "lots of regression techniques," Parker said.

4. Neural networks are a type of machine learning, and deep learning refers to one particular kind

Neural networks -- also known as "artificial" neural networks -- are one type of machine learning that's loosely based on how neurons work in the brain, though "the actual similarity is very minor," Parker said.

There are many kinds of neural networks, but in general they consist of systems of nodes with weighted interconnections among them. Nodes, also known as "neurons," are arranged in multiple layers, including an input layer where the data is fed into the system; an output layer where the answer is given; and one or more hidden layers, which is where the learning takes place. Typically, neural networks learn by updating the weights of their interconnections, Parker said.

Deep learning refers to what's sometimes called a "deep neural network," or one that includes a large system of neurons arranged in several hidden layers. A "shallow" neural network, by contrast, will typically have just one or two hidden layers.

"The idea behind deep learning is not new, but it has been popularized more recently because we now have lots of data and fast processors that can achieve successful results on hard problems," Parker said.

5. Cognitive computing: It's complicated

Cognitive computing is another subfield under the AI umbrella, but it's not as easily defined. In fact, it's a bit controversial.

Essentially, cognitive computing "refers to computing that is focused on reasoning and understanding at a higher level, often in a manner that is analogous to human cognition -- or at least inspired by human cognition," Parker said. Typically, it deals with symbolic and conceptual information rather than just pure data or sensor streams, with the aim of making high-level decisions in complex situations.

Cognitive systems often make use of a variety of machine-learning techniques, but cognitive computing is not a machine-learning method per se. Instead, "it is often a complete architecture of multiple AI subsystems that work together," Parker said.

"This is a subset of AI that deals with cognitive behaviors we associate with 'thinking' as opposed to perception and motor control," Dietterich said.

Whether cognitive computing is a true category of AI or simply a popular buzzword isn't entirely clear, however.

"'Cognitive' is marketing malarkey," said Tom Austin, a vice president and fellow at Gartner, in an email. "It implies machines think. Nonsense. Bad assumptions lead to bad conclusions."

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.
Show Comments

Latest Videos

Conversations over a cuppa with CMO: Microsoft's Pip Arthur

​In this latest episode of our conversations over a cuppa with CMO, we catch up with the delightful Pip Arthur, Microsoft Australia's chief marketing officer and communications director, to talk about thinking differently, delivering on B2B connection in the crisis, brand purpose and marketing transformation.

More Videos

Great content and well explained. Everything you need to know about Digital Design, this article has got you covered. You may also check ...

Ryota Miyagi

Why the art of human-centred design has become a vital CX tool

Read more

Interested in virtual events? If you are looking for an amazing virtual booth, this is definitely worth checking https://virtualbooth.ad...

Cecille Pabon

Report: Covid effect sees digital events on the rise long-term

Read more

Thank you so much for sharing such an informative article. It’s really impressive.Click Here & Create Status and share with family

Sanwataram

Predictions: 14 digital marketing predictions for 2021

Read more

Nice!https://www.live-radio-onli...

OmiljeniRadio RadioStanice Uzi

Google+ and Blogger cozy up with new comment system

Read more

Awesome and well written article. The examples and elements are good and valuable for all brand identity designs. Speaking of awesome, ch...

Ryota Miyagi

Why customer trust is more vital to brand survival than it's ever been

Read more

Blog Posts

A Brand for social justice

In 2020, brands did something they’d never done before: They spoke up about race.

Dipanjan Chatterjee and Xiaofeng Wang

VP and principal analyst and senior analyst, Forrester

Determining our Humanity

‘Business as unusual’ is a term my organisation has adopted to describe the professional aftermath of COVID-19 and the rest of the tragic events this year. Social distancing, perspex screens at counters and masks in all manner of situations have introduced us to a world we were never familiar with. But, as we keep being reminded, this is the new normal. This is the world we created. Yet we also have the opportunity to create something else.

Katja Forbes

Managing director of Designit, Australia and New Zealand

Should your business go back to the future?

In times of uncertainty, people gravitate towards the familiar. How can businesses capitalise on this to overcome the recessionary conditions brought on by COVID? Craig Flanders explains.

Craig Flanders

CEO, Spinach

Sign in