Five things you need to know about AI: Cognitive and neural and deep, oh my!

When it comes to tech lingo, not all terms are created equal

There's never any shortage of buzzwords in the IT world, but when it comes to AI, they can be hard to tell apart. There's artificial intelligence, but then there's also machine intelligence. There's machine learning, but there's also deep learning. What's the difference? Here are five things you need to understand.

1. AI is basically an umbrella term for it all

Artificial intelligence refers to "a broad set of methods, algorithms and technologies that make software 'smart' in a way that may seem human-like to an outside observer," said Lynne Parker, director of the division of Information and Intelligent Systems for the National Science Foundation.

Machine learning, computer vision, natural language processing, robotics and related topics are all part of AI, in other words.

2. Machine intelligence = AI

"Some people may come up with distinctions between the two, but there is not a universal view that the two terms mean anything different," Parker said.

There may actually be a regional preference at work behind the origins of the separate terms. "Machine intelligence” has a more "down-to-earth engineering sensibility" that's been popular in Europe, whereas "artificial intelligence" has a slightly more "science-fiction feel" that's made it more popular in the U.S., said Thomas Dietterich, a professor at Oregon State University and president of the Association for the Advancement of Artificial Intelligence. In Canada, it's often been called "computational intelligence,” he added.

3. Machine learning is also a blanket term covering multiple technologies

As a part of AI, machine learning refers to a wide variety of algorithms and methodologies that enable software to improve its performance over time as it obtains more data. That includes both neural networks and deep learning (see below).

"Fundamentally, all of machine learning is about recognizing trends from data or recognizing the categories that the data fit in so that when the software is presented with new data, it can make proper predictions," Parker explained.

As an example, think about the task of recognizing someone's face. "I have no idea how I recognize my wife’s face," Dietterich said. "This makes it very difficult to guess how to program a computer to do it."

By learning from examples, machine learning provides a way to do that. "This is 'programming by input-output examples' rather than by coding," Dietterich said.

Commonly used machine-learning techniques include neural networks, support vector machines, decision trees, Bayesian belief networks, k-nearest neighbors, self-organizing maps, case-based reasoning, instance-based learning, hidden Markov models and "lots of regression techniques," Parker said.

4. Neural networks are a type of machine learning, and deep learning refers to one particular kind

Neural networks -- also known as "artificial" neural networks -- are one type of machine learning that's loosely based on how neurons work in the brain, though "the actual similarity is very minor," Parker said.

There are many kinds of neural networks, but in general they consist of systems of nodes with weighted interconnections among them. Nodes, also known as "neurons," are arranged in multiple layers, including an input layer where the data is fed into the system; an output layer where the answer is given; and one or more hidden layers, which is where the learning takes place. Typically, neural networks learn by updating the weights of their interconnections, Parker said.

Deep learning refers to what's sometimes called a "deep neural network," or one that includes a large system of neurons arranged in several hidden layers. A "shallow" neural network, by contrast, will typically have just one or two hidden layers.

"The idea behind deep learning is not new, but it has been popularized more recently because we now have lots of data and fast processors that can achieve successful results on hard problems," Parker said.

5. Cognitive computing: It's complicated

Cognitive computing is another subfield under the AI umbrella, but it's not as easily defined. In fact, it's a bit controversial.

Essentially, cognitive computing "refers to computing that is focused on reasoning and understanding at a higher level, often in a manner that is analogous to human cognition -- or at least inspired by human cognition," Parker said. Typically, it deals with symbolic and conceptual information rather than just pure data or sensor streams, with the aim of making high-level decisions in complex situations.

Cognitive systems often make use of a variety of machine-learning techniques, but cognitive computing is not a machine-learning method per se. Instead, "it is often a complete architecture of multiple AI subsystems that work together," Parker said.

"This is a subset of AI that deals with cognitive behaviors we associate with 'thinking' as opposed to perception and motor control," Dietterich said.

Whether cognitive computing is a true category of AI or simply a popular buzzword isn't entirely clear, however.

"'Cognitive' is marketing malarkey," said Tom Austin, a vice president and fellow at Gartner, in an email. "It implies machines think. Nonsense. Bad assumptions lead to bad conclusions."

Join the CMO newsletter!

Error: Please check your email address.
Show Comments

Supporting Association

Blog Posts

Is AI on course to take over human creativity?

Computers and artificial intelligence have come along at an exponential rate over the past few decades, from being regarded as oversized adding machines to the point where they have played integral roles in some legitimately creative endeavours.

Jason Dooris

CEO and founder, Atomic 212

Are you leading technology changes or is technology leading you?

In a recent conversation with a chief technology officer, he asserted all digital technology changes in his organisation were being led by IT and not by marketing. It made me wonder: How long a marketing function like this could survive?

Jean-Luc Ambrosi

Author, marketer

Disruption Down Under – What’s Amazon’s real competitive advantage?

Savvy shoppers wait in anticipation, while Australian retailers are gearing up for the onslaught. Amazon’s arrival is imminent.

Thanks for picking this up. We are always happy to add richness to our products and in turn the lives of our followers and fans.

Fitbit Middle East

​Fitbit announces new virtual race platform to enhance customer experience

Read more

Thanks for a very interesting article. B2B marketing seems tricky. I think that marketing plays a vital part - it can build the brand and...

Aaren

From tactical overhead to strategic growth driver: B2B marketing in the digital age

Read more

meanwhile loads of people with digital skills are not finding work or getting an opportunity to be hired?? Double standards perhaps.

Graduate dying on centrelink

Report reveals Australia faces digital skills shortage

Read more

These laws are in one way or other giving businesses to VPN service providers & other cyber utilities. Just read PureVPN claiming 37%...

Paige Hudson

Getting prepared for mandatory data breach reporting

Read more

Great Post.Thanks for sharing such an informative article.I have worked with Ally Digital Media and it has a very good service which is b...

Utkarsh Kansara

Predictions: 17 digital marketing trends for 2017

Read more

Latest Podcast

More podcasts

Sign in