Data Scientists Frustrated by Data Variety, Find Hadoop Limiting

A survey of data scientists finds that a majority of them believe their work has grown more difficult.

Companies are focusing more and more attention on building out big data analytics capabilities and data scientists are feeling the pressure.

In a study of more than 100 data scientists released this week, Paradigm4, creator of open source computational database management system SciDB, found that 71 percent of data scientists believe their jobs have grown more difficult as a result of a multiplying variety of data sources, not just data volume.

Notably, only 48 percent of respondents said they had used Hadoop or Spark for their work and 76 percent felt Hadoop is too slow, takes too much effort to program or has other limitations.

"The increasing variety of data sources is forcing data scientists into shortcuts that leave data and money on the table," says Marilyn Matz, CEO of Paradigm4. "The focus on the volume of data hides the real challenge of analytics today. Only by addressing the challenge of utilizing diverse types of data will we be able to unlock the enormous potential of analytics."

Even with the challenges surrounding the Hadoop platform, something has to give. About half of the survey respondents (49 percent) said they're finding it difficult to fit their data into relational database tables. Fifty-nine percent of respondents said their organizations are already using complex analytics -- math functions like covariance, clustering, machine learning, principal components analysis and graph operations, as opposed to 'basic analytics' like business intelligence reporting -- to analyze their data.

Another 15 percent plan to begin using complex analytics in the next year and 16 percent anticipate using complex analytics within the next two years. Only four percent of respondents said their organizations have no plans to use complex analytics.

Paradigm4 believes this means that the "low hanging fruit" of big data has been exploited and data scientists will have to step up their game to extract additional value.

"The move from simple to complex analytics on big data presages an emerging need for analytics that scale beyond single server memory limits and handle sparsity, missing values and mixed sampling frequencies appropriately," Paradigm4 writes in the report. "These complex analytics methods can also provide data scientists with unsupervised and assumption-free approaches, letting all the data speak for itself."

Sometimes Hadoop Isn't Enough

Paradigm4 also believes Hadoop has been unrealistically hyped as a universal, disruptive big data solution, noting that it is not a viable solution for some use cases that require complex analytics. Basic analytics, Paradigm4 says, are "embarrassingly parallel" (sometimes referred to as "data parallel"), while complex analytics are not.

Embarrassingly parallel problems can be separated into multiple independent sub-problems that can run in parallel -- there is little or no dependency between the tasks and thus you do not require access to all the data at once. This is the approach Hadoop MapReduce uses to crunch data. Analytics jobs that are not embarrassingly parallel, like many complex analytics problems, require using and sharing all the data at once and communicating intermediate results among processes.

Twenty-two percent of the data scientists surveyed said Hadoop and Spark were not well-suited to their analytics. Paradigm4 also found that 35 percent of data scientists who tried Hadoop or Spark have stopped using it.

Paradigm4's survey of 111 U.S. data scientists was fielded by independent research firm Innovation Enterprise from March 27 to April 23, 2014. Paradigm4 put together this infographic of its survey results.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.
Show Comments

Blog Posts

3 marketing mistakes to overcome when courting prospective customers

Marketing that urges respondents to ‘buy now’ is a little like asking someone to marry you on your first date. At any time, only 3 per cent of the market is looking for what you’re selling, so the chances of your date randomly being ‘The One’ is pretty slim.

Sabri Suby

Founder, King Kong

Why are we dubious about deep learning?

The prospect of deep learning gives those of us in the industry something to get really excited about, and something to be nervous about, at the same time.

Katja Forbes

Founder and chief, sfyte

Why you can’t afford to fail at CX in 2019

In 1976 Apple launched. The business would go on to change the game, setting the bar for customer experience (CX). Seamless customer experience and intuitive designs gave customers exactly what they wanted, making other service experiences pale in comparison.

Damian Kernahan

Founder and CEO, Proto Partners

Red Agency YouGov Galaxy Report, February 2019 Predictors Study. https://redagency.com.au/re...

Vanessa Skye Mitchell

DNA-based marketing: The next big thing?

Read more

RIP holden

Max Polding

Marketing professor: For Holden, brand nostalgia ain’t what it used to be

Read more

Where does the claim that 2 million Australians have tested come from ? Anecdotal information suggests that this is way off the mark.

David Andersen

DNA-based marketing: The next big thing?

Read more

Thank you for the info , being part of a digital marketing agency in kerala , this proved handy and get to know with upcoming trends. htt...

Dotz Web Technologies

Predictions: 9 digital marketing trends for 2019

Read more

So who then is correct? The Research or The skilled Digital people.

Anene

Report reveals Australia faces digital skills shortage

Read more

Latest Podcast

More podcasts

Sign in