Data Scientists Frustrated by Data Variety, Find Hadoop Limiting

A survey of data scientists finds that a majority of them believe their work has grown more difficult.

Companies are focusing more and more attention on building out big data analytics capabilities and data scientists are feeling the pressure.

In a study of more than 100 data scientists released this week, Paradigm4, creator of open source computational database management system SciDB, found that 71 percent of data scientists believe their jobs have grown more difficult as a result of a multiplying variety of data sources, not just data volume.

Notably, only 48 percent of respondents said they had used Hadoop or Spark for their work and 76 percent felt Hadoop is too slow, takes too much effort to program or has other limitations.

"The increasing variety of data sources is forcing data scientists into shortcuts that leave data and money on the table," says Marilyn Matz, CEO of Paradigm4. "The focus on the volume of data hides the real challenge of analytics today. Only by addressing the challenge of utilizing diverse types of data will we be able to unlock the enormous potential of analytics."

Even with the challenges surrounding the Hadoop platform, something has to give. About half of the survey respondents (49 percent) said they're finding it difficult to fit their data into relational database tables. Fifty-nine percent of respondents said their organizations are already using complex analytics -- math functions like covariance, clustering, machine learning, principal components analysis and graph operations, as opposed to 'basic analytics' like business intelligence reporting -- to analyze their data.

Another 15 percent plan to begin using complex analytics in the next year and 16 percent anticipate using complex analytics within the next two years. Only four percent of respondents said their organizations have no plans to use complex analytics.

Paradigm4 believes this means that the "low hanging fruit" of big data has been exploited and data scientists will have to step up their game to extract additional value.

"The move from simple to complex analytics on big data presages an emerging need for analytics that scale beyond single server memory limits and handle sparsity, missing values and mixed sampling frequencies appropriately," Paradigm4 writes in the report. "These complex analytics methods can also provide data scientists with unsupervised and assumption-free approaches, letting all the data speak for itself."

Sometimes Hadoop Isn't Enough

Paradigm4 also believes Hadoop has been unrealistically hyped as a universal, disruptive big data solution, noting that it is not a viable solution for some use cases that require complex analytics. Basic analytics, Paradigm4 says, are "embarrassingly parallel" (sometimes referred to as "data parallel"), while complex analytics are not.

Embarrassingly parallel problems can be separated into multiple independent sub-problems that can run in parallel -- there is little or no dependency between the tasks and thus you do not require access to all the data at once. This is the approach Hadoop MapReduce uses to crunch data. Analytics jobs that are not embarrassingly parallel, like many complex analytics problems, require using and sharing all the data at once and communicating intermediate results among processes.

Twenty-two percent of the data scientists surveyed said Hadoop and Spark were not well-suited to their analytics. Paradigm4 also found that 35 percent of data scientists who tried Hadoop or Spark have stopped using it.

Paradigm4's survey of 111 U.S. data scientists was fielded by independent research firm Innovation Enterprise from March 27 to April 23, 2014. Paradigm4 put together this infographic of its survey results.

Join the CMO newsletter!

Error: Please check your email address.
Show Comments

Supporting Association

Blog Posts

Top tips to uncovering consumer insights for business innovation

An in-depth understanding of consumers sits at the heart of what we all need to do, but we know it’s not always easy to uncover insights that will unlock a true innovation opportunity.

Matt Whale

Managing director, How To Impact

Is your customer experience program suffering bright shiny object syndrome?

You may have heard of ‘bright shiny object syndrome’. The term is used to describe new initiatives undertaken by organisations that either lack a strategic approach, or suffer from a failure to effectively implement.

Leveraging technology to stand out in the sea of sameness

The technology I'm talking about here is data and marketing automation. Current digital marketing methodology, much as it is practiced at Bluewolf, dictates the need for a strategy that does four things: Finds the right audience, uses the right channel, delivers the right content, and does all of that at the right time.

Eric Berridge

CEO and co-founder of Bluewolf, an IBM Company

Lead Management is very important part of the process. For anyone running Facebook Lead Ads I would recommend using this service.Get your...

Dirk Lo

How this fintech startup is improving content marketing and lead generation

Read more

I am agreeing with Mr. Tyron Hayes that a measured test-and-learn approach could be missing opportunities to not only better engage custo...

rush essay reviews

CMO interview: How Curtin University’s marketing chief is using test and learn to cope with complexity

Read more

Excellent!

Dr Sadasivan,US

Shakespeare shows data and creativity aren’t Montagues and Capulets

Read more

Great article! Agreed with all... Matthew Lerner, Deeps De Silva... When a company has a great product that solves customers needs, a gre...

James Tyler

Why marketers are embracing growth hacking techniques

Read more

Very good article, Social media analytics helps in problem identification. They can serve as an early warning system for negative custome...

BizVinu

Four ways to use social media to boost customer loyalty

Read more

Latest Podcast

More podcasts

Sign in