There’s so much choice available that customers can pick and choose who they buy from and where, when, and how it happens. They want to discover, research, evaluate, and purchase on their preferred channel. Give them that option, and they’re more likely to choose you. That’s the whole point behind the multi-channel approach.
Big data has arrived as a big business initiative. But the hip, experimental, ad hoc veneer of blending data streams to surface bold discoveries belies a massive cultural and technological undertaking not every organisation is ready for.
Without a strategic plan that includes coherent goals, strong data governance, rigorous processes for ensuring data accuracy, and the right mentality and people, big data initiatives can easily end up being a big-time liability rather than a valuable asset.
Following are five strategic tips for avoiding big data failure. In many cases, the advice pertains to any data management project, regardless of the size of the data set. But the advent of massive data stores has brought with it a particular set of pitfalls. Here's how to increase the chances that your organisation's urge to mix large data pools from disparate sources is a success.
Big data success tip No. 1: Make big data a central business tenet Rearden Commerce CTO Phil Steitz succinctly sums up the single most important driver of big data success: You must integrate analytics and data-driven decision making into the core of your business strategy.
"If 'big data' is just a buzzword internally, it becomes a solution looking for a problem," Steitz says.
For Reardon Commerce, whose e-commerce platform leverages big data and other resources to optimise the exchange of goods, services, and information between buyers and sellers, the concept of "absolute relevance" - putting the right commercial opportunity in front of the right economic agent at the right time - is key.
"It is an example of this kind of thinking originating and centrally driving strategy at the top of the house," Steitz says.
Part of this approach includes developing a small, high-powered team of data scientists, semantic analysts, and big data engineers, then opening a sustained, two-way dialogue between that team and forward-thinking decision makers in the business, Steitz says.
"The biggest challenge in really getting value out of contemporary analytics and semantic analysis technologies is that the technologists who can really bring out what is possible need to be deeply engaged with business leaders who 'get it' and can help winnow out what is really valuable," Steitz says.
Another key success factor in making big data a part of the overall business strategy is effective management of data partnerships.
"Really optimising customer experience and economic value in today's world inevitably requires sharing data across enterprises," Steitz says. "Naive approaches to this - 'just send us your full transaction file nightly' - fail miserably for operational as well as privacy and security reasons."
Big data success tip No. 2: Data governance is essential
Big data projects bring with them significant concerns over security, privacy, and regulatory compliance. Nowhere is this a more sensitive issue than in the health care industry.
Health care provider Beth Israel Deaconess Medical Center is one organisation becoming increasingly involved in big data, as it works with electronic medical records, new health care reimbursement models, and the vast amounts of clinical and claims data that has been collected over the years. Data governance will play a key role.
"There will be a lot of pressure put on health IT organisations to turn the data around rapidly," says Bill Gillis, CIO of Beth Israel Deaconess.
Having strong governance in place enables organisations to make sure the data is accurate and tells the clinical story they need in order to provide quality and improved care.
"It's critical that the 'tyranny of the urgent' not win over," Gillis says. "Having governance in place up front can help avoid that pitfall and keep things on track."
Of course, security and privacy are a big part of this.
"Given the uncertainties that surround new big data, for the important brands the privacy and security bar is so high that the protections afforded for this new data are higher than most other traditional external decision data," says Charles Stryker, chairman and CEO of Venture Development Center, a consulting firm that has provided big data advice for companies such as AOL, Cisco, First Data, and Yahoo. "No major brand wants to test the limits of where the privacy and security line falls," Stryker says.
From a project's outset, companies need to consider data provenance (the metadata that describes the source of the data) and make appropriate pedigree decisions (confidence in the data) when using this data in any big data solution, says Louis Chabot, senior technical adviser and big data lead at technology and management consulting firm DRC, which has helped government agencies implement big data projects.
"Maintaining data provenance metadata and pedigree-based decision making is not something you 'bolt on' after the fact," Chabot says. "It is an integral part of the initiative that must be designed and included from the outset." When appropriate, Chabot says, specialised techniques such as digital signatures should be used to protect provenance from accidental and/or malicious tampering.
Organisations also need to respect data privacy laws and regulations. "Various techniques such as anonymisation of the data, stripping out elements of the data, and restricting distribution [and] usage of the data can be used" so that organisations are in compliance with security and privacy regulations, Chabot says.
Big data success tip No. 3: Don't shortchange data accuracy
Recent research from Aberdeen Group stresses yet another litmus test for big data success: data accuracy.
According to the report, best-in-class companies (as determined by Aberdeen metrics) reported 94 per cent data accuracy was their organisational goal and 1 per cent improvement was needed to meet this goal. But industry-average companies reported a data accuracy goal of 91 per cent, and needed 18 per cent improvement in their data management methodologies to achieve this, while ‘laggards’ reported a data accuracy goal of 80 per cent and needed 40 per cent improvement in their current performance to reach that.
Here, data cleansing and mastering are critical to big data success. "Contrary to some beliefs, this requirement does not go away," says Joe Caserta, founder and CEO of Caserta Concepts, a data management and big data consulting firm. "If the big data paradigm is to become the new corporate analytics platform, it must be able to align customers, products, employees, locations and so on, regardless of the data source."
In addition, known data quality issues that have long jeopardised credibility of data analyses will have the same impact on big data analytics if not properly addressed, he says.
On a typical big data project, data management is often "deprioritised" by development staff and can go unresolved, DRC's Chabot notes. Effective data management involves ensuring mature techniques - process and automation - are put in place to address model management, metadata management, reference data management, master data management, vocabulary management, data quality management, and data inventory management, he says.