When I interact with CIOs steering the agile enterprises, they discuss plans, somewhat coyly, to business processes driven by algorithms and Artificial Intelligence (AI). These enterprises are able to achieve automation of tasks that only humans could perform in the past. Better yet, the solutions deployed through AI are at a scale where humans struggle to manage. AI can enable enterprises to accomplish tasks in days what now takes months due to the scale. Even better, it can perform at human like or better accuracies and do so consistently.
Enterprises are shifting from the idealized notion of big data as a data first initiative to attempt practical and business focused use cases. AI and Machine Learning (ML) are key ingredients for big data success and expertise in these areas has become critical.
These trailblazer organizations are raising the bar high for competition to catch up, and are aiming at taking the value of data to a new level—execute tasks that were possible by humans only in the past. They are leveraging hybrid, flexible architectures like Integrated Big Data as a Service (IBDaaS) to give their functions and processes the agility required to venture into the world of prescriptive analytics and as a prerequisite for a smart AI led enterprise.
However, these organizations are a small percentage of the business landscape. In contrast, many other organizations that ambitiously kick started big data projects as technology projects rather than business initiatives are finding it hard to change course, and make good their investments.
Why Big Data Investments Fail?
Big data is a lot about gaining insight, understanding the business, and developing foresight—from a wide variety of data within and outside the business. It is a common refrain from CIOs that they are struggling to get value from their big data investments, “we have enormous amounts of data, yet we are not getting much from it,” and “we have poured in millions, and are yet to see any tangible return,” are all too familiar.
Companies have invested in new data architectures and tools in the hope that big data will enhance their profitability. However, after spending over $140 billion in 2017, only 37 percent of these projects delivered some measure of success.
Big data can include a variety of data types and can be structured or poly-structured. Structured data includes high volume transaction data such as retail transaction data and pharmaceutical drug test data. Engineers tend to build solutions on structured data for quick success. But such short sight can have a considerable impact on an organization’s success with its data.
Poly-structured data is more difficult to process and includes semi-structured data like XML and HTML and unstructured data like text, image, rich media, and possibly graph data. For smart organizations, data variety and value out of the variety takes priority over volume when looking for insights.
Big data engineers often fail to focus on deriving business value from data. Instead, they focus on building data pipelines. If organizations want to extract value from their data, they must focus on a business-first rather than a data-first view of data.
Data engineers must widen their horizons and focus on enabling the accelerated solving of the problems of the Chief Marketing Officer (CMO) and Chief Digital Officer (CDO) rather than only those of the CIO function. Organizations must hardwire the big data investment to its impact in the market and to the organization's market value.
If organizations want to extract value from their data, they must focus on a business-first rather than a data-first view of data
Big Data Automation
Big data automation attempts to do a lot more with a lot less, and smart organizations will deploy their limited talented resources to build anything that has customer-perceived value, that enable, and creates a differentiation for the organization.
Big data engineers today spend most of their time performing routine tasks. Most of these tasks are ideal candidates for automation.
Enterprises are planning to integrate the various data layers so they can reuse data across different applications. But existing architectures are too inefficient to fulfill this goal. Some organizations have begun using intelligent data grids and flexible architectures to integrate the data layer.
Intelligent data grids offer a common platform to quickly read and write data between applications, and create the source that can be used to perform a task.
A CIO friend was of the opinion that the approach to big data must go beyond data analysis. It must translate your business need to problem prediction, resource optimization and task execution. Out-of-the-box algorithms can automate big data tasks. Data management architectures must possess a meshed architecture based on a leaderless, actor-state model, in which each node performs independently with high fault tolerance and self healing capability. This gives them the efficiency and agility required to perform big data tasks.
Towards Prescriptive Analytics
If descriptive analytics was the foundation of business intelligence, and predictive analytics the current objective for big data, then prescriptive analytics will be the future of big data by taking decisions and learning automatically from their effects. A popular example of prescriptive analytics is to predict a breakdown of a machine and know and apply the fix preventively.
Prescriptive analytics use optimization and simulation algorithms, machine learning and artificial intelligence concepts to understand the impact of future decisions and use scenarios to advice on possible outcomes with a self or auto learning loop on the effects of the decisions. With prescriptive analytics, it is now possible to peek into the future and take actions today to shape the future. For example, an automotive manufacturer can forecast the increase in prices of raw material, and hedge in the raw material to lower price risks in the future and thereby protect the bottom line and ensure efficient operations.
Analytics to AI is a natural and pragmatic progression, and concerns related to life cycle of AI from data collection to governance to predictions to prescriptions and automation. Faced with an incessant increase in data, we need new systems that learn and adapt, and AI offers that hope.
Is AI the Emerging Normal?
For too long now, companies have focused on limited ends of data—to focus on generating reports, dashboards and interactive graphics—all about data representation rather than value extraction.
An organization that consumes huge amounts of energy for its operations can now automate the purchase of power when it is cheap (during non-peak hours) and store it for future use—all this, without human intervention.
This is about the ability to solve business problems that could not be solved before; to perform tasks that were only human-possible earlier.
Artificial Intelligence also has another application now— to achieve faster rate of data exploration in a shorter time. Algorithms can perform data analysis automatically and uncover relationships that exist amongst the data. Artificial intelligence can automate the preparation of data for analysis, perform big data analysis to discover correlations, make predictions and automate decision making
To successfully integrate big data in your organization, a flexible platform that addresses a wide range of business and data scenarios is necessary. Pre-built applications for specific big data use cases fail because data differs from one organization to another, and one size does not fit all. Organizations must adopt hybrid and flexible architectures that enable adding new data sources on the fly; that allow adding new metadata and new algorithms.
Big data projects involve performing four major tasks—data preparation, machine learning, domain knowledge, and result interpretation. “How about a robust framework,” I thought, “that makes these processes agile, automates the tasks, and leaves you with the insight you need to make informed business decisions?”
If your enterprise needs a scalable and extremely efficient data integration platform, consider implementing a services oriented architecture of the future. IBDaaS is a fantastic example of such architecture, offering sophisticated big data automation, and can integrate with your data and infrastructure.
IBDaaS enables organizations to achieve faster results by automating the whole data plumbing process, eliminating a huge cost overhead. Then it orchestrates the whole gamut of tasks including preparing the data, de-sensitizing it, and generating the output of your choice. Data is made available in a common data lake leveraging the metadata that eliminates duplication when accessed by multiple applications, creating a near instant ability to share common data. This enhances data governance and security.
Peek in to the Future
IBDaaS is a future-proofed architecture that automates and orchestrates your analytics operations—essentially creating an ‘insights as a service’ operation. This will release data scientists to focus on solving critical business problems—for instance, fine tune the entire route plan of a logistics operation with a goal to save 30 percent fuel consumption annually, instead of poring over data anomalies.
At iFusion Analytics, we have developed a true IBDaaS solution that eases the pain organizations endure to discover new insights and better understand what their data says about the future of their business. The hybrid and flexible framework makes-easy the addition of any component, and empowers them to stay at the forefront through AI.