Five Data-Driven Steps Deeper into Your Big Data Lake
More posts by this contributor:
Big data lakes have created a lot of change, a lot of angst and a lot of opportunity. Among those opportunities is a different way of looking at–and creating–insights for transforming your business. When done correctly, you can increase the speed and agility in which you measure your performance and course-correct to adapt to the fast changing business conditions around you. Here are five ways in which you can maximize the value of your data assets.
1. Load the Data First
Instead of modeling the data first, load the data first, then model based on the content and meaning of the data that matters the most for the decision at-hand. The switch adds power, saves time and enhances your ability to understand and make faster, more focused, smarter decisions.
Breaking data barriers-blending multiple sources, mining for new insights, analyzing sets and correlations, and linking internal and external data environments to create new insights, new analytic views, and new business opportunities–maximizes the value in the shortest possible delivery time. Adopting a “Data First” approach also assesses potential cost reductions by augmenting your traditional platforms with the big data lake, resulting in less dependence on higher-cost software and platforms over time and allowing for a ‘brute-force’ approach to crunching data at efficiencies of scale never possible before. Combine that with machine learning tools and you shift the heavy lifting paradigm by letting the database do the bulk of the analysis and propose new insights to you.
2. Opportunities in Storing Everything
By not discarding or ‘filtering’ data due to performance and cost concerns, you are able to maximize the depth and breadth of your analytics and increase its accuracy.
You can increase the speed and agility in which you measure your performance and course-correct to adapt to the fast changing business conditions
Data discovery will be easier with data that are more accessible, enable faster turnaround, and user-friendly tools to manage the environment in a self-service model, decreasing the need for on-going IT support and maintenance. Data should be versioned, curated, and tagged so that users can readily attach confidence levels to the data as they are using it. “Fit-for-purpose” acceptable-use policies should ensure that proper controls are in place so that users can mine this data for potential analysis and new insights that add value to with total control while contributing to the overall ecosystem.
On demand and real time analytics are within reach with these new processing paradigms.
3. Move the Data Warehouse to a New Neighborhood
In today’s environment, accounting for every question in one model is impossible. New data sources are emerging just too fast. New questions are sprouting up even faster. A highly engineered environment that only takes the data it needs upfront is going to have difficulty adapting to rapidly changing requirements.
Augmenting the data warehouse with agile analytical exploratory environments allows a business to leverage both environments successfully while mitigating cost and risk.
4. Worry Less and Execute Fast
Gone are the days where every expense needed to be justified over a 5-year depreciation cycle and achieve the approval of the board of directors. Big data is accessible within the spending limits of most departments. Think of your data lake as an enterprise platform, find a department interested in taking the journey with you, and go do. Most organizations see a very quick benefit and ROI by adopting the technology, which is proven both cost-effective and impactful. Sometime it is ok to break the rules, this is one of these cases.
5. Take the Best, Leave the Rest Behind
Best of breed is back. Leverage your current investments while keeping an eye towards the future of data management. It is big, it is fast, it is bold, but most of all it is smart. You cannot afford to be left behind in the war over who has the most accurate and current information to run their business.
r over who has the most accurate and current information to run their business.
Getting the Most out of Big Data
Big Data: Separating the Hype from Reality in Corporate Culture
Maintaining Maximum Relevancy for Buyers and Sellers
Building Levies to Manage Data Flood
By Nancy S. Wolk, CIO, Alcoa - Global Business Services
By John Kamin, EVP and CIO, Old National Bancorp
By Gregg T. Martin, VP & CIO, Arnot Health
By Elliot Garbus, VP-IoT Solutions Group & GM-Automotive...
By Bryson Koehler, EVP & CIO, The Weather Company, an IBM...
By Gregory Morrison, SVP & CIO, Cox Enterprises
By Adrian Mebane, VP-Global Ethics & Compliance, The Hershey...
By Lowell Gilvin, Chief Process Officer, Jabil
By Dennis Hodges, CIO, Inteva Products
By Gerri Martin-Flickinger, CIO, Adobe Systems
By Walter Carvalho, VP& Corporate CIO, Carnival Corporation
By Mary Alice Annecharico, SVP & CIO, Henry Ford Health System
By Bernd Schlotter, President of Services, Unify
By Bob Fecteau, CIO, SAIC
By Kushagra Vaid, GM, Server Engineering, Microsoft
By Steve Beason, Enterprise CTO, Scientific Games
By Steve Bein, VP-GIS, Michael Baker International
By Jason Alan Snyder, CTO, Momentum Worldwide
By Jim Whitehurst, CEO, Red Hat
By Alberto Ruocco, CIO, American Electric Power