It takes years – sometimes a lifetime – to perfect certain skills in life: hitting a jump shot off the dribble, nailing that double high C on the trumpet, parallel parking a Ford Expedition. Malcolm Gladwell wrote a book, “Outliers,” discussing the amount of work – 10,000 hours – required to perfect a skill (while the exactness of 10,000 hours has come under debate, it is still a useful point that people need to invest considerable time and effort to master a skill). But once we get comfortable with something that we feel that we have mastered, we become reluctant to change. We are reluctant to unlearn what we’ve taken so long to master.
Changing your point of release on a jump shot or your embouchure for playing lead trumpet is dang hard! Why? Because it is harder to unlearn than it is to learn. It is harder to un-wire all those synoptic nerve endings and deep memories than it was to wire them in the first place. It’s not just a case of thinking faster, smaller or cheaper; it necessitates thinking differently.
For example, why did it take professional basketball so long to understand the game changing potential of the 3-point shot? The 3-point shot was added to the NBA during the 1979-1980 season, but for decades the 3-point shot was more a novelty then a serious game strategy. Pat Riley, the legendary coach of the 3-pointer’s first decade in the league (won NBA Championships in 1982, 1985, 1987 and 1988), called it a “gimmick.” Larry Bird, one of that era’s top players said: “I really don’t like it.”
It’s only been within the past 3 years where the “economics of the 3-point shot” have changed the fundamentals of how to win an NBA Championship (see Figure 1).
NBA Coaches and General Managers just didn’t comprehend the “economics of the 3-point shot” and how the 3-point shot could turn a good shooter into a dominant player; that a 40% 3-point shooting percentage is equivalent to a 60% 2-point shooting percentage from a points / productivity perspective. The economics of the 3-point shot (coupled with rapid ball movement to create uncontested 3-point shots) wasn’t fully exploited until the 2015-2016 season by the Golden State Warriors. Their success over the past 3 seasons (3 trips to the NBA finals with 2 championships) shows how much the game of basketball has been changed.
Sometimes it’s necessary to unlearn long held beliefs (i.e. 2-point shooting in a predominately isolation offense game) in order to learn new, more powerful, game changing beliefs (i.e. 3-point shooting in a rapid ball movement offense).
Sticking with our NBA example, Phil Jackson is considered one of the greatest NBA coaches, with 11 NBA World Championships coaching the Chicago Bulls and the Los Angeles Lakers. Phil Jackson mastered the “Triangle Offense” that played to the strengths of the then dominant players Michael Jordan (Chicago Bulls) and Kobe Bryant (Los Angeles Lakers) to win those 11 titles.
However, the game passed Phil Jackson as the economics of the 3-point shot changed how to win. Jackson’s tried-and-true “Triangle Offense” failed with the New York Knicks leading to the team’s dramatic under-performance and ultimately his firing. It serves as a stark reminder of how important it is to be ready to unlearn old skills in order to move forward.
And what holds true for sports, holds even more so for technology and business.
The Challenge of Unlearning
For the first two decades of my career, I worked to perfect the art of data warehousing. I was fortunate to be at Metaphor Computers in the 1980’s where we refined the art of dimensional modeling and star schemas. I had many years working to perfect my star schema and dimensional modeling skills with data warehouse luminaries like Ralph Kimball, Margy Ross, Warren Thornthwaite, and Bob Becker. It became engrained in every customer conversation; I’d built a star schema and the conformed dimensions in my head as the client explained their data analysis requirements.
Then Yahoo happened to me and soon everything that I held as absolute truth was turned upside down. I was thrown into a brave new world of analytics based upon petabytes of semi-structured and unstructured data, hundreds of millions of customers with 70 to 80 dimensions and hundreds of metrics, and the need to make campaign decisions in fractions of a second. There was no way that my batch “slice and dice” business intelligence and highly structured data warehouse approach was going to work in this brave new world of real-time, predictive and prescriptive analytics.
I struggled to unlearn engrained data warehousing concepts in order to embrace this new real-time, predictive and prescriptive world. And this is one of the biggest challenge facing IT leaders today – how to unlearn what they’ve held as gospel and embrace what is new and different. And nowhere do I see that challenge more evident then when I’m discussing Data Science and the Data Lake.
Embracing the “Art of Failure” and the Data Science Process
Nowadays, Chief Information Officers (CIO’s) are being asked to lead the digital transformation from a batch world that uses data and analytics to monitor the business to a real-time world that exploits internal and external, structured and unstructured data, to predict what is likely to happen and prescribe recommendations. To power this transition, CIO’s must embrace a new approach for deriving customer, product, and operational insights – the Data Science Process (see Figure 2).
The Data Science Process is about exploring, experimenting, and testing new data sources and analytic tools quickly, failing fast but learning faster. The Data Science process requires business leaders to get comfortable with “good enough” and failing enough times before one becomes comfortable with the analytic results. Predictions are not a perfect world with 100% accuracy. As Yogi Berra famously stated:
This highly iterative, fail-fast-but-learn-faster process is the heart of digital transformation – to uncover new customer, product, and operational insights that can optimize key business and operational processes, mitigate regulatory and compliance risks, uncover new revenue streams and create a more compelling, more prescriptive customer engagement. And the platform that is enabling digital transformation is the Data Lake.
The Power of the Data Lake
The data lake exploits the “economics of big data”; coupling commodity, low-cost servers and storage with open source tools and technologies, is 50x to 100x cheaper to store, manage and analyze data then using traditional, proprietary data warehousing technologies. However, it’s not just cost that makes the data lake a more compelling platform than the data warehouse. The data lake also provides a new way to power the business, based upon new data and analytics capabilities, agility, speed, and flexibility (see Table 1).
Table 1: Data Warehouse versus Data Lake
The data lake supports the unique requirements of the data science team to:
The data science team needs to be able perform this cycle in hours or days, not weeks or months. The data warehouse cannot support these data science requirements. The data warehouse cannot rapidly exploration the internal and external structured and unstructured data sources. The data warehouse cannot leverage the growing field of deep learning/machine learning/artificial intelligence tools to quantify cause-and-effect. Thinking that the data lake is “cold storage for our data warehouse” – as one data warehouse expert told me – misses the bigger opportunity. That’s yesterday’s “triangle offense” thinking. The world has changed, and just like how the game of basketball is being changed by the “economics of the 3-point shot,” business models are being changed by the “economics of big data.”
But a data lake is more than just a technology stack. To truly exploit the economic potential of the organization’s data, the data lake must come with data management services covering data accuracy, quality, security, completeness and governance. See “Data Lake Plumbers: Operationalizing the Data Lake” for more details (see Figure 3).
If the data lake is only going to be used another data repository, then go ahead and toss your data into your unmanageable gaggle of data warehouses and data marts.
BUT if you are looking to exploit the unique characteristics of data and analytics –assets that never deplete, never wear out and can be used across an infinite number of use cases at zero marginal cost – then the data lake is your “collaborative value creation” platform. The data lake becomes that platform that supports the capture, refinement, protection and re-use of your data and analytic assets across the organization.
But one must be ready to unlearn what they held as the gospel truth with respect to data and analytics; to be ready to throw away what they have mastered to embrace new concepts, technologies, and approaches. It’s challenging, but the economics of big data are too compelling to ignore. In the end, the transition will be enlightening and rewarding. I know, because I have made that journey.
|Update your feed preferences|