Enabling the Autonomous Enterprise

This post was contributed by Senior Principal Product Marketing Director, Ron Craig.

Background – data overflow

The ability of enterprises to generate data is increasingly outpacing their ability to realize real value from that data. As a result, opportunities for innovation, driven by customer, market and competitive intelligence are being left on the table. And given that only a subset of the avalanche of data is being put to good use, it’s entirely possible that the use of inadequate data is leading to bad decisions.

A key source of this problem is that the productivity of human beings simply hasn’t kept pace with the technologies we have developed to help improve our business processes. IDC has predicted that by 2025, 6 billion consumers will have one data interaction every 18 seconds. At that point, the volume of global data will be 175ZB (175,000,000,000,000,000,000,000 bytes), and ~30% of that will be real time data – a 6X increase vs. 2018. The exponential increase in effort required to clean, arrange, secure, maintain and process the data from all those customers means less effort can be dedicated to insights. As a consequence, enterprises are not truly seeing the benefits from their own success in becoming data companies.

Abstraction as a game-changer

So what’s needed in response? Sometimes it’s good to look to other areas for inspiration, and innovation in the semiconductor industry provides some useful insights. That industry, since its early days, has had to deal with the fact that user productivity has struggled to keep pace with advances in technology, and has surmounted those issues with innovations that address those productivity limitations head on.

Digital designs – the creations that comprise everything from the silicon chips to operate a timer on a microwave oven all the way up to the ability to forecast the weather with a supercomputer – are at their essence created from logical components, known as gates. These logic gates perform pretty routine Boolean operations, and effectively allow decisions or calculations to be made based on sets of inputs, and propagate those calculations in real time. Chip designers working at the gate level could be expected to produce verified designs (effectively combinations of connected gates) at a rate of ~50 gates per day – a productivity level that’s remained pretty constant over time.

The processors in today’s high end cellphones may contain around 100 million gates, so a team of 100 chip designers working at the gate level would take 80 years to put such a chip together. In reality though, today such chips are often developed in two years or less, as a result of innovations introduced in the chip design flow over the last twenty years. For the purposes of this blog, and since it provides a useful analogy, the innovation we’ll focus on is the introduction of the hardware definition language (HDL). An HDL effectively works like software, allowing the chip designer to describe logic in a way that resembles what it does, as opposed to how it’s built, hence freeing the designer from the details of how that logic operation is implemented in hardware. HDL-based design goes hand in hand with automated synthesis algorithms, which translate those higher level descriptions into the equivalent gates that perform the same function, and which ultimately can be realized in silicon.

As a result of these innovations, the semiconductor industry has enabled designers to keep up with the capacity of silicon chips by allowing them to be less and less concerned about the lower level implementation details of the chips they are designing, and put their focus on what those chips actually do. Chip designers take care of the ‘what’, where they can bring their creativity and experience to bear, and automation takes care of the ‘how’ in a reliable and repeatable fashion.

Oracle Autonomous Database – Automating a path to innovation

The semiconductor industry experience provides a useful blueprint for a path that the data industry must also take, demonstrating why automation is the key to unlocking the potential of today’s data, in the same way that innovation in the semiconductor industry has allowed designers to fully exploit the capacity of silicon chips. Across a range of industries, corporations differentiate themselves by what they do with the data they generate and collect, not in the effort they expend to manage and secure that data. To have maximum impact, database experts need to be able to maximize their focus on what their data is telling them (the ‘what’), and rely on automation to keep that data available and secure (the ‘how’).

95% of the respondents to a recent Oracle user survey noted that they are having difficulty keeping up with the growth in their data, and the majority of data managers are performing multiple major database updates per year. In addition to simply keeping the database up and running, the survey noted that significant manual effort continues to be dedicated to general troubleshooting and tuning, backup/recovery tasks, and provisioning to handle usage peaks and troughs.

Data security also stands out as an area that can benefit significantly from automation, not only because automation can reduce manual effort, but because it can reduce risk. In an age where managers of on premises database setups much continuously juggle the urgency of patch installation with the cost of the downtime needed to install those patches, it comes as no surprise that a recent Verizon survey noted that 85% of successful data breaches exploited vulnerabilities for which patches were available for up to a year before the attack occurred. It makes perfect sense to instead make use of Oracle Autonomous Database to automatically apply security patches with no downtime.

In total, these automated capabilities reduce administrative costs by 80%, meaning that the Autonomous Enterprise taking advantage of these advances can dedicate significantly more effort to innovation.

Coming back to our semiconductor analogy, innovations in how design is done didn’t make chip designers less necessary, rather it made them significantly more productive and enabled them to make more innovative use of advances in technology. We expect the Oracle Autonomous Database to have the same impact for DBAs and data managers in the Autonomous Enterprise.

Learn more at Oracle Open World 2019

To learn more about how enterprises who have already become autonomous, visit the sessions below at the 2019 Oracle Open World event –

Drop Tank: A Cloud Journey Case Study, Tuesday September 17, 11:15AM – 12:00PM

Oracle Autonomous Data Warehouse: Customer Panel, Tuesday September 17, 1:45PM – 2:30PM

Oracle Autonomous Transaction Processing Dedicated Deployment: The End User’s Experience, Tuesday September 17, 5:15PM – 6:00PM

Managing One Of the Largest IoT Systems in the World With Autonomous Technologies, Wednesday September 18, 9:00AM – 9:45AM

Related:

  • No Related Posts

Leave a Reply