HPE: From data anarchy to data value

A post by Stefan Brock, Industry Partner at Hewlett Packard Enterprise

A data-centric architecture decouples the data from the heterogeneous application environment, providing a key prerequisite for data-driven business. While this is a complex transformation, it can be implemented in incremental steps based on a set of common design principles.

The platform giants have demonstrated how data-driven business works. They obviously had an advantage – they were able to design their IT architectures based on data-centric principles right from the start.

For example, in 2002 Jeff Bezos published an internal memo with his well-known API mandate: from now on all of our data exchange between apps and services must take place via service interfaces – without exception and regardless of the technologies used – and anyone who does not comply with this rule will be fired.

The situation is quite different in the vast majority of companies. They house a zoo of data silos which are held together by scripts, interfaces and point-to-point connections. In the long run, such spaghetti architectures are not only expensive, but also a serious obstacle to a company’s data value creation strategy.

But even for these companies there is a viable path towards a data-centric architecture. Its key features can be easily summarized. In essence, it decouples the data from the applications by channeling it via a central data hub – which in turn is based on streaming technologies, various database architectures and a data lake. Each application is both producer and consumer of a common real-time data stream. And all of this is embedded in an overarching data governance framework.

The introduction of a data-centric architecture ultimately serves the goal of creating a „digital twin“ of the entire data pool of a company or ecosystem. However, a big-bang approach is usually unrealistic for companies with complex heterogeneous IT environments.

In contrast, incremental approaches have proven to be successful. Lighthouse projects create a magnetic effect so that the target architecture continuously spreads throughout the company – provided the approach is based on common design principles.

Once the data streams have been channeled via a data hub, this is the foundation for quickly implementing a large number of use cases. But the way to get there is not easy and is also becoming more and more difficult in view of the increasingly complex application landscapes and rapidly growing data volumes.

As with all strategic IT initiatives, top management support and solid governance are needed to overcome these difficulties – but agile methodologies for architecture management, development and operations are particularly important. Because that way, the path and the goal are in harmony: here, as there, it is about overcoming silos with the aim of dealing less with oneself and more with customers.

Now that I outlined our view and approach, I’m keen to hear your perspective. What’s your strategy to unify data access and management? Which technologies and platforms have proven to be effective? And how do you collaborate with lines of business and company management? I look forward to discussing this with you in upcoming CIOmove events. See you there!