When Digital Transformation Polarizes the Organization
Part Three of Four: How to Control Your Future.
By Kurt Jonckheer
Chief Executive Officer
Some mistakes go unnoticed with no consequences whatsoever. Some mistakes cost lives.
In the world of IT, the situation is no different. Mistakes are going to happen as a matter of course, and the vast majority of these are routine and easily correctable. On the other hand, as organizations increasingly need to rely on emerging technologies and cloud-based solutions to survive, the wrong mistake at the wrong time in the wrong environment can literally put an organization out of business.
Regardless of the different domains that come into play—and the perceived list of priorities—the following items need to be controlled at all costs. Failure to do so can literally be a game-ending mistake.
Control your own destiny.
Sometimes situations occur that make dependency on an outside vendor seem like the right move. In fact, as IT systems become more and more integrated, it’s virtually impossible for any enterprise to function without a strong vendor ecosystem.
That said, when planning the NextGen organization, its future proof-architecture, its services, and its roadmaps, keeping your core specialties and added value under your own control is crucial. To give them away is tantamount to putting your future in someone else’s control.
Where the situation gets particularly muddled, however, is in deciding what an organization’s streaming, event-driven solutions should specifically entail. This is because data for reporting, billing, intelligence analysis, cost controls, and value-added services gets intermingled and often buried within multiple vendor systems.
Part of the digital transformation process should be to become less vendor dependent, to break existing silos and implement new tools, processes, and partner ecosystems, and to stimulate the organizational DNA to drive innovation and acceleration. Consequently, to avoid vendor lock-in and reduce outside dependencies, organizations often pursue a hybrid partner strategy. The same holds true for IT systems, software solutions, and cloud infrastructures.
Yet, a hybrid strategy alone isn’t enough to protect a company’s proprietary offerings—or its future. Organizations must also keep focusing on their primary skills, and retaining and repositioning of resources, while accepting that software development will increasingly become part of their core business.
Future exit strategies need to stay in place, as do performance-control procedures, scalability requirements, proactive customer insights, and customer linkage. A long-term roadmap is needed on multiple levels, ranging from internal and external stakeholder needs, resources, migrations, end-of-life exercises, security requirements, architectural visions, and implementation rules.
It’s imperative to address what must remain within the core control of the organization, versus what can structurally be outsourced. This entire process should be done internally and be devoid as much as possible from outside vendors—ahead of time, rather during transition periods, or even worse, as an afterthought.
Log, monitor, and control your costs.
Give 100 colleagues access to the new world of API’s, proprietary microservices, and non-documented data sets, and you might as well give them the corporate credit card.
That’s why the need to log, monitor and control your costs is hardly news to anyone; however, with so many complexities involved in moving to the cloud—along with more and more proprietary consumption-driven offerings from vendors—controlling these seemingly routine processes is far more critical.
When storage, compute and bandwidth levels of consumption forecasts are either ignored or improperly estimated, what might sound like pennies on the dollar /GB/month is almost certain to lead to an ROI or OPEX disaster. In fact, it’s bound to happen when unanticipated volumes are activated on an hourly basis throughout the organization.
The situation becomes even more problematic when the organization can’t articulate or quantify its use case types, numbers, amounts of needed lifts and shifts, work on hybrid infrastructures, cross-departmental processes in multiple environments, or deal with a shift from traditional CAPEX to OPEX budgets.
Unless logging, monitoring, and exposing consumption-driven costs are made clearly visible at every level of the company, nasty surprises are inevitable.
Control your own data and respect the data governance of your customer.
Data control is the cornerstone of every digital transformation initiative:
- Unless data is accessible, it’s game over.
- Unless data is secure, it becomes a liability.
- Unless data is manageable, it becomes unaffordable.
- Unless data is processed and understood, it’s useless.
- Unless data is streaming, it’s outdated.
- Unless data is geographically identified and regulatory-compliant, it’s a smoking gun.
- Unless data is accessible now, it defeats its very purpose to begin with.
If even one of these listed bullets is missing, you have already shot yourself in the foot. Streamlining, transforming, and moving to your organization’s NextGen solutions requires careful data-pipeline planning across every hybrid infrastructure.
Current US-based laws, like the Patriot Act and the Cloud Act, as well as the EU’s privacy shield arrangements, simply cannot warrant EU data governance as it’s presently instituted in the GDPR laws. As a result, Europe needs to identify and invest in its own cloud infrastructure, transatlantic privacy, and the alignment of data-ownership laws.
Fortunately, European initiatives like GAIA-X and IDSA have recently been launched in a move to create a data-sovereignty constitution. In parallel, their aim is to develop a regulation regarding the formation of EU-owned datacenter nodes as an alternative to existing world mandates.
Data ownership and end-user data control, in addition to concepts like data purses, data-dignity mechanisms, and data lineage all need to be taken into account in the creation of new, data-driven organizations. Privacy by design is no longer an afterthought. It needs to reside at the core of every platform, stack, application, and algorithm design.
While it is certainly a monumental task, it’s nevertheless possible. On the other hand, retrofitting in an attempt to fix game-over mistakes is a waste of time.