How things might change

Having had to worry about overloaded threads and overloaded operators in other contexts, the picture, above, reminds one of so much.  Rather than delay the adoption of a model to await incremental insights or trend maturation, the graphic was just loaded up with what was on offer from different perspectives.  One is reminded of an advertising campaign based on ‘just do it’.

As a kind of prescient cloud native architecture, much and many was/were condensed into a single manifestation of, well, manifests.  Now that we have a way to implement manifests and load threads and operators at the flick of a script, why not presume we can build something as richly augmented with value as the picture which helps us recall past flirtations with similar topics?

And that is the question.  Can we both see into, and see beyond our past well enough to pull off what the current level of software tooling truly anticipates?

 

Certainly, this work has at least danced into the foray on behalf of those who might find it a bit interesting.  To get more serious requires that the interesting features, of those threads and operators, take more of a key role in the orchestration of solutions.  We can start here with an interpretation of how the baseline for that picture can provide the reference architecture for cloud native evolution.

The Open Group’s IT4IT offer set includes a run time design for dev/ops, devsecops, and for the development of a model-based platform engineering program.  If we believe that software can drive the ‘as Code’ trend further and further in support of digital transformation, then software that helps build software becomes software that determines what software we can build from a given set of principles.  Certainly, there’s the 12-factor app as a principle.  Certainly, cyber physical systems are a principle.  And with cloud native offerings as a principle, we can use Kubernetes as the foundation upon which to construct the manifest of future change.  Happenstance, such happenstance.

Maybe it’s just easier to see things coming together as we see more.  The more we see of cloud native, the more we see of ourselves in our work.  Or, the more we see our work in ourselves.  We can start from data driven.  We can start from modernization.  We can start from software bills of materials.  We can start from digital twins.  With those as starting places, pretty nearly any vision from any perspective can be woven into the mashup of design tooling available.  That is what we will show in the next graphic.

Data Driven view from: The data-driven enterprise of 2025 | McKinsey; accessed 04 april 2023

Digital Twin view from: Digital Twins Capabilities Periodic Table; published by the Digital Twin Consortium

 

Pure data organizations, and we will see more of them, will want to move from the lower left around the graphic clockwise to the upper right.  Pure process organizations will want to move from the lower left counter-clockwise to the upper right.  Most organizations are gong to move from their as-built environments to their as-envisioned versions by adopting some data driven trends and some process driven trends.  Process driven trends are more and more going to become digital twins of manual processes or of human to machine interactions which have no prior existence.

In order to provide a template for best practices, moving clockwise or counterclockwise from the lower left, the use of analogies based on medical policies and procedures is helpful.  There are templates for all phases of market orchestration in the medical field because of the critical importance of patient health and the need to constrain costs wherever possible.  As a result of these twin governance guardrails, Meaningful Use, Accountable Care (here Accountable Distribution), Personalized health and Population health have all become practices with sufficient definition to support adoption by analogy.  Here, Personalized Health is Personalized Experience and Population Health is Demographic Demand.

What this model provides is a way to surround an embedded ‘as Code’ generator to provide stakeholder value at every stage of investment.  Typically, investment return is delayed until the end of some long spell of effort, expense, and trial and error.  By adopting the 12-factor app and the use of minimum viable products at each stage, phase, or level of development, we can accrue value to the stakeholders in a more agile manner.

The key to any of this is to enable teams to define what is truly needed that will be used for an extensive amount of time first, and then make that need secure within the process of its creation.  We have the opportunity to apply several techniques analogous to the McKinsey data portfolio study and the Digital Twins’ Consortium Capability Periodic Table designs. 

Those include the use of the Cyber Physical Systems model and the Open Group’s IT4IT portfolio, including a reuse of the first graphic in this Note.  Architecture, development, security, and operation teams all need their version of scalable, efficient, and extensible experiences.  

 

The breadth of the Cloud Native Computing Foundation’s catalog is a source of sources for replicating design patterns, supply chain logistics, manufacturing excellence, and responsible self-service.  These principles and techniques enable us to fill in the central part of the second graphic, and to develop a road map to the Composable Edge 4.0.

 

 

Previous
Previous

Starting on that longer work

Next
Next

Converging 101