Here at OBASHI a point made regularly is that,
Today as governments, businesses, and individuals we are increasingly reliant on flows of data, in nearly all aspects of our daily lives. Which means it is becoming ever more essential that standards for these flows of data are agreed and regulated.
Why would standards for data flows be a good idea?
Think about your daily interactions with other critical flows: Oil & Gas, Electricity and Water. Do you hesitate to ignite a gas ring on the cooker? Are you scared of electrocution when you switch on a television? Do you fear being infected by the water when you wash your hands?
No. This is because you are subtly aware that, over decades, these flows have been engineered using standards which minimise the risk of them causing you harm. The same confident mindset is present in businesses, even though interactions with, and between, these flows are exponentially more complex.
Fundamentally what the standards provide is statutory requirements on how things should be safely put together to enable the different kinds of flow.
Standards require, for example, that the diameter of oil rig pipes be measured to extremely fine tolerances; that electricity cables are insulated in appropriate materials; and that water purity is tested at regular intervals.
- Offshore oil & gas platforms standards (EU)
- Distribution of electricity on construction and building sites (UK)
- Water Quality Standards (US)
Under such standards every system and asset change is viewed and analysed, with flow at the heart of the decision making process. [more]
A key point about standards is that they reduce risk, and thereby help create trust. That is why standards can be good for business – where there is trust people are more likely to buy your products or use your services. And, of course, the converse is true.
This brings us to trust and flows of data.
In these adolescent years of the digital age, how much trust is there in today’s data flow reliant businesses?
Convincingly answering such a broad question is difficult, but it is clear that confidence in the business systems that enable data flow in today’s organisations is being undermined by an increasing number of outages, glitches and cybersecurity breaches. [more]
A key explanation for these ongoing problems is that although today’s businesses rely on flows of data, currently there are no standards on how things are put together to enable data flow through and between business.
At a time of dramatic increases in organisational connectivity and complexity, there is not enough clarity on how the individual resources of the business – people, process and technology – are put together to enable each flow of data. [more]
This lack of clarity makes it difficult for organisations to make the best informed decisions on mitigating business risks related to data flow. As a result, businesses are vulnerable to interruptions to flows of data, whether caused by accident or by design.
Today’s business systems are therefore insufficiently resilient, and as a consequence consumers are becoming well acquainted with ‘data flow disasters’. These are events like regular bank glitches halting financial services for hours, customer data being stolen remotely from retailers, and transportation systems grinding to a halt because of software errors.
Cybersecurity incidents in particular are focussing minds in governments and in businesses about the risks associated with IT.
For example, the “Framework for Improving Critical Infrastructure Cybersecurity” was published recently by the U.S. National Institute of Standards and Technology.
Created after a year-long consultation between the public and private sectors in America, the document, ‘sets out voluntary standards and guidelines that organisations can use to assess and improve cyber security policies’, and explains that,
“Cybersecurity threats exploit the increased complexity and connectivity of critical infrastructure systems…cybersecurity risk affects a company’s bottom line. It can drive up costs and impact revenue. It can harm an organization’s ability to innovate and to gain and maintain customers”
What is especially interesting about this breakthrough document is that in its “Framework Core – Identify function” section, under ‘Asset Management’ it recommends that,
“The data, personnel, devices, systems, and facilities that enable the organization to achieve business purposes are identified and managed consistent with their relative importance to business objectives and the organization’s risk strategy.” [p.20]
And further, crucially, that
“Organizational communication and data flows are mapped”
In effect, what the new US Government standard makes clear is that effective cybersecurity is an impossible task without understanding the flow of data.