Dr Oishee Kundu is a Research Associate in Discribe Hub, a social science-led digital security programme, funded by UKRI’s Digital Security by Design initiative.

 

The inherent insecurity of interconnected digital environments is something that has been known since the beginning of networking computers in the 1970s. But who is responsible for security in the digital world?

The digital world has since expanded and become ubiquitous, intrinsically linked with almost all aspects of our personal and professional lives. We rely on digital devices and the internet to communicate, conduct financial transactions, look for information, learn something new, and or even find love. In his book ‘What is cybersecurity for?’, Tim Stevens writes that “the task of cybersecurity is to help manage that vulnerability [which emerges from our dependence on networked digital devices] in such a way as to capitalize on the benefits of digital transformation, rather than to buckle under its more problematic aspects.”

In light of this, cybersecurity has typically relied on a combination of three types of solutions – physical, i.e., the physical security of devices and systems, technical, such as authentication and access control, and human-centric, such as employee training and public awareness campaigns. However, it is difficult to judge whether such an approach is sufficient or sustainable in the face of the relentless expansion of the digital world and the growing threat of cyberattacks from state and non-state actors.

We are all on the frontline in a cyberwar.

Lessons from the past

How have societies responded to technologies which carry existential threats? In 1955, the UK government commissioned a report about the possible impact of nuclear conflict which provided many recommendations for building shelters, stockpiling food and raw materials, and geographically dispersing industries, but none of them were acted upon as they were deemed “too expensive”. At the same time, funding was found for bombs and submarines, aligned with the policy of deterrence through mutually assured destruction. In the face of growing insecurity, people were expected to look out for themselves, as historian Florence Sutcliffe-Braithwaite suggests here. There was a proliferation of ‘Protect and Survive’ booklets, private companies selling survival kits and shelters, and so on.

The cyber-equivalent of this is a societal investment in AI and quantum computing, whilst ignoring questions of governance and security of digital systems and associated technologies. Perhaps the historical precedent of prioritizing technological progress over people makes it difficult to create regulations or programmes of work that protect people? However, there is ongoing UKRI programme on the digital security challenge which is aiming to tackle security issues at the design level and create a more secure hardware and software ecosystem. Digital Security by Design began in 2020 as an Industrial Challenge Fund project and has been building and testing a prototype processor (the Morello Board) which incorporates the research conducted at the University of Cambridge on new ways of designing hardware featuring compartmentalisation capabilities (an approached termed “CHERI”). The objective is to address a set of vulnerabilities in the digital world in the hardware design or architecture stage itself.

But how to take an innovation from page to stage?

If you build it, they will come…?

Changing a foundational technology like the processors which go into digital devices requires a degree of market coordination that appears difficult to achieve. The first barrier to adoption appears to be the nature of cybersecurity. In economics, the term “credence goods” is used to describe products and services whose qualities cannot be observed, even after consumption or purchase. We are rarely aware of the cyberattacks mitigated by our phone’s security features, but we do notice when the battery drains quickly or the phone operates more slowly. If consumers are unwilling to pay for a technology whose value they cannot see, suppliers will be unwilling to adopt it into their products.

Secondly, in the case of Morello board, we must also confront the highly specialised (fragmented) nature of the semiconductor industry whereby only a few companies dominate in a complex network of global outsourcing and interdependencies to produce chips. Taiwan Semiconductor Manufacturing Co. fabricates more than 90% of the most advanced chips, using machinery which is primarily produced by just 5 companies around the world. The key industrial partner in the Digital Security by Design programme – Arm Limited – has built the Morello board, but Arm is in the business of designing chips, not manufacturing them. The market failure might persist if a manufacturer does not emerge, and manufacturing at scale will only be viable if there is a strong demand for secure-by-design chips but demand itself will only emerge if Morello boards are manufactured at scale (a “chicken and egg” problem, some may say).

The third problem (and this is by no means an exhaustive list!) seems to be the business model itself for digital devices and subsequently the processors which power them. Gone are the days when there were few networked systems and thus a few large customers – particularly the defence department - who could drive the demand for a technological transition. In 2025, smartphones are expected to be the leading end market for the global semiconductor industry, and it is fragmented user base which is yet to signal to the wider industrial ecosystem that they would like memory safety built into their devices. No one organisation can make a difference and governments across the world are trying to draw attention to the importance of secure by design and addressing memory safety.

Imagining secure digital futures

The 1970 Ware report suggested designing cybersecurity from the ground up and involving all parts of the ecosystem – hardware, software, operators, users, and so on. The 2022 UK Cyber Strategy mentions “a whole-of-society approach to cyber”. However, truly participatory approaches are still underdeveloped, and perhaps that is what is missing in creating a mindset shift and transitioning to secure digital futures. In a very basic form, social engagement could be a simple marketing exercise for new technology (in this case, secure-by-design, or memory-safe chips), but a more meaningful exercise would involve societal actors imagining and designing secure digital futures themselves. Participatory research may hold the key to the development and diffusion of the next generation of secure digital technology, and we all must get involved in this process.

After all, if you are using a digital device (such as the one you are using to read this), you too are on the frontline in case of a cyber war.

All articles posted on this blog give the views of the author(s), and not the position of the IPR, nor of the University of Bath.

Posted in: AI, Culture and policy, Data, politics and policy, Emerging technologies, Science and research policy, Security and defence

Responses

  • (we won't publish this)

Write a response

  • Well written blog.The essence of the huge world of cybersecurity is explained in a few paragraphs.Thank you for this.