2000 – 2020: The 20th Year of Developing and Offering Cyber Risk Quantification – THE STORY
In 1999, cyber threats were known, seen as simply an I.T. department responsibility to “solve” through their opaque and uninteresting (to Board Members!) ways.
This retrospective view of 1999, to the present, outlines the challenges and shifts we have experienced within the domain of cyber risk quantification.
At the time, the use of the technologies we developed, were fit for purpose within the very same segments and in the same manner, as they are in 2020. Twenty years of supposed “evolution” should be tempered with the simple fact that direct and regulatory financial impact has been the force of change.
Many entities still rely upon “autopsy” risk management and whilst Deep Learning and AI, coupled with big data and 5G, will create opportunities, they will also exponentially increase the potential for threats. Will we experience the same cycle of hindsight cyber and data risk management the past two decades?
We continue our 20-years of research and development to improve global cyber risk management.
Dr. Phillip King-Wilson
Managing Director, Quantar Solutions Limited
1999 The Beginning
During 1999, two security developers playing around with network taps in an out-of-hours session discovered a method of covertly acquiring network traffic for security analysis. A third member of the group approached a friend, who was working as an international director for cyber programs for an international group of insurance and banking companies, to ask for his opinion as to what this could be used for.
Working in the international banking and insurance sectors, defining future internet strategies for the group, the friend replied was that it could be used to quantify risks posed to a computer network of a corporation. It could be used to enhance security, justify I.T. security budgets and to use to underwrite cyber insurance to cover such risks, or for the banks, it could be used for Basel II to achieve reduced retained risk capital levels (the advanced measurement approach of the Regulator, the BIS, required banks to quantify all operational risks, including IT risks).
The problem in 1999 was that there were no real cyber insurance policies available. I.T. security was viewed by Boards as a cost centre. Bank Boards were occupied with the mainstream risks of market and capital risk exposures, rather than the then-minor, I.T. risks. Quantifying and revealing risk exposure were viewed as negatives, creating an accountability and publicity problem for executives if there was a failure from cyber-attacks.
2000 – 2004 A Solution Seeking a Problem?
Despite the seeming setbacks of no market, meetings were held with top-level companies targeted as being potentially interested in the technology. There was a sustained belief that because cyber risks were so deeply impactful across a broad scope of business operations that there must be a use for the invention.
In March 2000, meetings with a major German reinsurer, Gothaer Re, in Cologne, revealed a need for accurate cyber risk valuation. Corporate clients were increasingly requesting cyber exposure cover, but the absence of such a product made fulfilment difficult. Gothaer Re provided such cover through the general insurance pool and used German I.T. security company, Secunet AG as part of the underwriting and claims process. This was the first indication of the use of the technology.
With other meetings held with global companies SAS Software, Munich Re, Swiss Re, Willis, Marsh, Oracle during 2001, albeit under NDA’s, it was clear that the invention needed protection. A patent was developed and filed in January 2002 and granted in September 2005. The time and financial burden of patent prosecution to allowance took its toll on the team and only one of the original four continued away from the original company.
With the general realization by corporations of the true impact of a successful cyber attack during the course of the mid-2000’s, refinement and additions to the original technology were made. The original network threat component was developed to ensure that the threat data could not be compromised, since dependency upon it for quantification required accuracy and integrity to be of use.
Data analytics were therefore developed, to utilise the acquired proprietary network traffic of each client. Technology and software analysis and R&D were contracted out to a U.K. university commercial arm, Loughborough University Enterprises Limited. This research entailed detailed analysis of items such as packet drop-rates per technology type, to ensure that what was modelled by the developing front-end technology would be useable by a number of sectors.
Data Analytics and Predictive Modelling Development 2005-2009
Drawing from experience within the international re/insurance and banking sectors, suitable consultants were identified to develop algorithms for the front-end systems between 2005-2009. As with using a well-respected university for R&D, together with an absence of experience by companies in the cyber risk quantification segment globally in 2005, contracts were placed with experts in military simulation development, used by a number of military entities, for the development of the first models. These were delivered for field trials with a major European insurance group in 2006.
Upon completion of the first front-end products, a secondary product line was developed in conjunction with a multi award-winning actuarial consultancy based in the global insurance capital, London, U.K. The algorithms were reviewed and new predictive models were created for implementation into the second set of front-end products. These were developed from the output models into the software versions by a company renowned for risk management within the banking and financial services sector and used, from post-financial crash, by the Royal Bank of Scotland and Abu Dhabi Investment Authority, among others, for portfolio risk quantification.
Market Testing & Product Launch 2009 – 2010
With the market for cyber insurance still undeveloped, with coverage limited to E&O and risk financing types of products from the likes of CNA, Marsh and Swiss Re, the decision was taken to test different markets with the products.
As a result of feedback from each of the exhibition stand visitors, further enhancement and user-friendly features were developed for the products. These included better threat trend analysis, red/amber/green warning “traffic lights” for ease of understanding, and risk threshold settings. Inputs were taken from the likes of the FBI, CERT UAE, Emirati ministers and Northrop Grumman.
During the period of product development and market testing, further patents to protect the front-end applications were created by the CEO and filed in 2005. These had prolonged prosecution periods, finally resulting in allowance from 2015 onwards, with continuations being maintained to avoid engineering around by competitors – a strategy that continues today, maintaining the original patent priority dates over any competitor patents filed since then.
In 2008-9, cyber risk reach was deep and broad, covering underwriting, I.T. security, business continuity, reputational risk management. Exhibitions were used to identify demand and included business continuity at the Business Continuity Institute (BCI) conference in November 2009 in London, with Quantar being bronze event sponsors; the International Security and National Resilience conference, Abu Dhabi in March 2010 and the Risk and Insurance Management Society (RIMS) event in Boston, U.S. in April 2010.
The patent prosecution was prolonged due to the fundamental management of risk methodologies embodied within the front-end applications specification and claims filed. Mapping dependencies between systems, business processes and categories are fundamental to the management of operational risks. Adding a cyber component at the time of filing in 2005 was ahead of their time and although required extensive capital investment and personal commitment, these now protect the products, regardless of patent law and rule development since filing.
Following from the ISNR in Abu Dhabi, the company was invited to present to various government and commercial entities in Kuwait in 2011. These included the National Bank of Kuwait (NBK), the National Guard and the Central Agency for Information Technology (CAIT).
It was clear from the discussions that cyber risk quantification required inputs that requires a certain level of maturity of operation to function and 2011 was still early for many organizations at home and abroad. As a result, a focus upon new model development was undertaken in 2012-13, with the company also commencing divestment and joint venture discussions through relocating to the U.S.
The Emerging Cognition of Cyber and Regulation 2011 Onwards
Although hacks and data breaches have impacted financially since the earliest days of computer networks, the growing ubiquity of the public internet gave rise, increasingly since 2010 to highly publicised attacks, in particular the Stuxnet attack which forced governments and corporate heads to recognise that cyber had become truly weaponised.
From being perceived as “an I.T. security problem” to a Board problem took a short period following the growth in frequency of data losses and reputational impact. It was most probably the Sony hack of 2014 that demonstrated the power of remote attacks to create such substantial losses to a major U.S. corporation that created a paradigm shift in corporate perception.
|Example Hacks 2010-Present|
|2010 Stuxnet virus attack on Iranian nuclear program: the first strike by computer?|
|2011 June Hack of Washington Post Netted 1.27M User Account Details|
|2012 Utah Department of Health 780,000 Medicaid patients stolen, hacker from Eastern Europe|
|2013 Target to pay $18.5M for 2013 data breach|
|2014 Sony Pictures hack cost the movie studio at least $15 million|
|2015 IRS: Thieves May Have Stolen Info From 220,000 Additional Tax Accounts|
|2016 Yahoo hack: 1bn accounts compromised|
|2017 Home Depot to Pay $27.25m in Latest Data Breach Settlement|
|2018 Hackers stole data of PM Lee and 1.5 million patients in ‘major cyber-attack’ on SingHealth|
|2019 British Airways faces record £183m fine for data breach|
The types, geographic location of targets, types of attack and types of losses have now placed data and it’s actual financial, strategic and personal value at the head of the list in terms of a need to identify and protect from a corporate perspective.
Increasing data losses and the use of personal data resulted in a change in European Commission attitudes to the value and privacy that should be attached to personally identifiable information. The General Protection Data Regulation of 2016 (in force 2018) may be cited as a regulatory change with global reach, but other regulations on a per-country basis have also acted as trigger points for companies to seek compliance.
The major difference between today’s increasingly data regulated world and prior laws is the onus is now upon the data user to irrefutably demonstrate compliance. A data breach is now the trigger for data regulators to investigate and demand proof of both intent to comply and actual compliance.
Our original line of thought was to use our products to assist in management of risk and in meeting criteria for banking regulations emanating from 2004 (Basel II).
GRC then seemed a good fit, since there are trends pushed in the business media and compliance was increasingly visible as a requirement.
In 2012, the International Standards Organisation (ISO) published the 22301 standard, for business continuity. Since process dependency upon I.T. systems fall within this domain, so too did our products.
Since financial losses from cyber-attacks, in particular from cyber-extortion, the demand for insurance coverage has similarly increased. Since underwriting cyber risk remains problematic and the CyCalc Informed InsuranceÒ products still serve this market segment, as originally envisaged, so the market for others to enter the market has become more attractive.
Our competitors have joined the cyber risk quantification bandwagon, especially since 2016, seeking markets and segments within them in the same way that we have from the outset in 2000. We remain however, the company with the patents and priority dates that count, published works in academic journals on modelling cyber risk (for example, 2018 publication; using epidemiological models for cyber), the real-world experience of multiple markets and geographies from two decades of cyber risk quantification.
Our Top 10 Competitors in Cyber Risk Quantification
|Company Name||CRQ Launch||CRQ Patents|
|Dell RSA Archer||2018||No|
(Supplies RSA Archer CRQ)
|Secure Systems Innovation Corp|
(Supplies Unisys TrustCheck)
|Cyence (now Guidewire)||2016||Not CRQ|
|Nehemia Security||2015||Not CRQ|
|Bay Dynamics||2001||Not CRQ|
Management consultancy McKinsey has claimed that data is the new oil. In the same way that extraction and use of that fossil fuel is increasingly controlled and prohibited, so will the acquisition, combining and use of data. The current and future trend will be for data governance and master data management and we are already engaged in these domains.
Our commitment to leading research, cyber risk quantification software product development and providing consultancy in emerging technologies such as Deep Learning, AI, Big Data and Distributed Ledger to organizations has been proven over the past 20 years and remains as the mission of our company.