Megabytes and algorithms in the insurance industry
Contacts: Karim Derrick
This article was authored by Joe Cunningham, Product Manager, London.
As society transitions from the oil-driven era to the data-driven age (also known as the ‘Fourth Industrial Revolution’) global industries are being shaped by rapid digital transformation. It was estimated that by the end of 2022, 60% of global GDP will depend on digital technologies.
Examples of digital technologies include mobile technology, connected devices (Internet of Things) and APIs, which allow the integration of software and exchange of data. Adopting new digital technologies is only one aspect of digital transformation; it also affects process, structure, customer service, management and influences our way of thinking, decision making and problem solving.
Powered by the increasing availability of data is the growth of data science appointments. In the US alone data science roles have risen 650% since 2012 and when coupled with increasing computational power – data scientists have been busy creating sets of instructions to solve a problem or complete a task using data. We refer to these as algorithms.
Algorithms are used for calculating, processing and automating human reasoning. They can be simple or incredibly complex. Artificial intelligence (AI) is the application of these algorithms.
What type of AI is more suitable for the insurance industry?
Any AI system where its operations are not visible to the user can be termed ‘black box’ – this is a type of impenetrable system where decisions cannot be explained.
This type of AI is generally unsuitable for application in insurance and legal services, where decision making is often audited and regulated. Kennedys and Kennedys IQ recently articulated their position on AI regulation within the market.
In contrast, users of ‘white box’ AI are provided with an output and an explanation of how the algorithm has reached its decision, as well as a breakdown of the process followed. The concept of transparency is key, and it is explainable by design. White box AI has vast potential to improve processes and assist decision making.
New technologies for old challenges
White box AI will help insurers with the problems of tomorrow by embracing the unstructured nature of the past. Challenges in portfolio management facing underwriting and claims become all too apparent in so-called ‘black swan’ events – from the pandemic and property cladding to understanding developments in the law, clause interpretation, and aiding underwriters with risk diversification strategies.
Unexpected events often result in the creation of voluminous spreadsheets, because data capture is still largely a manual task. Data requirements are only understood in a time of ‘crisis’, which creates many opportunities for accidental data loss, error, inconsistency and a lack of real-time insight. The next generation of technology will enable insurers to efficiently look at the past, in different contexts, by creating new datasets to understand how the challenges of today affect them and their clients.
Intangible risks such as reputation are harder to identify, manage and mitigate. Actuaries are seeking new sets of data to help develop their pricing or reserving models. Claims teams are often tasked with understanding a portfolio of claims, beyond transactional day-to-day claims handling. Underwriters are looking to understand their clients beyond that supplied in a loss run or claims history. The growth of digital transformation and the data generated from within organisations accompanied by the increasing amount of publicly available data, provides an opportunity for the next generation of insurance technology to challenge the status-quo.
In 2022, it was estimated that 97 zettabytes of data was created globally.
What is a zettabyte?
A zettabyte is a unit of measurement used to describe a computer or other device’s storage capacity. One zettabyte is approximately equal to 1,000 exabytes or 1 billion terabytes. 1 terabyte is equal to 1,000 gigabyte. Most modern laptops start at a capacity of 256 gigabytes.
By 2025, this number is predicted to exceed 181 zettabytes.
These are overwhelming numbers and represent an 80% increase over the next two years. Ten years ago, the volume of data created, captured, copied and consumed worldwide was estimated to be just 9 zettabytes. This illustrates the exponential trajectory of the last decade that shows no sign of slowing down.
Data is and will be everywhere, ready to be utilised in ethical, practical and sensible ways – the use of technology in the insurance sector will be an evolution, not revolution and is something not to be viewed with trepidation.
Within the insurance sector, AI has often been associated with personal lines or confined to actuarial science. However, by appreciating the distinction between black box and white box AI, it can be seen how the latter can assist in meeting the demands and challenges throughout many other segments of the insurance lifecycle, be it general or specialty insurance.
In public discourse infrequent effort is made to define what is meant by AI. When coupled with negative sentiment around the threats of AI to jobs and the issue of bias, this results in an entire field of AI being misunderstood, and exclusively and incorrectly associated with technology that adopts ‘black box’ algorithms. A consequence of digitisation is the generation of data, which presents an opportunity for insurers to understand their risks and clients better than ever before.
Related news and insights
Karim Derrick, chief products officer at Kennedys IQ, shares how the company is using artificial intelligence for complex claims documents.
Kennedys IQ, through our global law firm Kennedys, has joined the prestigious US-based Center for Research toward Advancing Financial Technologies (CRAFT) as it cements its pos...
Kennedys IQ’s Future Innovators programme saw students from a tribal school in Kerala develop innovative ideas for solving both local and global problems. In March, Kennedys ...