Insurers’ exploration of distributed ledger technology (DLT), commonly referred to as blockchain, continues to expand. Last month, AIG announced a partnership with IBM and Standard Chartered Bank P.L.C. to test a “smart contract” insurance policy. The Blockchain Insurance Industry Initiative, B3I, formed last year, recently expanded to 16 members, including Munich Reinsurance Co. and Swiss Re Ltd., as well as Aegon N.V., Allianz S.E., Assicurazioni Generali S.p.A., Hannover Re S.E., Liberty Mutual Insurance Co., XL Catlin, and Zurich Insurance Group Ltd.
Industry analysts suggest that insurers can use DLT in reinsurance contracts, wholesale insurance products, claims management, reserve calculation, automated notice of claims and losses, evolving underwriting models, and fraud detection. DLT could increase administrative efficiency and reduce administrative overhead, improve underwriting accuracy and claims management efficiency, and provide access to new insurance markets.
These benefits, if realized, could substantially increase insurance company profitability. DLT could profoundly impact the bottom line of early-adopter insurance companies, even if those companies only touch the surface of DLT’s potential uses.
DLT’s emergence poses both opportunity and risk for policyholders. Savvy policyholders should:
- Plan for potential claims automation.
- Assess the future value of source-level data for risk assessment.
- Develop strategies for controlling access to risk assessment data to maximize coverage benefits.
How Does DLT Benefit the Insurance Industry?
At its most fundamental, DLT is a customizable peer-to-peer digital ledger. DLT allows users to efficiently share information at a previously agreed upon level of automation. Before implementation, users agree on DLT’s key features, such as information sources, transfer methods, triggering events for information or payment transfer, and rules for recordkeeping on the ledger.
DLT allows users to add or remove functionality based on their needs. Insurers may not require immutability (a key characteristic of the bitcoin blockchain) in which past entries, once added to the ledger, cannot be changed. Ever. Period.
Bitcoin requires immutability to operate as a currency. Without it, users could spend the same bitcoin more than once, much like forwarding an email. In contrast, a permissioned blockchain privately operated between a group of insurers and reinsurers may not require immutability.
Insurers may customize DLT to use an “oracle” to trigger a blockchain event. An oracle is a mutually agreed on source of data (usually from a trusted third party). For example, a “smart” property policy could use National Earthquake Information Center (NEIC) data as an automated trigger to invoke or ignore an earthquake exclusion in a property loss claim. Similarly, a property policy could use a fire alarm as an automated trigger for notice of a fire-related insurance claim.
While DLT’s insurance applications are still in their infancy, insurers will likely customize their DLT applications to accomplish four goals:
- Efficiently gather and store information from policyholders (for both applications and claims).
- Automate portions of claims management.
- Automate and improve recordkeeping and data access.
- Share necessary information with reinsurers.
By customizing DLT, insurers hope to improve risk assessment, reduce overhead, and improve reinsurance outcomes by laying off appropriate levels of risk. The primary benefits to insurers are improved data gathering, data utilization, and operational efficiency, each of which should translate into better earnings or lower expenses.
How Does DLT Impact Policyholders?
DLT will likely impact policyholders through claim automation and data commodification. Policyholders should properly plan and negotiate placement of “smart” insurance policies to minimize the adverse impacts of both automation and commodification.
Policyholders Should Carefully Consider the Impact of Claims Automation
Implementation of claims automation without proper policyholder controls in place could wrest critical decisions from policyholders:
- Should we submit this claim given current business conditions?
- How should we describe the underlying event?
- When did the operative event actually occur?
- Does an exclusion potentially bar coverage?
Under the existing claims paradigm, policyholders can assess an event, determine the best strategy to manage and report the claim (in strict compliance with the policy’s notice requirements), and present the claim in a manner that maximizes the potential coverage available while accurately depicting the triggering event. This reporting process can range from simple and formulaic for some types of more routine claims to complex and customized for unique or substantial claims.
Negotiating automated triggers for reporting certain claim information and processing claim information necessarily removes an element of policyholder control from claims reporting. While automated processing could provide benefits (for example, automated claims notice mitigates the potential of a claim being denied for late notice), presenting certain information without context could increase the risk of a claim denial for other reasons.
For example, consider a cyberinsurance claims oracle that uses a mutually agreed upon set of network events to detect a distributed denial of service attack (DDoS) (a flood of requests to a particular network impacting the policyholder’s consumer web portal causing loss of sales and potentially leading to a business interruption claim). While automated reporting may benefit insureds, both philosophical risk management questions and technical questions about the oracle framework could impact claims management using automated DDoS reporting.
From a philosophical perspective, should the policyholder report every DDoS event, even if the ultimate business interruption may not cause a covered loss (either because the DDoS event did not actually impact aggregate sales or the lost sales did not exceed the deductible)? If this reporting is done, what information should be sent as part of the automated process? Should the notice be limited to the fact of the DDoS event or should it include additional network information that may be relevant to the insurance company’s investigation of the potential claim? Should the notification trigger a delayed sales impact report at a specific time following the DDoS event? May the insurer rely on notification of a DDoS event in subsequent renewal and premium decisions even if the event did not result in a claim? And perhaps most importantly, what benefit do policyholders obtain in exchange for this automated reporting system that they could not obtain at a less institutional cost than automated reporting?
From a technical perspective, the traffic monitoring protocol that detects a DDoS event and automatically reports the claim should report the event when the network speed is sufficiently impacted to cause the loss of sales while minimizing false positives due to higher than normal usage rates. Policyholders should negotiate these technical details ahead of time to limit their adverse impact on the subsequent administration of a claim.
Policyholders Should Control Data Commodification
Adopting DLT applications could allow insurers to gather, store, and analyze policyholder data to a far greater degree than they currently do. Insurers already obtain a significant amount of data from their policyholders over the course of what is often a multiyear, collaborative business relationship. The development of DLT applications combined with the proliferation of always-on, always-connected devices (the Internet of Things or IoT) could substantially increase the depth and scope of the data that insurance companies obtain from their policyholders.
Insurance is and always has been a data-driven enterprise, but even modern insurance frameworks require substantial manual data entry and duplication of effort across departments. Despite the substantial information that insurance companies obtain from their policyholders, much of the analytic value of this data is currently lost to this administrative inefficiency.
With proper data management protocols and access to new and better sources of data from the IoT, insurers will have access to unparalleled databases to refine and improve their underwriting and risk assessment models. While improving risk assessment could benefit some policyholders in the form of reduced premiums, policyholders should carefully consider the type of data shared with insurers, the format of the shared data, and when that data is shared. Policyholders who negotiate restricted access to source data and instead present analytical results during renewal may obtain lower premiums than those companies that simply allow insurers continuous access to source-level data from key sources.
For example, consider commercial auto insurance. Insurers calculate premium rates based on several factors, including the type and amount of automobiles in the fleet, the number of authorized users, the location of automobiles, the usage of those automobiles, and the level of desired coverage.
With the advent of DLT, insurers could supplement this basic underwriting information with automatically reported driving monitoring data and offer dynamic premium pricing based on daily fleet usage. A sophisticated commercial policyholder might obtain lower premiums by using the same devices, gathering and analyzing the data itself, and reporting previously agreed-upon outputs to its insurer, rather than allowing the insurer access to the source-level driving data.
Even if the policyholder permits its insurer to access the source data, the policyholder should consider restrictions on the storage and use of that data. For example, may the underwriter use some or all of that driving data during renewal?
Driving is simply one example of DLT in action (and perhaps not even a long-lived example given the advent of autonomous vehicles). The IoT could make similar data reporting and analysis available for large portions of commercial and industrial systems. The commercial value of this data cannot be understated.
DLT applications allow insurers to more efficiently gather and store data for later use, but policyholders should control the type and form of data shared with the insurer before trading terabytes of data for the promise of reduced premiums.