Best's Review

AM BEST'S MONTHLY INSURANCE MAGAZINE



At Large
Dedication to Data

Insurers must use data more efficiently or be left behind.
  • Stephen Catlin
  • May 2019
  • print this page

Stephen Catlin

The one thing I learned as the CEO of an insurer is that getting the most out of the data available to a company is very expensive, time-consuming and requires top-notch expertise.

Stephen Catlin

It is frequently said that the property/casualty insurance industry has a “data problem.” I don't think that's necessarily true.

The way I see it, the industry has access to an immense amount of data: risk-specific information from policyholders, historic data and generic data found in the public domain. The amount of information we can capture is ever-growing. Importantly, we as an industry are finally capturing significant amounts of data electronically at source, so that we don't have to keep inputting the same information over and over again. Multiple data entry is cumbersome, expensive and results in errors. Much more work needs to be done to ensure single-entry data is the norm, but we're starting to get there.

Therefore, the so-called data problem we as insurers face today does not have as much to do with the information itself, but instead with what I call “process.” It's no use having the best information possible if a company doesn't have the tools to analyze and use it in an efficient manner. In other words, I believe what we really have is a process problem.

Some insurers have invested significant amounts of money on processing data over the past decade—albeit with varying degrees of accomplishment. A lot of this money has been spent fixing problems with historic data or addressing issues that do not provide actual business benefits. Those companies that have achieved success most likely adopted a well-developed, forward-looking data strategy. Those that have come up short most likely adopted a piecemeal plan and ended up spending considerable amounts of money to produce disappointing results that have failed to provide insights to underwriters.

The one thing I learned as the CEO of an insurer is that getting the most out of the data available to a company is very expensive, time-consuming and requires top-notch expertise. I also found that this investment is essential. If an insurer is not willing to make the appropriate investment—in terms of data systems and the best people to manage them—it probably will be disappointed with the end result and wind up being left behind.

Advances in the use of underwriting algorithms have led to more robust pricing for both personal lines and commercial risks. It is clear that algorithms can significantly reduce the time and the cost to underwrite smaller risks while improving the quality of underwriting decisions.

For the large risks for which pricing cannot be completely trusted to a model, access to the best information possible is still one of the keys to successful underwriting. While human judgment is essential when pricing a truly complex account, well-designed algorithms can provide the underwriter with superior analysis, which leads to a speedier process as well as a fairer price. Calculating a fair price is one thing that benefits the client, the intermediary and the insurer.

Human talent always will be a key differentiator in the insurance industry. Employing high-quality people who embrace new ways of doing business and providing them with the appropriate analytical tools to get the most out of a company's data will result in a winning combination, particularly for complex business.


Best’s Review contributor Stephen Catlin is the founder of Catlin Group and former executive deputy chairman of XL Catlin. He is a member of the International Insurance Society’s Insurance Hall of Fame. He can be reached at bestreviewcomment@ambest.com.



There’s So Much to Cover—Don’t Miss the Latest

Get more news stories like this delivered to your inbox by signing up for our article spotlights.

Subscribe

Back to Home