Data driven insight can be a game changer: the actuarial function in service of underwriting decision-making

Blog -- 08 November 2024

Author: Marketing

related article image

Verisk Specialty Business Solutions recently hosted a roundtable in its London office, led by Taha Ahmad, director of pricing solutions at Verisk SBS and Farid Ibrahim, lead pricing actuary at Everest Reinsurance. The session, entitled “The future of specialty pricing methodology”, brought together 20 actuarial leaders, who gathered for an informal discussion on the challenges and opportunities the pricing function faces.

Join us as we delve into the key takeaways and insight from the event.

The results of the pre-event survey sent to participants kicked off proceedings. Participants were most eager to discuss the issue of data quality and reliability and how it informs specialty pricing. They found limited data availability to be a significant barrier for the adoption of advanced actuarial techniques, such as AI or machine learning. Other constraints that were mentioned frequently were time and cost of implementation and lack of actuarial expertise in advanced methods.  

When it comes to methodology itself, participants shared that they most commonly use classic techniques such as exposure rating, experience / burning cost and frequency / severity. Stochastic modelling and generalised linear models were also mentioned by a minority. Farid summarised the findings as: “methodology, ones we commonly utilise, hasn’t changed that much. What has changed is delivery: how you execute it, how you connect to your systems, how you feedback that data and use it to continuously improve on your base rates, on the loss cost”.

One of the participants reflected the general feeling of the room when saying: “I don’t have huge concerns about methodology, more generally. My biggest concern is the data with which to build the models and the usage of those models.” Another participant added that pricing actuaries are “doing a more critical examination now that they have increased data with which to do it. The level of calibration and the depth of review are increasing. The role of a pricing actuary isn’t just to deliver a technical premium or a benchmark premium but to service the underwriting community, to give a series of data driven insight to help inform them to make as good a choice as they can make”.

Taha recommends pricing actuaries ask themselves if they are reacting to the needs of the market and if they are delving deeper into underwriting. This way, they would not only be fulfilling their role as providers of guidance on technical pricing, but also helping underwriters get an “adult view of the risk” so they can understand the loss profile and structure a product that brings the best possible returns.

Farid acknowledged that data driven insight can be a game changer for operations, while Ahmad introduced the dilemma of granularity of data. He said “what we haven’t cracked as a market is loss data, we have become more profitable, we have lots of data available, but are we able to know what was the cause of a particular loss? How can we break losses down and group them?”. One participant commented that their main challenge was being able to accurately pinpoint the location of a loss and to accurately match claims and exposure data by peril and location, as the current set up requires “going through a process where, when you record everything, you need to select from a drop down menu: pick the location, pick the peril, pick the property, and you can’t make a mistake.” He added that they are designing a new data structuring and capturing process to solve this mismatch between claims and pricing, which will eventually be able to incorporate machine learning models to reduce accidental mistakes.

Another participant shared that they have been exploring generative AI to improve the quality of their data, helping identify and fill gaps, and for claims causation and root cause analysis. They commented, “I have not seen anybody say they’ve been able to do it across classes; it requires a lot of work, isolated, on a particular area or subset of business. Pricing actuaries are not prompt engineers, so you have to rely on your wider capability - your data science team if you have them”.

Farid said that a big part of pricing transformation journeys is aligning definitions: “a premium is a premium is a premium”. He explained that for a pricing transformation to be successful, the models used by underwriting and pricing need to be integrated: “if we are going to start connecting different systems they need to have a consistency across the board. In order to have that feedback loop a roadmap is the first thing I want to see; you start with the pricing model? What’s next? If everything is built in isolation it causes a lot of problems. Take for example currency rates, if you don’t establish the source from the start, it can create significant differences”.

In conclusion:

  • The longstanding issue of data quality persists, with the lack of granularity in claims data posing a significant challenge.
  • AI and generative AI technologies are increasingly being employed to clean and unify data.
  • Traditional pricing methods remain widely used and relevant.
  • Pricing teams are increasingly focused on supporting underwriters in making insight-driven risk selections.
  • There is great potential for added value by connecting pricing functions with broader systems.

Taha Ahmad will be attending the GIRO Conference this November 18-20. If you’d like to continue the conversation, don’t hesitate to arrange a meeting. 

Related Product

Rulebook

Pricing, underwriting and distribution, for even the most complex classes of business.