In 2020, Deloitte reported on The four trends that define insurance and showed that the future of the insurance marketplace is going to be significantly different.
Life and Property and Casualty insurers, for example, estimated that 93% of their volume already came from propositions that were not offered five years ago. New propositions were expected to keep on rising, with nearly a quarter of investment spend in insurance allocated to new product development. Many expected data-driven innovations and digital technologies to change the landscape of insurance services yet further.
The pandemic has, if anything, reaffirmed and speeded up this change. The newly published 2021 Insurance Outlook from Deloitte focuses on how firms can speed up their recovery and one of the key messages is:
“The need to accelerate digitization and enhance virtual operations turned headwinds into tailwinds at many insurers, driving faster action to deliver within the coming year what might originally have been three-to-five-year transformation plans.”
But if you’re in insurance, or the wider finance sector, how can you harness the value of data to ensure those digital transformation initiatives are successful?
Adapt your customer thinking
Customer demands are always evolving and, over the last year in particular, they’re far from where they used to be. We’re seeing people spending much more time at home, some of it enforced, some of it because they don’t feel like they want to go and mix in greater spaces. Remote working has also proved to be a success, with many companies declaring their employees can work from home when the pandemic is over. That, in turn, is changing the way people are consuming all sorts of products and services, and insurance is no different to anything else out there.
This is where data can be used to increase the speed at which insurers can adapt and pivot and meet that change in customer thinking. Insurers need to ensure they’re designing, creating and adapting products and services that increase the value they offer to their customers. Data is a key part of that, so you need to be considering the use cases you want to fulfil, and then how the data and the technology that runs that data supports the delivery of them.
If your customers are now working from home, for example, they may well be questioning why they have a car insurance policy that covers them for 10,000 miles a year. They might now have valuable IT equipment in their home office – is it covered by their house insurance? There are concerns about health insurance and, when we can travel freely again, what’s covered when we take a trip overseas.
DevOps has a part to play here because it lets you get an idea to value as quickly, and safely, as possible by automating many parts of the application and database development process, and releasing features and improvements in small iterations. Rather than spending months developing a new product or service and going for a big bang release, it encourages teams to release as soon as possible and use customer feedback to constantly improve.
You could, for example, take a single use case out of all of the service and product surface area you’re delivering to your clients and say ‘We’re going to modernize this part of it’. By getting early feedback at every stage, companies can innovate more effectively, keeping and improving those features customers find valuable, changing those that are less popular.
That can be complicated, depending on the legacy technology that is in place, but what you’re really trying to do is decouple the complexity and make it easier to move faster and smarter. You need to seek to innovate with products and services, as well as internally, and work with your people to break down the silos that prevent them from communicating.
Tooling can help achieve that. Making sure that you’ve got all of the right technology and components in place to support the processes that you want to run, and the idea you’re trying to get to market, is the building block to being successful.
Adapt your application thinking
A huge part of what I’ve just been writing about also applies to the applications you already have in play, because they may well need to change as well.
An interesting example is the motor insurance industry which was turned upside down in 2020. I was talking to an insurer recently and they told me that they’re seeing far more demand for temporary car insurance. That’s understandable because the move to remote working has changed people’s relationship with travel, and the industry needs to innovate and change as well.
A lot of insurance policy systems are based on a 12-month renewal process, with systems built over many years to support the acquisition of new customers with insurance premiums that are renewed annually. What if customers only want one day of motor insurance because they’ve sold the car they weren’t really using any more and now they’re borrowing a friend’s car? What if they want insurance they can increase and reduce on a weekly or monthly basis, depending on whether they’ll be in the office or not?
Those are very different use cases and therefore the systems need to change with that. Essentially, you’ve got to do much more dynamic pricing and fulfilment of those policies, whereas previously you did that on an annual basis at a much lower frequency. This kind of change is likely to be seen right across every part of the insurance industry, with Deloitte’s 2021 Outlook stating:
“Insurers will likely be called upon to revise siloed thinking given blurring lines between personal and commercial auto, homeowners, and workers’ compensation. Similar challenges confront life insurers, with real-time data availability perhaps transitioning carriers into “insurers for life,” focused increasingly on maintaining wellness.”
The Outlook points out that generating continuous innovation in insurance policies, sales strategies, operations, and customer experience could turn out to be the biggest differentiator in 2021 and beyond.
This poses a challenge, however, because many of the applications in place were created years ago, based around the one product, renewed annually, approach. With so much business logic and complexity already in place, it doesn’t make sense to build something brand new. You could, however, still use this as an opportunity to innovate by, for example, introducing or enhancing web-based application interfaces so that end users can consume more new services directly, rather than talking to a broker over the phone.
Adapt your system thinking
Another big move we’re seeing in the technology arena is the move to the cloud, which provides further opportunities to innovate. Even before the pandemic, organizations were looking to the cloud to see how they could benefit from elasticity of consumption. Deloitte’s 2021 Outlook showed that while remote working has moved cybersecurity to the top spot in terms of investment priorities, cloud comes a close second. 59% of insurers expect to spend more on cloud computing and storage in 2021.
That shouldn’t really come as a surprise, given the uncertainties of 2020. With on-premises servers, you have to predict upfront the consumption requirements in terms of memory, storage and compute power. Compare that to cloud services, where you can consume what you need from day one. You don’t need to worry about how all of the infrastructure is provisioned because it will be there for you when you demand it. And if you’ve got a workload that has high seasonal demand and then periods of low utilization, you can design your application and technology platform to allow you to burst up to more compute, memory and storage when required. and then come back down to lower levels of consumption when you don’t need it.
That creates a very efficient operating model and also provides an opportunity to use it as the data platform for a new product or service, while maintaining legacy applications on-premises. That way, you can get started much more quickly, build a DevOps process in from day one, and use the latest, fastest, most efficient technology to start innovating.
One final factor also comes into play here because, when you’re developing products or services that use databases on-premises or in the cloud, monitoring those databases is equally important. Whether you’re monitoring down to the hardware level or monitoring the Database Transaction Units (DTUs) or the waits or the deadlocks, it really doesn’t matter where your environment sits.
You still need to know what the workloads are, and when they go up or down, and what your resource requirements are now, and what they’re likely to be in the future. That way, you can make informed decisions on what your consumption requirements are, and whether you should use on-premises resources or move some requirements to the cloud.
It also allows you to gain one thing that is absolutely key if you want to develop features and services faster and release more frequently: insights. You’ll be able to see what impact releases have on performance, and get feedback in into how developers are writing queries, what the query plans look like, and how they can they be improved.
That way, you can ensure that when you include the database in DevOps, you gain all of the advantages it promises while maintaining performance levels.
You can find out more about how the database can be included in database DevOps by visiting our solution pages.
James Boother is the Sales and Marketing Director at Coeo, a Microsoft Gold Partner that partners with companies to predict their future through the effective use of data. Their Microsoft data platform and analytics specialists have vast experience of working with clients across their entire data journey – from modernization and migration to artificial intelligence and beyond.
Was this article helpful?