<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=318106&amp;fmt=gif">
Skip to content
Dave Connors 06-Aug-2018 16:33:14 6 min read

The uncertainty that surrounds technology strategy

Uncertainty

One thing that often comes up, whether we are talking informally to market contacts, or engaged with clients and prospects, is the level of uncertainty around technology strategy. 

 

AI and Machine Learning, Digitalisation, Blockchain, Disruption – everyone is familiar with these phrases. They know they should have an opinion on them, they know they should be “doing something”, but what? In the London Market this includes the TOM – whether / how much to engage with the initial phase, how to push further take-up, whether there should be a another phase and what it should look like…? 

 

At TINtech in June this year, Eurobase hosted a stream on Lloyd’s and London Market and the adoption of Modernisationour discussions were very much led by these themes, not only in our stream but across all the other streams too.  However, this goes way beyond streams and interactions at a single conference. So many of our engagements, whether formal processes or quick catch-ups, include conversations along the lines of “we’re looking at...", we’re starting to explore... or "we have begun to consider... all in line with the buzzword du jour. 

 

What I’m picking up on are three areas of general uncertainty: 

 

What technology? 

Is AI/Machine Learning really a game-changer? Is “Data Science” a new term for something the Insurance Industry has been doing for years? What value will it give over existing data analytics? Is it an enhancement or a replacement? Is it expensive? Is it something only the big players need to do? Again, a lot will come down to use-cases. There is a difference between a “data science” problem that can be resolved through pure statistical analysis, and a “machine learning” problem, where the intelligent application of past trends and watching them evolve, leads to more accurate forecasting and earlier highlighting of anomalies etc. This is definitely not the preserve of big companies, nor does it have to be prohibitively expensive. Anyone can benefit from it, and with the right approach and consultancy, anyone can do it. 

 

What about Blockchain? What issues of scale remain? What Blockchain platform? What are the plusses and minuses of differing platforms, of public vs permissioned Blockchains? Is there going to be a VHS/Betamax issue with the differing platforms? Is a start-up platform ICO something I should be interested in? Some of these questions have easy answers, some are more subjective (even B3i has changed platform), and some perhaps don’t have clear answers. This uncertainty is certainly holding people back, along with continued doubt over compelling use-cases. “A blockchain for…” is InsurTech’s version of the generic start-ups “A Tinder for…” 

 

Application or Microservice? 

An extension of the “what technology” question, this relates to technology architecture and landscape. Everyone is comfortable with the world of the “application” and piecing together discrete systems, each with their own UI, application logic, and data store. These have differing levels of integration on a “point to point” level, or may be modules of a single system, and are underpinned by a shared Enterprise Data Warehouse. 

 

Sometimes a problem (or identified benefit from a different type of processing or analysis) might be relatively significant, but doesn’t justify the cost, risk and upheaval of changing an entire application. The development of microservices – essentially APIs that perform one discrete task and can be entirely decoupled from both data store and UI  provide a solution to this, but also adds another layer of uncertainty. One doubt is how well their existing landscape works. Can “legacy” but perfectly functional systems pass or receive data from the microservices… is the required data available in the organisation and where… is our architecture future-proof… what’s the least disruptive path from a legacy platform to a modern architecture? 

 

In-house or Vendor (Established or InsurTech)? 

Especially when it comes to new microservices, there seems to be a lot of uncertainty around execution. Specifically, questions around whether to use an existing vendor, a new (established) vendor, or an InsurTech 

 

Doubts here come along the lines of – my existing vendor will give me an expensive change request to their system, and other established vendors will just try to sell me their system. Either is a sledgehammer to crack a nut that won’t necessarily give all the benefits we identified for a microservice, or move our overall architecture forward. InsurTechs might not be able to meet all the procurement hoops for a larger organisation, or may be looking for investor partner rather than a pure customer, which won’t be suitable for everyone. 

 

So where does this leave us? 

The truth is there is no one-size-fits all answer to any of these questions. Each will depend on the specific scenario, the organisation and their existing landscape and appetite for risk and investment. 

 

Where Eurobase is clear is that this is a challenge for incumbent software providers as much as incumbent insurance providers. We are justifiably proud of our synergy2 system and continue to invest in ever-richer functionality alongside technological relevance: established doesn’t have to mean legacy. But we accept that a new PAS is not the answer to every problem. We are committed to innovation, and are investing to deliver solutions such as Machine Learning based analytics and forecasting independent of underlying systems. 

 

We have embraced the challenge of uncertainty – have you?