Automating Data Intake Process in Commercial Underwriting - Web
Automating Data Intake Process in Commercial Underwriting - Web
Contents
1. Data Intake Problem in Commercial Insurance 03
2. Metamodel-driven Solution 04
3. Business Benefits 05
Underwriter
Internal Data
Clearly, the turnaround time and quality can be significantly improved if the current manual process is
replaced by more automated data intake process. To summarize, the data intake process must:
Handle inconsistent input files in multiple formats Map external codes to internal codes
Improve data quality by scrubbing the data Run rules to flag the risks for underwriting
scrutiny
Enrich missing/incomplete data
Those who have tried to automate the process enrichment requirements. The solution needs to
know that the solution to this problem is not be - Highly configurable, Metamodel-driven, auto-
simple; due to variety of input formats, different mated with provision of manual intervention
types of data for different products and varied as required.
NO SQL
Third party / Internal
Metamodeler Services and business rules
Highly Configurable
Uses
First time mapping Uses Uses
Standardized and
Enriched Data
Input Files
Data Processing DB Sub0DB
No SQL
Manual Intervention if required
User Interface
The heart of the solution is a Metamodel, which may require manual intervention if the code
means a model of a model. It stores the mapping rules are not defined for all the codes.
information on how every input file should be The Enrichment station allows configuring third
interpreted, and how it should be mapped to party or internal services that should be invoked to
internal formats and codes. Due to varying number enrich the data. For example, you may get specific
of data elements, formats and mapping rules, the peril data for every location from third parties and
best way to store this Metamodel is to leverage append to the data. It also runs the configured
‘No SQL’ databases. The Importer is a batch rules to flag specific risks for a deeper evaluation.
routine that imports ongoing feeds by leveraging Once the standardized and enriched data is
the Metamodel information. The Translator uses available, it will be sent to an underwriting
the translation engine to covert the input data workstation for further processing. The better
elements into standardized formats that can be quality, standardized data also enables the use of
used internally. The Code Mapper allows mapping analytical models, thereby providing quality
incoming codes to the industry standards such as insights to underwriters.
ACORD or internal standards. This functionality
Business Benefits
The solution provides following business benefits:
Reduction in the overall cycle time by as Improve decision making due to better quality,
much as 50% (calculated for large market enriched data that can be leveraged for
property risks) advanced analytics
LTI (NSE: LTI, BSE: 540005) is a global technology consulting and digital solutions Company helping more than 350 clients
succeed in a converging world. With operations in 30 countries, we go the extra mile for our clients and accelerate their
digital transformation with LTI’s Mosaic platform enabling their mobile, social, analytics, IoT and cloud journeys. Founded
in 1997 as a subsidiary of Larsen & Toubro Limited, our unique heritage gives us unrivaled real-world expertise to solve the
most complex challenges of enterprises across all industries. Each day, our team of more than 28,000 LTItes enable our
clients to improve the effectiveness of their business and technology operations, and deliver value to their customers,
employees and shareholders. Find more at www.Lntinfotech.com or follow us at @LTI_Global