Why Insurance Companies Need A Data Warehouse
To The Editor:
In the May 14 issue, Patrick Kassebaum submitted his analysis of why insurance companies do not need data warehousing and bemoaned the lack of return on investment (ROI) in their development and use. To intelligently respond to his premise, some brief history on data warehousing and business intelligence is in order.
What is a data warehouse? Bill Inmon, widely regarded as the originator of the term "data warehouse" and recognized as its leading advocate, defines a data warehouse as "a collection of integrated, subject-oriented database(s) designed to support the DSS function, where each unit of data is relevant to some moment in time. The data warehouse contains atomic and lightly summarized data." A data mart is a subject matter or department-oriented version of a warehouse. It is smaller and very focused.
The universe of data warehousing has attracted a whole new community of users who heretofore had not been able to access corporate data without difficulty and suspicion. The centralization of data into a common repository, the use of common business models and implementing business rules has gone a long way to ensuring that users of data warehouses are now seeing a single version of the truth. The resultant effect of this has been that insurers using data warehouses are now making more quality business decisions per hour because they now have timely and accurate information, not just data from a database. For insurers, this means settling claims faster, detecting and preventing fraud, performing more accurate analysis of their book of business, introducing new products faster, and moving into new areas, such as e-business (selling policies online and providing online account information) and investments.
In Kassebaums article, he refers to "expensive data management projects that never get completed, or that cannot possibly deliver the hoped-for benefits." A data warehouse and the ultimate result from its implementation, business intelligence, is not a finite project. The methodology usually employed is based on an iterative approach in which the warehouse grows incrementally. In the beginning, the iterations are longer and more complex and the ROI small. As the user community embraces it, their analytic demands grow and additional data is added. The iterations become shorter each time and the ROI more significant. It ultimately becomes the way the enterprise does its business each day.
Kassebaum also remarks that "the goal of a central data repository interfacing seamlessly with all of a companys legacy systems often becomes a mirage on an ever-receding horizon." With the growth of data warehouse development methodologies, reusable and repeatable processes have been developed to make this task a much less daunting and prohibitive process than Kassebaum states it is. Secondly, companies have developed products of advanced design that allow changes in business structure to be reflected in the data warehouse without the need for costly software changes associated with changing the database schema. This dramatically affects the total cost of ownership of the data warehouse.
In addition, the emergence of Application Service Providers (ASPs) has made a significant impact on the insurance industry. These providers facilitate data capture and data integration to a carriers home office systems and run applications (the carriers and some they may provide) on their facilities. Complementing this trend is combining analytic capabilities to the ASPs product offering. By bringing together the concept of ASPs and data warehousing science, these providers are delivering to their clients a comprehensive set of data regarding the behavior of the carriers distribution channels and the performance of specific insurance products. In addition to the compelling economic advantages ASPs provide, business intelligence is enabled and provided to the client, developed, supported, and upgraded by the ASP.
Insurers continue to chase this "elusive vision," as Kassebaum calls data warehouses, because it works and their competition is doing it. Kassebaum states that it is harmful that data warehousing "mandates the homogenization of customer data originally created by different functions to serve their own particular needs." It is only harmful if the different departments, such as Underwriting, Claims, and Finance, are operating without a common business model or set of business rules. Granted, each of these departments have their singular analytic requirements. But when the leadership asks for reports from these departments and they provide different numbers for similar data elements, then there is a problem. Undoubtedly, these departments used business rules or algorithms that were better suited to them rather than to the company as a whole. If so, then the concept of a "single version of the truth" is an elusive, if non-attainable, target.
To avoid the "Balkanziation" and turf wars in an insurer when they standardize their data across the company, two things are essential. One, the leadership of the company must understand business intelligence and embrace it and then force it downward. This sounds clich?, but it is critical nonetheless. Two, data governance must be established. Kassebaum states that much of the richness of the source systems data is lost after it is scrubbed and placed in the warehouse. The creation of a data governance group, board, department, or whatever term you want to apply to this function, will agree on the definition, algorithm, and usage of each data element in a data warehouse. This function, essential in both large and small organizations alike, is absolutely necessary to ensure a single version of the truth exists in the system of record for the company: the data warehouse.