Data 国标麻豆视频APP
  • Articles
  • December 2021

The Case for a Data and Analytics Ethics Framework in Insurance

Data warehouse and innovation
In Brief

RGA has launched a Data and Analytics Ethical Framework, a set of enterprise-wide guiding principles for data utilization. 国标麻豆视频APP Doug Knowling  explains the framework, delving into its guiding principles and the problems it seeks to address. 

In other words, data-driven models and machines may not know right from wrong, but the humans who build and use them do. Recent high-profile revelations about data misuse in business and government only reinforce these words and are fueling calls for greater governance and privacy protections. Insurers and reinsurers are increasingly reliant on data to manage risk, make decisions and develop products, yet, unlike medical research, which has been shaped by extensive clinical trials, the risks 鈥 and rewards 鈥 of analyzing big, often non-public, databases are only just beginning to become clear.

That鈥檚 why RGA launched a Data and Analytics Ethics Oversight Board and charged it with developing and implementing a new Data and Analytics Ethical Framework, a set of enterprise-wide guiding principles for data and analytics utilization.

RGA met with Doug Knowling, Chief Data and Analytics Officer, to discuss the importance of creating such a framework. He argues that the insurance industry can play a powerful role in curbing the potential for harm in data utilization and harnessing its power for good through sound data and analytics governance.

  Explore our Framework

 

How does RGA view data usage?

The ethical acquisition and analysis of data can be a powerful force for good; information has the power to advance our industry and society by enabling insurers to gain a much deeper understanding of risk and make protection more affordable and accessible. I would say that the insurance industry has always been data-centric, yet the volume, variety, and velocity of data available has grown exponentially. Willis Towers Watson recently found that the finance sector used more big data in 2020 than at any time in history. Our task now is to ensure that information is not misused: That鈥檚 the motivation behind the Data and Analytics Ethical Framework.

Surveys suggest public concern about data use is growing. Why is that? 

Every time any of us interacts with a device, we reveal metadata that could be translated into insight. Recent controversies have highlighted the ease with which personal data can be collected, used, shared, and even sold without our knowledge. Even when information is released with technical consent, many of us may not understand implications or extent of that permission.

Reinsurance 鈥 and insurance 鈥 are built on trust that we will keep the promises we make. Using data ethically and responsibly is critical to maintaining, and expanding, that trust. That means going beyond what is required to doing what we know is right. It means understanding that just because we can do something doesn't mean we should. It means being able to look in the mirror every day and know that we have exceeded the high expectations of our clients, regulators, society, and ourselves. The industry needs an ethical structure, a framework, to guide these decisions.

Reinsurance 鈥 and insurance 鈥 are built on trust that we will keep the promises we make. Using data ethically and responsibly is critical to maintaining, and expanding, that trust.

But why call it a framework 鈥 why not a set of rules?

It鈥檚 an overused analogy, but providing data privacy and protection is a journey, not a destination. We can鈥檛 know all the potential uses of information 鈥 data and analytics are in a constant state of evolution.  Technology is advancing rapidly, and new opportunities are emerging all the time. We need guiding principles 鈥 rather than a static set of rules 鈥 that can help us evaluate each case.  We also need an oversight board made up a diverse group of experts who can think critically about context and circumstances.

Technology is advancing rapidly ... We need guiding principles 鈥 rather than a static set of rules 鈥 that can help us evaluate each case.

It鈥檚 helpful to think about data and analytics ethics as a set of overlapping circles around an immovable core. In the broadest circle, we can place potential business opportunities that involve the use of data. Nested within is a circle of actionable ideas, that is, what we can do now through available database access, expertise, and current technical capabilities.

But just because we can act doesn鈥檛 mean we should. There鈥檚 another smaller circle that includes what is possible within various regulatory and legal contexts. And even then, within that circle of opportunity, we place an even smaller sphere that includes only potential applications that do more than merely comply with rules but that also reflect our values as an organization. This isn鈥檛 purely about compliance; it鈥檚 about integrity.

RGA established an oversight board to bring together a diversity of perspectives and to determine what should remain in that smallest circle - that core that encompasses how we use data.  

The framework seems to be organized in two categories based on data use and operations. Can you explain?

We feel these are two separate but interrelated areas. First, there is the question of whether we are using data correctly. We must ask ourselves whether we are adequately prioritizing fairness and equity. Beyond the use of data are the actual operating practices we must consider. The question there is not why but how: How do we make sure that we're ensuring privacy and transparency?

Let鈥檚 take that first issue of fairness and equity. What do we mean by that?

More than complying with a set of rules, the industry needs to create a culture of continuous oversight and analysis to ensure that usage of data attributes does not lead to unfair decisions. Some attributes are very clear cut. For example, RGA does not knowingly process information or make decisions based on race, religion, sexual orientation, ethnicity, or political beliefs. At times, however, the issue is less straightforward: Data attributes can be correlated, but not directly related, in ways that can lead to biased decision-making, something we call proxy discrimination.

More than complying with a set of rules, the industry needs to create a culture of continuous oversight and analysis to ensure that usage of data attributes does not lead to unfair decisions.

Explain proxy discrimination.

As an industry, we build models to rate risk and guide underwriters, and those models are based on data. Sometimes certain variables can become correlated or connected in ways the insurer or reinsurer did not intend, or often even recognize. In these cases, a single data attribute becomes a proxy for another in ways that can create the perception of discrimination or result in actual bias. It is critical that the industry guard against this development.

How? Through detecting and mitigating bias throughout the data solution lifecycle, by looking at outcomes critically, and then being flexible enough to adjust models accordingly. There may be cases where it pays to be less accurate but more inclusive. And we always should prioritize communication and transparency around what drives our decision-making. 

Now this framework is not just about collecting data. It also touches on how data is applied.

How we use data is arguably even more important than who we collect it from. We have established principles to ensure that we do more than just pay lip service to best practices, and we have implemented the right mechanisms to govern data use. It鈥檚 all about meeting a high standard of integrity and care in data transmission, security, storage, use, and monitoring. RGA is focused, for example, on protecting individual privacy by de-personalizing data sets, whenever possible, and maintaining secure processes and information systems.

We limit who has access; we restrict how information can be accessed; and we employ a multi-layered series of checks and balances to ensure data is made available strictly within the bounds of our agreements.

It鈥檚 worth mentioning, too, that as global company, we need to be consistent and true to our values everywhere we operate.  We also need to hold our partners, such as third-party data providers, to the same ethical standards. From my perspective, and certainly from the perspective of many of our clients, we don鈥檛 own the data that is offered to us for the purpose of providing insurance. We safeguard it on behalf of our clients. And we never will license or sell that data. That鈥檚 a hard boundary, and our clients and regulators should expect nothing less. 

What happens if a third-party misuses data?

Any insurer or reinsurer should be prepared to do the right thing. This means walking away from any relationship when it becomes clear that data practices are not ethical or appropriate. The chain of protection around data is only as strong as the weakest link. All insurers have an obligation to be fully transparent about how data is being processed and applied. As an industry, we are guardians of a rich store of highly sensitive information, and we must be prepared to step-forward and take this immense responsibility seriously.  


More Like This...

Meet the Authors & Experts

Doug Knowling
Author
Doug Knowling
Senior Vice President and Chief Data and Analytics Officer (ret.)