Protecting consumer privacy in data analytics requires exceptional data governance.
Here's how we do it.
Leslie Arnold
When Ken Meiser is not volunteering as an ambulance crew and first responder in his home town of Lavallette, New Jersey, he is thinking about how best to protect consumers' data privacy in information and analytics services.
Meiser’s job at LexisNexis Risk Solutions, the global information and analytics company, is to act as a bridge between product development and compliance for the company's Business Services industry sector. When you are in the business of analyzing personal information and developing analytics to protect consumers or government end-users from identity theft or online scams, he explains, faultless product design and strong data governance are key to building trustworthy products that meet customer needs, respect the rights of individuals, protect society, and go beyond the minimum requirements of the law.
Handled with care, the use of personal information in analytics tools can be of great benefit to consumers. In fact, much of the modern economy relies on data and analytics. Using data analytics to verify that individuals are who they say they are protects consumers against online fraud and other crime: without these “know your customer” checks, much of online commerce would be at risk.
Government agencies use data analytics to combat cybercrime, bribery, corruption, human trafficking, money laundering and global terrorism. Local law enforcement rely on it to keep communities safe. Consumers get fair prices on insurance policies and obtain reimbursements faster. And "credit invisibles" who lack sufficient credit history to build a credit score can access credit thanks to the use of non-traditional risk scoring tools that leverage data analytics.
Thanks to the processing of personal information and the use of analytics, consumers, businesses and public agencies alike can operate online more securely and efficiently.
At the same time however, social and political anxiety has been heightened by the misuse of personal information by bad actors. Instances of stolen and leaked data are a growing concern, with the Identity Theft Resource Center reporting an all-time record of 3,205 data breaches in the United States in 2023, affecting more than 350m victims and hundreds of companies.
Built-in biases and other errors in data analytics tools have also brought calls for more transparency and accountability. Some U.S. states have responded to the general public's concern about instances of misuse by and general mistrust of 'data brokers' with new regulation. At the national level, Congress has been debating national frameworks for data privacy.
“Organizations must understand that data is a trusted responsibility, not just a revenue generator,” says Vivienne Artz, a data protection and data governance expert who co-leads the data privacy experts group of the Global Coalition to Fight Financial Crime.
Meiser agrees. LexisNexis Risk Solutions differentiates itself by its use of data for purposes that are beneficial if not essential to the economy, society and government end-users, and it takes its responsibility for the proper collection, use and sale of personal information extremely seriously. Indeed, he argues, LexisNexis Risk Solutions has industry-leading data governance policies and practices in place and is a company that meets or exceeds regulatory requirements.
Good governance for the entire data lifecycle
Artz says good governance must embrace the entire data lifecycle, from the selection of data and creation of datasets to solve problems, to what is done with personal data, who it is shared with, whether it is incorporated into other products and uses, and finally, how personal data is deleted or disposed of. Transparency and accountability are key.
At LexisNexis Risk Solutions, good governance starts by checking data for quality and accuracy, and this is a responsibility the company does not take lightly. Annette Gaines, Deputy General Counsel and Compliance Officer, says that in the vetting process, the data is tested and analyzed by data content and legal experts, as well as technological specialists. “When we analyze sources, we also give them a risk rating that will inform how often we continue to vet them through the life of our relationship with them,” she says. The firm's risk security levels range from one to three and due diligence on the data will be performed every one, three or five years depending on their risk ratings.
Rick Gardner, a global data protection officer at LexisNexis Risk Solutions and a member of Gaines’ team, adds: “Once a potential source has been identified, the data provider will have to answer a number of questions from a privacy standpoint, information security standpoint, and a business operations standpoint. A lot of the things we’ll be looking for from that source will be things that we expect of ourselves. Are they acting appropriately and ethically in the marketplace? Do they have appropriate governance structures in place? Do they have a compliance management system (CMS) and a privacy compliance program in place? Are they tracking changing laws? Do they have the right levels of data security and do they have resilience policies in place? We ask the data contributor to show us how they are doing these things, and we hold them to the same high standards we hold ourselves.”
The guardrails for the use of personal data
LexisNexis Risk Solutions takes a number of steps to ensure personal information is protected and used responsibly. Gaines says any sensitive personal information processed in LexisNexis Risk Solutions products is subject to a strong series of controls – affecting how it is handled internally and limiting who can access it externally.
This duty of care is set out in LexisNexis Risk Solutions’ data privacy principles that apply to all its products and services.
“We put ourselves in the shoes of the individuals who might be affected,” says Gaines. “Just because we have access to information doesn’t mean it should be provided.”
To protect personal information, LexisNexis Risk Solutions clients undergo a similar vetting process as for data providers. The firm must be satisfied that the client need for a product is legitimate and that appropriate data privacy, security and governance processes are in place.
In addition, the firm limits the use of its products for certain permissible purposes only. “Before clients sign up, we have conversations to help them understand our expectations in terms of the appropriate usage of our data,” says Gaines. “We want to ensure that personal information is used responsibly and in a disciplined way.”
For example, LexisNexis Risk Solutions makes its LexisNexis ThreatMetrix solution available to banking and insurance clients to carry out identity checks and prevent fraud. But if a client were to use ThreatMetrix to assess a customer’s creditworthiness, they would be in breach of their contract. LexisNexis Risk Solutions would be made aware as it regularly checks how clients use its products. “Clients come through our system and we have teams that are proactively monitoring transactions. We capture elements of a transaction, we run a search and we pull all of that data,” Gaines says.
When a client is not using LexisNexis Risk Solutions products appropriately, Gaines’ team might talk to them about a different solution. They will offer training on the correct use of a product. “We remind them of our expectations, the product, the use case,” says Gaines. “We have and we will cut customers off if what they are seeking to do is not compatible with our products.”
Additionally, multiple types of entities from customers to government agencies audit the firm's practices annually.
Monitoring appropriate usage is not just essential for protecting data privacy – it is also a vital differentiator for LexisNexis Risk Solutions.
“There are bad and unethical actors that are just selling data to the highest bidder,” says Gardner. “That's not who we are.”
Building a responsible data analytics tool
Data analytics helps clients make better decisions. And because these decisions might have real life consequences, care and thought must go into building the models that drive them. The privacy principles at RELX, the parent company that owns LexisNexis Risk Solutions, require product developers to embed privacy protections into the design of all products, services and business practices. RELX also has published guidelines for the responsible use of artificial intelligence which asks designers to consider the real-world impact on people; to prevent the creation or reinforcement of unfair bias; to be able to explain how products work; to create accountability through human oversight; and to respect privacy and champion robust data governance.
Meiser works closely with the data scientists to embed both data privacy and responsible AI principles in product design.
First, they select the data required for the problem they are trying to solve. LexisNexis Risk Solutions mandates that model builders only select the minimum amount of information they need for a particular use, and then ensure it is used solely for that purpose.
Second, the models are tested not only for accuracy and reliability, but for bias – to ensure fairness for consumers. For example, a credit-scoring tool such as LexisNexis RiskView, or Fraud Intelligence, a predictive tool to help financial organizations detect fraudulent new account applications, cannot include information, such as zip codes, that could be used as proxies for lower income individuals.
“What you look for in a model is that you don’t have any data that could potentially be used to discriminate against protected classes of people,” Meiser says. “And then you test the model to see if overt discrimination is present in the results.” If the model produces different results for different groups, then a second battery of tests is conducted to quantify these discrepancies and compare them to observable differences in the real world. The LexisNexis Risk Solutions Fraud Intelligence tool, for example, might flag up more seniors as victims of identity theft because the group is just more vulnerable to fraud. Going beyond the tests for differential impact to help quantify the occurrence rates within the classes, says Meiser, is one example where LexisNexis Risk Solutions goes above and beyond regulatory requirements to ensure its products are ethical and fair and produce accurate results.
Building a model can take months, with testing and adjustments throughout. When the product is ready, it is reviewed by an independent panel of experts, who quiz the developers about how it was built, the information that was used, how the model operates, and its impact upon testing.
Explainability is important, Meiser says, because banks and other industries must in turn explain to state or federal regulators how data analytics is being used in their decision making. He expects the demand for explainability to grow, and says this is another example of where LexisNexis Risk Solutions practice is ahead of the industry.
working to increase transparency
Data analytics has reduced the time and hassle for a consumer to engage in many different transactions, such as applying for a mortgage or getting a life insurance quote. But when a bank's or insurance company's decision goes against a consumer, or just doesn't seem right, it can be frustrating to get answers.
Under the U.S. Fair Credit Reporting Act (FCRA), individuals have the right to examine information that is held about them and that has been used to inform an adverse action notice. This is a letter that informs a customer that they have been denied credit, employment, insurance, or other benefits based on information provided. It indicates which consumer reporting agency or source of information was used, and how to contact them.
LexisNexis Risk Solutions operates a consumer portal that gives individuals access free of charge to the personal information it holds about them, upon request. If a consumer believes that data contained in a LexisNexis Risk Solutions report is incorrect, consumers can ask LexisNexis Risk Solutions to correct the information. The firm employs over 200 staff dedicated to responding to consumer enquiries. The portal handles some 100,000 requests a year.
Regardless of regulations and minimum legal requirements, personal information is of immense importance to the people to whom it relates. Companies that process personal information have a special responsibility to take the necessary steps to keep it safe and to ensure it is used in legal and transparent ways. Clear and solid data privacy, security and governance practices are vital to retain the trust of society and regulators in an age when data has come to govern so much of our lives.
Amid calls for tighter regulation, Artz, the data privacy expert, says there is an opportunity for the sector to come together to improve both perception and practice in the industry. “It’s about culture and trust,” she says. “There are organizations that only want to comply with the law, and there are those that exceed it and want to do more.
That’s where best practices can drive progress.”