By Trevor Clawson

A payment to a marriage guidance counselling service appears on your monthly credit card bill and a few weeks later you receive a letter from the bank informing you that your credit limit has been reduced. You renew your car insurance policy with a company that you’ve been with for several years, only to find a friend of the same age and with a similar car and driving record has been offered a much lower renewal  quote from the same company – the only difference of being that he has been a regular user of price comparison sites in the weeks and months before seeking new cover.

Welcome the world of big data, in which an ever-increasing range of personal information is not only collected on a daily basis but also processed, analysed and fed into the algorithms that determine whether or not we can take out a loan, access credit or buy an insurance product at a competitive price.

And this week, the Financial Conduct Authority (FCA)  warned that financial services companies – notably banks and insurers – may face a regulatory clampdown unless they are open and transparent about the way they use the personal information that they hold on customers.

Speaking at an event organised by the Reuters news agency, FCA chairman Charles Randall warned there was a risk that consumers might become “prisoners of technology” unless they are made aware of how their data is being used.

Slaves to the  Software

As Randall pointed out, the financial products we buy and the price we pay for them is increasingly governed by algorithms – or to put it another way, the software programmes that analyse what we do and say on and offline.

“Algorithms decide which insurance and savings products you are offered on price comparison websites. Whether your job qualifies you for a mortgage. Perhaps, whether you are interviewed for your job in the first place,” said Randall.

Randall acknowledged that there is a genuine upside to data-driven decision making, citing Artificial Intelligence-driven tools that have enabled insurers to provide insurance coverage to individuals who might otherwise have been turned away.   But he also warned that profiling of customers using analytics tools could create a kind of apartheid in the financial services marketplace.

Watchdog Warns Finance Firms On Data Transparency

There were a number of media reports earlier this year claiming that price comparison websites quoted significantly higher car insurance premiums for people with names suggesting they are members of ethnic minorities, he said, referring to research by the BBC’s You and Yours programme. “If that’s true, are they being treated as people, or as numbers – as mere data points?”

The danger is that apparently neutral algorithms are in fact programmed to reflect the preconception that certain groups within society are inherently riskier than others. In other cases, an algorithm might simply underpin a commercial strategy that does not necessarily reward loyalty on the part of customers.

“It’s well known that the FCA is concerned about firms using their predictions of customers’ propensity to shop around in imposing significant price rises on customers who do not,” Randall added.

The Role of Social Media

But it’s not simply a case of financial services businesses applying algorithm magic to their own data – or in the case of credit scores, industry-wide information. Increasingly financial services are seeing an opportunity to build a 360-degree view of the customer by drawing on sources such as activity on social media or browsing history.

“This is now reaching further than simply using historical data such as credit reports and borrowing history,” says  Phoebe Griffits, marketing manager at KIS Bridging Loans.Some companies are also looking into social media platforms to reach into their customers’ personal lives, such as hobbies, work life, habits and social interactions and this information is being used to create a ‘personality profile’ on the customer to determine the risk of lending to them, thus the price they will pay for the product or service.”

Griffits cites Admiral insurance, which in 2016, created a product that used Facebook profiles as a tool to assess the risk profiles of younger drivers. This was blocked by Facebook, but Griffits says other companies have followed suit.

“This use of social media intelligence wasn’t a one-off by Admiral. It is becoming a growing role in other areas of the financial sector” she says

Data as an Enabler.

Again there is an upside in that data can be an enabler. For instance, decisions on credit have traditionally been made on the back of credit scores that rely on a history of handling debt well. Thus those who have had little access to credit – such as the young – don’t have much of a record and can find it difficult to get a loan. By using non-traditional sources of data – including social media – lenders arguably have more scope to offer credit.

“FinTech firms are already using transactional data to identify creditworthy loan and mortgage customers who would be excluded from these products by traditional credit scoring risk assessments because of irregular earnings histories or long periods of renting,” said Randall.

The Ethical Dimension

Equally, challenger financial services businesses have an opportunity to put clear water between themselves and their competitors by communicating ethical and transparent personal information policies.

For instance, insurance app Kinsu ( a contraction of Kind Insurance) has opted not to go down the data-driven pricing route. “Datasets are powerful,” says founder Chris Sharpe. “But you tend to get winners and losers. What we do is offer everyone something close to the average price.”  

And according to Griffits, companies that are open and transparent about data use can reap benefits in terms of customer trust.

“There are a lot of companies who are transparent and honest about the data they hold and how they use it,” she says. “Because of this, I believe these companies have a huge advantage over those who aren’t so transparent because they gain their customers’ trust. I also believe that regulations will be made stricter in the future, cracking down on those companies who aren’t adhering to them now.”

And that may well be the case. Under GDPR (Europe’s General Data Protection Regulation) businesses are already required to seek the consent of their customers over data use, but the FCA chair hinted that more regulation might be needed.

“Should all businesses have a data charter? Should these be developed through voluntary codes of practice? Will the industry take the lead or should they be a regulatory requirement?” he asked

As analytics and A.I. technology becomes ever more sophisticated these questions are likely to be asked more frequently by policymakers.