Big Tech Financial Firms Ahead of Regulators | Barry eichengreen

0

In 2009, in the midst of the global financial crisis, Paul Volcker, the former chairman of the Federal Reserve, celebrates observed that the only socially productive financial innovation of the previous 20 years was the ATM. One wonders what Volcker would do with today’s tsunami of digital financial innovations, from mobile payment platforms to online banking and peer-to-peer lending.

Volcker can rest assured: like the humble ATM, many of these innovations have tangible benefits in terms of lower transaction costs. But as a critic of large financial firms, Volcker would likely also be concerned about the entry of very large tech companies into the industry. Their names are as familiar as their services are ubiquitous: Amazon in the United States, courier company Kakao in Korea, online auction and commerce platform Mercado Libre in Latin America, and Chinese tech companies Alibaba and Tencent. .

These entities now do virtually everything related to finance. Amazon provides loans to small and medium-sized businesses. Kakao offers a full range of banking services. Alibaba’s Ant Financial and Tencent’s WeChat provide a cornucopia of financial products, having grown so rapidly that they have recently become the target of a Chinese government repression.

The challenges for regulators are obvious. When a single company channels payments for the majority of a country’s population, as M-Pesa does in Kenya, for example, its failure could lead to the collapse of the entire economy. Regulators must therefore be very attentive to operational risks. They need to be concerned about protecting customer data – not just financial data but also other personal data that big tech companies have access to.

Additionally, large tech companies, because of their ability to collect and analyze data on consumer preferences, have an increased ability to target behavioral biases in their customers. While these biases drive some borrowers to take excessive risk, big tech will have little reason to care if they simply provide technology and expertise to a partner bank. This moral hazard is the reason why Chinese regulators now require that the country’s major technologies use their own balance sheets to finance 30% of any loan granted through co-loan partnerships.

Governments also have laws and regulations to prevent providers of financial products from discriminating on the basis of race, gender, ethnicity, and religion. The challenge here is to distinguish between price discrimination based on group characteristics and price discrimination based on risk.

Traditionally, regulators have asked credit providers to list the variables that form the basis of lending decisions so that they can determine whether the variables include prohibited group characteristics. And they require lenders to specify the weights attached to the variables so that they can establish whether lending decisions are not correlated with ethnic or racial characteristics when conditioned by these other measures. But as AI-powered algorithms from big tech companies replace loan officers, variables and weights will continually change with the arrival of new data points. It is not clear that regulators can keep pace.

In addition, in algorithmic processes, the source of bias may vary. The data used to train the algorithm may be biased. Alternatively, the training itself can be biased, with the AI ​​algorithm “learning” to use the data in a biased way. Given the black box nature of algorithmic processes, the location of the problem is rarely clear.

Finally, there are risks for competition. Banks and fintechs rely on cloud computing services operated by big tech companies, making them dependent on their toughest competitors. Large techs can also subsidize their financial activities, which are only a small part of their business. By providing a range of interconnected services, they can prevent their customers from switching providers.

Regulators have responded with open banking rules requiring financial companies to share their customer data with third parties when customers consent. They have licensed the use of application programming interfaces that allow third-party vendors to connect directly to financial websites to obtain customer data.

It is not obvious that this is sufficient. Big tech can use their platforms to generate large amounts of customer data, use it to train their AI algorithms, and identify high-quality loans more efficiently than competitors who don’t have the same information. Customers can transfer their financial data to another bank or fintech, but what about their non-financial data? What about the algorithm that has been trained using his data and that of other clients? Without this, digital banks and fintechs will not be able to price and target their services as effectively as big technologies. The issues of consumer foreclosure and market dominance will not be overcome.

In an old parable about banks and regulators, banks are greyhounds – they run really fast. The regulators are sleuths, slow to step but faithfully on the track. In the era of the platform economy, sleuths are going to have to pick up the pace. Since only three central banks report having dedicated fintech departments, there is reason to worry that they will lose the smell.

Barry Eichengreen is a professor of economics at the University of California at Berkeley and a former senior policy adviser at the IMF.

© Project union


Source link

Leave A Reply

Your email address will not be published.