Wednesday, November 29, 2023

Yes, An Algorithm Can Enable Your Bank To Evict You From All Your Accounts

                                                                       


Can an algorithm trigger your bank to close down all your accounts, including bank cards? According to author David Ward ('The Loop',p. 247) citing work by Prof. Michele Gilman (who teaches law at the University of Baltimore):  

"Increasingly she's found that algorithms are at the center of her cases, denying citizens their rights and benefits.  In a 2020 paper for Data & Society she offered aa guide to algorithms now being deployed to make decisions in literally dozens of areas of American life, from credit scores to public benefits to housing.  These algorithms often make life changing decisions without consulting the people whose lives they are changing."  

Such is the case with a banking algorithm (that flags "suspicious activity")  that's been canceling accounts for people and businesses at least since 2017-18. Thomson Reuters, in fact, reports that banks filed more than 1.8 million suspicious activity reports (SARS) in 2022, a nearly 50 % increase in just two years.   That figure is now on track to hit nearly 2 million this year.  

The people caught in the SAR trap often find chaos and confusion in their finances as a result. Sometimes, the bank is alert and sympathetic enough to dispatch a letter telling the forlorn customer that it is closing all their checking and savings accounts and canceling all their cards.  Any explanation, if there is one, is usually sketchy - offering little or no details, reasons.  

At the other end are customers who never get any letter or alerts at all, period.  Instead, they suddenly discover their accounts no longer work - often while paying for groceries at the supermarket, at the rental car counter, the hotel front desk or the ATM.   When they call their banks, the representatives show concern, i.e. "So sorry, we will look into it."  But then there comes a pause and shift in tone, at least according to a NY Times account (Nov. 10, 'Why Banks Are Suddenly Closing Customer Accounts',  by Ron Lieber and Tara Siegel Bernard)  of the SARS phenomenon. Thus they will be told:  "Per your account agreement we can close your account  for any reason at any time."

The Times' piece is quick to point out this isn't just a pro forma dealie to dump a grifter, say a person who's bounced one too many checks. No, not that at all.  Nope, what has occurred is that a "vast security apparatus has kicked into gear" to track what you've been doing - namely how many withdrawals made, and how much each time, not to mention deposits and dates.  If these strike the algo's flags then a SAR gets triggered.  The goal here is to crack down on money laundering, terrorism, human trafficking and the sort of financial crimes George Santos has been accused of committing (as elaborated in the House Ethics Report.)   

But often bank evictions occur because ordinary customers are simply making transactions which the algo regards as "out of character".    These algo-generated alerts are then reviewed by human bank employees.  These latter then already know they must file an SAR if they see actions or behavior that might violate a given law.  The problem is that the algo's "out of character" baseline casts too wide a net and a hundred times more innocent people get captured than bad guys.  In other words, like  bad cancer test, the system generates too many false positives. But unlike the bad cancer test, those false positives for the SAR and followup eviction generally can't be taken back, repaired.  

In one case the Times' piece examined, a guy doing a house purchase in NY withdrew large chunks of money ($7k to $12k at a time) to pay his contractor.  He was then surprised when the bank (Citi) called to ask why he was making such repeated large withdrawals.   He told the Times he just assumed the bank called to make sure no one was stealing his money.  But the next thing he knew he was evicted...no more accounts, or cards, for him.   When he actually visited the branch to inquire the frustrated manager probably "said more than he was supposed to", basically:  "Don't ask me, ask the computer that flagged you."

And therein lies the rub with use of any AI -directed algorithms which are chained to a logic loop that foregoes nuance or internal checks.  Once again, as author Ward learned ('The Loop') there’s a fast, intuitive processing system (System 1) for the brain, which solves many problems with graceful ease.   However, it can also be lured into error, and there’s a slower, more effortful logic module (System 2) which can grind out the right answer when it must. But as author Ward has noted, prefers to let System 1 do all the work and arrive at the answers.   This is exactly the problem with the banking algo:  it's decision making has been 'lured' into a simplistic System 1 loop which basically defeats its own purpose.  

How to escape the loop-based algorithmic errors which are based on excessive scrutiny?  Federal banking regulation would have to be changed first, to at least try to cut down on the false SAR positives for regular customer actions. 

Failing that, humans need to intervene (i.e. bank employees) to look beneath the algo 'hood' and see what's really caused the specific SAR to be activated. Then, if it is found to have been over reactive, to cancel the eviction. Citizens ought not be subject to the decisions of an impervious deus ex machina incapable of genuine logical reasoning and processing of causal nuance.


No comments:

Post a Comment