The New York State Department of Financial Services in the US has announced plans to investigate the algorithm associated with Apple Card.

The move comes after a software entrepreneur, David Heinemeier Hansson, claimed on Twitter that the algorithm provided 20 times higher credit limit compared to his wife.

Hansson alleged that the approval was provided despite filing joint tax returns.

US-based financial services company Goldman Sachs, which is the issuing bank for the card, provides the algorithm. The algorithm is meant to analyse the credit worthiness of an applicant.

The US regulator said that the company would be investigated for potential gender discrimination.

Responding to the allegations, Goldman Sachs noted that the algorithm does not consider gender, race, age or marital status while reviewing an application.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

The company also offered to reevaluate credit limits for cardholders who obtained lower credit limit than expected.

In a statement, Goldman Sachs retail bank CEO, Carey Halio, said: “We have not and never will make decisions based on factors like gender. In fact, we do not know your gender or marital status during the Apple Card application process.”

Apple, in alliance with Goldman Sachs, launched the credit card in the US in August this year. The product leverages machine learning and Apple Maps to label transactions with merchant names and locations.

Apple noted that the transactions are not stored in its databases.

Wider concerns

The issue has been controversial with industry professionals expressing wider concerns. Professor Hani Hagras, Chief Science Officer at Temenos, argues that Apple’s ‘sexist’ credit card is the tip of the iceberg of a very big problem that faces AI.

He said: “We see examples of bias in ‘opaque box’ AI systems every day and the more banks, retailers and other organisations use ‘opaque box’ AI, the more bias in AI decision making is going to happen.

“It is a huge governance issue that lies at the heart of the tech transformation and AI revolution. Apple’s ‘sexist’ credit card is the thin end of the wedge and the scenario of bias reinforcing bias in opaque AI systems is the pressing problem that has to be fixed if those organisations using AI are to avoid significant governance issues and potential legal action while gaining the trust and confidence of the end user.

“By explaining how and why decisions are made, XAI helps consumers understand what they need to do to get a different outcome (e.g. turn a rejected mortgage application into an acceptance) and so can help consumers take appropriate action, and banks and other institutions to offer more suitable products.”