Is Apple’s New Credit Card Sexist? Apple’s Co-Founder Steve Wozniak Thinks So

Apple Card Hand Credit: Apple
Text Size
- +

Toggle Dark Mode

It seems that something may be seriously amiss in the algorithm that Goldman Sachs is using to process applications for Apple’s new credit card, the Apple Card, leading to allegations that Apple and Goldman are discriminating against applicants based on gender.

High-profile Twitter user David Heinemeier Hansson, known as the founder of Basecamp and the creator of Ruby on Rails, made a series of tweets last week raising the issue that he and his wife both received dramatically different credit limits for their individual Apple Cards — in this case Hansson received 20 times the limit that his wife did, despite, as he notes, having filed joint tax returns, living in a community-property state, and having been married for a long time.

To make matters worse, Hansson noted that his wife actually has a higher credit score than he does, and Goldman and Apple’s customer service was of little help. While his wife eventually got a “VIP bump” to match his credit limit, this seems to have been the result of public outcry rather than any kind of internal review — as Hansson said in an interview, “As soon as this became a PR issue, they immediately bumped up her credit limit without asking for any additional documentation.”

What’s even more interesting is that Hansson isn’t the only one to have encountered this, with none other than Steve Wozniak, Apple’s legendary co-founder, reporting that he and his spouse encountered the same issue, despite also being in a similar situation to Hansson’s family, with no separate bank or credit card accounts between them or any separate assets.

While both Hansson and Woz don’t believe that Apple or Goldman are setting out to be discriminatory, this doesn’t change the ultimate outcome, which was made worse by the difficulty in correcting the problem.

Goldman and Apple are delegating credit assessment to a black box. It’s not a gender-discrimination intent but it is a gender-discrimination outcome.

David Heinemeier Hansson

Both ultimately blame the algorithms. As Woz points out, “It’s big tech in 2019.”

An Investigation Triggered

The tweets, which have now gone viral, have also attracted the attention of the New York Department of Financial Services, which according to Bloomberg has now launched an investigation to determine whether New York state laws are being violated by Goldman Sachs. The issue is not whether Apple or Goldman are intentionally being discriminatory, but rather the results that count.

Any algorithm, that intentionally or not results in discriminatory treatment of women or any other protected class of people violates New York law.

Linda Lacewell, Superintendent, New York Department of Financial Services

One of the key problems with the Apple Card is that unlike many credit card issuers, Apple doesn’t offer the kind of joint accounts that many families take advantage of. This means that spouses who each want an Apple Card are required to make their own individual applications, and as a result it’s possible for two family members to “receive significantly different credit decisions,” according to a Goldman spokesperson. However, Goldman categorically denies that gender plays any role whatever in its algorithms.

In all cases, we have not and will not make decisions based on factors like gender.

Spokesperson for Goldman Sachs

The problem, however, as Hansson points out, is the biases that may be inherent in machine-learning algorithms that are used to make decisions like these — biases that companies like Goldman may not even be aware of, especially when “no one can explain how this decision was made.”

It’s effectively a darker side of the use of artificial intelligence that’s already becoming apparent today. In fact, the issue has already drawn the attention of Congress, with Bloomberg noting that the House Financial Services Committee has already heard evidence of algorithmic decision-making in financial services where researchers have discovered the machine learning algorithms had developed their own bias, even though the designers had no intent to discriminate.


Sponsored
Social Sharing