No, the Apple Card Does Not Discriminate Against Women

Path to Apple Card Credit: Apple
Text Size
- +

Toggle Dark Mode

Not long after Apple launched its revolutionary Apple Card two years ago, some high-profile users began questioning whether the algorithms employed to process applications by Apple’s financial partner, Goldman Sachs, were discriminating against applicants based on gender.

The alarm bell was first sounded by David Heinemeier Hansson, founder of Basecamp and one of the most vocal opponents of Apple’s App Store policies. Hansson came out in late 2019 blatantly declaring the Apple Card a sexist program, noting that his Apple Card was approved with 20 times the credit limit that his wife received, despite filing joint tax returns, living in a community-property state, and the fact that she actually has a higher credit score than she does.

Despite Hansson’s typical antagonism toward Apple’s policies, however, he conceded that he didn’t believe this was a case of deliberate gender discrimination, but rather the fault of the “black box” algorithms used to process credit approvals. In fact, it appears that Hansson’s wife only received a “VIP bump” to match his credit limit after the PR storm began.

More importantly, however, this wasn’t just an isolated incident on Hansson’s part, as legendary Apple co-founder Steve Wozniak soon weighed in on Hansson’s tweet to add that he and his wife had run into the same problem.

Woz appeared to be a bit more mellow about the whole issue, however, shrugging it off as just the nature of “big tech in 2019.”

The Investigation

It doesn’t appear that Hansson, Woz, or anybody else ever pushed for an investigation, but that didn’t stop their tweets from getting the attention of the state of New York’s Department of Financial Services.

Since Goldman Sachs is based in New York City, it comes under that state’s jurisdiction, so it promptly opened an investigation to determine whether Goldman was in violation of state laws against discrimination.

At the time, Goldman categorically denied that gender played any role at all in its algorithms, however, the New York Department of Financial Services said that it’s ultimately the results that count, rather than the intent.

In other words, even if gender bias was purely unintentional, if it was present in the algorithms in any way at all, Goldman could be held responsible for discriminatory treatment.

In fact, even Hansson acknowledges that this may not be something that Goldman or Apple are even aware of. Rather, he posits that such biases may simply be inherent in the machine-learning algorithms that are being used.

Since the Apple Card doesn’t allow joint accounts, spouses who want their own Apple Card are required to make their own individual applications, which could highlight a problem that simply isn’t encountered with most other credit card providers.

As with most government investigations, however, it’s taken some time for the Department of Financial Services to analyze all the data and come to a conclusion, but in the end it determined that Goldman Sachs did not use discriminatory practices when deciding whether to extend credit to prospective customers of its Apple Card.

The Department’s exhaustive review of documentation and data provided by the Bank and Apple, along with numerous interviews of consumers who complained of possible discrimination, did not produce evidence of deliberate or disparate impact discrimination, but showed deficiencies in customer service and transparency.

Report on Apple Card Investigation, New York State Department of Financial Services

The regulator analyzed underwriting data for almost 400,000 New York Apple Card applicants, without finding a single violation of fair lending laws. It did note that the Apple Card offered some unique approaches to credit approval, such as offering “second look” evaluations based on an applicant’s Apple customer data. However, this avenue was only offered to those who might have otherwise been declined due to limited or no credit history.

The results were reported this week by Bloomberg, which notes that while the investigation found no fair lending violations, it did fault Goldman for “deficiencies in customer service,” and a “perceived lack of transparency,” that likely led to the accusation in the first place by undermining consumer trust. “The problems might have been prevented by better management of the product’s roll-out,” the regulator noted in its report.

Although the Bank was able to explain, at the request of the Department, the credit decisions for all of the individuals who filed complaints, lack of transparency to the complainants themselves in this case seemed to produce confusion that could have been mitigated.

Report on Apple Card Investigation, New York State Department of Financial Services

Notably, however, the agency noted that there remains a larger systemic problem though the U.S. when it comes to bias and access to credit, not only for women, but also for other minorities, who “continue to qualify generally for less credit and at higher cost than White men.”

It is not unlawfully discriminatory for a lender to consider income, assets, credit history, and similar factors to predict likelihood of default of an applicant. However, because these same variables often reflect the nation’s long history of racial and gender discrimination, even the exclusive consideration of such financial characteristics does not prevent that history of discrimination from affecting credit scores and, consequently, access to credit.

Report on Apple Card Investigation, New York State Department of Financial Services

However, as the report notes these problems are not so much as a result of bias in credit approval algorithms, but rather a function of social inequalities in the underlying system that’s used to assign credit scores in the first place.

Sponsored
Social Sharing