The FTC rang in the New Year with a report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues. The report wrestles with the implications of using big data analytics to target and make decisions about customers. Some of the FTC’s observations are common sense, but others require more careful analysis. For example, the FTC discusses how it considers whether a company is subject to the Fair Credit Reporting Act, how even statistically sound models might produce results that have a discriminatory impact, and what kinds of activities the FTC might consider “unfair,” “misleading,” and even “unethical.”
The Backdrop: The report comes over a year after the FTC engaged with industry participants and consumers about the risks and benefits of using “big data.” The report explained, “The proliferation of smartphones, social networks, cloud computing, and more powerful predictive analytic techniques have enabled the collection, analysis, use, and storage of data in a way that was not possible just a few years ago.” The FTC’s goal was to examine the use of “big data analytics techniques” by companies “such as financial institutions, online and brick and mortar retailers, lead generators, and service providers” to “to categorize consumers and make predictions about their behavior.” While acknowledging that data collection, compilation, consolidation, and analysis all play into the equation, the FTC focused on companies’ use of that data.
The Good: The good news is that the FTC continues to recognize that big data provides “numerous opportunities for improvements in society,” including allowing companies to “target educational, credit, healthcare, and employment opportunities to low-income and underserved populations” who otherwise might miss out on opportunities generated by traditional data sources alone. The FTC also stayed away from recommending further legislation or regulation to deal with the new issues big data poses. Instead, it provided further explanation and guidance about how existing laws and regulations apply and emphasized that it will continue to enforce them.
The Bad (and the Complicated): At the same time, the FTC reiterated concerns that “inaccuracies and biases” in data “might lead to detrimental effects for low-income and underserved populations.” In particular, it examined “concerns that companies could use big data to exclude low-income and underserved communities from credit and employment opportunities.” It focused on three sets of applicable laws.
- Fair Credit Reporting Act: The FTC emphasized that the FCRA is not limited to traditional consumer reporting agencies, credit bureaus, and lenders subject to the statute. It also applies to “data brokers,” particularly if those companies “advertise their services for eligibility purposes.” That’s an interesting statement because the statute doesn’t specifically mention advertisement of services as a factor in determining whether the FCRA applies. The FTC emphasized, though, that whether a company uses “traditional” credit scoring models or “non-traditional” consumer characteristics like zip code, social media usage, or shopping history, the FTC uses the same methods and standards in applying the statute. It conducts a “fact-specific analysis” to determine whether any given practice is subject to or violates the FCRA. As such, it emphasized that “companies should be mindful of the law when using big data analytics to make FCRA-covered eligibility determinations.”
- Equal Opportunity Laws: While companies should consider numerous equal opportunity laws, including the ECOA, Title VII, ADA, ADEA, and FHA, the FTC report focuses on the ECOA. The report embraces the much-debated disparate impact doctrine. As the FTC explained, “Disparate treatment occurs when a creditor treats an applicant differently based on a protected characteristic,” such as refusing to “lend to single persons or offer[ing] less favorable terms to them than married persons.” “Disparate impact occurs when a company employs facially neutral policies or practices that have a disproportionate adverse effect or impact on a protected class, unless those practices or policies further a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact.” The FTC specifically called out the making of credit decisions based on a consumer’s zip code as a practice that “may have a disparate impact on particular ethnic groups because certain ethnic groups are concentrated in particular zip codes.” The report also provides specific advice on advertising and pre-screening using big data. As with the FCRA, the FTC believes that the application of equal opportunity laws is a “case‑specific inquiry,” and advises that “companies should proceed with caution when their practices could result in disparate treatment or have a demonstrable disparate impact based on protected characteristics.”
- UDAP: The FTC, of course, also enforces Section 5 of the FTC Act, which prohibits unfair or deceptive acts or practices. While the statute is broad, the FTC specifically called out four kinds of practices as problematic: (a) violating “any material promises to consumers—whether that promise is to refrain from sharing data with third parties, to provide consumers with choices about sharing, or to safeguard consumers’ personal information;”; (b) failing “to disclose material information to consumers”; (c) failing to “reasonably secure consumers’ data”; and (d) selling “big data analytics products to customers if they know or have reason to know that those customers will use the products for fraudulent or discriminatory purposes.”
The Consequences. The stakes are real. In FTC enforcement actions, for example, the agency can recover a penalty of up to $3,500 per FCRA violation. Even before issuing the report, the FTC had used its enforcement authority to go after the alleged misuse of consumer data. This past October, the FTC entered into a $2.95 million settlement with a phone carrier that allegedly failed to properly tell customers with lower credit scores that they they’d been placed into a special, more expensive program. The FTC has also recently gone after data brokers for allegedly improperly selling consumer data. Enforcement around consumer data has been a priority for the FTC and other agencies, and will remain so.
What to Do? In addition to carefully considering whether, and if so how, the FCRA, ECOA, and other applicable laws apply, companies using big data should consider the quality of that data. The FTC provided some specific ideas that it “encourages” companies to consider.
- Representative Data: “Companies should consider whether their data sets are missing information about certain populations, and take steps to address issues of underrepresentation and overrepresentation.”
- Biased Data: “Companies should consider whether biases are being incorporated at both the collection and analytics stages . . . For example, if a company has a big data algorithm that only considers applicants from ‘top tier’ colleges to help them make hiring decisions, they may be incorporating previous biases in college admission decisions.”
- Accuracy: Big data isn’t magic. The FTC re-emphasized that traditional statistical methods should be applied to test the strength of models using big data, and noted that “while big data is very good at detecting correlations, it does not explain which correlations are meaningful.” While the FTC isn’t particularly concerned with how this plays out in advertising, it is very much concerned with the use of analytics in making credit and employment decisions, for example, in flagging individuals as “risky” based on social media history.
- Ethical and Fairness Concerns: Finally, the FTC encouraged companies to think more broadly about how they’re using big data, both to avoid unfairness and enhance current practices. For example, the FTC cited with approval the use of recruitment tools that objectively match companies with qualified candidates, “but also ensure that those candidates are not limited to particular gender, racial, and experiential backgrounds.”