When one of my mom’s inquisitive friends asks what I do, the “technology/antitrust/competition policy” spiel used to function as a real conversation killer/potential sleep aid. Lately, not so much. Platform competition is having its moment. The Federal Trade Commission and Department of Justice both have bold, visionary leaders at their helms to lead blockbuster antitrust cases against Google and Facebook. June saw the introduction of a bipartisan package of legislation by the House Judiciary Committee to revitalize digital competition that seemingly covers everything — interoperability, nondiscrimination, lines of business restrictions, antitrust enforcement funding, even judicial venue selection. But there’s one aspect of the debate that risks being left out: consumer protection. As we build out regulations for digital platforms, we must take account of both the competition and consumer protection problems that they can pose.
The Federal Trade Commission, for example, has one bureau focused on consumer protection and another focused on competition. While it can be tempting to think of competition and consumer protection as two separate issues, the interplay between the two means they are each maximally effective when taken in tandem. Perhaps this is why new Chair Lina Khan has worked to break down the barriers between the two bureaus and streamline their work. In one sense, competition and consumer protection go hand-in-hand. A monopoly not worried about competitors, knowing that its customers have nowhere else to go, can exploit freely with little consequence for its bottom line. However, if the same company is facing competition with fickle customers ready to jump ship to a competitor, they have every incentive to protect those customers and innovate if they want to stay on top.
However, reality (like in so many other areas) is far more complex. For one, a “regulated monopoly” might give what seems to be splendid service, while hiding the monopoly costs inflicted on consumers. Indeed, this was an argument AT&T made against its breakup in the 20th century. At first glance, the then-monopoly telecommunications provider was reliable and universal. But in reality, users were suffering from a lack of innovation and choice. Modern analogs might be today’s dominant digital platforms, which seemingly provides a positive experience to consumers while in fact hiding ways in which they can harm consumers. For example, Amazon and Google now pepper users with more ads than ever before, and once-clear distinctions between ads and organic search results are becoming increasingly blurred. Advertising is a legitimate business model, however a consumer protection problem arises when the distinctions between an ad and search results aren’t clear. Most consumers probably aren’t going to spot a purposefully de-emphasized “ad” or “sponsored” label and assume that neutral criteria (best price, highest reviews) are used instead. Platforms have the right to place ads where they want, but they shouldn’t be allowed to mislead consumers on what is and is not paid content.
In other ways, competition might actually increase the incentive to exploit consumers. The problem of dark patterns, strategic user interface designs to manipulate users into platform-friendly behaviors, only arises because there are competitive alternatives enticing a consumer to leave a service. Dark patterns can cajole a user into staying with an inferior alternative when an unbiased and transaction-cost free look at the entire competitive market might lead to a different choice. Dark patterns simply raise switching costs, which can be bad for consumers finding the true, best option for them.
Lessons from Europe
As the United States embarks on its consumer protection odyssey, there’s no need to start from scratch. Regulators can and should learn from other jurisdictions and emulate what works while discarding or reforming what doesn’t. One potential source is the European Commission’s proposed Digital Services Act (DSA). It should be noted at the outset that some aspects of the reform package are not advisable, particularly those that could limit reasonable tools for sharing information or free expression. Additionally, US regulators must take account of the different governmental system and legal tradition present in the EU when applying in the American context. They also must strike the proper balance between a strong enforcement regime and one that doesn’t chill or kill potential innovative competitors. However, some particular provisions from the DSA merit a look by US reformers.
Under the DSA, digital platforms are required to identify and be able to trace their business users (other companies which use or rely on the platform), which according to the European Commission drafters, should cut down on illicit goods. Outside researchers are explicitly given access to key platform data to facilitate research and create third-party accountability. Larger platforms (with greater resources) have additional requirements, from risk management assessments to developing clear and transparent codes of conduct. This regime drives home a key point in consumer protection: We should rely on government regulation (on behalf of the public interest) over sole reliance on the companies at issue themselves. Most online businesses are not inherently evil but are merely seeking to maximize their profits. However, we can’t expect that overriding incentive to perfectly align with protecting consumers or maximizing competition. Government-imposed regulation also avoids the incentive to self-favor and weaponize laws meant to protect consumers to hurt competitors.
Thankfully, there’s already a blueprint to strengthen consumer protection in the DSA. The European consumer rights group BEUC has released an excellent white paper with proposed changes to the DSA. At the outset, the paper notes that consumer protection and product safety are not explicitly laid out as objectives of the legislative package. The group also advocates for a positive liability framework (rather than just exceptions to general liability as currently laid out), as well as stronger enforcement mechanisms beyond mere fines. While content moderation can be an important part of consumer protection, would-be regulators need to think broader. Issues like online trader traceability, forced arbitration, and online advertising deserve more attention from regulators.
The Problems of Digital Platforms
Consumer protection should be at the forefront of policymakers’ minds because the lack of direct connection between a user and a platform operator can create a massive power imbalance. A lot of what a platform does lacks transparency to a user. Left unchecked, expect platforms to exploit the power imbalance to pad their bottom lines. If you’re unhappy with the service at a brick-and-mortar store, you can “speak to a manager” to hopefully resolve your concerns. You can also directly see the product you’re buying, and many times even physically hold it and test out its functionality. Although online retail can be efficient, I’m sure you’ve had to return shoes and clothes that don’t fit properly or look different from their virtual screen presentation. Many times a user is forced to rely on customer reviews, which can be fake or manipulated to hide poor reviews. However, it’s much harder to speak to a digital platform’s manager. You can’t reach through the screen and try on your purchase, and its low quality might not be readily apparent. The answer is not really pure competition, but a hassle-free return policy so consumers aren’t stuck with goods they don’t really want. This is another opportunity for consumer protection law to step in to help consumers.
It is also important to think about the degree of control a platform has over its users, especially in comparison to a traditional brick-and-mortar retail option. If you’re running a tangible store, there are things you can do to maximize your profit and maximally exploit your consumers. Shelf layouts are very intentional, and the goods consumers are forced to peruse while waiting in line contain high-margin items.
But that level of control pales in comparison to what a digital platform can do. Digital platforms are unbounded by the need to physically stock items and can thus morph their website in ways to maximize profits. Platforms can personalize homepages so that consumers only see products they have a high likelihood of buying (and are rarely the most cost-effective option). Amazon can put its preferred choice in the “Buy Box” and make it so the average customer cannot find any alternatives. Prominent, red notifications and infinite scrolling make it easy to spend hours perusing your Facebook or Instagram feeds.
The Solution? Competition + Consumer Protection
While healthy competition can solve a lot of platform ills, competition policy is not a panacea. In markets with massive players and few entrants, the customer relationship can be less sacred. Think about how platforms make it easy to sign up and hard to quit. Website interfaces with prominent shiny buttons are more than happy to take your money. But when trying to leave a platform, one is met with a bevy of “are you sure?” messages, dire warnings, and hard-to-find instructions. Subscriptions and “free trials” take advantage of consumer inertia to result in ongoing unwanted credit charges for unwary consumers. This behavior was recently catalogued in a report on “dark patterns” for Amazon Prime signups.
There’s also the issue of defective goods, made all the more pervasive by the virtual distance between buyer and seller. If a consumer receives a counterfeit when they thought they were buying a particular brand name or a product that does not work as intended, can they hold the platform (such as Amazon) responsible? The Third Circuit Court of Appeals recently tackled this question in Oberdorf v. Amazon.com, Inc., which found Amazon liable for selling a defective dog leash. After a retractable dog leash malfunctioned and blinded a woman, she attempted to sue the seller of the leash. When they couldn’t be found, she sued Amazon. Amazon argued that it was not a “seller” under the auspices of Pennsylvania state law, but the court disagreed. In particular, the court found Amazon to be the only party the consumer could go to for redress, that the liability created overall safety incentives, and that Amazon had the ability to bear liability costs in an efficient manner.
While the court result is encouraging, it is far from settled law. Clear, pro-consumer product liability is the exact type of thing that a consumer protection-oriented agency could accomplish through rulemaking. Online marketplaces like Amazon need to be incentivized to protect consumers from dangerous or counterfeit goods and a liability regime is a good way to do that.
It should be noted that the consumer protection pressure is already having an effect on Amazon. Last month, the company announced that it would begin to directly compensate consumers harmed by defective products sold on the Amazon Marketplace. Amazon still claims the seller as the first point of contact, but has agreed to step in and cover the claim if the seller cannot be found for proper restitution. While this is a laudable step forward, company promises can go away whenever convenient, and therefore are no substitute for enforceable consumer protection rules.
Digital platforms are by no means the first novel product to present unique competition and consumer protection concerns, so we should look to history for answers. New technological and scientific advances have been met with their own specialized regulators and regulations, and digital platforms are no different. Telecom created the Federal Communications Commission, pharmaceutical advances created the Food and Drug Administration, and we need a digital regulator to take on platforms.
However, it is pivotal that this regulator has dual missions of competition and consumer protection. We cannot expect the latter to come about with myopic focuses on competition. Authorizing legislation must explicitly spell out the consumer protection aims of the new agency and allow for agency rulemaking to tackle the consumer protection problems inherent in platform business models. The most important thing for would-be tech reformers is to be thinking about consumer protection as they craft competition rules. Consumer protection does not come naturally to any company, but strong regulation can make companies pay attention to consumers. Public Knowledge looks forward to continuing this conversation as efforts to rein in Big Tech reach their crescendo.