Is Your Rent an Antitrust Violation?
7 min readIf you rent your home, there’s a good chance your landlord uses RealPage to set your monthly payment. The company describes itself as merely helping landlords set the most profitable price. But a series of lawsuits says it’s something else: an AI-enabled price-fixing conspiracy.
The classic image of price-fixing involves the executives of rival companies gathering behind closed doors and secretly agreeing to charge the same inflated price for whatever they’re selling. This type of collusion is one of the gravest sins you can commit against a free-market economy; the late Justice Antonin Scalia once called price-fixing the “supreme evil” of antitrust law. Agreeing to fix prices is punishable with up to 10 years in prison and a $100 million fine.
But, as the RealPage example suggests, technology may offer a workaround. Instead of getting together with your rivals and agreeing not to compete on price, you can all independently rely on a third party to set your prices for you. Property owners feed RealPage’s “property management software” their data, including unit prices and vacancy rates, and the algorithm—which also knows what competitors are charging—spits out a rent recommendation. If enough landlords use it, the result could look the same as a traditional price-fixing cartel: lockstep price increases instead of price competition, no secret handshake or clandestine meeting needed.
Without price competition, businesses lose their incentive to innovate and lower costs, and consumers get stuck with high prices and no alternatives. Algorithmic price-fixing appears to be spreading to more and more industries. And existing laws may not be equipped to stop it.
In 2017, then–Federal Trade Commission Chair Maureen Ohlhausen gave a speech to antitrust lawyers warning about the rise of algorithmic collusion. “Is it okay for a guy named Bob to collect confidential price strategy information from all the participants in a market and then tell everybody how they should price?” she asked. “If it isn’t okay for a guy named Bob to do it, then it probably isn’t okay for an algorithm to do it either.”
The many lawsuits against RealPage differ in their details, but all make the same central argument: RealPage is Bob. According to one estimate, in more than 40 housing markets across the United States, 30 to 60 percent of multifamily-building units are priced using RealPage. The plaintiffs suing RealPage, including the Arizona and Washington, D.C., attorneys general, argue that this has enabled a critical mass of landlords to raise rents in concert, making an existing housing affordability crisis even worse. (In a statement, RealPage has responded that the share of landlords using its services is far lower, about 7 percent nationwide. RealPage’s estimate includes all rental properties, whereas the lawsuits focus on multifamily-building units.)
According to the lawsuits, RealPage’s clients act more like collaborators than competitors. Landlords hand over highly confidential information to RealPage, and many of them recruit their rivals to use the service. “Those kinds of behaviors raise a big red flag,” Maurice Stucke, a law professor at the University of Tennessee and a former antitrust attorney at the Department of Justice, told me. When companies are operating in a highly competitive market, he said, they typically go to great lengths to protect any sensitive information that could give their rivals an edge.
The lawsuits also argue that RealPage pressures landlords to comply with its pricing suggestions—something that would make no sense if the company were merely being paid to offer individualized advice. In an interview with ProPublica, Jeffrey Roper, who helped develop one of RealPage’s main software tools, acknowledged that one of the greatest threats to a landlord’s profits is when nearby properties set prices too low. “If you have idiots undervaluing, it costs the whole system,” he said. RealPage thus makes it hard for customers to override its recommendations, according to the lawsuits, allegedly even requiring a written justification and explicit approval from RealPage staff. Former employees have said that failure to comply with the company’s recommendations could result in clients being kicked off the service. “This, to me, is the biggest giveaway,” Lee Hepner, an antitrust lawyer at the American Economic Liberties Project, an anti-monopoly organization, told me. “Enforced compliance is the hallmark feature of any cartel.”
The company disputes this description, claiming that it simply offers “bespoke pricing recommendations” and lacks “any power” to set prices. “RealPage customers make their own pricing decisions, and acceptance rates of RealPage’s pricing recommendations have been greatly exaggerated,” the company says.
In December, a Tennessee judge rejected RealPage’s motion to have a class-action lawsuit against it dismissed, allowing the case to go forward. It would be a mistake, however, to conclude from that example that the legal system has the algorithmic price-fixing problem under control. RealPage could still prevail at trial, and in any case, it isn’t alone. Its main competitor, Yardi, is involved in a similar lawsuit. One of RealPage’s subsidiaries, a service called Rainmaker, faces multiple legal challenges for allegedly facilitating price-fixing in the hotel industry. (Yardi and Rainmaker deny wrongdoing.) Similar complaints have been brought against companies in industries as varied as health insurance, tire manufacturing, and meat processing. But winning these cases is proving difficult.
The challenge is this: Under existing antitrust law, showing that companies A and B used algorithm C to raise prices isn’t enough; you need to show that there was some kind of agreement between companies A and B, and you need to allege some specific factual basis that the agreement existed before you can formally request evidence of it. This dynamic can place plaintiffs in a catch-22: Plausibly alleging the existence of a price-fixing agreement is hard to do without access to evidence like private emails, internal documents, or the algorithm itself. But they typically can’t uncover those kinds of materials until they are given the legal power to request evidence in discovery. “It’s like trying to fit a square peg in a round hole,” Richard Powers, a former deputy assistant attorney general in the DOJ antitrust division, told me. “It makes the job really hard.”
In the case of RealPage, the plaintiffs were able to make the peg fit. But in May, a Nevada judge dismissed a similar case against a group of Las Vegas hotels who used Rainmaker, concluding that there wasn’t enough evidence of a price-fixing agreement, because the hotels involved hadn’t shared confidential information with one another and weren’t required to accept Rainmaker’s recommendations, even if they allegedly did so about 90 percent of the time. “The rulings so far have set the bar very high,” Kenneth Racowski, a litigation attorney at Holland & Knight, told me. The RealPage case “was able to clear that bar, but it might prove to be the exception.”
And cases like RealPage and Rainmaker may be the easy ones. In a series of papers, Stucke and his fellow antitrust scholar Ariel Ezrachi have outlined ways in which algorithms could fix prices that would be even more difficult to prevent or prosecute—including situations in which an algorithm learns to fix prices withouts its creators or users intending it to. Something similar could occur even if companies used different third-party algorithms to set prices. They point to a recent study of German gas stations, which found that when one major player adopted a pricing algorithm, its margins didn’t budge, but when two major players adopted different pricing algorithms, the margins for both increased by 38 percent. “In situations like these, the algorithms themselves actually learn to collude with each other,” Stucke told me. “That could make it possible to fix prices at a scale that we’ve never seen.”
None of the situations Stucke and Ezrachi describe involve an explicit agreement, making them almost impossible to prosecute under existing antitrust laws. Price-fixing, in other words, has entered the algorithmic age, but the laws designed to prevent it have not kept up. Powers said he believes existing antitrust laws cover algorithmic collusion—but he worried that he might be wrong. “That’s the thing that kept me up at night,” he said about his tenure at the Department of Justice. “The worry that all 100-plus years of case law on price-fixing could be circumvented by technology.”
Earlier this year, a handful of Senate Democrats led by Amy Klobuchar introduced a bill that would update existing laws to automatically presume a price-fixing agreement whenever “competitors share competitively sensitive information through a pricing algorithm to raise prices.” That bill, like so much congressional legislation, is unlikely to become law anytime soon. Local governments might have to take the lead. Last week, San Francisco passed a first-of-its-kind ordinance banning “both the sale and use of software which combines non-public competitor data to set, recommend or advise on rents and occupancy levels.”
Whether other jurisdictions follow suit remains to be seen. In the meantime, more and more companies are figuring out ways to use algorithms to set prices. If these really do enable de facto price-fixing, and manage to escape legal scrutiny, the result could be a kind of pricing dystopia in which competition to create better products and lower prices would be replaced by coordination to keep prices high and profits flowing. That would mean permanently higher costs for consumers—like an inflation nightmare that never ends. More profound, it would undermine the incentives that keep economies growing and living standards rising. The basic premise of free-market capitalism is that prices are set through open competition, not by a central planner. That goes for algorithmic central planners too.