In recent court filings and public comments, the Department of Justice Antitrust Division (“DOJ” or “the Division”) has stated that price fixing using algorithmic software is per se illegal under the antitrust laws. These statements make clear that DOJ sees algorithms as the new frontier in antitrust enforcement, and is willing to pursue cases involving this rapidly-developing technology.

DOJ Statement of Interest

On November 15, 2023 DOJ filed a Statement of Interest in Tennessee federal court (“DOJ’s SOI”), which explicitly outlines the government’s theory of per se liability for algorithmic price fixing. DOJ filed the SOI in a consolidated class action brought by a proposed plaintiff class of apartment and student housing lessees alleging that 34 property managers, owners, operators and lessors conspired with a property management software company to artificially inflate lease prices above competitive levels. 1Plaintiffs allege that the pricing software gathered real-time pricing and vacancy data from lessors that it used to make pricing and vacancy recommendations, which the lessors allegedly agreed to adhere to, with the understanding that their competitors would do the same.2 Plaintiffs assert that the alleged conduct amounts to a per se violation of the federal antitrust laws, as well as various state antitrust and consumer protection statutes.

DOJ’s SOI advocates in support of plaintiffs’ assertion that the defendants engaged in per se unlawful price fixing by allegedly agreeing to adhere to recommendations made from algorithmic pricing data. While DOJ qualified that “not every use of an algorithm to set price qualifies as a per se violation of Section 1 of the Sherman Act,” it argues that it is per se unlawful when “competitors knowingly combine their sensitive, nonpublic pricing and supply information in an algorithm that they rely upon in making pricing decisions, with the knowledge and expectation that other competitors will do the same.”

DOJ’s commitment to pursuing per se algorithmic price fixing cases

The SOI was presaged by recent comments from Ryan Tansey, Chief of the Division’s Criminal 1 Section, at the ABA 2023 Antitrust Fall Forum focused on Artificial Intelligence (“AI”) that was held on November 9, 2023. Tansey said that algorithms are expanding the markets that the Division is going to be investigating criminally, noting that these investigations are being conducted in markets that, absent algorithms, regulators would not typically have probed. Tansey explained that criminal price fixing charges will always “depend on the facts,” warning that, as technological advances occur, “concerted actions will take more sophisticated forms.” In the case of algorithms that Tansey contends make it “vastly easier for competitors to collude in large, decentralized markets,” the Division will look at “the intent of the people who are implementing the algorithm and using the algorithm,” including what they are trying to get out of it, and whether they know that others using the algorithm are using it with the same intent.

In a statement before Congress on November 14, DOJ Assistant Attorney General Jonathan Kanter explained that the Division is committed to “building the technical and substantive infrastructure to address artificial intelligence and other complex digital tools.”3 These efforts include hiring data scientists for the first time to make sure that the Division is able to understand the impact of this technology on competition.


DOJ’s enforcement position has significant implications for businesses that make or use algorithmic pricing models, and will likely have a direct impact on the shape of antitrust enforcement as it pertains to fast-developing AI technologies. As a result, companies should invest in antitrust compliance with respect to AI and algorithms, since DOJ intends to hold companies accountable for algorithm use and activity. Companies will need to understand and monitor the tools they are using—including what data are being shared and with whom—to avoid potential antitrust exposure.