On 14 June 2017, the OECD published a Note from the EU on Algorithms and Collusion (DAF/COMP/(2017)12 - here) (the EU Note). An updated background note on Algorithms and Collusion was published by the OECD Secretariat on 9 June 2017 (DAF/COMP(2017)4 - here). At the same time the Antitrust Division of the U.S. Department of Justice (DOJ) and the U.S. Federal Trade Commission (FTC) also published a policy paper on the impact of algorithms and the US approach (DAF/COMP/WD(2017)41 - here) (the DOJ/FTC Paper).
Luckily, there is no sign of conceptual change about how pricing algorithms are to be analysed under EU and US competition law. They will continue to be analysed under the existing concepts, distinguishing between vertical and horizontal relations, and between unlawful "explicit collusion" and lawful "intelligent adaptation" (also called "tacit collusion").
To briefly summarise, an algorithm is a decision-making software turning digital inputs into digital outputs, (potentially self-learning - amending its own rules depending on past experience). For obvious reasons, the web-based economy creates an unprecedented level of price transparency, and more and more companies are using algorithms to adapt their prices to quickly changing market conditions - almost in real time. The use of algorithms makes the traditional information sharing/price fixing cartelist look outdated. However, the key question is whether and under what conditions the competition authorities might view the use of algorithms as a competition law offence. The good news is that they generally won't. The bad news is that any technical improvement of your self-learning algorithms can make you cross the Rubicon and expose you to great liability.
The EU Note
In a vertical context, the EU Note points out that algorithms greatly facilitate the monitoring of resellers who are unwilling to respect the resale price recommendations of the supplier. However, as with traditional "offline" standards of EU competition law (national law may be stricter), it is mainly the subsequent retaliation or "pressure" that turns the unilateral monitoring into unlawful resale price maintenance (RPM). The Note does have some slight ambiguity in the language: "If algorithm-enabled price monitoring allows a supplier to pressure a retailer to stick to a 'recommended' price, the supplier would be actually turning that 'recommended' price into a fixed resale price (RPM)." (paragraph 15 of the EU Note). This suggests that RPM might already occur where the algorithm "allows" a supplier to "pressure" a retailer. However, that appears to be a stretch. Typically it would be the exertion of pressure as a result of the price monitoring that would trigger the RPM. Merely monitoring or knowing that recommended prices are not being adhered to would typically not be viewed as RPM.
In a horizontal context, the EU Note distinguishes four scenarios: monitoring, implementing, engaging in, and implied (tacit) collusion.
Monitoring: Where algorithms are used to monitor already agreed prices, it is the preceding agreement which breaches the law, not the implementation. However, monitoring could well be seen as an aggravating circumstance, increasing potential fines. As stated in the EU Guidelines on Vertical Restraints, a price monitoring system makes a price fixing more effective. (para 48 of the 2010 Guidelines).
Implementing: Similarly, where algorithms are used to implement pre-existing explicit collusion, there is nothing new. The EU Note refers to the European Court of Justice Eturas - case C-74/14 (paragraph 25), where Lithuanian travel agents used common software and the system administrator proposed (in a circular it sent to all the travel agents using the system) to implement a rule that would automatically limited discounts granted (albeit the travel agents retained the ability to change the default. Irrespective of whether the travel agents had actually read the circular, presumed awareness of the proposed rule was held to be sufficient for the basis of the 'concerted practice', Evidentially this case raises significant concerns as the software provider imposed the discount as a default rule in the system, there was not evidence of any bilateral discussions or agreement to this default discount.
Engaging: Competition law is clearly triggered where algorithms are used to engage in explicit collusion, e.g. where competitors agree on particular repricing parameters or pricing algorithms strategies (directly or in a tripartite "hub-and-spoke" scenario). The EU Note (in paragraph 27) expressly refers to price signaling in this context, i.e. where companies signal early on future price increases that are intended for the competition rather than for customers (as it was the case in the Container Shipping case of 2016). More importantly, the EU Note evokes "yet another scenario": where the algorithms achieve a sense of reaching agreement, the firms using them would remain liable for their behavior (paragraph 28). Conceptually this is clear cut but may prove difficult in practice to defend against. How do you ensure technically that your algorithm does not outsmart you and 'decides' to collude with others?
Implied/Tacit collusion: Last but not least, in relation to (lawful) tacit collusion, the EU Note evokes the possibility to "consider taking an expanded interpretation of the notion of 'communication', in order to bring cases of algorithm-enabled price matching within the scope of Article 101." (paragraph 33) The EU Note acknowledges (in paragraph 34) that short of price signaling, tacit collusion remains and should remain lawful, but also warns that "at this stage, one cannot fully rule out the possibility that more creative and novel types of interactions could in certain situations meet the definition of "communication". Again, a technical advancement may result in crossing the boundaries to unlawful conduct.
The DOJ/FTC Paper
In relation to the US, the DOJ/FTC Paper addresses the application and limits of US antitrust analysis to business conduct involving technologically advanced tools such as pricing algorithms.
It points out that "[c]omputer-determined pricing may be susceptible to coordination, just as human-determined pricing can be." (paragraph 11) In common with the EU Note, the DOJ/FTC Paper points out that algorithmic pricing may similarly be a mechanism for implementing a collusive agreement between individuals (similar to Implementing in the EU Note above), and that if competing firms agree to use pricing algorithms in a particular way, then the conduct would clearly amount to an anticompetitive agreement between the competitors (similar to "Engaging" in the EU Note above), both set out in paragraph 13 of the DOJ/FTC Paper.
The DOJ/FTC Paper also specifically identifies the different scenarios in which algorithmic pricing could implement coordinated anticompetitive price changes, citing the Trod Ltd case (discussed below) as an example. However, it also acknowledges that “[a]bsent concerted action, independent adoption of the same or similar pricing algorithms is unlikely to lead to antitrust liability even if it makes interdependent pricing more likely." (paragraph 18) For example, the DOJ/FTC Paper notes that "if multiple competing firms unknowingly purchase the same software to set prices, and that software uses identical algorithms, this may effectively align the pricing strategies of all the market participants, even though they have reached no agreement.” (also at paragraph 18) Thus, while the US antitrust authorities argue that current antitrust rules should be applied to companies that employ pricing algorithms, the authorities also acknowledge that they need to have evidence of collusion before taking any action.
While the EU Commission has yet to take a case specifically related to the use of algorithms, its e-commerce sector inquiry (here) found that 53% of respondents tracked online prices of competitors, "two thirds of which use automatic software to adjust their own prices based on the observed prices of competitors". The Commission has already noted that the "wide-scale use of such software may in some situations, depending on the market conditions, raise competition concerns" (Final Report, paragraph 13). The Commission has noted that it will use the findings from the inquiry to step up EU antitrust enforcement in European e-commerce markets.
Other competition authorities have already taken action against algorithm users, notably in the US and UK (both cases relating to the same online poster sales price fixing agreement), as well as the Lithuanian Eturas case, set out below.
In the US, the DOJ has taken two cases where software was found to have contributed to the anticompetitive conduct.
In 1994, the DoJ found that the use of a jointly owned computerized online booking system, the Airline Tariff Publishing Company, provided the airlines, not only with the means to disseminate fare information to the public, but also to engage in private dialogues on fares, and "certain features of the system enabled the airlines to reach overt price-fixing agreements, and “facilitate[d] pervasive coordination of airline fares short of price fixing.”" (paragraph 12).
In 2015 it found that Trod Ltd (doing business as Buy 4 Less, Buy For Less and Buy-For-Less-Online) had agreed to fix the prices of certain posters sold online through Amazon Marketplace by means of specific pricing algorithms with code was written to implement the agreement. This was achieved by one of the competitors programming its algorithm to find the lowest-price offered by a third party for a particular poster, and then set its poster price just below that. The other price fixer then set its algorithm to match the price set by the first party. Charges were also leveled against the two individuals involved, including the director and owner of Trod Ltd. The Assistant Attorney General Bill Baer of the DOJ’s Antitrust Division stated that this case represented “the Division’s first criminal prosecution against a conspiracy specifically targeting e-commerce,” and that the Division “will not tolerate anticompetitive conduct, whether it occurs in a smoke-filled room or over the Internet using complex pricing algorithms.”
In the UK, the Competition and Markets Authority (CMA) also pursued Trod Ltd and GB eye Ltd who had agreed to not undercut each other's prices for posters and frames sold on Amazon's UK website, and who then implemented this agreement through the use of automated repricing software. It should be noted that in the case the CMA had email evidence of discussions between the parties, with the algorithm subsequently developed using automated repricing software which was configured to give effect to their agreement (para 3.62 of the CMA Decision). In additional to fines the CMA case also resulted in a five year director disqualification for the director of Trod Ltd.
In Lithuania, the Lithuanian competition authority fined the online booking provider Eturas and 29 travel agents €1.5 million after Eturas (the booking platform) sent a system message (alert) to all agents (users) informing them that discounts in the platform would be technically limited to 3%. The limit followed a questionnaire sent to users asking them what discount limit they wanted to see implemented on the platform. The Court of Justice of the EU (following a reference from the Lithuanian Supreme Court) stated that if the users had received the alert and opened it, then they were presumed to have participated in the concerted action, unless they could prove they had distanced themselves from the infringing conduct. However, if the users were not aware of the alert, and could prove this, then the mere implementation of the measure was not sufficient to impute liability to them. (paragraph 41 of Case C 74/14 - Eturas).
Conclusions and practical steps
There is a significant risk that self-learning algorithms will develop a capacity to reach a logical understanding and/or agreement with competitors' algorithms, in which event the users would remain liable for any damage caused by their technologies.
As a practical measure, it would be prudent for algorithm developers and users to maintain a clear audit trail of all the steps taken during the development of the algorithm and note in particular the 'decision' making process of the algorithm (the decision tree) and any changes that are made to the algorithm during its use. It would also be prudent to ensure that the input parameters (source data) used by the algorithm are set by the user, and the default settings are not used. Consideration should also be given to whether the same algorithm is being used by your competitors. In this case, the fewer the better.
As a company using any algorithms, it is essential to read the small print and understand exactly how the system works, in particular any automatic rules setting default prices/rebates/margins and remain vigilant to any changes made to the system.