GCR Competition Authorities Zero in On Antitrust Risks of Algorithmic Pricing
GCR Competition Authorities Zero in On Antitrust Risks of Algorithmic Pricing
Guide - Fourth
Edition
Competition authorities zero in on
antitrust risks of algorithmic pricing
Digital Markets Guide -
Fourth Edition
The fourth edition of the Digital Markets Guide – edited by Claire Jeffs of Slaughter and
May and Andrew Lacy and Arman Oruc of Goodwin Procter – provides detailed guidance on
and analysis of the digital economy, combined with practical and timely guidance for both
practitioners and enforcers trying to navigate this fast-moving environment. It examines both
the current state of law and the direction of travel for the most important jurisdictions in
which international businesses operate. The guide draws on the wisdom and expertise of
distinguished practitioners globally and brings together unparalleled proBciency in the Beld
to provide essential guidance on subjects as diverse as how pricing algorithms intersect
with competition law and antitrust enforcement in certain tech mergers – for all competition
professionals.
Uxplore on GCR
RETURN TO SUMMARY
Competition authorities
zero in on antitrust risks
of algorithmic pricing
Alejandro Guerrero Perez, Ombline Ancelin, David Trapp and Andrea
Pomana
Simmons & Simmons LLP
Summary
INTRODUCTION
NEW LEGISLATIVE INITIATIVES IN THE TECH SPACE (THE DSA, DMA AND AI ACT)
CONCLUSION
INTRODUCTION
In the past few years, the international and U[ competition law community has turned its
attention to the rise of artiBcial intelligence )AI2 and, in particular, algorithmic pricing, its
]0W
impact on competition and the search for an appropriate response under antitrust laws.
3orldwide, companies are increasingly using algorithms and AI in order to power their
operations, from product development to manufacturing and marketing. Product pricing
and consumer targeting are no exception to this trend. The exponential improvement of
these technologies and the very speciBc impact of algorithm-related conduct on consumer
behaviour, competition and even personal data protection are being increasingly targeted by
]1W
regulators all over the world.
The use of these technologies is still precocious in most markets, however, and enforcers
have had little opportunity so far to identify competition issues. There are very few
precedents Bnding anticompetitive pricing algorithms in breach of U[ competition rules,
but this has not precluded models and studies from being developed and published on the
matter.
Much of the analysis carried out has focused on assessing possible theories of harm
of pricing algorithms, and their respective Bt into the deBnitions of Articles F6F and F60
of the Treaty on the :unctioning of the Uuropean [nion )T:U[2. The theories of harm
most commonly identiBed regarding pricing algorithms and U[ competition law include; )F2
algorithmic collusion )Article F6F T:U[2q and )02 algorithmic unilateral exclusionary conduct
)predatory pricing, rebates2 and exploitative conduct )excessive pricing, unfair trading
practices and price discrimination2 )Article F60 T:U[2. 3e may also add the regulatory
limitations to certain algorithmic pricing practices introduced by the new Digital Markets
Act )DMA2, Digital Services Act )DSA2 and ArtiBcial Intelligence Act )AI Act2 to this list. A
number of provisions in these recent legislative initiatives have a strong competition policy
component )particularly in the case of the DMA2 and may curb the unfettered use of pricing
algorithms.
The evidence, models and studies carried out on algorithmic pricing do not une4uivocally
point to systematic anticompetitive effects, particularly in real economy conditions. Indeed,
as highlighted by the Organisation for Uconomic Co-operation and Development )OUCD2, the
magnitude of the threat of algorithmic collusion by autonomous self-learning algorithms is
]’W
still disputed in the academic literature and there are few known cases. 3ith regard to
certain unilateral pricing algorithm practices, such as price discrimination, it is still unclear
to what extent this practice is deployed by companies.
Against this backdrop, a more reasonable and proportionate approach would involve
policymakers and regulators to gather more information on real-life cases of anticompetitive
algorithmic pricing, in order not to put the proverbial cart before the horse and hinder a
technological development that undoubtedly also has societal and consumer beneBts.
Algorithms are se4uences of automatised operations and processes that transform an input
into an outputq pricing algorithms are those algorithms whose output is the determination of
a price of a product.
Studies distinguish between pricing algorithms that monitor other companies/ prices )price
monitoring algorithms2, those that recommend or automatically set a price based on other
companies/ prices and7or market conditions such as demand )dynamic pricing algorithms2,
and those that tailor prices to speciBc individuals based on their features )personalised
]EW
pricing algorithms2.
3ith regard to price monitoring algorithms and dynamic pricing algorithms, the Uuropean
Commission found in its 06FN U-Commerce Sector Un4uiry that almost a third of all retailers
analysed effectively tracked the online prices of competitors using automatic software
programs, and subse4uently adjusted their own prices based on those of their competitors.-
]5W
The OUCD Rackground 8ote has further emphasised that Brms operating online fre4uently
use monitoring and dynamic pricing algorithms. 3hile personalised pricing does not seem
to be as widespread, the increasing availability of customer data and the development
of technology makes personalised pricing ever more feasible and, therefore, exposed to
legislative and regulatory action.
The main concern raised by pricing algorithms is the potential ability to facilitate coordinated
]KW
conduct, resulting in higher prices. Competition authorities and the OUCD have identiBed
]F6W
three main ways that pricing algorithms lead to collusion.
Algorithmic pricing may also be used to support collusive agreements entered into by
companies placed in different levels of the supply chain. :or example, the Uuropean
Commission/s investigation into consumer electronics )Asus, Denon Q Marant&, Philips and
Pioneer2 showed how technology could be used to monitor retailers/ alignment with prices
]F1W
dictated in a vertical relationship.
The closest real-life investigation that one might Bnd in relation to hub-and-spoke algorithmic
collusion is the Eturas case. In this case, the Lithuanian Competition Council considered
that travel agencies using the Uturas booking system had entered into a tacit agreement or
concerted practice, having learned that the system would limit their discounts to a maximum
]FEW
of 1 per cent. Although the facts of the investigation do not show that the travel agencies
were in contact with one another, the U[ Court of Justice ruled that knowledge of the full
terms of the 1 per cent discount cap )including the assumption that all travel agencies would
be subject to the same restriction2 could amount to a tacit agreement.
Since at least 06FE, academics have identiBed the potential for algorithmic autonomous
]FNW
tacit collusion. More recently, economists have started to publish an increasing number
of papers explaining the models created and the outcomes analysed in certain scenarios
]F9W
of algorithmic pricing. In particular, this literature suggests that algorithmic collusion is
possible without communication or even without making competitors/ prices an input of
the pricing algorithm. In these cases, algorithmic autonomous tacit collusion is modelled
using reinforcement learning algorithms )i.e., algorithms that learn through autonomous trial
]FKW
and error exploration2. A number of authors have used ‘-learning reinforcement learning
]06W
algorithms, only to conclude that these algorithms learn to set supra-competitive prices
]0FW
without communicating with each other.
Other authors have even concluded that pricing algorithms can soften competition by
undermining competitors/ incentives to undercut prices, as any price reduction by a company
]00W
would immediately be met by an e4uivalent cut in price from its competitors. In these
markets, prices could remain above competitive levels, even in the absence of explicit
collusion.
3hile some of these papers show the tendency of independently used pricing algorithms
to reach tacit collusive strategies automatically, the risks that these strategies materialise
in real economy conditions are still unclear. Pricing algorithms have been adopted
progressively, but are not yet universal or homogeneous across all competitors in a given
sector. Uven if that were the case, speciBc market conditions would need to be present
for the simultaneous use of pricing algorithms to lead to appreciable anticompetitive
outcomes )e.g., a low7limited number of market actors, e4ual levels of vertical integration,
etc.2. :inally, even if companies in a given market used self-learning pricing algorithms,
there is no conclusive evidence that algorithmic autonomous tacit collusion is a signiBcant
issue )e.g., convergence to a collusive e4uilibrium may be slow and often unsuccessful2.-
]01W
Thus, while some authors consider that continued research and enforcement efforts
]0’W
regarding tacit collusion of algorithmic pricing may be unwarranted, others invite
regulators and academia to further investigate possible scenarios of tacit collusion in real
]0EW
life environments.
The identiBcation of algorithmic autonomous tacit collusion does not only present technical
and economic hurdles, but also legal ones. Uven if a competition authority were to identify
a potential case of tacit collusion, the current state of the law could make such practice
irreproachable in the absence of an explicit communication or contact among the companies
using such autonomous algorithms. Indeed, as demonstrated in the case law in Anic
]05W
Partecipazioni, Hüls and T-Mobile Nederland, the “concerted practices/ doctrine re4uires,
among others, at least one instance of “direct or indirect contact between such operators,
the object or effect whereof is either to in”uence the conduct on the market of an actual or
potential competitor or to disclose to such a competitor the course of conduct which they
]0NW
themselves have decided to adopt or contemplate adopting on the market/.
The OUCD and other regulators are aware of what are perceived as shortcomings of the
]09W
existing legislation, and several calls to action have been made for policy changes
]0KW
to address this potential enforcement gap. :rom the enforcer/s perspective, existing
competition law may not be suHcient to capture algorithmic autonomous tacit collusion, and
the OUCD has tabled the possibility of “changing the deBnition of éagreement_ and éconcerted
practice_ to move away from being deBned by éact of reciprocal communication between
]16W
Brms_ or émeeting the minds_/.
Some scholars have also discussed what remedies could be considered to limit the potential
negative effects of algorithmic pricing tacit collusion. A Brst, market-based remedy, involves
using consumer algorithms to counterbalance the algorithmic collusion taking place on
]1FW
the supplier side. Alternative remedies put forward rely on intervention from regulators,
including the following; )F2 turning to merger control to block or remedy transactions giving
rise to situations that stimulate algorithmic pricingq )02 introducing disruptive algorithms to
alter the market conditions in a way that disincentivises tacit collusion )i.e., by charging lower,
more competitive prices2q and )12 creating an artiBcial time lag, reducing the speed at which
]10W
pricing algorithms can address novel market conditions.
In the absence of a coherent and Brm approach from the antitrust community, legislators
around the globe are moving to adopt laws that prevent or regulate the use of algorithmic
pricing in speciBc sectors. :or example, in the [S, two ?ouse representatives introduced
the Preventing Algorithmic :acilitation of (ental ?ousing Cartels Act on 5 June 060’, which
would prohibit digital price-Bxing by landlords. In the U[, it was revealed in July 060’ that
the Uuropean Commission/s justice and consumer department had recently launched a new
workstream related to AI in contracting, concerning speciBcally scenarios where machines
make decisions without explicit human consent.
Algorithmic pricing can not only be analysed in the context of actual or potential collusive
conduct, but also as unilateral conduct aimed at excluding competitors from the market by
using pricing strategies, or exploiting customers by imposing unfair prices.
Algorithmic pricing may be 4ualiBed as an abuse of dominance under Article F60 T:U[
if the algorithm is operated by a dominant entity. The dominant undertaking/s intention
is immaterial for the competitive analysis; the authority has to demonstrate only that the
]11W
algorithmic pricing is capable of producing anticompetitive effects.
EXCLUSIONARY CONDUCT
A dominant Brm can use pricing algorithms to pursue anticompetitive exclusionary
]1’W
strategies, primarily via predatory pricing and rebates. These exclusionary objectives can
]1EW
be achieved by means of personalised pricing and algorithmic targeting. The reason for
this is that dominant Brms seeking to engage in more typical exclusionary anticompetitive
conduct )and unable to price-discriminate2 would generally face the challenge of applying a
single price across the market, which could act as a constraint upon the proBtability and7or
duration of the anticompetitive behaviour. Personalised pricing and algorithmic targeting can
weaken this constraint, as they enable dominant entities to adjust the pricing strategy to
targeted customers or categories of customers.
PREDATORY PRICING
Dominant companies can use algorithms to conduct predatory pricing strategies that
target marginal customers. :or example, a dominant company can identify a customer
at risk of switching, or a competitor/s customer that is price-sensitive )i.e., a marginal
customer2, and target it with below-cost price cuts in order to retain it or to steal it from
the competitor. This way, a dominant company would use the predatory pricing strategy to
achieve anticompetitive foreclosure of rivals.
Algorithmic targeting and pricing can lower the costs of predation for the dominant company,
as they allow it to avoid losses on inframarginal customers )i.e., customers not at risk of
switching, or competitors/ customers that are not price-sensitive2, as well as by charging
excessively low prices. This can lead to extended durations of predatory pricing strategies
)as foregone proBts, or even losses, may be more sustainable2, and reduces the need for the
dominant company to recoup its proBts, making the overall predatory pricing strategy more
]15W
feasible.
REBATES
]1NW
According to the OUCD , algorithmic targeting allows dominant Brms to adopt a new form
]19W
of rebate that combines the most compelling elements of standardised rebates, that
is, application to large groups of customers )e.g., larger rebates for marginal customers
]1KW
and lower )or no2 rebates for inframarginal customers2, and personalised rebates,
that is, maximisation of proBts across customers )e.g., by targeting transactions where
]’6W
competition is Bercest2. Accordingly, algorithmic targeting mitigates the limitations of
standardised rebates )e.g., lack of proBt maximisation for selected customers2, as well
as those of personalised rebates )e.g., offering personalised prices to a large and diverse
set of customers may increase transaction costs and undermine proBtability2. Algorithmic
targeting can therefore facilitate the use of rebates by dominant Brms to prevent customers
from switching to rivals resulting in anticompetitive foreclosure of the latter.
Algorithmic pricing and targeting also have an impact on the assessment of price-based
exclusionary conduct by competition authorities.
:irst, as the price-cost test determines whether a dominant Brm is charging a below-cost
price, algorithmic pricing and targeting have an effect on the prices that are considered for
this analysis. (eason dictates that actual costs and prices applied to the products sold and
to the contestable part of demand should be used for comparison purposes, as opposed to,
]’FW
for example, average prices prevailing in the market.
Second, algorithmic pricing and targeting may re4uire re-thinking the scope of the as-eHcient
competitor )AUC2 test. To date, the AUC test had been applied solely to the analysis of
the ability of companies to reduce costs and prices to match the strategy of a dominant
company. ?owever, algorithmic pricing and targeting introduce a new level of sophistication
in the sale of products that might only be available to incumbents or long-time players that
]’0W
have collected substantial customer data. It remains to be conBrmed to what extent
this consideration is relevant, as stakeholders participating in some market studies have
considered algorithm performance as a key driver for generative AI performance, greater
]’1W
than the volume of data used to train AI models. If data were indeed considered to be
relevant and key for the analysis, this would beg the 4uestion whether the )in2ability of smaller
competitors to collect or replicate such data sets for algorithmic targeting and pricing should
be factored in any way into the AUC test.
:inally, one may wonder how dominant companies may use algorithmic pricing and targeting
safely, or what remedies may be sought by authorities in case of a Bnding of the use of
algorithms as abusive. The OUCD has pointed to a number of behavioural remedies that
could remedy personalised pricing and algorithmic targeting, including; )F2 restricting the
amount of personal data collected by the dominant undertakingq )02 obliging the dominant
undertaking to share customers/ data with rivalsq )12 disclosing the personalised pricing
strategy and corresponding parameters to usersq and )’2 offering users a right to opt out of
]’’W
personalised pricing. :rom an U[ competition law perspective, these potential remedies
may be considered far-reaching and raise some proportionality 4uestions. :urthermore,
other remedies )including those mentioned above to avoid algorithmic collusion2 could
be considered, such as limiting the fre4uency with which the dominant Brm/s algorithm
]’EW
changes7adjusts prices.
The following types of exploitative conduct are typically distinguished; excessive pricing
)e.g., unfair purchase or sales prices2, unfair trading conditions )e.g., unilaterally imposing
other unfair trading conditions2 and price discrimination )e.g., applying dissimilar conditions
]’NW
to e4uivalent transactions2. In the next sections we focus on two of these exploitative
conducts; excessive pricing and price discrimination.
EXCESSIVE PRICING
Uxploitative abuse cases, and by extension decisions on excessive pricing, are very scarce,
even in jurisdictions where authorities are allowed to investigate this type of behaviour )e.g.,
]’9W
U[2. Despite the lack of precedent, a strategy of algorithmic excessive pricing can still
– at least in principle – be found to be implemented by a dominant player. A key challenge
in pursuing exploitative abuse cases is to determine whether the conduct is “excessive/ or
]’KW
“unfair/, as there is often not a clear boundary. A case-by-case analysis, which takes into
consideration the particularities of each relevant market, does not seem to provide solid
ground to devise an applicable test.
PRICE DISCRIMINATION
Pricing algorithms can be used for personalised pricing and algorithmic targeting.
Personalised pricing involves tailoring prices for different consumers based on information
]E6W
about personal characteristics or behaviour. Algorithmic targeting, on the other hand,
allows the Brm to price differently for marginal and inframarginal customers )i.e., distinction
]EFW
is made between “core/ and “fringe/ customers2. Companies can thus differentiate )and,
in a way, discriminate2 between the prices displayed to different categories of customers.
The broader welfare implications of personalised advertising are also not entirely clear.
Studies have shown that, while the consumer surplus lowers under personalised pricing
)as explained in more detail below2, more than 56 per cent of customers beneBt from a
]E0W
more personalised approach. Re it as it were, the OUCD has concluded that there is not
]E1W
much evidence of widespread personalised pricing yet, and while this may change in the
coming years, it will not be without facing practical challenges.
Rehavioural economics suggests that, while consumers may accept third-degree price
discrimination )e.g., lower prices for the elderly and children2, consumers do not favour
personalised pricing. This is primarily due to considerations of a perceived lack of fairness
in different terms being applied to e4uivalent transactions, as well as a lack of transparency
]E’W
)as pricing decision-making would not be fully understood2. Therefore, companies may
either refrain from adopting personalised pricing strategies to protect their reputation, or be
less forthcoming and open when they do use personalised pricing. A company/s ability to
successfully price-discriminate – while losing only a fraction of marginal customers – could
in itself prove that it is able to behave to an appreciable extent independently of its customers
]EEW
and competitors.
Some studies have also analysed whether all price discrimination, or only speciBc types of
]E5W
it, could be considered abusive. :or example, a price-discriminating monopolist could
apply lower prices to consumers that have a lower willingness to pay, and higher prices to
customers with a higher willingness to pay. This could result in a redistribution of consumer
welfare, but by setting prices at a level that is closer to consumer willingness to pay for all
]ENW
consumers, part of that consumer surplus would be transferred to the monopolist.
harmq and )E2 identifying the precise source of discrimination in light of possible remedies.-
]E9W
In the U[, the Court of Justice has indicated that the use of Article F60 T:U[ to sanction
exploitative abuses would re4uire a competition authority to meet an elevated burden of
]EKW
proof. Applied to personalised pricing, this jurisprudence would re4uire an authority
to show that; )F2 price discrimination is applied on a recurring basisq )02 the algorithm
consistently discriminates between groups of consumersq )12 there is a lack of objective
justiBcations )e.g., consumer welfare2q and )’2 the relevant counterfactual shows a negative
]56W
impact on consumer welfare.
NEW LEGISLATIVE INITIATIVES IN THE TECH SPACE (THE DSA, DMA AND AI ACT)
Although some of the new U[ regulations dealing with issues in the digital sphere do not
fall under the remit of competition authorities, it is beyond doubt that these initiatives affect
the way in which companies in the U[ may deal with data, algorithms, AI and the display
of information to consumers. Accordingly, algorithmic pricing may also be subject to this
legislation.
THE DMA AND THE RESTRICTION OF USE OF BUSINESS USERS’ NON-PUBLIC DATA
The main DMA provision that is of relevance with regard to algorithmic pricing is Article
5)02, which prohibits gatekeepers in a competitive relationship with business users from
deploying non-public data generated by those business users in the context of their use of
the relevant core platform services.
The DMA goes on to specify that the concept of “non-public data/ includes any aggregated
and non-aggregated data generated by business users that originates from the commercial
activities of business users or their customers )including click, search, view and voice data2
on the relevant core platform services or on services generally provided together with the
relevant core platform services.
Given the reliance of algorithmic targeting and pricing on customer and pricing data, this
prohibition is liable to signiBcantly restrict the ability of gatekeepers to operate algorithms
in this way. It must be noted, however, that only a handful of tech companies have been
]5FW
designated as gatekeepers for core platform services. This restriction will therefore not
apply directly to the multitude of smaller players in the digital sphere that may also apply
algorithmic targeting for their own products and services.
Article 0E and Article 0N of the DSA, respectively prohibiting “dark patterns/ and regulating
“recommender systems/, seem particularly relevant in this context. :or example, if
algorithmic pricing or targeting was coupled with other display strategies susceptible
of nudging consumers into taking decisions against their best interests, Digital Service
]50W
Coordinators could have grounds to act in order to prevent such practices. In the case
On the face of it, the hardcore prohibition enshrined in Article E)F2)c2 of the U[ AI Act,
and in particular the widely formulated second test )i.e., “unfavourable/, “unjustiBed/2, seem
liable to restrict activities of companies using AI systems to engage in algorithmic targeting.
Indeed, as explained above, algorithmic pricing implies a distinction in pricing between
different groups of customers. It is foreseeable that this prohibition may be invoked against
companies targeting a speciBc customer group )e.g., customers openly displaying a higher
willingness to pay2 to which subse4uently a higher social score is tagged. 3hen this involves
discarding customer segments that do not exhibit this type of behaviour, and granting these
segments a lower score, the prohibition may kick in.
CONCLUSION
3hile algorithmic pricing can enhance dynamic eHciency and beneBt consumer welfare, it
also harbours risks of anticompetitive behaviour.
Algorithmic pricing can also enable unilateral conduct by dominant entities, potentially
abusing their market power under Article F60 T:U[. This includes predatory pricing
strategies aimed at undercutting competitors and using advanced rebate tactics to
discourage customer switching. ?owever, assessing such practices is challenging due to the
complexities involved in determining whether prices are below cost and what their impact is
on consumer welfare.
Last, despite the challenges of determining and predicting the extent of anticompetitive
practices in this evolving Beld, the U[/s recent legislative initiatives in the tech sector –
DSA, DMA and AI Act – show an aim to utilise all available tools. The DMA, for instance,
seeks to restrict the use of non-public data generated by gatekeepers/ business users in the
context of their use of the relevant core platform services, potentially curbing the in”uence
of gatekeepers in algorithmic pricing. 3hile the DSA primarily addresses transparency
and fairness in online platform practices rather than pricing directly, it still affects how
algorithmic pricing strategies are disclosed to consumers. Similarly, the AI Act introduces
prohibitions against AI systems unfairly ranking individuals based on social behaviour,
posing implementation challenges for companies using AI in algorithmic targeting.
ENDNOTES
[1]
Alejandro Guerrero, Ombline Ancelin, David Trapp and Andrea Pomana are partners at
Simmons Q Simmons LLP. The authors thank Jonathan Sak= for his contributions to the
chapter.
[2]
:or the most recent example of this, see the “Joint Statement on Competition
in Generative AI :oundation Models and AI Products/, issued by the Uuropean
Commission, the [z Competition Q Markets Authority, the [S Department of
Justice and the :ederal Trade Commission on 01 July 060’, and available at
https;77competition-policy.ec.europa.eu7document7download7NKK’99’5-’56E-’c1
a-K’a5-6’’e1’’acc11VenYBlenameX060’6N01VcompetitionVinVgenerativeVAIVjoint
VstatementVCOMP-CMA-DOJ-:TC.pdf.
[3]
See, e.g., the 01 July 060’ decision of the [S :ederal Trade Commission to open a probe
into personalised pricing. See also the statements of (avneet zaur, chair of the Competition
Commission of India, on 09 June 060’; “Dark patterns can in”uence the conduct of digital
companies . . . ]and conditionW how people can . . . make certain choices . . . that could have
some competition implications./
[4]
See OUCD, “Algorithmic Competition OUCD – Competition Policy
(oundtable Rackground 8ote/, 0601, )OUCD Rackground 8ote2 available at
www.oecd.org7daf7competition7algorithmic-competition-0601.pdf.
[5]
See OUCD Rackground 8ote, Section 0.1. See also Seele, P et al. )060F2, “Mapping the
ethicality of algorithmic pricing; A review of dynamic and personali&ed pricing/, Journal
of Business Ethics FN6, pp. 5KN–NFKq and Gautier, A, A Ittoo and P €an Cleynenbreugel
)06062, “AI algorithms, price discrimination and collusion; a technological, economic and legal
perspective/, European Journal of Law and Economics E6)12, pp. ’6E–’1E.
[6]
See Uuropean Commission, “Staff 3orking Document Accompanying the document
(eport from the Commission to the Council and the Uuropean Parliament :inal report on
the U-commerce Sector In4uiry/, S3D )06FN2 FE’ Bnal, para. 1F’.
[7]
See OUCD Rackground 8ote, Section F.
[8]
See, e.g., the study on “Algorithms and Competition/, prepared by the Autorit=
de la concurrence and the Rundeskartelamt in 8ovember 06FK, available at
www.autoritedelaconcurrence.fr7sites7default7Bles7AlgorithmsVandVCompetiti
onV3orking-Paper.pdf.
[9]
Lee, C, “The Landscape of Pricing and Algorithmic Pricing/, 0606, %usof Ishak Institute
Uconomics 3orking Paper, p. 0N.
[10]
See OUCD Rackground 8ote, Section 1.Fq CMA, “Algorithms; ?ow they can reduce
competition and harm consumers/, 060F, p. 16q Li, S, C Zie and U :eyler, “Algorithms Q
Antitrust; An overview of U[ and national case law/, 060F, Concurrences e-Competitions
Algorithms & Antitrust Article 8o. F6011’.
[11]
See CMA, “Online sales of posters and frames/, 06FE, available at
www.gov.uk7cma-cases7online-sales-of-discretionary-consumer-products.
[12]
See C8MC, “The C8MC Bnes several companies F.0E million for imposing
minimum commissions in the real estate brokerage market/, 060F, available at
www.cnmc.es7sites7default7Bles7editorVcontenidos78otas 06de 06prensa7060F7
060FF06KV8PVSancionadorVProptechVeng.pdf.
[13]
See Uuropean Commission, “Antitrust; Commission Bnes four consumer
electronics manufacturers for Bxing online resale prices/, 06F9, available at
https;77ec.europa.eu7commission7presscorner7detail7en7IPVF9V’56F.
[14]
See OUCD Rackground 8ote, Section 1.Fq Assad, S, Clark, (, Urshov, D, and Zu, L,
“Algorithmic pricing and competition; Umpirical evidence from the German retail gasoline
market/, 0606.
[15]
See Case C-N’7F’, Eturas, U[;C;06F5;’0.
[16]
See OUCD Rackground note, Section 1.Fq Calvano, U, Cal&olari, G, Denicolo, €, and
Pastorello, S, “ArtiBcial Intelligence, Algorithmic Pricing and Collusion/, 06FK.
[17]
See U&rachi, A and M Stucke, “€irtual Competition; The Promise and Perils of the
Algorithm Driven Uconomy/, 06F5, ?arvard [niversity Pressq U&rachi, A and M Stucke,
“ArtiBcial Intelligence and Collusion; 3hen Computers Inhibit Competition/, 06FE, [niversity
of Tennessee, Legal Studies (esearch Paper Series 05Nq and Mehra, S, “Antitrust and the
robo-seller; Competition in the time of algorithms/, 06F5, Minnesota Law Review 06’.
[18]
See Calvano, U, Cal&olari, G, Denicolo, €, and Pastorello, S, “ArtiBcial Intelligence,
Algorithmic Pricing and Collusion/, 06FKq and Assad, S et al., “Autonomous algorithmic
collusion; Uconomic research and policy implications/, 060F, Oxford Review of Economic
Policy 1N)12, pp. ’EK–’N9.
[19]
See Gautier, A, A Ittoo and P €an Cleynenbreugel, “AI algorithms, price discrimination and
collusion; a technological, economic and legal perspective/, 0606, European Journal of Law
and Economics E6)12, pp. ’6E–’1E.
[20]
‘-learning is a type of reinforcement learning that will Bnd the optimal
course of action based on the agent/s current state. :or more background on
‘-learning and other reinforcement learning algorithms, see Ittoo, A and Petit, 8,
“Algorithmic Pricing and Tacit Collusion; A Technological Perspective/, 06FN, available at
https;77orbi.uliege.be7bitstream7005970F99N17F7SS(8-id16’5’6E.pdf.
[21]
See Calvano, U, Cal&olari, G, Denicolo, €, and Pastorello, S, “ArtiBcial Intelligence,
Algorithmic Pricing and Collusion/, 06FKq Rallestero, G, “Collusion and artiBcial intelligence; a
computational experiment with se4uential pricing algorithms under stochastic costs/, 060Fq
and zlein, T, “Autonomous algorithmic collusion; ‘-learning under se4uential pricing/, 060F,
The RAND Journal of Economics E0)12, pp. E19–EE9.
[22]
See Rrown, and A Maczay, “Competition in pricing algorithms/, 060F, 8ational Rureau
of Uconomic (esearch.
[23]
See den Roer, A, Meylahn, J, and Schinkel, M, “ArtiBcial collusion; Uxamining
supracompetitive pricing by ‘-learning algorithms/, 0600, Amsterdam Law School (esearch
Paper 0600-0E
[24]
See Schrepel, T, “The :undamental [nimportance of Algorithmic Collusion for Antitrust
Law/, 0606.
[25]
See Calvano, U, Cal&olari, G, Denicolo, €, and Pastorello, S, “ArtiBcial Intelligence,
Algorithmic Pricing and Collusion/, 06FK.
[26]
See Case C’K7K0 P Commission v. Anic Partecipazioni ]FKKKW UC( I’F0Eq Case CFKK7K0 P
Hüls v. Commission ]FKKKW UC( I’09Nq Case C-9769 T-Mobile Netherlands BV U[;C;066K;1’1.
[27]
See Case ’67N1 Suiker Unie v Commission ]FKNEW F551, para. FN’.
[28]
See OUCD, “Algorithms and Collusion – Rackground 8ote by the Secretariat/, 06FN, pp.
15–1K, available at https;77one.oecd.org7document7DA:7COMP)06FN2’7en7pdf.
[29]
See OUCD Rackground 8ote, footnote 16.
[30]
See OUCD Rackground 8ote, Section 1.Fq Caforio, € , “Algorithmic Tacit Collusion; A
(egulatory Approach/, 0601, Competition Law Review.
[31]
The algorithms would thus be operated by consumers or third parties, which are
aggregated into buyer groups, providing the necessary buyer power to disincentivise the
supplier to maintain high price levels. See Gal, M, “Limiting Algorithmic Coordination/, 0601,
Berkeley Technology Law Journal, pp. 16–1’.
[32]
See Gal, M, “Limiting Algorithmic Coordination/, 0601, Berkeley Technology Law Journal,
pp. 1’–’N.
[33]
See, for example, Case C-1NN706, Servizio Elettrico Nationale and Others, U[;C;0600;1NK.
[34]
The OUCD/s Rackground 8ote also refers to tying and bundling as a potential third
category of exclusionary conduct that can be driven or facilitated by algorithms.
[35]
See Cheng, T, and 8owag, J, “Algorithmic Predation and Uxclusion/, 0600,
LundLawComp3P F70600.
[36]
See Cheng, T, and 8owag, J, “Algorithmic Predation and Uxclusion/, 0600,
LundLawComp3P F70600, pp. N–9q Leslie, C, “Predatory Pricing Algorithms/, 0601, New York
University Law Review, p. 96.
[37]
See OUCD Rackground 8ote, Section 1.0.
[38]
Standardised rebates apply uniformly to all customers )e.g., Post Danmark II2.
[39]
Personalised rebates differ depending on the individual customer or transaction
)e.g., Intel2. See Cheng, T, and 8owag, J, “Algorithmic Predation and Uxclusion/, 0600,
LundLawComp3P F70600, p. 9.
[40]
See Cheng, T, and 8owag, J, “Algorithmic Predation and Uxclusion/, 0600,
LundLawComp3P F70600, p. 9. .
[41]
ibid., pp. 0F–0N.
[42]
ibid., pp. 10–11.
[43]
See, e.g., Autorit= de la concurrence, “Avis 0’-A-6E du 09 juin 060’ - relatif au
fonctionnement concurrentiel du secteur de l/intelligence artiBcielle g=n=rative/, available at
www.autoritedelaconcurrence.fr7fr7avis7relatif-au-fonctionnement-concurrent
iel-du-secteur-de-lintelligence-artiBcielle-generative.
[44]
See Rotta, M, and 3iedemann, z, “To discriminate or not to discriminateY Personalised
pricing in online markets as exploitative abuse of dominance/, 0606, European Journal of Law
and Economics, pp. 1K5–1KN.
[45]
Another thought-provoking but perhaps less feasible policy response to personalised
pricing concerns the use of personalised law )e.g., personalised price caps, etc.2. See, for
more information, Rar-Gill, O, “Algorithmic Price Discrimination 3hen Demand Is a :unction
of Roth Preferences and )Mis2perceptions/, 06F9, University of Chicago Law Review.
[46]
See OUCD, “Personalised Pricing in the Digital Ura – Rackground 8ote by the Secretariat/,
06F9, p. 0N.
[47]
See Rotta, M, and 3iedemann, z, éTo discriminate or not to discriminateY Personalised
pricing in online markets as exploitative abuse of dominance/, 0606, European Journal of Law
and Economics, pp. ’5E–’55.
[48]
See Motta, M, “Self-preferencing and foreclosure in digital markets; Theories of harm for
abuse cases/, 0600, p. F5.
[49]
(obertson, €, “Algorithmic pricing – A Competition Law Perspective on Personalised
Prices/, 0601, Gra& Law 3orking Paper Series, p. F6.
[50]
See OUCD, “Personalised Pricing in the Digital Ura – Rackground 8ote by the Secretariat/,
06F9, p. 9.
[51]
See OUCD Rackground 8ote, Section 1.0.
[52]
Dub=, J-P and Misra, S, “Personali&ed Pricing and Consumer 3elfare/, 0600, Journal of
Political Economy.
[53]
See OUCD Rackground 8ote, Section 0.1.
[54]
See Rotta, M, and 3iedemann, z, “To discriminate or not to discriminateY Personalised
pricing in online markets as exploitative abuse of dominance/, 0606, European Journal of Law
and Economics, pp. 199–’66.
[55]
See Case 0N7N5 United Brands Company and United Brands Continentaal v. Commission
]FKN9W UC( 06N, para. 5Eq Case 9E7N5 Hoffmann-La Roche & Co v. Commission ]FKNKW UC(
’5F, para. 19.
[56]
See Rotta, M, and 3iedemann, z, “To discriminate or not to discriminateY Personalised
pricing in online markets as exploitative abuse of dominance/, 0606, European Journal of Law
and Economics, pp. 199–’66.
[57]
ibid., p. ’66.
[58]
OUCD, “Personalised Pricing in the Digital Ura – Rackground 8ote by the Secretariat/, 06F9,
p. 16.
[59]
See Case C-E0E7F5, Serviços de Comunicações e Multimédia SA (MEO), U[;C;06F9;0N6.
[60]
See Rotta, M, and 3iedemann, z, “To discriminate or not to discriminateY Personalised
pricing in online markets as exploitative abuse of dominance/, 0606, European Journal of Law
and Economics, pp. 1K0–1K’.
[61]
At the time of writing, Alphabet, Ama&on, Apple, Rooking, RyteDance, Meta and Microsoft
have been designated as gatekeepers for speciBc core platform services.
[62]
:or background, Digital Services Coordinators are national actors responsible for helping
the Commission to monitor and enforce obligations in the Digital Services Act.
https://www.simmons-simmons.com/