The increasing reliance on artificial intelligence (AI) to set market prices, especially in digital markets, has led to the threat of algorithmic collusion, where pricing algorithms align market behavior among competitors without explicit human agreement. This Insight examines the implications of such practices, highlighting the relevance of the approach taken in the recent European AI Act, which regulates the development and the employment of AI systems. While pricing algorithms can improve market effi- ciency by responding quickly to fluctuations in supply and demand, they also raise concerns about po- tential anti-competitive effects. Two different scenarios are discussed: the Predictable Agent and the Digital Eye, where algorithms operate either under human control or in an autonomous manner. In the Predictable Agent scenario, the conduct of the algorithm can be attributed to the undertaking, potentially giving rise to liability under art. 101 TFEU. Conversely, the Digital Eye scenario challenges the accounta- bility of the undertaking for the conduct of the algorithm due to their autonomy. To address these issues, the Insight discusses the “compliance by design”, as emphasized in the AI Act, which requires undertak- ings to ensure that their algorithms comply with antitrust rules. This is complemented by the “outcome visibility”, that in turn provides for the undertakings to correct any anti-competitive outcomes, even if algorithms have been programmed in accordance with guidelines. Together, these measures seek to balance the benefits of AI with the need to prevent collusive practices in digital markets.

Algorithmic Collusion: Corporate Accountability and the Application of Art. 101 TFEU / Giacalone, Maria. - In: EUROPEAN PAPERS. - ISSN 2499-8249. - (2025). [10.15166/2499-8249/798]

Algorithmic Collusion: Corporate Accountability and the Application of Art. 101 TFEU

Maria Giacalone
2025

Abstract

The increasing reliance on artificial intelligence (AI) to set market prices, especially in digital markets, has led to the threat of algorithmic collusion, where pricing algorithms align market behavior among competitors without explicit human agreement. This Insight examines the implications of such practices, highlighting the relevance of the approach taken in the recent European AI Act, which regulates the development and the employment of AI systems. While pricing algorithms can improve market effi- ciency by responding quickly to fluctuations in supply and demand, they also raise concerns about po- tential anti-competitive effects. Two different scenarios are discussed: the Predictable Agent and the Digital Eye, where algorithms operate either under human control or in an autonomous manner. In the Predictable Agent scenario, the conduct of the algorithm can be attributed to the undertaking, potentially giving rise to liability under art. 101 TFEU. Conversely, the Digital Eye scenario challenges the accounta- bility of the undertaking for the conduct of the algorithm due to their autonomy. To address these issues, the Insight discusses the “compliance by design”, as emphasized in the AI Act, which requires undertak- ings to ensure that their algorithms comply with antitrust rules. This is complemented by the “outcome visibility”, that in turn provides for the undertakings to correct any anti-competitive outcomes, even if algorithms have been programmed in accordance with guidelines. Together, these measures seek to balance the benefits of AI with the need to prevent collusive practices in digital markets.
2025
algorithmic collusion – European competition law– art. 101 TFEU – AI Act – concerted practice – compliance by design
01 Pubblicazione su rivista::01a Articolo in rivista
Algorithmic Collusion: Corporate Accountability and the Application of Art. 101 TFEU / Giacalone, Maria. - In: EUROPEAN PAPERS. - ISSN 2499-8249. - (2025). [10.15166/2499-8249/798]
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1755263
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact