regulation for algorithmic collusion
This week, Chenhao Zhang from Northwestern University visited ITCS and gave a talk on Regulation of Algorithmic Collusion, based on his ongoing collaboration with Prof. Jason Hartline. Here’s a background of the topic, summary of the talk and their work (hopefully), and some discussion afterwards. Regulation of Algorithmic Collusion ABSTRACT Consider sellers in a competitive market that use algorithms to adapt their prices from data that they collect. In such a context it is plausible that algorithms could arrive at prices that are higher than the competitive prices and this may benefit sellers at the expense of consumers (i.e., the buyers in the market). This paper gives a definition of plausible algorithmic non-collusion for pricing algorithms. The definition allows a regulator to empirically audit algorithms by applying a statistical test to the data that they collect. Algorithms that are good, i.e., approximately optimize prices to market conditions, can be augmented to contain the data sufficient to pass the audit. Algorithms that have colluded on, e.g., supra-competitive prices cannot pass the audit. The definition allows sellers to possess useful side information that may be correlated with supply and demand and could affect the prices used by good algorithms. The paper provides an analysis of the statistical complexity of such an audit, i.e., how much data is sufficient for the test of non-collusion to be accurate. ...