Imagine the following game. It is summer on Lemonade Island, and you need to make some cash. You decide to set up a lemonade stand on the beach (which goes all around the island), as do two others. There are twelve places to set up around the island like the numbers on a clock. Your price is fixed, and all people go to the nearest lemonade stand.
The game is repeated. Every night, everyone moves under cover of darkness (simultaneously) and in the morning, their locations are fixed. There is no cost to move. After 100 days of summer, the game is over. The utility of the repeated game is the sum of the utilities of single-shot games.
At the beginning of a single summer (100 rounds of the game played by three players), the number of customers at each of the twelve positions is selected independently at random: either 6, 12, or 18 customers at each position. The distance between the positions are measured around the circle, so that 6 o'clock is a distance of 3 from 9 o'clock and 3 o'clock. Each customer goes to the nearest lemonade stand: if there are multiple lemonade stands equidistant from a position, the customers divide themselves evenly. Each customer is 1 utility.
"Listen, here's the thing, if you can't spot the sucker in your first half hour at the table, then you are the sucker." - Rounders
The game has almost no structure. Probability theory and the theory of Nash equilibria do not provide a single solution to this game, but can you? And yet, it is frighteningly hard: state-of-the art online algorithms fail when up against a simple strategy.