Three armed bandits are in a gunfight with each other. The gunfight proceeds in a sequence of rounds. In each round, each bandit chooses one of the bandits to target, and all three shoot simultaneously. Each bandit (while alive) shoots once per round. Each bandit has a certain accuracy, and the probability of hitting or missing a target is independent of the target (i.e., it doesn't matter who the bandit is shooting), and also independent of all other shots.
Each bandit knows the others' (possibly randomized) strategy. Any randomness used by any of the bandits is hidden from the other bandits (i.e., the distribution of actions that defines any bandit's strategy is independent of the other bandits' strategies).
The gunfight ends when at most one bandit remains alive. A bandit "wins" the gunfight if he survives until the last round (he is either the last bandit remaining, or he survived until the very end when all remaining bandits are simultaneously killed). In the degenerate case where all three bandits have 0 accuracy (and thus the gunfight never ends), then all three bandits are declared winners.
What is the Nash equilibrium?
(Thanks to Vasu for helping come up with the name.)