Where to stand when playing darts?
Artikel i vetenskaplig tidskrift, 2021
This paper analyzes the question of where one should stand when playing darts. If one stands at distance d > 0 and aims at alpha is an element of R-n, then the dart (modelled by a random vector X in Rn) hits a random point given by alpha + dX. Next, given a payoff function f, one considers sup(alpha) Ef (alpha + dX) and asks if this is decreasing in d; i.e., whether it is better to stand closer rather than farther from the target. Perhaps surprisingly, this is not always the case and understanding when this does or does not occur is the purpose of this paper. We show that if X has a so-called selfdecomposable distribution, then it is always better to stand closer for any payoff function. This class includes all stable distributions as well as many more. On the other hand, if the payoff function is cos(x), then it is always better to stand closer if and only if the characteristic function vertical bar phi(X) (t)vertical bar is decreasing on [0, infinity). We will then show that if there are at least two point masses, then it is not always better to stand closer using cos(x). If there is a single point mass, one can find a different payoff function to obtain this phenomenon. Another large class of darts X for which there are bounded continuous payoff functions for which it is not always better to stand closer are distributions with compact support. This will be obtained by using the fact that the Fourier transform of such distributions has a zero in the complex plane. This argument will work whenever there is a complex zero of the Fourier transform. Finally, we analyze if the property of it being better to stand closer is closed under convolution and/or limits.