OPEN
This is open, and cannot be resolved with a finite computation.
Let $L(r)$ be such that if $G$ is a graph formed by taking a finite set of points $P$ in $\mathbb{R}^2$ and some set $A\subset (0,\infty)$ of size $r$, where the vertex set is $P$ and there is an edge between two points if and only if their distance is a member of $A$, then $\chi(G)\leq L(r)$.
Estimate $L(r)$. In particular, is it true that $L(r)\leq r^{O(1)}$?
The case $r=1$ is the
Hadwiger-Nelson problem, for which it is known that $5\leq L(1)\leq 7$.
See also
[508],
[704], and
[705].
View the LaTeX source
When referring to this problem, please use the original sources of Erdős. If you wish to acknowledge this website, the recommended citation format is:
T. F. Bloom, Erdős Problem #706, https://www.erdosproblems.com/706, accessed 2026-01-16