The error has occured when using hstack, with array with shape (700, 33) and data type float64. The problem is not regarding the system as $ echo 1 > /proc/sys/vm/overcommit_memory
This will enable "always overcommit" mode, and you'll find that indeed the system will allow you to make the allocation no matter how large it is (within 64-bit memory addressing at least). The usage just required 180 KiB of space, so it can not be the root cause of the problem.
Reproducing code example:
import numpy as np
np.hstack(columns_from_the_dataframe)
Error message:
Traceback (most recent call last):
File "<__array_function__ internals>", line 5, in hstack
File "/usr/local/lib/python3.8/dist-packages/numnumpy-1.19.3py/core/shape_base.py", line 346, in hstack
return _nx.concatenate(arrs, 1)
File "<__array_function__ internals>", line 5, in concatenate
MemoryError: Unable to allocate 180. KiB for an array with shape (700, 33) and data type float64
NumPy/Python version information:
numpy-1.19.3