Python forward compatibility & numpy
At the time of writing the most recent version of Python is 3.14.2. If you use this version then this innocuous looking sequence of commands will result in a seriously broken environment:
poetry env use 3.14
poetry add opencv-python
poetry install
that is all! Poetry will install numpy (2.2.6) and opencv-python (4.12.0.88) but unfortunately this version of numpy does not work correctly on Python 3.14.
The bug is illustrated by this program:
import numpy
N = 128
x = numpy.ones(N)* 2
X, Y = numpy.meshgrid(x, x)
A = numpy.array((X, Y))
print(A[-1,-1,-1])
def f1(a):
dx = a.copy()
# This line appears to erronously change dx in-place!
_ = dx**2
return dx
# Expect dx to be copy of a unchanged so to get 2:
print (f1(A)[-1,-1,-1] )
This above program will incorrectly print “2” and “4”. On correctly working environments it will print “2” and “2”! I posted a question on this on numpy github with an interesting discussion.
Immediate Cause
The issue is that numpy >=2.0,<2.3 is not compatible with Python >= 3.14 ! The fix for 3.14 was introduced in https://github.com/numpy/numpy/pull/28748 . Versions of numpy before 2.3 do not work with Python 3.14, in a very subtle way shown above.
But opencv-python currently requires numpy < 2.3.
And Python does not have effective capping of Python versions as pointed out by https://github.com/seberg with link to https://discuss.python.org/t/requires-python-upper-limits/12663 . Therefore Poetry will install a version of numpy which is broken in this situation.
Forward compatibility
The bigger issue is that Python does not maintain good forward compatibility for compiled extensions. Versions of numpy which were released before Python 3.14 are broken on this version of Python, i.e., there isn’t forward compatibility between Python 3.13 and Python 3.14.
Luckily the error I came across was already known and corrected in most recent numpy (even if I could not use it because of the opencv constraint). But this type of error is in general difficult to identify because it is due to a new, erroneous, side-effect which additionally also occurs for certain data sizes. A very tricky combination!