Instead of recomputing the left index and the float amount in that
interval for each coordinate all the time, do it once when we
preprocess the input coordinates.
One line less, faster, and arguably easier to read.
No behavior change.
Using `min()` to guarantee the left index is never == `size() - 1`,
even for an interpolation value of 1.0, is less code, and arguably
easier to understand as well.
No behavior change.
I didn't find example code for this and the AI assistant did very
poorly on this as well. So I had to write it all by myself!
It can be much more efficient I think, but I think the overall
shape is maybe roughly fine.
* SampledFunction now keeps the StreamObject it gets data from alive
(doesn't matter too much in practice, but does matter in the test,
where nothing else keeps the stream alive).
* If a sample is an integer, we would previously sample that value
twice and then divide by zero when interpolating. Make sure to
sample 1 unit apart.