I have an object that is moving towards target points and I want to determine when it will reach or pass through them. I receive points in real time, which represent the current position of the object. The position is a 6-D vector. The data can be download from here
Initial approach
For each received point:
- Calculate distance between current and target points.
- As the object approaches the target, the distance decreases. If current distance is greater than previous distance, I consider that it has reached the target. It's good enough for me.
Problem
The problem is that the point can sometimes fluctuate a bit. It can increase and then decrease again (noise), even if it has not yet reached the target. But because of this fluctuation, my algorithm detects a false positive because the current distance is greater than the previous one.
The image shows false positives and true positives. When a real upward trend begins, I know that the object has reached the target point. My simple idea to start with is to get rid of false positives by using a smoothing filter and fit a curve. After that I can find when positive slope begins which tells me when object reached target.
I wonder if I should try this approach or any other recommendations including smoothing filters, fitting a curve, etc?
