The idea is to take a moving variance of each data point, and once the variance is less than a certain amount, say 1e-6 for example, that's when I know my data has converged. However, the problem with using this method is that the convergence depends on the window size (the number of data points within the window) and the cut-off value. Specifically, it keeps increasing as I increase the window length, and keeps decreasing when I increase the cut-off value. I have been trying for the past few weeks to overcome this problem but can't come up with a solution.

Is there a way to know the best combination of these two variables? Or has anyone else run into this problem before and would be able to lend some tips? This is for a research project, so there is not a convergence that has already been found that I can compare it to, otherwise that would be the easiest method. Or has anyone used a better method for determining the convergence of something? Thank you for taking the time to help!