An interesting problem came up the other day. I have two data stream that is being output continuously. Say, A and B. (Different values)
In the ideal world, A and B are exactly the opposite. If A increases by x percent, then B will decrease by x percent, and vice versa.
However, in my world, they are not 100% relative. This means that sometimes A can increase X percent while B only decreases X- 0.Y percent. Vice versa when B is increased. The difference is small but detectable.
I need to be able to detect when the decrease of either data stream is more than the other by 0.3 percent. So for example, if A = 30, B = 100. and the next data point for A is = 33 (10 percent increase), and B is = 89.5(10.5 percent decrease). Then I need to know that this has happened. since the difference between the two is greater than 0.3 percent.
Now the problem itself would be easy if I know their original values. However, this is a continuous data stream and I don't have a reference point. I do not need to have the absolute accurate answer but a relative one.
For example, if we decide to stream data, how can I know that "hey, buddy, they are going out of order, watch out" without knowing that what they once were?