I’m looking to eliminate or reduce the digital signal’s unstable behaver using a node or group of nodes to average the signal readings in a given time interval before sending average to LCD. I tried using the following node to achieve this with now success. I even had a clock node in there too at one point.
I’ve down loaded it and look forward to giving it a try soon. I’m looking for a procedure for determining the Q, R, and P values. Is there any documentation outlining the steps to take to dial the node in? Is there a step by step process for determining these values? What do you do with the outputs Perror, Serror and Eerror?
Where have I gone wrong with this patch? I was trying to send a digital value to “running-avg” every 200 milliseconds to store in the queue. This would continue until 15 values were collected. The average from the 15 values would then be displayed in the “watch”. So, every 3 seconds the “AVG” value would update (0.2 x 15). What am I missing?
You have to grab the little tab on the right side of the running-avg node and expand it so there are 15 dummy pins if you want avg of 15 values. The dummy pin is not used (hence the name). Giving it a value does nothing. It exists only to take advantage of the variadic function of XOD.
Running-avg-AVG should have output immediately on startup. Since values initialize to zero, the initial AVG will be 0. After 200ms, it will average 14 zeros and the new value. After 15 x 200ms, you will start getting valid data (15 samples averaged with no default zeros), which will update every 200ms with the average of the last 15 values. It is a running average. Every time a new value is read, it drops the oldest value & averages the last 15 samples.
As cesar said, there are examples in the library for each of the nodes in the library (utils-example-avg for the running-avg example).