Calculates the short time averaged RMS series.
STARMS(series, intv)
series |
- |
An input series |
intv |
- |
Optional. An integer, the duration of each RMS segment. Defaults to 1.0 second. |
A series, the short time averaged RMS series.
W1: gsin(1000, .01, 1)
W2: starms(W1)
W2 consists of a 10 point series where each point has a value of 0.707107, the RMS value of each 1 second segment of W1.
W4: starms(W1, 0.1)
W4 consists of a 10 point series where the values now vary since the RMS value of W1 varies over a 0.1 second interval.
The number of segments used to calculate the RMS value is
segsize = int(interval / deltax(s))
where s is the input series.
The segments are non-overlapping.