Something seems inconsistent between the statistical channel option that calculates standard deviation and the ones that give min and max.
DT800 occasionally delivers standard deviation values of 0 when the min and max values are not equal. This simple program causes most records to represent a sample of about 8 or 9 readings but rarely as little as 2 or 3.
In some of these rare cases the standard deviation is given as zero even though it should be nonzero in its last three digits or so.
If there are only two readings, they should be the minimum and maximum values, and we can calculate the standard deviation independently, which I did for the first example line from the dataset.
BEGIN"DUMMY" RS RA300T 1V(FF5,AV)(SD)(MX)(MN)(NUM) LOGON END
Here are four example lines from the resulting dataset of over 200,000 lines. The "0.00000" a bit right of center is the standard deviation.
In the first example the (population) standard deviation "sigma" is 0.00100398 and the sample standard deviation "s" is 0.00141984, if the minimum and maximum are the two raw readings.
D,083147,"DUMMY",2010/09/14,22:44:41,0.448608,1;A,0,29.72379,0.00000,29.72480,29.72279,2.00000;0095;11BC D,083147,"DUMMY",2010/09/15,02:24:24,0.359497,1;A,0,30.79051,0.00000,30.79734,30.78182,3.00000;0095;A883 D,083147,"DUMMY",2010/09/15,04:01:39,0.643554,1;A,0,30.78642,0.00000,30.78894,30.78240,3.00000;0095;5ADA D,083147,"DUMMY",2010/09/15,07:27:12,0.953491,1;A,0,30.60210,0.00000,30.60373,30.60047,2.00000;0095;B241
Of course the standard deviation of two or three numbers isn't very useful, but, how is this getting calculated?