It would be great to have the option to get decimated (and/or interpolated) data from Citadel, up to a specified maximum # points over a time interval, with these conditions:
1) Buffer only the decimated/interpolated data, instead of reading all the raw data and then decimating. The goal is to retrieve data from a large time period without filling up memory!! (That's why this is best done down in the Citadel code, not up in LabVIEW.)
2) Use the same kind of smart decimation that the LV graph uses (maybe borrow that code). For example, if you display 100,000 values on an XY graph, and they are all =10 except ONE single value=100, the graph will show that spike no matter how small the graph is, i.e. no matter how much it has to decimate the data to fit it into relatively few pixels. It would be important to keep min, max, and NaN (break) values.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Any idea that has received less than 2 kudos within 2 years after posting will be automatically declined.