LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Chart x scale issues

I use 7.1
I am experiencing some issues with the chart x scale while doing analog data acquisition. The scale automatically shows me only 1000 points. Since I use a scan rate of a 1000, I conveniently use a multiplier of .001 to see seconds on my x scale instead of data points. So far so good.
The problem is that the scope of the x scale is no more than 1023 points. I would like to see 5 or 10 full seconds in view, but however I try to change it, the number instantly changes back again. As the attached picture shows, you can also set a minimum and a maximum. This also doesn't allow me to change it to a bigger scope.
Additionally the chart seems to lag. When I shake my accelerometer, it takes a second or two before it shows on the chart
. This is not due to buffer size, because I keep that between 100 and 500 point (max .5 seconds).
See my block diagram attached. I am using control refnums to put everything in a subvi, but I don't believe that causes my problems..
Thanks for any help.

Aart-Jan
Download All
0 Kudos
Message 1 of 5
(5,518 Views)
Hi,

You can right click on the chart and set the chart history length to more than 1024 points which is the default.
Another option is to use the waveform graph control which is not buffered and will show you as much data as you need.

Toshi
Message 2 of 5
(5,518 Views)
I am used to answering my own questions so here is part of the answer: the 1023 points (really 1024) can very easily be set by right clicking on the chart and choosing chart history length. Did that and I can see 10 seconds now.

I also found out a way to make the chart update instantly, but it only confuses me. The smoothest seems to be to use the following settings: 500 scans/s, loop delay of 50 ms and numbers of scans to read per loop at 25. In this configuration the chart suddenly updates instantly. Furthermore the backlog stays on 0 without the cpu steaming up to 100% load (which I see normally when AI read.vi has to wait for scans). So why is there such a big lag all of a sudden at a scan rate of 1000 with minimal back log???? I feel I am missing something
here..

Aart-Jan
0 Kudos
Message 3 of 5
(5,518 Views)
Good to see you have answered your question 🙂

Your system behavior is OK.
What happened is that you are sampling 500 samples/sec and trying to retrieve 500 s/sec.
But in fact you retrive this:
25 * (1000ms / (50ms delay + processing time)) <= 500 points!!!

Ideally processing time would be 0, but in real life it is more than 0. What you are doing is reading less points that the ones you are acquiring in the board.
I would recommend increasing the number of points to read.

Your eye only needs 20 frames / sec to see an image moving continously.

Hope that helps
Message 4 of 5
(5,518 Views)
Your point of processing delay is good, but it doesn't make all my questions disappear. Consider the following:
The eye will discern refresh rates slower than 20 frames per second, so the optimal loop delay (including processing delay) should be less then 50 ms. This implies one should make the time delay smaller so a loop will finish as soon as there are enough data points available in the FIFO buffer and the delay time has expired. The backlog will indicate whether I am lagging behind with processing the scans so it should really stay on zero or something small. When I do so, I still encounter the lag of a couple of seconds on screen that I do not experience at 500 scans/s

On top of all this I see the following odd behaviour:
The back
log number quickly increases and decreases again. This means that there are lots of loops going on where there are no (or not enough) scans read. After a second or less the backlog quickly decreases again. Then there is a moment of nothing, making the graph stand still! Then the circle starts over again. With alll this, the graph also still lags.
I am really confused.

Aart-Jan
0 Kudos
Message 5 of 5
(5,518 Views)