LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Anyone know why this VI wont work?


@lorc34 wrote:

I'm just wondering how it plots 1 million points over 1000 pixels. I.e. 1000 points per pixel. Is it not only possible to plot 1 point per pixel?

 

Regardless i will have to do some kind of data averaging for the plot, while writing the full data to csv file.

 

Its just if theres a spike in the level, it would be very high, but very low for the rest of the data, and so the average would be much lower than the spike was.

 

Anyway, that is fine, thank you I now understand why my program was overloading (too large of an array)

 

Thanks for the help!


Please read this https://www.ni.com/docs/en-US/bundle/labview/page/memory-management-for-large-data-sets.html

 

santo_13_1-1738166264361.png

 

Santhosh
Soliton Technologies

New to the forum? Please read community guidelines and how to ask smart questions

Only two ways to appreciate someone who spent their free time to reply/answer your question - give them Kudos or mark their reply as the answer/solution.

Finding it hard to source NI hardware? Try NI Trading Post
0 Kudos
Message 11 of 13
(153 Views)

AFAIK, LabVIEW does the same thing that other graph, chart and picture indicator programs do to make their images look nice at many different scaling and resolutions.  It will resample to give you the best-looking images within a few constraints so that it doesn't present information that doesn't really exist and make you interpret your data wrong.  This has been getting better over the years.  I'm an oldster and remember a time when you couldn't resize an image (good 'ol BitMaps!) without it resulting in jagged lines and faulty aspect ratios.  Now you can stretch and morph images and their edges will always look clean and crisp.

 

algorithm - Reducing a graph's datapoints while maintaining its main features - Stack Overflow

Managing Large Data Sets in LabVIEW - NI Community

LabVIEW Pro Dev & Measurement Studio Pro (VS Pro) 2019
0 Kudos
Message 12 of 13
(123 Views)

You have some bad math earlier in this thread. Your sample code is sampling at 1000 Hz, not 10 Hz. Each reading is a Double, so 8 bytes per sample.

 

That's 8,000 bytes per second, or 480,000 bytes per minute, or 28,800,000 bytes per hour, or 280,000,000 bytes per 10 hours. 280 MB worth of data is a lot, but not enough to totally hose your computer. And your code is extremely simple, so you don't have a TON of duplicated data (though it IS duplicated at least a few times, and since you don't initialize your shift register you'll retain old data).

 

If you truly only need 10 Hz data, just take a little care to adjust your code to only record data at 10 Hz. I'd also recommend not letting DAQmx Read "freewheel" like you're doing, as it will loop way too fast for part of the time.

 

Also, the help here: https://www.ni.com/docs/en-US/bundle/ni-daqmx/page/buffersize.html

 

indicates your current settings will give you a buffer of 10 kS, which should take 10 seconds to fill up. Your error code isn't a buffer overflow- it's a "your acquisition has stopped" error, so I'd look more at USB settings than buffer sizes (though see below... I don't know that this is the real error).

 

That said, try these mods- set your Samples per Read to 100 in DAQmx Read, then average all of those points before adding them to your display. That will give you a net 10 Hz data rate and will dramatically reduce your memory footprint.

 

Also, and probably more importantly, add an Or to the Stop terminal and wire in the output of the DAQmx Read's Error Out terminal. Right now, your code won't stop if DAQmx Read returns an error. When you finally click Stop, you'll only get the last error- your error wire terminal is "Last Value" not a shift register, so if Read errors out and stops the task, your next call to Read will return a different error, since you're effectively throwing that error away.

0 Kudos
Message 13 of 13
(112 Views)