LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Passing Data between cRIO targets

Hi All,

 

I have two compact RIO chassis collecting data at 800 samples per second across 52 channels each.

 

I need to transfer all the data from the cRIO's to a network accessible storage device.

 

So far I have tried using FTP, transfering data files from cRIO A to cRIO B and then from B to the NAS device.  This did not work, the files coming from cRIO A always without fail ended up corrupted (generating a 116 error when I try and read them through LV, I don't know how else to try and open the files to check if they are corrupt other than through LV).  The files generated on cRIO B however always opened fine, the saving method was identical across the cRIOs.

 

I am now using data streams to stream the data from cRIO A to cRIO B.  I have so far got as far as writing the streaming code and displaying the data, system memmory information and CPU Loads on the front panel, ie no save functions as yet.  What I find is that the memmory usage on both of the cRIO's increases until one or the other crashes, it seems to be a bit hit or miss which one crashes, but generally it is the cRIO A.  From this point onwards the memmory usage on cRIO B, assuming it's A which has crashed, remains constant, despite continuously generating data.

 

The data from cRIO A comes into cRIO B, transfered into a queue, ultimately the intention is that the data in the queue is saved into a file, but for the moment I am simply displaying it on the front panel, however I would assume this act would cause the queue, which is being flushed, to be emptied and therefore the data to be lost, and the memory used purged......  Obviously I am wrong.

 

I have attached the project, any help would be gratefully recieved.

 

Dom

 

(PS I started doing this in July last year, it's getting embarrasing now)

0 Kudos
Message 1 of 5
(3,090 Views)

An Update,

 

I have tried moving the whole network stream inside a while loop, my thinking is that as the end points will constantly be being created and destroyed, it may free up some system resources.

 

Alas, all that seemed to happen is that the RIO's were unable to hold the connection for any length of time, and therefor didn't transfer anything.

 

Dominic

0 Kudos
Message 2 of 5
(3,070 Views)

Ok,

 

So it seems my buffer size was set WAY to high.

 

I am passing clusters of data straight from a data queue into the network streams.  My thinking was along the following lines:

 

I have an array of dimensions 53*800 containing singles being generated every second, even though it was being passed as a cluster from the data queue I assumed I needed to set my buffer size as 41600 elements.

 

After interrogating my memmory usage using the RT get memmory usage tool, I found that I was getting a memmory leak.  After a short while the cRIO would crash.

 

So after initially thinking it was something to do with the data queues, I spent ages fiddling with the queue structure, obviously to no avail.  Eventually I put 2 and 2 together and decided to read this tutorial and descovered the info about the buffer size, I didn't realise that the whole array would be considered one element.  Combine this with the fact that the buffer elements are not over written until all the elements in the buffer have been used (to give it it's "Lossless" transfer ability) I was basically trying to fill 41600 elements with about 163k (I think) each second.

 

No wonder it crashed after a while, the cRIO doesn't have 6.6 gig of available memmory for such things!!!!!!

 

My buffer size is now 10, which gives me 10 seconds of redundancy in case of a network outage, and seems to have solved one of my problems.

 

Fun and games!!!!

Message 3 of 5
(3,060 Views)

Hi,

 

I'm glad you solved the issue you were having and thank you for posting a solution.  Post back if you need any help with this problem!

 

Lewis Gear CLA
LabVIEW UAV

0 Kudos
Message 4 of 5
(3,036 Views)

No Worries,

 

I am kind of viewing this board as some sort of self help tool at the moment,

 

where you can rant away at noone and hopefully find a solution in the crazed ramblings............................................................................

0 Kudos
Message 5 of 5
(3,032 Views)