LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Synchronisation of Data from Vision IMAQ and niScope

Solved!
Go to solution

Hi!

 

My task:

- Take images with a Camera (Photonfocus MV-D1024E-160) connected to a NI-Framegrabber PCIe-1427) at speeds around 300 fps at freerunning mode

- Measure Signals with a PCI-5922 Digitizer at 500 kS per second

- Store all the data on the disk

 

Now I have to find the time window of the Digitizer Signal exactly at the time the image was taken.

What would be your concept doing something like that. (I use Vision and niScope)

 

I did some first tries using:

- Synchronize Timed Structures Starts VI

- Use the Global Start/End Time of a Timed Loop as time stamps

- There is also a micro-second status line stored in every picture (i could use for timing)

 

Last but not least important:

- How would you realize the datastorage

- I did tryes with safing the image data and the digitizer data in seperate binary files (would TDMS be the better option?)

- Would you use queues?

 

I´m happy for every concept hint! Just write if you need more detailed information!

 

Thanks

Tobi

0 Kudos
Message 1 of 3
(2,477 Views)

Hi,

 

some tipps for your work:

 

 

- How would you realize the datastorage

- I did tryes with safing the image data and the digitizer data in seperate binary files (would TDMS be the better option?)

 

"NI-TDMS-Dateiformat": https://www.ni.com/en/support/documentation/supplemental/06/the-ni-tdms-file-format.html 

-> here you have a better organisation of the data and open them easily in LabVIEW and DIAdem again

 

 

- Would you use queues?

"Understanding Core LabVIEW Design Patterns": http://zone.ni.com/wv/app/doc/p/id/wv-2322 

See this Webcast to decide if you want / need to use queues.

 

 

Best regards

Suse

______________________________
Certified LabVIEW Developer (CLD)
Message 2 of 3
(2,404 Views)
Solution
Accepted by topic author wickiwau

Hi Suse

Thanks a lot for the hints. After a lot of effort I got to the following solution:

I´m happy if you or someone else has comments on it (This stuff is pretty advanced for me):

 

Two independend VI´s running at the same time (See Attachments)

A) image consumer/producer

B) Digitizer consumer/producer

Additionally to the data I also safe the Global Start Time from the timed loop for each dataset (afterwords I can relate the Images to the Digitizer Signals using that timestamp)

 

A) Image Consumer/Producer

Attachment A: After initialisation of the image acquisition (left, upper corner). I have a Producer Loop (on top) which builds an array of 8-bit image data and the Global time stamp of the timed loop). The Consumer Loop (bottom) dequeues the cluster and saves every image as binary file with the nanosecond time stamp as filename. Like this i´m able to stream images from the camera to the harddisk at about 150 fps at full resolution (1024*1024 pixel) and even faster at low resolution (with a ROI)

 

B) Digitizer Consumer/Producer

Attachment B1: After initialisation of the NI-SCOPE I fetch (Points per fetch=2000 at the moment) the digitizer data and build again (similar to A) a cluster with the Global Start Time (and Global End Time for testing aswell). I enqueue in the Producer Loop

Attachment B2: I Dequeue the data in the Consumer loop and Write (stream) it to a TDMS file

 

To stopp all the loops after Acquisition and Storage I use a notifier.

 

Questions:

- Where do you see bottlenecks in this design (what could slow down?)

- What could be done in a more efficient way?

- How precise is the Global End Time Stamp (i want do find out exactely which image belongs to which digitizer signal). I didn´t find a lot of info about this Global Times (how are they "produced" within LabView?)

- Other comments, review on it?

 

Thanks again!

Tobi

 

 

Download All
0 Kudos
Message 3 of 3
(2,392 Views)