Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

How to do a simultaneous snap from two GigE cameras?

Solved!
Go to solution

I have 2 Basler Scout 780-54gm cameras (monochrome GigE) that I am triggering externally at 10 Hz. This frame rate is modest and the resulting data rate is not particularly high. I would like to get images from both cameras from the same trigger pulse. I have an Intel Pro/1000 NIC and I am using the NI driver that shows up as National Instruments GigE Vision Adapter. When I use the Basler Pylon Viewer (the software that comes with the cameras) I can see perfect images from both cameras at the same time at 10 Hz. This tells me that the network, NIC and drivers can handle the data rate with no problem. I have jumbo frames enabled.

  When I try to acquire images into LabVIEW 8.6 with IMAQdx vis, though, I run into problems. I can acquire from each camera alone, but I can't get both at the same time cleanly. The images are shot through with black horizontal lines that appear and move erratically. I have attached a vi that shows the problem. I configure and start both acquisitions, then enter a while loop which reads the images. If I do the acquisitions sequentially by starting and stopping them inside the loop and making sure that one stops before the other starts then I get high-quality images but they are not from the same trigger pulses, and the overhead of starting and stopping the acuisitions slows things down so that I can't get images from every trigger.

  My guess is that this is a problem with the way LabVIEW is managing memory, resulting in corruption with simultaneous acquisitions. Are there any settings I can change to cure this problem? I have experimented with everything that I know about (buffer numbers, etc), but no success yet.

 

Rich

 

P.S. I posted this in the LabView forum a couple of days ago but got no responses. Sorry if you see this twice.

Message 1 of 4
(4,619 Views)
Solution
Accepted by Richard2950

Richard,

 

It sounds like you are simply experiencing packet loss. You could wire up a property node to query Lost Packets to confirm this. The reason this is happening is that although the average data rate of both cameras is below the limit of the NIC, each camera is likely going at a peak bandwidth of the full 1000Mbit when it transfers each image. When the image transfers happen to overlap, you'll be sending 2000Mbit/sec down a 1000Mbit pipe and losing packets.

 

To solve this, one easy way is to adjust the Acquisition Attributes->Advanced Ethernet->Peak Bandwidth Used attribute to 500Mbit for both cameras. This should ensure that each camera never uses more than half the bandwidth. There are also other options such as staggering the transfer delays, but those would require more tuning as you change other parameters.

 

I'm not sure why the Basler software doesn't show this, although their default bandwidth setting might just be lower at the expense of added latency to get the images.

 

Hope this helps,
Eric

Message Edited by BlueCheese on 07-12-2009 09:39 AM
Message 2 of 4
(4,603 Views)

BlueCheese,  your suggestions indirectly helped me to solve the problem. I could not find the "Acquisition Attributes->Advanced Ethernet->Peak Bandwidth Used" that you described. Using MAX I did find "Camera Attributes->Transport Layer->Bandwidth Assigned", but it is read-only. The description says it represents a combination of the  packet size and inter-packet delay. It was set to 125 MBytes/sec (1 Gbit/sec). Based on that I set the packet size down to 2048 (from 9016) and my problem went away.


Thanks,

Richard

0 Kudos
Message 3 of 4
(4,563 Views)
I found the setting you described in the property node for the IMAQdx session so now I can set it inside my vi and be sure that nobody has messed with it.
0 Kudos
Message 4 of 4
(4,558 Views)