05-10-2013 03:07 PM
I have been trying to get a Basler camera working with Labview and MAX again. I say again, because it use to work. Not quite sure what happen other than upgraded to 2012 SP1 and the new Basler driver 3.2. I went thru and followed Basler's instruction for driver install and network card settings. Please check out the screenshot. The weird thing for me is that it will have a steady image for three or four seconds, and then it starts losing packts and never recovers. Why would it be able to keep up for a few seconds and then the problem? I would assume if the setup wasn't right, it would never have a full image. When I had this issue before, I was able to mess with the timing of the resend messages, and it solved the issue when running Labview 2011; but that doesn't seem to work now.
The screenshot is from MAX, but it acts the same in my Labview app. When I run the Basler pylon viewer, it doesn't drop have any issues. Any ideas? Thanks.
05-11-2013 04:02 AM
Are you using NI High Performance Driver for Intel NIC?
Please read the following kb:
Troubleshooting GigE Vision Cameras
Andrey.
05-11-2013 06:32 AM
Looked over that document. I am not having any of those issues. I am using the Intel driver Basler recommends, and I have the jumbo frames maxed out. It will get images for a few seconds, and then it starts losing packets, and hence the lines in the images.
05-11-2013 11:54 PM
@biographie wrote:
. I am using the Intel driver Basler recommends,
Try to use Intel Driver NI recommends, not Basler.
05-14-2013 10:56 PM
A few additional comments:
- You are using the standard Intel driver rather than the NI High Performance Driver
- You are using a PCI-based Pro/1000 GT rather than a PCI express card. Unless you happen to have this card in a 64-bit/66Mhz slot (fairly rare) then you are likely going to run into bandwidth issues. PCI only has 133MB/sec of bandwidth shared among all the slots while Gigabit Ethernet can consume 125MB/sec in each direction (250MB/sec combined), not including all the additional overhead the card needs to handle book keeping of the data.
I suggest replacing the Pro/1000 GT with a PCIe Intel NIC instead (such as the NI PCIe-8231). You also may be able to try adjusting the "Peak Bandwidth Desired" attribute in the acquisition attributes to allow throttling the camera to something <80MB/sec or so which should be much more robust on the limited bandwidth of PCI.
Eric
05-15-2013 01:01 AM
Hi
Another thing to check is your cable. Be sure to use at least Cat 5E and try with different cables.
Regards
05-22-2013 01:14 AM
Hi,
I use Basler GigE with the Pro 1000 and the "National instruments GigE Vision" Driver.
Normally I use 2 or more cameras.
I test if hardware and driver is working well with MAX.
For each camera I start a new MAX and let the camera run in GRAB.
The cameras should all run at the max frames/second which the supplier specified.
I have done this in 8.6 and LV2012 DS2.
I let it run for a weekend. For each camera there are 6.000.000 Images taken. (30 frames/sec)
It should give zero Resend Packets Requested and zero Resend Packets Received and zero Lost Packet Count
If there are then you have to look at driver or hardware.
I install vision acquisition and do not change anything.
See pictures attached.
Notes:
- I never installed driver from Basler.
- When using software trigger instead of continuous grab there will be Resend Packets Requested. There should not be a lot of them and also NO Lost packets.
Toine