09-06-2011 01:36 PM
Hi,
I have multiple PCIe-1433 framegrabbers in my possession and I have a routine that sends 1KB packets of calibration data to the camera one packet at a time. Usually after few sends, a timeout error occurs because not enough data is received by the camera (the camera responds back with a character after each full packet is received). The strange thing is that the same code works properly on other non 1433 framegrabber boards (using PCI-1428 and PCIe-1429 framegrabbers, although I am not able to take advantage of the faster baud rates) on this computer.
I believe the problem is related the use of the PCIe-1433 framegrabber AND the computer model. I've tried the same code on two Dell Precision T3500's and they both allow only a few packets to be sent before a transmission error occurs. When I try the same code and the same 1433 board on a Dell Precision T3400 the code works fine and I am able to send many 1000's of packets to the camera with no errors.
What may be causing this issue? I've tried switching cameralink cables and had the same issue.
Thanks,
Bruce
Solved! Go to Solution.
09-07-2011 10:11 AM
At what baud rate are you trying to talk to the camera? What kind of camera is it? The fact that it works on one computer and not another suggests that it might be a software timing issue, perhaps related to the timing of the software as it runs on the different machines. After all the, the PCIe-1433 hardware is the same in both computers.
I would suggest that you try slowing down the data transfer by using a lower baud rate, or perhaps sending the data a little bit at a time with small delays in between. I've seen cameras where the firmware can't keep up with the selected baud rate, even though it's supposed to support that rate. When the camera's firmware can't keep up, bytes get dropped.
09-07-2011 10:41 AM
Thanks for taking time to work through this problem.
I've tried using many different baud rates (57600, 115200, and 460800), and the same problem occurs after a few 1K packets are sent (the size of the packet is set by the camera). I even see the problem after a few packets if I put a breakpoint in my code stopping it after each packet is sent.
Another strange thing I see is that, if the code doesn't get a response after waiting a second, the code will send a character to the camera so that the camera "thinks" it has received all data. It is at this point that the code reads a value of x80 (decimal 128), which is not one of the characters the camera is supposed to send back (I am expecting either an ASCII 0 for no error (ready for next packet), 1 for error, u to resend packet, q to stop sending packets). The x80 value is not a value that is expected to be returned (I even asked the guy who wrote the camera code and he said that this is not possible).
I am able to get the problem to occur using multiple types of cameras, and as mentioned earlier, this problem isn't seen when using the board on a different computer. The cameras I am using are Goodrich H and J type cameras (which I use often since I work for Goodrich and help to develop software to test and calibrate these cameras).
Thanks,
Bruce
09-07-2011 10:46 AM - edited 09-07-2011 10:49 AM
Bruce,
I have a couple of questions in regard to your setup:
I look forward to your reply.
Larry H.
09-07-2011 11:16 AM
Bruce,
Also,
1. What size PCIe slot are you placing the 1433 card into in each computer? Are they X1, X4, or X8?
2. Do you have the same version of the IMAQ driver installed on each computer? If so, what version of the driver is this?
Regards,
Larry H.
09-14-2011 11:06 AM
Bruce, if you're still monitoring this thread...
Here are two other questions that might help figure this out:
1. Which version of the NI-IMAQ are you using (if you haven't done this before, go to MAX, and click on the drop-down arrow next to Software under My System)? The latest NI-IMAQ version to affect the 1433 firmware was 4.6.0 (released with Vision Acquisition Software March 2011). If you see the timeouts with the latest version but not with earlier versions (or visa versa), this could be a clue as to the root of the problem.
2. How often are these 1kB packets being sent? Once per second? More often? If you reduce the frequency of 1kB packets significantly (like one per minute) do you still get the timeouts?
-Daniel
09-14-2011 12:18 PM
I'm still here, it's just been very busy at work and not able to spend a lot of time investigating.
I am using IMAQ 4.5 currently, I'll see about trying to upgrade the IMAQ drivers (I have 4.6.1) I just don't usually install the latest drivers immediately because then the other computers that use the software (there are quite few) will also have to be updated which is a pain for me.
The 1KB packets are being sent very fast, basically as fast as the baud rate allows with little or no delay between packets. The transmission error doesn't seem to be affected by the baud rate, but as you suggested, I could see if the problem goes away if there's a delay between packets (although this is not a solution!). The calibration data for 1 camera contains tens of thousands of packets so the total transmission time usually takes about 30 minutes without delays.
Bruce
09-19-2011 11:38 AM
I don't think that you'll find any difference in the serial port behavior between IMAQ 4.5 and 4.6.1.
Another question... You say 1kB packets... Is that 1000 bytes or 1024? (I've had some trouble getting more than 1024 to work in the past.)
09-19-2011 03:33 PM - edited 09-19-2011 03:38 PM
I send 1024 bytes at a time. This value is specified by the camera.
As a quick test to see if there were any firmware updates for the 1433 (which was suggested earlier), I put the board on the computer I use for non-production oriented experimentation (it has IMAQ 4.6.1 on it). It said the firmware on the 1433 was already up-to-date.
Thanks everyone for your continued help.
Bruce
09-23-2011 08:52 AM
Are you using Windows XP? Vista? 7? 32-bit or 64-bit?
-Daniel