Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

How to detect and fix when camera link signal is out of sync?

I'm using NI-1428 with a CameraLink camera. I'm triggering the camera with a separate microcontroller, and we are having issues with some strange extra triggers being supplied to the camera. We can't quite figure out why these extra pulses are being outpu, and in the meantime we'd like to address the issue that these extra few pulses are giving us - the camera is going out of sync.

 

The image is shifting up by 1-2 pixels and right 1-2 pixels whenever this noise is seen. The shift compounds with each errant pulse, and so the sync just gets worse and worse. Without turning off the triggering or the camera, we can get rid of the sync offset by stopping and starting acquisition again. Is there any way to detect this sync shift and/or fix it without having to stop and start acquisition? I realize that I need to fix the errant pulse, but that may take longer than having the NI board correct itself.

0 Kudos
Message 1 of 12
(4,959 Views)

Hello midletongard,

 

Would you be able to elaborate on the shifting you are seeing?  Does this mean there are two pixels from the previous image shifted onto the next image?  Would you be able to post a screenshot of the shifted image?

 

Also, what is the source of the noise on the trigger lines?  What are the circumstances surrounding the errant pulses that make them difficult to eliminate?

 

David A

0 Kudos
Message 2 of 12
(4,948 Views)

Attached - noshift.png shows the sensor output of a good frame. shift.png shows the camera looking at the exact same location, but it's been allowed to run for a while with the triggering arrangement. The shifting can be reset by stopping and starting acquisition. I'm wondering if the sync in the framegrabber can be reset without stopping and starting acquisition, and if there is a way to detect the misaligned frames programmatically from the frame grabber.

 

The source of the errant pulses aren't entirely understood... basically we're using a microcontroller dev board to generate pulses from an encoder input, but there seem to be timing issues within the controller. Unfortunately, we need to show a prototype in about a week, and the extra frames sent by the pulses wouldn't be a huge issue if it wasn't for the shift of the sensor data. We may  not have time to fix this issue on the microcontroller side before the showing.

 

 

 

 

Download All
0 Kudos
Message 3 of 12
(4,945 Views)

middletongard,

 

There won't be a way to detect the mislignment since the frame grabber wouldn't have acquired the frame unless the synch lines informed it to do so.  You could create a camera file that interprets the synch lines differently, but camera file creation is far more work than actually fixing the issue causing the synch problems.

 

Rather, if the synching issue creates a predictable offset, you could use functions within the Vision Development Module to combine the two images.  Using Image to Array would allow you to manipulate the image data at a low level.  You would need to do manipulate the array to take a portion of one image and append it onto another to recreate the orignal image.

 

David A.

 

 

0 Kudos
Message 4 of 12
(4,936 Views)

I've found more information about the problem I'm having. It appears that the stuttering actually happens repeatably at two frequencies.

 

When triggering the camera both externally, and by timed pulses from the 1428, I can consistently cause the frames to go out of sync by triggering at ~30.5 and 60.9 Hz. This happens with both 1428 frame grabbers we have. Is there any way that this issue could be caused by something in the framegrabber?

0 Kudos
Message 5 of 12
(4,918 Views)

And one more thing - sometimes the issue shows itself with a simple frame shift, but also sometimes it seems that all data in the ring buffer is marked as "valid", and my software will report hundreds of frames per second when I'm only triggering at 60.9 fps. I can tell that old data is making its way to the screen, because I'm using a waterfall display to show the change in intensity of certain bands of the camera over time. I will turn off the light in front of the camera, and dark frames will appear on the screen, then 50 light frames will dump out onto the screen before becoming dark again. Why does the frame grabber dump out old data to my software?

0 Kudos
Message 6 of 12
(4,917 Views)

Hi middletongard,

 

What kind of shielding do you have on your trigger lines? Because this issue is happening around 60 Hz (and 30) I am wondering if it is a noise issue from lights or something along those lines. Also, how long of cameralink cable are you using and are you in full, medium or base mode? 

 

For the random data dump in your waterfall display, how are you clearing the buffer on your camera? 

 

 

Mychal F

Applications Engineer
National Instruments
0 Kudos
Message 7 of 12
(4,901 Views)

Hi Mychal,

 

The triggering is done using one of the Camera Link control lines, and is generated by the 1428. I've had a colleague in Finland test his camera, and he was able to reproduce the issue, where they are not using the same line frequency. We are using base camera link, and are using a 10m cable.

 

From what I know, the camera does not have a buffer, it is only the frame grabber which has a buffer. The camera simply sends out one frame at a time as instructed. I'm not sure I understand this question

0 Kudos
Message 8 of 12
(4,898 Views)

Hi middletongard,

 

I'm sorry, I misspoke during my last post, I did mean the buffer on the framegrabber. How are you clearing it? 

 

How are you reading these extra pulses? Are you using an oscilloscope? 

 

Without triggering, are you able to acquire a normal image with this CameraLink camera? 

 

Mychal F

Applications Engineer
National Instruments
0 Kudos
Message 9 of 12
(4,888 Views)

Hi Mychal,

 

It turns out that the cause of the issue is not extra pulses, it's simply that at certain frequencies the camera's sync gets messed up. The camera can only be triggered externally - either from the CameraLink cable, or from another connector on the camera. It works for most frequencies except for even divisors of 60.9 (30.45, 20.3, etc). We've put a bandaid on this issue in the microcontroller software, but I'm still curious how old data is making its way to the display, so that I might be able to at least prevent that from happening in special cases.

 

I'm acquiring using a ring buffer, and release the buffer using imgSessionReleaseBuffer. I use imgSessionExamineBuffer to obtain the next buffer, and always increment the desired buffer number by 1.

 

Here's my acquisition loop code:

 

 
	// the thread stop when StopRing goes to TRUE or there is an error
	while(!*stop && !error)
	{
		error = imgSessionExamineBuffer (pCamera->Sid, pCamera->BufNum, &currBufNum, &bufAddr);
 
		if(!*stop && (error == IMG_ERR_GOOD)){

 

			NewFrame(pCamera->BufNum, (void *)bufAddr);
 
 
			// release the buffer back in the ring
			errChk(imgSessionReleaseBuffer (pCamera->Sid));
 
			// Now we want to extract the next buffer
			pCamera->BufNum++;
		}else if(error == IMG_ERR_TIMEOUT){

			error = 0;
		}else{
			errChk(error);
		}
 
	}
0 Kudos
Message 10 of 12
(4,883 Views)