Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

Unsigned 16 bit image distorted

Hi,

 

I am trying to connect a thermal camera to Labview using NI-IMAQdx.  The camera is a FLIR A315 camera and uses unsigned 16 bit format connected to the computer using GigE Vision.  The unsigned 16 bit data should represent a temperature in Kelvin.  When I use the software that came with the camera the image is correct, however when I use Measurement and Automation Explorer the image is distorted (see my attachments).  When I look at a histogram of the data it is between 30000 and 53000 and I expect it to all be right around 30000.  In MAX I have selected the 16 bit format with pixel signedness being unsigned.  The format is read out as Mono16.  Does anyone have any suggestions of what I might be doing wrong?

 

Thanks for your help,

Bryan 

0 Kudos
Message 1 of 2
(3,504 Views)

I would suggest taking a look at this forum thread here. I believe that it covers the same issue. You can also try some steps that FLIR has given us in the past found below. The document it refers to is attached.

 

1. Page 14; Temperature Range of your target. This hardware setting is call a "Case." Query the one you need and set it. Temps are in Kelvin. 

2. Page 15; Object Parameters. This allows the camera to calculate accurate temperature of your target. 

3. Page 17; NUC Mode. Automatic is best, but you don¿t want the camera to NUC while you are measuring your target. You can set it to manual, but you must NUC at least once every 30 minutes or you can damage the camera. 

4. Page 18; IRFormat. Set to Temperature Linear 0.1K. This is a 16-bit datastream, where you get a digital count value per pixel in Kelvin. For example, a digital value of 3730 equates to 373.0K or 273.0C.

 

 

Cameron T
Applications Engineer
National Instruments
0 Kudos
Message 2 of 2
(3,470 Views)