06-17-2011 02:45 PM
Hello,
I am using an Edmunds optics monochrome CCD detector to measure an interferogram. The CCD has a Labview output of an U8 array.
I currently have two methods of retrieving the array. One is to choose a row of the array and save just that, the other is to convert the array to an image(using ArraytoColorImage.VI) and save the image. I am running into an issue in that the two methods give different results, when they should give the same. I am saving the images as high quality tiffs, so there shouldnt be any compression issues. The problem it seems is the ArraytoColorImage VI.
For instance, using a monochromatic source I measure the interferogram usng both methods. When I take the FFT of the straight array method I get a single peak as I should. When I use the ArraytoColorImage method, I get multiple peaks, which I shouldn't for a monochromatic source.
It seems that the the ArraytoColorImage VI is adding noise to the data which is corrupting it. Shouldn't it just be a 1:1 correlation from array to image? Any thoughts or suggestions?
06-20-2011 11:05 AM
I'm guessing it has something to do with the data-type source and how LabVIEW places that into the new data type. For instance, I could image an instance where the greyscale image is 8-bit and has values between 0 and 255. It is then converted to a color image where the way of representing grayscale images is via a 24bit RGB value. In this case a color (say 145) which is grey would be represented by 9539985. (145-145-145 in binary)
This is just a guess, but it would make sense.