05-22-2014 09:18 AM
Dear users,
can one (re)define the colorbox to have any 16bit output? Currently, the colobox has U32 output.
Cheers,
Solved! Go to Solution.
05-22-2014 09:23 AM - edited 05-22-2014 09:24 AM
LabVIEW uses a RGB color scheme . 0xFF0000 is Red 0x00FF00 is Blue etc.. You really need 24 bits to hold a color. So no. you cannot make a 16 bit colorbox.
05-22-2014 09:24 AM
What do you mean?
How do you want a 32 bit color to map to a 16 bit value?
05-22-2014 09:29 AM
@RavensFan wrote:
What do you mean?
How do you want a 32 bit color to map to a 16 bit value?
I do not need a 32bit colormap. I am happy to have a colorbox with a lower "resolution", the output must be coded in 16bit (U16). I have a display that accepts U16. There, the RGB values have to have a range as follows: red: 0..31, green: 0..63 and blue: 0..31.
05-22-2014 09:31 AM
@JÞB wrote:
LabVIEW uses a RGB color scheme . 0xFF0000 is Red 0x00FF00 is Blue etc.. You really need 24 bits to hold a color. So no. you cannot make a 16 bit colorbox.
I see. So when I want to use the colorbox (for user friendly definition of colors in the GUI), then I have to make a translator for it, which will transform U32 to the equivalent U16 (5 bits for red, 6 bits for green and 5 bits for red).
05-22-2014 09:43 AM
Break the bytes apart into individual U8's. I'd turn them into arrays of bytes. Use array subset to pick out the most significant bits that you want of each color. Build the array, typecast it back to a U16.
05-22-2014 10:35 AM - edited 05-22-2014 10:35 AM
This would be a good candicate for an xcontrol.
As indicator it could show a colorbox of the converted color. As control it could show three U8 controls in a cluster (R, G, B), each with its own defined and coerced input range. The datatype of the xcontrol would be U16.
05-23-2014 03:32 AM - edited 05-23-2014 03:43 AM
Thank you for your ideas. Here is a solution based on the suggestion by RavensFan. Additionally, I learned that the specifications for the display are a bit different: The color is defined as BGR in U16 (565)...
the "in: RGB(U32) out BGR(U16)" VI:
And it's usage:
05-23-2014 06:44 AM
Without digging into all the code, that looks like what I was thinking of.
There are probably ways to code it so it uses less code and/or be more efficient.
Altenbach made a good point about having it output another LabVIEW colorbox that shows what the color should look like after conversion. I think I would do that by taking the original U32, and then ANDing it with a mask that has 1's for the appropriate higher order bits in each byte of the U32, and 0's for the lower order bits you're getting rid of. Then feed that new U32 into a colorbox indicator and also through all the bit manipulations to pack it into the U16.
05-23-2014 09:21 AM
I am lost in terms like "xcontrol", "xcontrol as indicator", "xcontrol as control". I've never used them before. So, far I am happy with my subVI. But I'd be glad to learn something new, of course. So, if anybody thinks, that it can be done more efficiently, please, let the thread know 🙂
Thank you once again for your ideas.