11-19-2008 03:08 AM
Hi
I am trying to convert a array of U8 into a 10 bits array (take the first 8 bits then add the next 2 bits of the next byte then the remaning 6 bits with the next 4 bits and so on). The ram we are using is 8bit but the
CCD image it actually 10 bit, i have attached a couple of jpegs to show what i have done and i was wandering weather there is a expert around to say if this is the most efficient way of doing this. I have a large
arrray to convert and would like to make it as quick as possible.
Thanks for any help Gary
11-19-2008 03:30 AM
Hi, Gary,
Not fully understand why you needed such packing. The most efficient way is convert U8 to U16 where 10 bits only used in each 16 bit Word and 4 bits remains zeroes.
Andrey.
11-19-2008 03:56 AM
Hi
I have a interface that writes to a 8 bit RAM block i need to take that and make it 10bit because that is the CCD is 10 bit, so the first 8 bits and the 2 bits of the next byte make up the first
CCD pixel, the remaining 6 bits of the second byte and 4 bits of the third byte make up the second pixel and so on. So five bytes make 4 words with only the first 10 bits used.
Hope this make sense Gary
11-19-2008 04:10 AM
11-19-2008 05:48 AM
Hi
Thanks for that, i was wondering if making the U8 input array into binary and then taking 10 bits at a time would be better ?(not sure how easy it is to convert a large array into bits)
Cheers Gary
11-19-2008 10:10 AM
11-19-2008 10:13 AM - edited 11-19-2008 10:16 AM
Gary,
Reshaping an array of bits is quite easy and (imho) the block diagram code is much easier to understand.
For arrays of a different size you will just have to calculate the first dimension for the resize operation.
Regards
Anke
11-20-2008 04:40 AM