LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Writing, reading, and splitting strings of binary

Hi,

I am a relatively inexperienced Labview user, and I'm stuck with a particular task I'm trying to do.

In our project, we have a DAQ system that will record data on an MMC card, and later upload that data in binary format via RS232 to Labview. As it reads the file, Labview will take chunks of 12 bits from the data stream for processing. My task is to construct VIs to implement the reading of the stream and separating it into 12 bit chunks. At the moment, for test purposes, I have a VI that generates a string of bits from an ASCII string, and then separates this into chunks of 12 bits ("Disecting ASCII-binary string", see attached). I have then tried to save the long string of bits generated from
the ASCII in a file ("Writing the file") and read it in again using a modified version of my first VI ("Reading and Disecting string from file"). However, the VI doesn't work. I just get blanks in the string fields.

Your help as to why this happens and what I can do to fix it would be much appreciated.

Dominic
0 Kudos
Message 1 of 5
(4,485 Views)
You had two basic basic problems in ReadingAndDisectingString.vi.
1. The Read File function Line Mode input is unwired, so it defaults to False (not in Line Mode). When you're not in Line Mode, Read File will read the number of bytes wired to Count. But you have Count unwired, so it reads 0 bytes.
2. The EOF function isn't doing anything (and you don't want it to). Just delete this function.
I like using the higher level file functions in ...\vi.lib\Utility\File.llb. See the updated version of your file.
0 Kudos
Message 2 of 5
(4,485 Views)
On second thought, you could use EOF to tell you how many bytes you had in the file and pass that value to the Count input of the Read File function (and leave Line Mode unwired and False). Just add a wire from EOF Offset to Read File Count.
You don't really have a binary file, you have an ASCII file with a string of ASCII 1's and 0's. So that's why I though of the Line Mode.
0 Kudos
Message 3 of 5
(4,485 Views)
Thanks for the prompt reply. I've had another look and have attached my latest attempts. Are these using binary? If so, then I think they may have solved my problem.

Thanks again for your help.

Dominic
0 Kudos
Message 4 of 5
(4,485 Views)
OK, now you're talking binary!
But you don't need to cast numbers as strings to write binary. I'd go back to the Write Binary File.vi and Read Binary File.vi examples that ship with LabView. To find these examples, from any LabView window, goto Help >> Find Examples >> Search and enter binary files under Type a keyword to find. We got off the track on these earlier because I thought you wanted to write ASCII 1's and 0's.
LabView doesn't have a 12-bit data type. You need to use a minimum of a 16-bit integer (I16) to hold a 12 bit value.
1. Using the Write Binary File.vi example, replace the For loop (and its random number generator) with your input array of I16.
2. Using the Read Binary File.vi example:
2.1 Change the constant 8 (bytes/dbl) to 2 (bytes/I16).
2.2 Change the representation of the constant "type dbl" to I16. (Right-click on constant and select Representation >> I16).
2.3 Change the representation of the indicators "data from file" and "array from file" to I16.
If you still want to use your VI's (which will work, but they do a lot more conversions than you need to), do the following: (but I'd suggest starting from the binary file examples: they'll be a lot cleaner)
1. In Write_I8_Binary_File_try_again.vi, change the To Byte Integer Function (I8) to To Word Integer function (I16). A byte (8 bits) isn't big enough to hold 12 bits.
2. In ReadingAndDisectingStringUpdatedAgain.vi:
2.1 Change the three occurances of the numeric constant 12 to 2. Now that the data is in binary, you need 12 bits, not 12 bytes, to hold the data. Since the 12 bits of data are represented as I16, the file uses 16 bits, or 2 bytes, to hold one value.
2.2 Undo some of the changes I suggested before when I thought you wanted an ASCII file.
2.2.1 Insert an EOF function between the Open and Read functions.
2.2.2 Wire the Offset output of EOF to the Count input of Read.
2.2.3 Delete the True constant wired to the Line Mode input of Read, so it will default to False. With a binary file, you don't want to read in the Line Mode, you want to read x number of bytes (wired to Count).
2.3 Inside your loop, put a Type Case function. Wire the binary string to the X input, and an I16 constant to the Type.
2.4 Wire the output of Type Cast out of the loop to an array to display the full 12 bit numbers.
P.S. If you really want to minimize the file size, you can pack two 12-bit numbers (24 bits total) into 3 bytes (instead of using the 4 bytes required by two 16-bit I16's). But that involves a lot of manipulation on both ends: you need to use bit-wise And to mask off the upper and lower bytes, use logical shift to move bits, bit-wise Or to combine nybbles, etc. Something possible, but probably not something you need to do. It will make the files harder for you to create and harder for you (or anybody else) to read.
0 Kudos
Message 5 of 5
(4,485 Views)