LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

HDF Files to/from LabVIEW

NI provides a nice HDF5 LabVIEW toolkit with their Soft Front Panel
software. However, I could not find support for 2-dimensional arrays. Does
anyone no how to convert 2-dimensional arrays between LabVIEW and HDF5
format?

- Neal Pederson, np@vicontrols.com
0 Kudos
Message 1 of 8
(6,185 Views)
What is the name of the toolkit you are referring to? I've been trying to get some information on it, but I haven't been able to find much about it. Once I find it, I might be able to help you out with this.
J.R. Allen
0 Kudos
Message 2 of 8
(6,185 Views)
"National Instruments' Scope Soft Front Panel version 1.5 contains support
for HDF5. The Scope Soft Front Panel is used to control all high speed
digitizers from National Instruments. Included in the distribution are
LabVIEW VIs to read and write the HDF5 files used by the Soft Front Panel.
The distribution can be downloaded from http://ni.com. Questions regarding
this product should be directed to the support group at National
Instruments."

Go to:
http://digital.ni.com/softlib.nsf/websearch/A3A6792E6B19774186256BF9005190F2
?opendocument&node=132060_US, login and download scopesfp151.exe (9.16 MB).
I believe it is part of this package. It should load LabVIEW VIs into the
InstrLib pallette for HDF5 file I/O.

The installer that I downloaded several mon
ths ago is called sftFile.exe and
is about 3MB. If the above doesn't work e-mail me and I can send it to you.

"JRA" wrote in message
news:50650000000500000072B70000-1031838699000@exchange.ni.com...
> What is the name of the toolkit you are referring to? I've been
> trying to get some information on it, but I haven't been able to find
> much about it. Once I find it, I might be able to help you out with
> this.
0 Kudos
Message 3 of 8
(6,185 Views)
There is no support for 2D arrays because the toolset was designed to complement the digitizer and function generator products. 2D support is relatively easy to create. However, the answer depends upon whether you want to use HDF5 in general, or the Hierarchical Waveform Storage (HWS) structure used by soft front panels. We will cover the first case first, since it is easier.

The HDF5 base API is C. The data type for both read and write functions is almost always void*, meaning you can wire anything into it. HDF5 determines how to read/write data by looking at the type and dataspace inputs. The LabVIEW toolset is a very thin layer over the C API, so the HDF5 documentation (available at the NCSA website http://hdf.ncsa.uiuc.edu/HDF5/) should give you all you need. Attached to this reply is a zipped LL...




0 Kudos
Message 4 of 8
(6,185 Views)
Thank you very much for your help! I was able to save a 2D array by using
your "H5D Create-Write 2D I32 array.vi" plus the "H5F Open-Create-Replace
File.vi" and "H5Fclose.vi" VIs from the toolset since I did not see these
VIs in your library. However, I am still having some problems and have a
few questions:

1. Sometimes it works fine and sometimes I get the following error: "An
exception occurred within the external code called by a Call Library Node.
This may have corrupted LabVIEW's memory..." Do you know what may be
causing this? Might it have something to do with my combining of your
library and the toolset "F" VIs? If you have a library that includes all H5
VIs that I would need I would prefer it to the toolset.

2. Can you define the "chunk size" for me? I appear to get fairly good
results (low size and low errors) with 1024 but this does not make a lot of
sense.

3. If I try to save (add) data to the same file and data name I get an
error. Eventually I plan on saving a 4096x4096 array of U16 data so it
would take a lot of memory to buffer and save all of the data at one time.
It would be a lot more efficient to save several rows of data at a time. Is
this possible or does all the data have to be saved at once?

4. As I mentioned, my end goal is to save a 4096x4096 array of U16 data (it
is an x-ray image file). I think I can figure out how to convert your VIs
to U16 from the notes you gave me. Please advise if there is anything
additional I should know.

Thanks again, Neal Pederson, np@vicontrols.com

"DFGray" wrote in message
news:5065000000050000006ABB0000-1031838699000@exchange.ni.com...
> There is no support for 2D arrays because the toolset was designed to
> complement the digitizer and function generator products. 2D support
> is relatively easy to create. However, the answer depends upon
> whether you want to use HDF5 in general, or the Hierarchical Waveform
> Storage (HWS) structure used by soft front panels. We will cover the
> first case first, since it is easier.
>
> The HDF5 base API is C. The data type for both read and write
> functions is almost always void*, meaning you can wire anything into
> it. HDF5 determines how to read/write data by looking at the type and
> dataspace inputs. The LabVIEW toolset is a very thin layer over the C
> API, so the HDF5 documentation (available at the NCSA website
> ref=http://hdf.ncsa.uiuc.edu/HDF5/>http://hdf.ncsa.uiuc.edu/HDF5/)
> should give you all you need. ...






















0 Kudos
Message 5 of 8
(6,185 Views)
1. The error you mention is fairly common when working with HDF5 (or any other C API) from LabVIEW. It usually means you have a mismatch between the LabVIEW data allocation and the HDF5 data allocation, causing the HDF5 routines to go into memory that is not allocated. The toolset file open VI will not cause this problem. It is just a wrapper around the simple H5Fopen VI to make it easier to use. The underlying HDF5 VIs are available in the toolset in the HDF5 directory. We are currently working to extend the breadth of functionality of this toolset. You can usually create any HDF5 function you want by copying one of the current ones and modifying the call library node to the function you are interested in. The only place you will have problems is when the function you are interested in uses a C convention that LabVIEW does not support, like a function pointer. You will then need to write a C wrapper for it, put it in a DLL, and call that from LabVIEW. That is the purpose of the HDF5_2_LV.dll that ships with the toolkit.

2. Chunk size is used in extendible data sets. When created, a dataset can be flagged as either extendible or fixed in size. If extendible, it can be extended at any time in the future and can be compressed. The chunk size is the size of each piece of data in the data set. The chunk is an indivisible unit which is atomically written to and from disk. There are a lot of performance issues with chunk size. Check out the HDF5 documentation for a good discussion of this issue. You should get no errors for any chunk size, but you will get performance differences. Remember that chunk size is multi-dimensional. If you have a two-dimensional data set, your chunk size must also be two-dimensional. Your data size on disk will always be a multiple of your chunk size, even if you put less data in than the chunk. Compression does change this somewhat. For large data sets, a chunk size of 65,000 points seems to give fastest performance on Win32 machines.

3. HDF5 was designed to stream data to disk, so the capability exists. In LV6i, it was faster to stream to disk using HDF5 1.4.1 than using the LV primitives, although the difference was minimal. Both LV and HDF5 have improved since then, so I cannot give any definitive numbers now. Once again, check out the HDF5 documentation for how to stream to disk. The basic idea is to open an extendible dataset, then loop on the following instructions:
a) Extend the dataset to the new size
b) Query the dataset for its dataspace (it changed when you extended its size)
c) Select a hyperslab of the dataspace that corresponds to the data you want to write
d) Create the memory dataspace of the data you want to write. This will probably be a constant in your case, but for general streaming to disk where the number of points changes each time you write, it will not be.
e) Write the new data to disk
f) Close the disk dataspace.
Make sure you clean up all dataspaces, datatypes, etc. when you are done.

4. If you are trying to save an image file, you may be better off using the HDF5 image API. Check out the High Level APIs on the HDF5 homepage. HDF5 is very powerful, but very low-level and somewhat user un-friendly. Don't be surprized if it takes you a couple of weeks to get the hang of using it. Using it from LabVIEW is even more difficult, since LabVIEW is a 32 bit environment and HDF5 is a 64 bit environment. The techniques used in the toolset show how to handle this problem. One final tip. Use H5close.vi from the HDF5\Utilities subdirectory to close the HDF5 system down when things are not working. Since HDF5 has a persistent process which is active until all references are closed, a single memory error will cause errors forever until the system is closed down and restarted. It will restart again automatically at the first call to the DLL.

Damien Gray
Senior Software Engineer
National Instruments
0 Kudos
Message 6 of 8
(6,185 Views)
Damien,

My error was caused by not setting both dimensions of the chunk size
appropriately. I can now save 2D 4096x4096 U16 arrays but have not been
able to incrementally save data to the array.

My goal is to save sets of 8 x 4096 U16 arrays until the final array size is
4096 x 4096. I followed your steps as best I could but could not figure out
how to do the following steps with the functions in the LabVIEW libraries:
a) Extend the dataset to the new size
d) Create the memory dataspace of the data you want to write.

The attached test VI and U16 2D writer VI work with the library you sent me
plus the toolkit library. I am able to replace existing data if I set DU64
Start to the beginning of the array but I am not able to ad
d new data to the
end of the array.

Please assist if you are able, Neal Pederson, np@vicontrols.com



[Attachment HDF2D_Neal.zip, see below]
0 Kudos
Message 7 of 8
(6,185 Views)
As you mention in the notes on the VI block diagram, you were failing to extend the dataset before attempting to write more data to it. This resulted in an attempt to write a 4096x4096 array into an 8x4096 dataspace. The attached VI shows the corrections needed. I took some of your error checking out to make it clearer. Hopefully this will work for you.

As a side note, your chunk size and the size of the data blocks you stream to the HDF5 file do not have to be the same, although they are in this example. The HDF5 system takes care of double buffering them for you. This allows you to tune both the generating chunk size (size of data pieces you save to disk) and the save chunk size (HDF5 dataset chunk size) to your app
lication.

Good luck.

Damien Gray
Senior Software Engineer
National Instruments
0 Kudos
Message 8 of 8
(6,185 Views)