LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How to dynamically allocate memory

I am using a custom built PCI DAQ card hardwired for a particular address.
For acquiring the data, I used the inport.vi and outport.vi but it is slow.
So, I wrote a C pgm and made a dll file and thought of using in LV. But I am
facing the following problems in the Win98 system.

1. In the Visul C++ IDE, in the project->settings->c/c++->code generation
field, if I select Debug Multithreaded DLL, I am getting a message "unable to
load secondry DLL" when I open the VI. But when I open the VI in Win2000
system, the DLL is getting loaded and there is no error. If the option is
selected as Debug Multithreaded , then the DLL gets loaded in Win98 systems.
I dont understand the difference between these two options. Can anyone please
explain me the significance of these choices?

2. In my C pgm, I wanted to use dynamic memory allocation using "malloc"
function. When I use this function, the following error message comes when
the VI is executed: "An exception has occured within the external code called
by LabVIEW call Library node". I have attached my C pgm with this mail. Can
anyone help me on how to use dynamic memory allocation ? Should I look for
CINs?

Thanks & Regards,
Srini
0 Kudos
Message 1 of 10
(4,479 Views)
Dear Srini,

For 1:
I suppose you builded the DLL on the W2K system. If you use the Debug Mulithreaded DLL or Debug Singlethreaded DLL option you need to install the numerous runtime debug DLLs on your target machine. These DLLs are installed together with Visual Studio on the development machine. Using the Debug Mutithreaded or Debug Singlethreaded option all functions from the DLLs are statically linked to your DLL. The DLL file must be larger than with the other option.

For 2:
If you will give back the buffer allocated by the malloc call to LV this does not work. Read the "Using external Code in LabVIEW" manual from the LabVIEW bookshelf (should be installed with LV). There is a chapter how to allocate buffer so LV can use it.
Waldemar

Using 7.1.1, 8.5.1, 8.6.1, 2009 on XP and RT
Don't forget to give Kudos to good answers and/or questions
Message 2 of 10
(4,472 Views)
Check out one of our examples on how to pass data back and forth between LabVIEW and a DLL - including allocating your own memory.

http://sine.ni.com/apps/we/niepd_web_display.display_epd4?p_guid=B45EACE3E78F56A4E034080020E74861
Message 3 of 10
(4,447 Views)
Hi,
Thanks for your reply. You were right. The file size was more with Debug Mutithreaded.
I have manged to make my program working by making a DLL and used call library function node. I havent used the malloc function and the acquistion is reasonably faster. I was able to do it in Win98 OS but I actually want to run the program under Win2k OS. I believe we cant directly access ports in WIN2k systems using _inp() & _outp() in C. Can you help me to access ports in WIN2k systems using C bcoz the LabVIEW inport.vi & outport.vi are very slow in reading data from the port. I prefer to use the C program.
0 Kudos
Message 4 of 10
(4,428 Views)
I would recommend using the NI-VISA driver as it handles the memory mapping of your PCI device to allow for very fast access. Check out

http://zone.ni.com/devzone/conceptd.nsf/webmain/ADF3152837E2B4A486256B5600642AC7

on how to set it up for this.

I would also recommend that you try to use the LV VISA nodes to do the peeks and pokes first to see if it gives acceptable performance. The LV register I/O are designed to be very easy to use - not for applications where you will be doing a lot of them. VISA is a little more complicated (but much much easier than doing a Windows device driver) and gives you much better performance. VISA supports both C and LabVIEW, but you may find that calling VISA from LabVIEW gives you acceptable performance.

Let me know if this works for you.
Message 5 of 10
(4,413 Views)
Do I have to seperately buy NI-VISA ?
And, I wuold like to know whether whatever application I am developing with LabVIEW under Win98/2k can be run in Linux OS.
0 Kudos
Message 6 of 10
(4,388 Views)
NI-VISA is part of the Device Drivers CD which came with your LabVIEW. If you have chossen not to install it run the installer again to add it. Then you should have under the Instrument I/O palette the VISA subpalette.

You can run a LV application build on any OS on any other OS supported when
1) you have the development version on this platform to allow LV to recompile your VIs
2) you have nothing used which is platform dependent, like Registry VIs, DLLs, CINs, Bluetooth, ActiveX, .NET, AppleEvents and so on. Look in help for all functions you are not sure whether they are supported only on one OS.

Message Edited by waldemar.hersacher on 03-22-2005 02:18 PM

Waldemar

Using 7.1.1, 8.5.1, 8.6.1, 2009 on XP and RT
Don't forget to give Kudos to good answers and/or questions
Message 7 of 10
(4,378 Views)
I tried to use the peek & poke vi's. First thing is that I am not able to find the VISA DDW. So, I dont know what inputs I should wire and how to access my PCI card. What should be given for address space? And I also dont know what to wire for Resource name control. I went to PXI link in Resource name and saw lots of options there like Manufacturer name,
Manufacturer ID, slot, Model name, Model code, etc... Since the card was made by us for our purpose, I dont know how to get many of these parameters. Can you help me with a sample program. It would be of great help.
0 Kudos
Message 8 of 10
(4,313 Views)
You need the DDW to get started so you might want to upgrade your VISA version. Try upgrading to this version

http://digital.ni.com/softlib.nsf/websearch/F20288844692F8D086256EE6006983D8?opendocument

You'll find the DDW in the Start->National Instruments->VISA->VISA Driver Development Wizard. The only thing you need really is the Manufacturer ID and Model Code - both are values hardcoded into your card so you should have them from your HW or Firmware engineers. Most likely you don't have subsystem IDs, but they should be able to tell you that also.

If your card generates interrupts, it is going to be more complicated but the instructions from the web site I gave you do walk you through it. If you aren't going to be generating interrupts then you can skip that whole section.

Once you are done the PCI card shows up in the MAX configuration screen. The display shows you exactly what your VISA Resource name is (such as PCI0::3::INSTR). You can also launch the VISA Interactive Control (VISAIC) from MAX to interactively peek and poke your card. When you are ready to program it in LV, you just wire in the resource name into peek and poke.
0 Kudos
Message 9 of 10
(4,293 Views)
My PCI card is hardwired for a particular address. So, it doesn't have these id's. Is there any way I can access it through VISA to speed up the acquisition so that I dont have to write a driver in C for Win2000 system?
0 Kudos
Message 10 of 10
(4,259 Views)