LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

microsecond timer

Hello,
 
I am using Labview 7.1 and the DQ PCI-16E-1 Card.
I want to meassure 2 signals in a Microsecond resolution (5us would be also ok). And then I want to start a timer which also works in a Microsecond resolution and starts at the edge of signal1 and stops at the edge of signal2.
 
The problem I have is that the realtime-counter which should work with microseconds is just working with milliseconds. Because this function is made for fpga I also installed a fpga-package from labview but it is still counting in ms. I even cant switch the execution target.
 
The second problem I have is with the input. I am getting a signal in as an array (with a resolution of about 4us). When I want to work with the datastream the next functions only take on value of the array and the whole labview works very slow (ca. 2 Hz).
 
Has somebody made experiences like this and knows what to do?
 
thanks for support.
 
Thomas
 
my program is in the attachement
0 Kudos
Message 1 of 2
(3,729 Views)
Hello Thomas,

Allthough I did not debug your code in detail I have some remarks concerning the program.
  • The counter block you are using is not accessing the hardware of your PCI board. To be able to use the code specified for FPGA you need the FPGA hardware as well. Using windows timing (this is what the vi seems to do in your situation) the clock rate is limited to 1ms timing. This is expected behaviour.
  • If you want to do some measurements in sub-ms-timing you need to do this in hardware. Your DAQ card has got 2 general purpose counters that can be used for that. Unfortunately the conters used on the E-series devices do not support the edge seperation measurement which seems to be doing exactly what you need (the newer M-Series boards support this function) - so you will have to think of a different solution which might include some external signal conditioning.
  • If the edges you want to measure the difference of are the analog signals you require I would do the calculation in software. I dont know hoe the edges look like, but since you are aquiring the signals fast enough a calculation like this should be possibel: Find edge in signal 1 -> get its time -> Find edge in signal 2 -> get its time -> calculate the time difference.
  • Concerning the speed of your analysis I can only say that you are massively using external code and local variables. Both of these elements do have an impact on the overall performance of the application. As far as I can see, you could replace all these elements by native LabVIEW code. Dont use locals and code-nodes more often than absolutely necessary. LabVIEW is a dataflow oriented programming environment and the functions and variables you are using contradict with this paradigma. That is why the compiler cant do its best.
  • I would recommend that you have a look at some examples to see how data acquisition, analysis and presentation can be realized. Take a standard example as Acq&Graph Voltage-Int Clk.vi and start developing your application from there. You will be surprised by the plus of performance you get.
regards
Ingo Schumacher
Systems Engineering Manager CEERNational Instruments Germany
0 Kudos
Message 2 of 2
(3,698 Views)