Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

How to measure the integrating time of an integrator

Solved!
Go to solution

Hi, all,

 

Sorry for disturbing you. Currently, I have designed an analog integrator which needs to be measured. First, I want to apply a known signal and then measure the integrating time. (I have a PCI-6251 DAQ board to do that.) 

 

To do that, I propose to build an "analog input sub-vi" to measure the output voltage of the integrator. Meanwhile, use the counter to record the time. When the output voltage of the integrator reaches a specific level, I will stop the integrating action and stop running the counter. Finally, the integrating time can be obtain.

 

However, I have some problems in stopping the counter when the voltage reaches the required level. Can I use a trigger to do that? If yes, how to do it?

Can anyone give me some hints, or examples?

 

For you conveniet, I also made a plot to explain the process, as is shown below:

 

untitled.JPG

According to the plot, when the integrating process starts the counter has to be started as well. After that, when the analog voltage reaches the required level (Vref), we have to stop running the counter and return the measured time. Meanwhile, a digital signal will stops the integrating as well.

 

Please give me some hints on how to make such a program. Especially on how to stop the counter when the voltage reaches the required level. This is actually my biggest problem. The second problem is the sychronization of the counter and the integrator. I have seen the example in Labview, and I will try to test it. However, if you have any suggestion, please help me. Thank you in advance.

 

Cheers,

Tingxuejh

0 Kudos
Message 1 of 17
(5,533 Views)

Hi tingxuejh,

 

Sounds interesting.  If we are talking about using a consistent sample clock (like the on-board AI Sample Clock), it should be easier to just count samples of the input to timestamp the data.  For example, if acquiring at 1 MHz, every sample of the ai clock would correspond to 1 us.  If the threshold is crossed on the 97th sample (e.g.), then we would know the integration time was 97 us.  If you want sub microsecond resolution, you could always interpolate and get pretty close to the actual value since it looks like the input is essentially linear.

 

 

Like you mentioned, we could use a counter to do this as well--the digital control signal you are looking for is called the Analog Comparison Event (see M Series User Manual).  In order to use this signal, you must have an analog trigger configured.  If going this route, I would configure the counter to perform a pulse width measurement on the analog comparison event signal

 

 

Best Regards,

John

Message Edited by John P on 11-11-2009 08:26 AM
John Passiak
Message 2 of 17
(5,514 Views)

Hi, John,

 

Thank you for your information. The first solution you mentioned is absolutely correct. However, I have some problem when using it.(it's my problem, since I'm new to DAQs). For instance, when I configured the DAQ assistant to acquire data with a sampling rate of 500kHz and apply a "write to file.vi" to record the data, I can see the data easily overflowed. I think it is because the recording process is relatively slow. Could you tell me how do you usually do solve such problem? The following pic is the block diagram which I created. I tried to increase the buffer size, but it seems not good.

 

 untitled1.JPG

 

About the second solution, I also need your help. I know how to configure an analog trigger, however I don't know how to configure the counter to perform a pulse width measurement on the analog comparison event signal, since there is no output terminal of this signal. Can you show me an example?

 

Apart from that, as you can see from the previous picture, I also need a digital control signal to enable and disable the integrating process. My opinion is to take use of this event signal. However, the problem is still where to find this signal?

 

I'm sorry to disturbing you so much, but I really need the help. Thank you.

 

Cheers,

Tingxuejh

0 Kudos
Message 3 of 17
(5,508 Views)

Hi Tingxuejh,

 

The write to file process is likely slowing the loop down to the point that it can't keep up with the incoming data.  At higher rates I usually write to a binary file instead of converting all of the data to ASCII as it is coming in (the express VI can be configured either way, so I can't tell for sure if this is what you are doing--LVM files are ASCII and TDMS are Binary).  You will also probably want to use lower level functions like in the following example (the Express VIs are good to get started but do not offer the most control over execution).  The Producer/Consumer architecture might also prove helpful in this case:

Using Producer/Consumer Architecture for DAQmx Read and Write to File

 

You'll want to read enough samples per loop so that you can keep up with the data.  I would stick with the AI clock rate method, but here is an example of what the Analog Comparison Event counting might look like using DAQmx code.

 

 

As far as the pulse output, it sounds like you want to start this at an arbitrary time and stop it when the integration has reached a certain voltage.   What are the timing requirements of this signal?  Does it need to stop exactly when the limit has been reached or is a software-timed solution acceptable?

 


Best Regards,

John

John Passiak
Message 4 of 17
(5,488 Views)

Hi,John,

 

I tried the producer/consumer structure, and it works fine. I think this solves my problem. Thank you. However, I have some difficulties in recovering the data from binary code to decimal format. Because the measured data contains not only the voltage, but also the time information, I don't really know how to convert the data from binary to decimal form. I tried the "read binary.vi" in the Labview example, but the result is strange.

 

About the second solution, it's also nice. I will try to use it later.

 

As for the pulse signal, I want to start it exactly when the counter starts counting. However, the stopping time is not that strict. You are right, it can be done by software control. However, I'd like to know how to make it stop exactly when the limit has been reached. If you have time, can you also tell me that? Thank you in advance.

 

Cheers,

Tingxuejh

0 Kudos
Message 5 of 17
(5,461 Views)

Hi, John,

 

I think I made some mistakes. The software solution of the digital control signal is not possible for my case. Because the integrating process is relatively fast. In this case, the integrating time is in the range of 90usec. If we use software to control the digital signal, it will be too slow. Thus I still think I need a hardware solution, which could stop the integrating exactly when the counter stops counting.

 

I think we can also take use of the analog comparison signal. But could you also give me some hints on how to do it? Thank you in advance.

 

Cheers,

Tingxuejh

0 Kudos
Message 6 of 17
(5,422 Views)

Hi Tingxuejh,

 

This ended up a little trickier than I first thought, but I think I have a solution.  With this method, you can configure the amount of time between consecutive acquisitions (to allow time for your integrator to settle back down to 0 Volts).  Have a look at the following timing diagram and LabVIEW code and see if this makes sense for what you are trying to do.  You will need to choose your appropriate device name (my DAQ device is called '6251').

 

Timing Diagram.PNG

 

 

I hope this helps!  I think it's very close to what you need to do.

John Passiak
Message 7 of 17
(5,410 Views)

Hi, John,

 

I think this one gonna work. But I havn't be able to test it. (Because my labview is 8.0 and the programm is written in 8.2) I will try to find a Labview 8.2 soon and let you know if this solution really works.  Thank you.

 

At this moment I have one question. As you mentioned, I can use a produce/consumer loop to sample the data with very high frequency, and save them in binary format. I tried to do that and it works. However, I have some problems to convert the binary data to decimal. Especially in this case, the data is in waveform format which contains both measured information and time information.  I tried to use the similar code as the "read binary.vi" in the example finder, however the results are a bit strange. So I'd like to ask how do you usually do to recover the data?

 

Sorry for disturbing you so often.

 

Cheers,

Tingxuejh

0 Kudos
Message 8 of 17
(5,396 Views)

Hi Tingxuejh,

 

I actually typically use the TDMS format.  The TDMS file is binary file with header information to help read back the data.  

 

In general, when writing to a binary file you need to make sure that you read back the data as the same type.  Common errors might include reading the data as little-endian vs. big endian, indexing/not indexing the array size, using different data types, etc.  You can always break out the waveform into its separate components using the Get Waveform Components function in LabVIEW then just write the array and ignore the timing information.

 

 

8.2 was as far back as I could save the code in LV 2009--if you have any problems finding LV 8.2 I can track down a system with an earlier version of LabVIEW on it so I can save back to 8.0 if necessary.

 

 

-John

John Passiak
Message 9 of 17
(5,393 Views)
Also, here's the code attached as a .png so you can take a look without LabVIEW 8.2 if you need to.
John Passiak
Message 10 of 17
(5,392 Views)