LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

DAQ Assistant: Resetting Buffer

Hello,

 

Is it possible to reset the buffer while running continuously so that the DAQ Assistant can read/write a new set of data points?  My application is basically triggering without using the triggering option in DAQ assistant.  I'm am using a 9172 so I can only have one DAQ Assistant in my program. 

 

Right now, I can hit my external "trigger" and the current set of data points will be written to a file.  I want to be able to hit my external "trigger" and have the data points after that event be saved to the file, rather then the group of points in the buffer around the "trigger."  This way I can avoid losing data if the trigger goes off at the end of the group of data points. 

 

 

0 Kudos
Message 1 of 8
(4,718 Views)

Hey Bentz,

 

I've got a few questions to try and clarify what you are doing and what you are trying to accomplish.

 

First, when you say you want to clear the buffer, do you mean you want to clear the samples stored on the module's buffer?

 

Second, I was looking at your code and was wondering what the second while loop was for? Just to read back your data from the file and evaluate it?

 

 

I believe I understand your issue, based on your code it looks like what ever samples are on the device buffer are saved whenever that manual trigger comes in. However, you want all of the samples acquired to be smaples taken after that trigger is received?

 

I believe you have the right idea, but your logic is in the wrong place. If you want the samples to be acquired and written after the trigger is sent, then I recommend moving that trigger case structure to the beginning of the DAQ Assistant and then placing an error constant inside. Then you can wire that error out form the case structure into the Error In of the DAQ Assistant. Now when that trigger is sent, the error wire will move along with it, and since the error wire controls dataflow it should cause the DAQ Assistant to start. Now the DAQ Assistant will take its samples and they will be written to file, but this will all happen AFTER the trigger is received instead of happening continuously. To monitor that case, all you need to do is put a While loop around it with a Wait function (for your CPU's sake) and that should result in you acquiring samples only after the trigger has been received.

 

Does this sound like what you want to do?

 

 

Micah M.
National Instruments
NIC AE Specialist - Test
0 Kudos
Message 2 of 8
(4,683 Views)

Thank you for the reply 🙂

 

This is indeed what I was trying to do, but I feel that I was over complicating it way too much XD  I changed my hardware around so that the DAQ could accept a digital input signal and used that as a trigger under the DAQ assistant settings.  Instead of using a while loop, I just set the "timeout" to -1 so it runs until the trigger goes off and the data is captured.  This program does everything I wanted the other to do and it is much simpler.  It was also within my ability to write with my limited LabVIEW knowledge. 

 

There is one more thing I need help with that is beyond my current experience with LabVIEW however.  Every time I've run the program, I have needed to re-calibrate all the strain gauges in DAQ Assistant.  This isn't really a big deal for me, but I want to make this program easy to use for someone who has never used LabVIEW before.  I was thinking of installing a "calibrate strain gauges" button on the front pannel, but I don't know how to make that button actually work.  I'm reading up on it now and looking through the help examples, but any help would be greatly appreciated.

 

0 Kudos
Message 3 of 8
(4,671 Views)

More info:  To calibrate the strain gauges, I open the DAQ assistant, select the appropriate channels, and select "calibrate" under the devices tab.  I'm using a NI 9236 (Quater bridge analog input) module on a NI 9172.  The only thing that seems to change during calibration is the offset voltage of the channels.  The gage factor, gage resistance, etc all stay constant.  All this data is saved in the DAQ assistant.  However I constantly have to re-calibrate because the offset voltage keeps changing for some reason. 

 

Is there a way to replace the DAQ assistant so that this could work?

0 Kudos
Message 4 of 8
(4,661 Views)

It sounds like you're doing a self-calibrate on the device after the DAQmx Assistant finishes running. The DAQmx palette has a Self Calibrate VI that you can call after running the task (see attached picture). The VI is located in Hardware I/O » DAQmx » Advanced » Calibration. Tie the error wires together and then create a constant for the device name and it should calibrate the device after running the task. 

Jake H | Product Manager
0 Kudos
Message 5 of 8
(4,648 Views)

Hi Jake,

 

I read some tutorials online and re-wrote my program from scratch.  It has the same functionality but unfortunetly I still have problems calibrating it.  I think what is happening now is the trigger is causing it to calibrate, but it doesn't calibrate before the data is captured and saved to the file.  I want to be able to hit the calibrate button, see that it's calibrated, and then hit my trigger to capture data.

 

If I run the program a second time, the gauges have been calibrated and my waveforms line up with 0V. 

 

I think I have to set up a reference trigger to capture the data and run the read continuously, but I'm not sure.

 

Unfortunately, I won't be able to work on it again until Monday..

 

 

0 Kudos
Message 6 of 8
(4,643 Views)

I thought you were trying to perform a self-calibrate on the device, not use shunt calibration and offset nulling to set-up your bridge. Are you able to run this community example successfully? 

Jake H | Product Manager
0 Kudos
Message 7 of 8
(4,607 Views)

Yes, I am able to run that example.  That is the same example I refrenced when building my block diagram.

 

I think the main calibration that was necessary was the offset nulling.  Every time I would open and close the program, the Strain Gauge waveforms would no longer be centered around zero. 

 

Here is my program now.  This one actually calibrates before the trigger goes off.  All I did was remove the condition around the calibration blocks.  I think this makes the program calibrate before it starts waiting for the trigger. 

 

As for shunt calibration, I'm using an NI 9236, which has a built in 100k shunt resistor which can be switched into and out of the quater bridge.  This is controlled by software, so I saw no reason to not do that calibration as well. 

0 Kudos
Message 8 of 8
(4,598 Views)