Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

HELP!!! Not taking samples at predetermined sample intervals - missing samples

"I am having problem sampling my data at a predetermined frequency (e.g. 400Hz) using the PCI-6036E.

I have written my VC++ code such that it would go through a list of events when triggered by the internal counter, which is on-board the 6036E. I have hooked up the trigger by making the counter produce a squarewave and having this directly connected to PFI2/Convert (as given in the example code, AIonePointExtConv_Eseries.c). The squarewave will act as the trigger for the collection of analog input. I then used the AI_Check function to determine the sample interval. "readingAvailable" variable from the AI_Check function will be the flag variable that will determine the executi
on of events that are to occur at the sample frequency.

The problem I am having is that the computer is missing couple of samples here and there. Normally, the computer samples at whatever frequency I run it at but at times, it just seems to forget to sample for awhile. Although the duration of this "forgetfulness" increases as I increase the sampling frequency, this phenomenon happens even when I am running the system at 1Hz!. My machine is a 1.4GHz machine running in Win98.

Another question that I have regarding this is how LabView takes care of sampling. Does it use interrupts using the timer chip on the motherboard? Or does it do it the way I am doing - by generating a sample frequency and use a trigger variable? If LabView does is differently than I am, I would like to know how it is done since I can try to incorporate the algoritm in VC++

I have attached the VC++ files I am using.

Thank you for reading my loooooong post and thanks for the advise!
0 Kudos
Message 1 of 8
(3,438 Views)
Hi Jejun,

From what I understand, you want to sample analog input data at a specified frequency once you get a trigger signal. I would simply use a standard analog input example without using an external Convert Clock. If you want to sample at a specific rate then you need only do that. Or if you do want to use the counter as the scan clock, you can just use the select_signal call with Counter 0 Output as the source and Scan Start as the signal. You can find these options in the NI-DAQ C Function Reference located at Start >> Programs >> National Instruments >> NI-DAQ >> Help.

The reaason you might be missing samples is that your code is dependent on the operating system reading data as its available. At 400Hz it is almost always going to be ready. However, since you are using a Windows (probably), the OS will give a time slice (since it is a multitasking OS) to your program at which point it does the AI Check (which is false). Then OS gives processor time to another program (at which point you will occasionally miss the data point when it is ready). The OS once again returns to your program and performs the AI Check but the data is gone and you are acquiring the next point. This might not be exact but this is probably (roughly) what's going on.

If you know you are going to acquire x amount of samples (100 or 10000 etc) or so at a specified frequency (400Hz), then use an AI Scan operation and you can use the internal timebase as the scan clockk or you can use a counter (as described above). What the scan does is it acquires a bunch of data points (or continuous) on one or more channels. DAQSingleBufTrig_E_Series on a trigger or just simply a DAQSingleBuf example. These are all shipping example that should be found in your folders for NI-DAQ.

Anyway, the point is to not use a single point acquisition when you are acquiring multiple points as there are a multitude of ways to lose data points along the way.

LabView uses similar versions to the standard NI-DAQ function calls you are using.

Anyway, hope that helps.

Ron
Applications Engineer
National Instruments
0 Kudos
Message 2 of 8
(3,438 Views)
I dont think I was clear in explaining my question.

The operations that I need to perform at every sampling instant is not only to acquire analog inputs but to get digital inputs and give out digital outputs.

There are a list of functions I need to perform at every sampling instant besides just collecting analog inputs such as getting digital inputs and have analog outputs. If I were to use "SCAN" functions, I dont think I can do this. Is there anyway I can get a list of functions to run synchronized at a predetermined frequency?

Thank you for the help!
0 Kudos
Message 3 of 8
(3,438 Views)
Hi Jejun,

It sounds like you will need to use DAQ Events to fire an interrupt each time you acquire analog input. You can still configure your analog input as a buffered acquisition operating at 400Hz as mentionned above but you would also configure a DAQ Event which triggers an interrupt where you can service the rest of your program needs. The NI-DAQ function you want to use is Config_DAQ_Event_Message and you want to configure it for message 1. Message 1 interrupts every multiple of N scans where you should choose N to be 1. The NI-DAQ C Function Reference manual located at Start >> Programs >> National Instruments >> NI-DAQ >> Help should help with the description and use of the function.


What essentially happens with this funct
ion is that an interrupt is fired everytime a scan (analog sample) is acquired. You can have that interrupt be handled by a callback function.

I've included a link to a good tutorial and to some sample code. Both examples aren't DAQ Event 1, but they should get you on the right track.

An Overview of DAQ Events and Occurrences
http://zone.ni.com/devzone/conceptd.nsf/webmain/7B95597F3C6138F8862567EB006C9638?opendoc ument

DAQ Event 2 Example
http://venus.ni.com/stage/we/niepd_web_display.DISPLAY_EPD4?p_guid=B45EACE3EAD156A4E034080020E74861&p_node=DZ52309&p_submitted=N&p_rank=&p_answer=&p_source=Internal

Using DAQ Event 8 to Send a Message When a Digital Pattern is Matched in Visual C++
http://venus.ni.com/stage/we/niepd_web_display.DISPLAY_EPD4?p_guid=B45EACE3E22F56A4E034080020E74861&p_node=DZ52334&p_submitted=N&p_rank=&p_answer=&p_source=Internal

Hope that helps. Have a good day.


Ron
Applications Engineering
National Instruments"
0 Kudos
Message 4 of 8
(3,438 Views)
Ron,

I have been reading documentation on applying the Config_DAQ_Event_Message function. A question that I have is will I be able to use the Config_DAQ_Event_Message function in console application?

According to "http://zone.ni.com/devzone/conceptd.nsf/webmain/7B95597F3C6138F8862567EB006C9638?opendocument", the callback function's declaration is "void myCallback(HWND hwnd, UINT message, WPARAM wparam, LPARAM lparam)". I have only programmed in console/DOS C/C++ and I have never come across these parameter listings ( the HWND UINT WPARAM LPARAM). Thus I am assuming these are for programming Windows application.

Will I be able to use Config_DAQ_Event_Message AND its callback features in console C/C++?

Thank you so much!

JJ

0 Kudos
Message 5 of 8
(3,438 Views)
Hi JJ,

Although I haven't used this function in DOS applications, I don't think you would have problems implementing this in DOS. In the discussion of those parameters, they are not Windows parameters but NI-DAQ parameters. Therefore I believe they are independent of operating system. Give it a try.

Ron
0 Kudos
Message 6 of 8
(3,438 Views)
Ron,

I was successful in generating the interrupts as you have advised me to. Thank you.

However, I noticed another problem while running the completed code. I have on my oscilloscope the external clock AND a signal from AO that lets me know if I am in the process of executing lines in the callback function. The problem that I noticed is that the time interval between the samples seem to INCREASE as I take more samples. For example, the sample interval in the first few samples were SHORTER than sample intervals during the last samples taken. It seems that there is consistent increase in latency time between the interrupt is fired and the callback function is executed.

Could you give me an advise on where
I should look to solve this problem?

Thank you~
0 Kudos
Message 7 of 8
(3,438 Views)
Hi Jejun,

Not sure what could be causing that problem. It seems like there is something that is slowing down the processor. From a data acquisition standpoint, when the event occurs, an interrupt is fired and a callback is called. Even if you had a memory leak, this should not interfere with speed at which a callback is handled unless your memory is completely used up and you are now paging data out of memory and hard disk.

I would run a system monitor to see how your memory and CPU is handling things while you are running your program. Good Luck.

Ron
0 Kudos
Message 8 of 8
(3,438 Views)