LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Odd Simulated Device Behavior

LV 2010, PXI 8101 controller NI-DAQ 9.2.2

Starting a new project using the PXI 1050 combo box: part PXI, part SCXI.

I'm trying out things to see how fast I can go.

 

I have a simple program which configures 8 LVDT channels at 1000 Hz sample rate and enters a loop:

  The loop simply reads the AVAILABLE SAMPLES PER CHAN property and then reads ons sample, with zero timeout.

  I then check for a QUIT button and loop if not clicked.

 

For now, I am using the SIMULATED DEVICE for a PXI 6251 Controller and the PXI1050 chassis and SCXI 1540 LVDT board.

I have the real PXI box and a real PXI 6251, but I don't have the real SCXI 1540 (the client does).

Here's the code - the other frame of the loop just samples the QUIT button.

The VI at the left does CREATE AI CHANNEL-LVDT 8 times for 8 consecutive channels and appends each to the task.

 

I'm recording the value of AVAILABLE SAMPLES as a diagnostic.

 

Pic 1.PNG

 

 

 

 

 

What I see is this:

Pic 2.PNG

 

 

It looks like I joined something in midstream, and the first one is off.

The rest of it looks like somebody DUMPS 20 samples into the buffer periodically.

I pull one out, I pull one out, I pull one out, and it gets empty and stays empty until something dumps another 20 in there.

 

Sure enough, if I disable the READ part, and just log the SAMPLES AVAILABLE, it goes up in steps of 20:

 

 

Pic 3.PNG

 

 

Where does "20" come from?

 

If I change my sample rate to 100 Hz (not 1000), then it dumps 2 samples, not 20:

Pic 4.PNG

 

 

I would expect that second graph to show 0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1

 

It puts a sample in, and I take it out. 

 

I'm using whatever default buffer happens.

 

I've used something similar on a different PXI box for years, but that was a real device.

 

Is this a bug / feature of the simulated thingy-do?

 

 

I stumbled around and found a "SCAN ENGINE" but not sure what that is doing for / to me.  Didn't seem to have an effect on this issue.

 

Anybody have ideas?

 

 

 

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 1 of 13
(5,297 Views)

Here I changed the code to WAIT on a single sample.

I also recorded the value of a 100 nSec timer, AFTER each DAQmx READ.

The DUMP number went back to 10 (where does "10" come from?)

 

The timer values are around 200 ticks apart (20 uSec), until there's a jump of 100,000 ticks (10 mSec)

 

Where is the clog in the pipeline?

 

 

Pic 5.PNG

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 2 of 13
(5,290 Views)

 

So... I stripped out all the other stuff, and changed the code to use the REAL 6251 board, not the simulated one.

I don't have a real LVDT board to go in it, but it should work for the test.

 

Here, I start a task with one channel, and start a timer.

 

The 0th time thru the loop, I real all available samples, just to clear out the buffer.

I'm recording AVAIL SAMPLES each time, BEFORE a DAQmx READ with ZERO timeout.

I'm also recording the difference in time (100 nSec ticks) between loops.

 

For the AVAIL SAMPLES, I would expect to see 0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1, and so forth.

 

I don't.

 

the AVAIL samples gets TWO samples dumped in every now and then. I read one in one loop and read the other in the next loop.

 

The average loop time is 2750 ticks (275 uSec).  WHY?

 

When samples get dumped in, it's like the loop wakes up and processes them about 40 uSec apart.

But what is it doing in the meantime?

 

And why doesn't NI-DAQ tell me when a sample is ready?

 

Pic2 1.PNG

 

 

 

Pic2 2.PNG

 

 

 

 

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 3 of 13
(5,281 Views)

So, I went back to the old system.  It's a different PXI box (PXI1042 with 8196 controller and PXI 6221 for ADIO, other cards too).

 

I already had a timer set up there, so I stripped out all the code and did the same testing.

 

It's running at 100 Hz (not 1000) but that shouldn't matter.

 

I timed the loop-to-loop interval and it's dead on 10000 uSec (10 mSec), with a couple of glitches.

Pic3 1.PNG

 

I put the SAMPLES AVAILABLE property node ahead of the DAQ READ and recorded it too.

That turned out to be a constant zero, which is what I would expect, actually.

There's nothing for the CPU to do except wait on the sample, so it loops and finds no sample available and waits.

 

So, what's the difference?

 

1... 100 Hz vs. 1000 Hz - well the new system I tried at 100 Hz and still saw the problem.

2... 6221 vs. 6251.  Hard to see how the hardware would clog up the software this way.

3... 8101 vs. 8196 - maybe not the hardware, but the OS on it ?

4... The old system calculated a wait time ( 1/ Sample Rate), applied a fudge factor of 1.1, and used that as a timeout.

   Don't see how that can affect anything, I used a 10 sec timeout and it's not timing out at all, enyway.

 

5.... The old system did NOT have a SCXI chassis attached to the 6221.  The new one DOES.

 

I tried changing the new one to 100 Hz and a short but adequate timeout.  It never times out. No errors occur.

That eliminates #1 and #4 above.

 

It's still dumping TWO samples at once (or at 1000 Hz, it's dumping ....

 

Hmmm.  It's still dumping TWO.  Earlier it was dumping TEN or TWENTY.

 

Confusing..

 

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 4 of 13
(5,273 Views)

Oh, right - it was the SIMULATOR that was dumping TEN and TWENTY.

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 5 of 13
(5,272 Views)

I moved to a separate simulated PXI 6251 device WITHOUT a SCXI chassis on the idea that maybe the SCXI attachment was affecting my results, trying to use the board.

Apparently not.

 

I changed to simulated PXI 6221 (to match the old system that works) and got no change.

 

The upper graph shows the number of samples available (Y) vs loop number.

The lower graph shows time between loops (Y - uSec) vs loop number.

The low numbers on the lower graph are around 16-17 uSec - the basic loop time, I suppose.

 

It still looks like the thing is dumping 10 samples into the buffer every 10-12 mSec.  WHY?

 

 

Pic 1.PNG

 

 

Here's the code:

Pic 2.PNG

 

I have set the SCAN ENGINE to 200 mSec and lower priority, to avoid possible interference.

 

Why doesn't it give me one sample every one mSec instead of 10 samples every 10 mSec ?

 

 

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 6 of 13
(5,254 Views)

Well, I've discovered something, but I'm not sure what it means.

 

Here's the code and the results.

 

I would expect the SAMPLES IN BUFFER to be continuous 0 (because it's waiting on one sample and then reading it).

I'm thinking it should get back to the READ long before the sample is ready.

 

Not so:

Pic 1.PNG
 
 
But here, I set the HARDWARE TIMED SINGLE POINT mode, and it works perfectly.
Pic 2.PNG
 
 
The weird thing is, I've got a project out there with it set to CONTINUOUS SAMPLES, and it seems to work fine.
 
I don't understand why it FAILS in CONTINUOUS mode.  I haven't seen any rules that say "You can't trust the READ to tell the truth".
 
But there it is.
Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 7 of 13
(5,232 Views)

 

 

 

So, stripping all the extraneous stuff out and using HW TIMED sampling, here's the code and the results:

Darn near perfect.

 

 

Pic 3.PNG

 

 

The SIMULATED device has also lost the weird dump-20-samples-in-at-one-time behavior, but can't keep up.

AT 4000 Hz, it presents a sample every 2000 uSec (should be 250)

At 2000 Hz, it presents a sample every 2000 uSec (should be 500)

At 1000 Hz, it presents a sample every 4000 uSec (should be 1000)

At 500 Hz, it presents a sample every 4000 uSec (should be 2000)

At 250 Hz, it presents a sample every 6000 uSec (should be 4000)

AT 100 Hz, it presents a sample every 12000 uSec (should be 10000)

At 10 Hz, it presents a sample every 101996 uSec (should be 100000).

 

I've no idea about this.  That's not just a matter of being unable to keep up.

 
 
Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 8 of 13
(5,212 Views)

Hi Steve, 

 

I am not sure if there is a specific question within your posts that you are hoping to have addressed. However, I read through the thread and wanted to see if I could clarify some things for you. 

 

As you can see, whenever you used the real devices all of the timing and sample reading seemed to work correctly, as indicated by your last post. When you used simulated devices seemed to be when most of the discrepancies appeared. Section 4 of this document may prove helpful. It discusses how timing is not going to be consistent when using simulated devices, since they obviously do not have hardware timing, or on-board buffers, etc.

 

If there are any specific questions you would like to have answered, feel free to let me know, and I will see what I can do to assist you.

 

Hopefully this is helpful.  

Best Regards,

Thomas B.
National Instruments
Applications Engineer
0 Kudos
Message 9 of 13
(5,198 Views)

@Thomas-B wrote:

As you can see, whenever you used the real devices all of the timing and sample reading seemed to work correctly

 

 

I disagree with that, although yours and my definition of "correct" may be different.

 

It's possible that NI-DAQ has evolved out from under me.

Once upon a time there were TWO choices: FINITE samples and CONTINUOUS samples.

 

Here's the basic question in a nutshell:

 

On a real or simulated device, If I use CONTINUOUS SAMPLES, then I see the AVAILABLE SAMPLES property jump up (to 10 or 20 or something on a simulated device, at least 2 on a real device), and DAQ READ will wait on this 10 or 20 or whatever.  It DOES NOT REPORT samples that should be there.

 

WHY?

Steve Bird
Culverson Software - Elegant software that is a pleasure to use.
Culverson.com


LinkedIn

Blog for (mostly LabVIEW) programmers: Tips And Tricks

0 Kudos
Message 10 of 13
(5,195 Views)