High-Speed Digitizers

cancel
Showing results for 
Search instead for 
Did you mean: 

Is a digitizer something for me ?

I know the question will sound daft to you, but I can't find the answer myself.

 

For your reference, I'm currently using a DScopeIII for my audio related measurements and that's okay, weren't it that the A/D is limited to 192 Ks/s. This, while our audio D/A converter takes 24/768 for input. Filtering is done in-PC-software (say, the upsampling of 16/44.1 to e.g.. 24/705.6) and since this is a real interpolating filter I need to see beyond the 96KHz I can see with the DScopeIII. This is how I came to the 5922 (and yes, I know about the decreasing bit depth - no problem).

 

The stream to be measured goes like this :

WAV file on hdd -> Playback software (incl. filtering) -> D/A converter (no filtering) -> Digitizer.

Simple.

 

We need to examine periodic wave forms (but also sweeps), FFT's and preferably THD+N.

The hot-sauce (for us) is examining the behavior in the audio band (which would be DC up to 44100/2) BUT merely the HF anomalies because of the interpolating filter. This is how even 15Ms which the 5922 can do plays its nice role.

All still simple, if it can work like I intend. Point is :

 

I can't see through the hoopla of record lengths and continuous measuring (2ch) were that to be WITHOUT HASSLE. So, as it looks to me, a digitizer isn't really made for "on-line" (and real-time) measurements, and things go merely through a captured file on disk. So, what I need to see is a "scope" screen like I'd say any normal FFT analyzer would show it and with the example of a sweep you'd see things pass by as they are.

 

I have read about the necessity of pushing the in-PC data fast enough to disk so the lot would keep on running, but

a. this seems to take programming of some sort (which I can do but don't like to at all);

b. would seem to make that "streaming" solution dependent on the measurement type (this is how I would refuse to dive into programming again and again);

c. which I don't need at all, because I see no reason to examine disk files off line when the analysis is available in real time (this may not apply to all measurement types).

 

So the short question could be : Will there be a "button" somewhere that allows continuous measurement and on-screen results with close to instant refresh times (depending on FFT depth of course) with the explicit notice that I don't need to save the results to disk anywhere (so, show the analysis on-screen and throw out the record(s) as I seem to understand).

 

I'm pretty sure that you may hand me a tip here or there on how to achieve that by means of examples and small adjustments. But as said, that would lead to me adjusting each measurement type - I think.

I must emphasize that I may have gotten it all wrong, and that "of course" all I want is available in the base. But if I am not wrong, I wonder why it must go so difficult with the for me so obvious that I want to look continuously at what's happening with my data and what IMO a "scope function" is for. To this regard, any solution like "yea, but you can look at your sweep results afterwards just the same !" is not a solution if you only think of the real time (playback software) filter adjustments which obviously needs the immediate feedback (tuning).

 

A related subject seems to be the triggering. So, I am used to press "Play" somewhere and all happens without further notice. Maybe this is a one time setting and then it's no problem, but as far as I can see this is all again related to the continuous measurement thing. So, I just have test signals - no explicit triggers.

 

I realize that I sound like "blaming", but please try to read it as something like being disappointed (so far) and merely blaming myself to being in lack of knowledge in this field; possibly I am trying to be cheap in avoiding 30K++ audio analyzers but I like to know what I will get for 10-13K (incl. some software) which is still a lot of money.

And hey, a search for "audio" in here shows zero hits. So am I on the wrong track all together ?

 

Thanks for honest answers !

Peter

 

 

 

0 Kudos
Message 1 of 13
(11,921 Views)

I use the 5922 and like that card.

See it as the hardware part. It will capture the data and store it onboard. You don't have to stream it to disk , just move it into your 'PC' memory, analyze the data and display the result. That is the software part 😉

What is your 'real time'? On a bench: half to some seconds? In a process control less than or a couple of ms?  I guess it' more the first one 🙂 so that card could be a nice thing.


Simple preprocessing tasks can be done already in the 5922 (filter, etc)  but I think that's not the point here.

 

So to really use that card you will need software.

For the loop:

capture (incl. trigger), do some more or less processing (filter, window, DFT,..) , display

you can use Signal Express (ready made blocks you chain together and configure), store these and recall them for the task. 

 

But to really use the power of that card you will need some more software ;),  knowledge of the driver and usually use LabVIEW (or a text based SDK) to create the tool you need to get the job done.

Or have someone doing it for you...

 

Since the 5922 is no low cost device, I'm sure a NI rep. would be willing to visit you 😉

Prepare some tasks (hardware setups) and see how far you can get

 

Greetings from Germany
Henrik

LV since v3.1

“ground” is a convenient fantasy

'˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'


0 Kudos
Message 2 of 13
(11,920 Views)

Thank you Henrik,

 

What is your 'real time'? On a bench: half to some seconds? In a process control less than or a couple of ms?  I guess it' more the first one :smileyhappy: so that card could be a nice thing.

 

I'm afraid it's neither. Think "minutes" or even longer; anyway nothing which will fit in either memory. The first example is that real time filter adjustment, but a second example is the possble "drifting" of an oscillator causing resonance (and that really can take a minute to occur). And I know, this all will be possible by examining the captured file on disk later, but that won't suit me already because of the extra time involved. Think of being a wole day with your nose behind the analyser's screen - that now resulting in a multiple of that.

 

Thanks ...

Peter

 

0 Kudos
Message 3 of 13
(11,916 Views)

Hey Peter,

I think Henrik's question was aimed at this comment:So the short question could be:

Will there be a "button" somewhere that allows continuous measurement and on-screen results with close to instant refresh times (depending on FFT depth of course) with the explicit notice that I don't need to save the results to disk anywhere (so, show the analysis on-screen and throw out the record(s) as I seem to understand).


When you say "close to instant" I think Henrik was wondering if you had a number you could put to that (seconds, milliseconds, etc.)  For instance, if you want to watch the action as it's happening, your eye is only going to catch roughly 15-30 changes per second (refresh rate of ~15-30Hz), which your brain blurs together into motion.  

 

So is a refresh rate of 15-30 Hz your goal?

 

Other thoughts on your above statement:

- There is no "button" anywhere on the NI digitizer, but we have a Soft Front Panel that allows you to interact in the way you'd expect from a regular oscilloscope, and it shows up on your computer screen.  This has "buttons" you can click on, but will automatically assume to refresh as quickly as it can (albeit not as quickly as a program you write can refresh).  It uses built-in functions to provide, for instance, an FFT, and you can access those same functions from a program using the driver.

 

- Be careful in your use of the word "continuous" in the context of High-Speed digitizers - especially when saying "continuous streaming" as that has a pretty specific meaning in NI-speak.

 

- There's no requirement ot save results anywhere (to a file on disk or otherwise).  You can bring all the results into memory, graph them, bring new results into memory, graph them, etc. each time discarding results from the previous iteration.  In fact, this is how most of our examples and Soft Front Panel are natively set up - you have to do extra if you want to save something to a file on disk.

 

 

We can use some math to explain how quickly data can transfer back off the digitizer.  I'll try to take some of the nebulous-ness out of it.  In general the limitation is more a limitation of the bus it is on (PCI, essentially), and on the program you write, and the digitizer settings.

 

Background

I just want to make sure certain details are understood upfront.

 

So, as it looks to me, a digitizer isn't really made for "on-line" (and real-time) measurements, and things go merely through a captured file on disk.

The NI 5922 digitizer is only a piece of hardware that runs on the PCI/PXI bus - the PXI bus is an instrumentation extension of the PCI bus, but it is the same protocol, so it has the same throughput capabilities.  (You may understand the following already, but I repeat it for clarity for anyone else reading this.)  Data can be transferred on the PCI bus (to your host, not necessarily a file on disk) theoretically at 133MB/s maximum.  Since PCI is a shared bus among other instruments (unlike PCIe), and for other reasons, you realistically may only see (rough number) 90MB/s-100MB/s maximum.

 

I can't see through the hoopla of record lengths and continuous measuring (2ch) were that to be WITHOUT HASSLE.

You're correct that digitizers traditionally organize data into records.  There's another way of streaming absolutely everything (without breaking up sampled data sets into records - see Fetch Forever example provided by NI-SCOPE driver); however, I always caution people to look at that use-case, and I would definitely caution you.  It can cause more confusion than good, and it doesn't always work for a real-time update unless you really know what you're doing.  So I'll show you how to think of it in terms of records, since I think what you really care about is seeing the real-time update.  Here's a way to navigate the specifications to come up with the example numbers I gave above.

 

I wonder why it must go so difficult with the for me so obvious that I want to look continuously at what's happening with my data and what IMO a "scope function" is for.

Digitizers tend to be a different class of instrument than traditional benchtop oscilloscopes.  Since the design oriented more for automated test, we don't always assume there's a user standing in front of a screen and watching the data as it's acquired.  We do have a Soft Front Panel that helps to provide some basic troubleshooting to get started and ease into the automated programming side, but where a digitizer really shines is when we abstract away the decision making to an automated program looking at data rather than a human looking at the monitor.

 

So, I am used to press "Play" somewhere and all happens without further notice. Maybe this is a one time setting and then it's no problem, but as far as I can see this is all again related to the continuous measurement thing. So, I just have test signals - no explicit triggers.

There are capabilities to start a digitizer acquisition on an analog or digital signal, or on a software trigger (software command) or on an "immediate" basis (whenever the digitizer is ready).  The traditional oscilloscope (used for probing an hand-testing) will often trigger, whether you realize it or not.  For instance, if a traditional scope is in "run" mode, the seamlessness of "auto-triggering" may happen without the user knowing. The digitizer, unlike the traditional oscilloscope, tends to assume the user will know exactly all of the parameters he/she wants to configure ahead of time.

 

(to be continued...)

National Instruments
0 Kudos
Message 4 of 13
(11,906 Views)

(continued...)

 

Software

 

You can use the NI-SCOPE Soft Front Panel (installs with the NI-SCOPE driver) to see the "real-time update".  This executable program provides a similar feel (real-time update) of what a benchtop oscilloscope screen does, but it requires user interaction to control the settings.  The Soft Front Panel makes things easier for a new user by making assumptions, but is not optimized for something like refresh rate, for instance.  If you're looking for piece of code that does something similar then read on.

 

If we write a program with the NI-SCOPE driver that calls the Fetch function (the function used to read data off the digitizer and bring it into the PC host's memory) in a continuous loop ("while" loop), we can accomplish what you're looking for.  

 

A) You could accomplish this in LabVIEW SignalExpress, as Henrik suggested, but I would caution that there are still assumptions made in SignalExpress about how you want your code to execute (and honestly, I'm not as familiar with those assumptions, but some of those assumptions may not allow you to optimize your refresh rate.)

 

B) If you write the program in LabVIEW or C, you can control the digitizer more precisely by starting with a shipping example or online example.  An appropriate shipping example for you would be Configured Acquisition you can set the "trigger type" to Immediate meaning that the digitizer triggers as soon as it's ready.  

 

An alternative way to run that same example (which is technically a little more like what a benchtop oscilloscope in "run" mode will behave) is to leave the "trigger type" to Edge (or wherever your source signal is) and set the trigger modifier to "Auto Trigger".  You can speed up or slow down the refresh by putting in a signal that will actually trigger the digitizer, or (if the signal isn't present) decreasing the "timeout" input to 100ms or so.

 

Continuing on with option B, there are several ways you can optimize your code to improve "refresh performance". One important way for your application is to modify the Configured Acquisition example to use a "producer consumer" type architecture with two loops.  (I'm sure this architecture isn't currently available through the Soft Front Panel, and I'm 99% sure you won't be able to do it in SignalExpress.)  There are some streaming examples that show how to do this, but streaming isn't exactly your application, so I wouldn't recommend starting with those examples - just take a look at them for refrence.)

 

In a "producer consumer" setup, one loop (producer) that calls the fetch function to get data from the digitizer into memory, and stores it into a queue, and the other loop (consumer) which plots the data to your screen.  The benefit of the producer/consumer architecture for you is that the time the digitizer spends acquiring a record doesn't need to influence the time it take for you to refresh the screen (read on).

 

There's a generic LabVIEW template to show you this kind of architecture if in LabVIEW you go to File>>New... and find this:
Capture.GIF

 

The "producer consumer" type of setup can save you the time spent acquiring record time I have listed below.  If you are capturing short records at 15MS/s, this will be negligible compared to the other factors influencing your refresh rate.  If you are capturing longer records (or shorter records at lower sampling rates), then this factor could slow your refresh rate (see below for example numbers.)

 

Math       (Digitizer Settings, Bus Capability, Refresh Rate)

You refresh rate is going to be governed by:

 

1 / (total time the setup takes for each iteration [s] )  = refresh rate [Hz]

 

The "total time the setup takes each iteration [s]} is going to be the addition of a few things:

Time spent acquiring record [s] (only if record isn't yet acquired, dictated by digitizer settings) + 

Time spent waiting for transfer on PCI/PXI bus [s] (dictated by bus and other instruments sharing bus) +

Time spent by program [s] (dictated by how well your program and CPU is set up to show you the data) 

total time the setup takes for each iteration [s]

 

Record Length [S] / Sampling Rate [S/s] = Time spent acquiring record [s]

 

Take, for example, a 1MSample record length at 15MS/s:

1MS / (15MS/s) = 0.0667s or 66.7ms

(since the channels are simultaneously sampled, this doesn't change when enabling a second channel.)

 

Since each sample is 2B (16-bit) at 15MS/s, then eacch record will occupy about 2MB of memory if one channel is acquiring, 4MB if two channels are acquiring.  I'll continue the numbers with 4MB total per record.

 

Record Length [MB] / Transfer rate [MB/s] = Time spent waiting for transfer on PCI/PXI bus [s]

 

4MB / (90MB/s) = 0.0444... or 44ms

(I use 4MB since that is the example number I used for two channels)

 

Time spent by program [s] = a benchmarked value.  This is not something you compute unless you have a deployed LabVIEW real-time system (not the same operating system as Windows, so not something I expect you care about.)  Instead of calculating this, we can benchmark it.  See this Knowledgebase ID 125FRRHL or this Example Code "Benchmark Loop Iteration Time with Tick Count Timer".  There are several other options for timers you can search the forums for, too.

 

So let's say you write your own program and start taking these factors into account.  If you use the parameters I gave you above, and the extra time spent by program is 0ms, that means your max refresh rate would be something like:

 

1 / (66.7ms + 44ms + 0ms) = 1 / (110.7ms) = ~9 Hz

 

Remember, that means you're not scaling the data, you're not plotting anything, you're not computing an FFT.

 

Let's start adding those processes in, and using the links I gave you above you find that your processing takes 10ms to scale 1M samples, 100ms to compute a 1M sample FFT, 10ms to plot 1M samples.  Then our equation changes to:

 

1 / (66.7ms + 44ms + 120ms) = 1 / (230.7ms) = ~4 Hz

 

If we take the suggestion above (using the producer consumer loop) then we could look at the most recent completed record in one loop and perform all the scaling, FFT, and plotting functions on it separately, while letting the digitizer to initiate, acquire, and return the data in another loop.  This cuts down the refresh rate to the slowest of the two factors:

 

Producer loop:

1 / (66.7ms + 44ms + 0ms) = 1 / (110.7ms) = ~9 Hz


Consumer loop:

1 / (0ms + 0ms + 120ms) = 1 / (120ms) = ~8 Hz

 

So now you can see there's some balance between how fast these two loops would execute:

 

If the Producer loop executes faster (in the above example numbers, it does) then over a long period of time we'll start backlogging data unless we write in the code to discard previous records and only look at the most recent.  

 

If the Consumer loop executes faster, then it will simply wait until the producer loop gives it more data to work with.  Therefore, the refresh rate is going to be throttled by the Producer loop.

 

 

 

So hopefully that clarifies the math and benchmarking behind how this works, and the influence of record sizes and bus speed and program setup. To circle back to your question, I think a digitizer could be what you need (particularly if the 5922 specifications look like what you need) but you're right, there are a few tricks I gave above to getting it to operate the way you'd expect it to.

 

-Andrew

National Instruments
0 Kudos
Message 5 of 13
(11,905 Views)

Andrew, thank you so much for your super extensive reply and answers to my questions. Maybe it could be a sticky because possibly others wonder about the same as I do ?

 

Btw, what a pitty that this stuff is so much about your Producer-Consumer loop example, because it would be my motivation + skills to get it where I want (for sure). But point remains of course : ... this would be so when it would be my job. And since this is ought to be a tool to perform my job ...

Well, I can always change jobs.Robot tongue

 

If the Producer loop executes faster (in the above example numbers, it does) then over a long period of time we'll start backlogging data unless we write in the code to discard previous records and only look at the most recent.  

 

If the Consumer loop executes faster, then it will simply wait until the producer loop gives it more data to work with.  Therefore, the refresh rate is going to be throttled by the Producer loop.

 

Maybe I should praise you for your calculations which I think are all over the forum, and which are key to some of what we want to do of course. So, the above is an example of that, and by me indicated as "hoopla". Not meant negatively, but to take into account with the inability to see the implications. So, what you so nicely layed out is the most clear (and obvious once it's understood), and it leaves me with a conclusion, for you to judge (further) please, because it is exactly that what could make me worry :

 

- When the acquisition (production) goes too fast for the processing (consumer), it IMO can't be the solution to discard acquired data. This will be obvious for you too, but could be important for others. So, when we are processing a nice (periodic) sine and need to cut it regulalry, the possibly only thing which remains valid (looking at the screen) is the wave form itself. The cut part of the cycle once in a while will go unnoticed and since we just throw out already captured data it won't be related to messed up triggering. Still it could be so that we see a phase shift on the screen - and yes, knowing that we are showing sines, that can be corrected in the software (hey, that would be too much of any hoopla).

What would not be valid is something like an FFT and/or derived THD etc. So, a THD figure will "wobble" all the time.

Do notice that a THD figure wobbling or not, is a measurement within itself. The oscillator resonance example I gave is a good example for this. No need to understand this, but when THD figures are not steady, one of the key measurements (for us) is killed.

 

- If the processing (cosumer) is faster than the aqcuisition (producer) there's hardly a problem, unless we find the refresh rate being not "even" to be a problem. As I see this at this moment, this will only be a somewhat disturbing thing, like looking at a refresh 10 times per second, while actually one per 10 is skipped and this isn't evenly always the tenth, but 10th, 9th, 8th, 8th, 7th, 6th, 5th, 5th, 4th, ...

This by itself can be solved by means of an additional "thread" which makes the refreshing even. Ehm, hoopla perhaps ? Not when it's your job - then this is just a nice challenge.

 

The above is how it's in my mind, but possibly it is not as bad - or worse than I described it.

 

In the same realm I can't judge how consistently the "data stream" can be sustained. So, I read about circular buffers and all (and I sure understand that), but what needs to happen here looks mighty the same as the first part of your quoted text above - when one sample is missing things go odd, and it looks to be up to me to organize that correctly. And mind you Andrew, this is not going against your good outlay, but derived from the so many general outlays which must be there for a reason. Where it all applies (and which undoubtedly springs from my (changing) needs) - "we" can't tell.

So indeed, all you can do is explaining to your best efforts, anticipating more or less explicit questions.

 

Andrew, now please try to remember that I'm telling you that I may look like nagging, which I explicitly do not ...

 

What I tried to tell very indirectly in my OP is that the DScopeIII must work very much (if not 100%) the same;

The only difference would be that it is USB connected but that really won't make the difference. So, we have a hardware device at the back end and the analysis is done in-PC-software. True, it samples at 192KHz only and going higher may limit refresh rates and all, but I am fine with that. The analysis is "as dot-net" as it is here, and I don't see much speed/efficiency difference, although there might be some (I don't know that, so I regard it the same). The difference comes from the pre-cooked "normal scope" analysis and I never need to worry about how to synchronize the available on-board memory with what the analysis software can do with it (but say your producer/consumer loops). And so what it seems to come down to is that we seem to have a mighty nice device for analysis - only the software hasn't been made in full. I can do that myself all right, but why. Is it to cut the price ? Is it because nobody sees that it could have been a better product if only the software had been made ? Is it because a "digitizer" as such is for other objectives ? Let's stick to the latter then. But still.

 

 

Having said this, let's hop over to jitter analysis. $3000 and we can do that. But can we ?

Since I don't see a requirement for digital input, I must assume that analog input-only can do the job already. Just put in my sine of any frequency and cough up all the jitter specs (as described for the module). Hey, this is allowed to be analyzed offline ! haha.

Here's the scenario again :

 

WAV file on disk -> Software audio player -> D/A converter -> Analog-out to 5922 -> Jitter analysis.

(specs on the phase noise of the 5922 seem sufficient to me)

 

On a side note, and hopefully interesting for you, the D/A converter should show 200fs of RMS jitter inherently. This will be lower than the 5922 is able to show, but I'm sure this doesn't matter. Why ? because what's implied for noise depicted by the PC (so, I'm talking about the PCI version for sure) deteriotares that 200fs. That's nice, but now I want to see that. Do notice that the playback software is able to influence the in-DAC jitter (DAC is outboard). The software plays in the PC.

We might have a little problem, because the digitizer is also in a PC and as far as I can tell it will already influence itself for jitter (specs). This may not be a real problem, as long as I can see the influence from the playback software. It is totally audible anyway, so whatever the effect is, it must be in the ns range (to my estimation).

 

Any remarks on this ?

 

When you tell me now that the 5922 (8MB) can just capture my D/A's analogue output and performs all the analysis as described for the module in a reliable fashion, I'll get myself a 5922 and report back about the FFT stuff later (read : this jitter is just another subject and I would be satisfied having this working alone).

Please notice that (lack of knowledge) I don't see the relation between the WAV file's sample rate (44100), what the software makes of that and outputs towards the DAC (705600) and the sampling speed of the digitizer. I guess though that for the latter counts "when not high enough how to obtain ps range jitter specs" and that you even might tell me to obtain another digitizer with a higher sampling speed and lower bit depth.

 

It is all not *that* easy for someone like me !

haha

Peter

 

PS: Might it help, the PC we plan to use for this is an i7 hexacore (12 threads) @ 4GHz (2GHz per thread) and general latency is 80us (max 120us).

 

0 Kudos
Message 6 of 13
(11,893 Views)

Hi Peter,

 

And so what it seems to come down to is that we seem to have a mighty nice device for analysis - only the software hasn't been made in full. I can do that myself all right, but why. Is it to cut the price ? Is it because nobody sees that it could have been a better product if only the software had been made ? Is it because a "digitizer" as such is for other objectives ?

 

Some of these questions would be best asked of your sales representative, but I can try to give an overview in a personal message.  It's a matter of philosophy, and where how NI believes it adds value to the industry and to customers.  It's not something that's easy to express via a forum post.

 

- When the acquisition (production) goes too fast for the processing (consumer), it IMO can't be the solution to discard acquired data.


If that is the requirement for your application (I'm not sure it is), then you'll need to make sure the digitizer always acquires data at a rate slightly slower than what your system can graphically plot/display to you.  You can do this rather simply:

1. If triggers aren't arriving, and you want the digitizer to return data (the Auto Trigger is Enabled), then you can control the rate at which your Producer loop provides you data by changing the timeout value in the Fetch function.  A larger timeout value means the device will not Auto Trigger as frequently, so the Producer loop will iterate less quickly, and less data will come through.

2. If triggers are arriving and you don't want to throttle how many times your digitizer triggers, then you can use the Trigger Holdoff attribute.

 

Despite the above, I want to make sure you understand my recommendation about discarding data.  I understood that your application requirement is to have a stead on-screen refresh of whatever data is currently at the input of your digitizer.  To that end, you should really only care about the most recent record of the data that your digitizer finishes.  A discarded record will be a record that shows something that was present at the input of your digitizer several milliseconds (or tens or hundreds of milliseconds) in the past.  The most current data is the data you would still show.

Let me put it another way - if you have two records available to plot, because your Consumer loop is running slower than your digitizer Producer loop, then which record do you want to actually plot?  The older record, or the most recent record?  I'm suggesting discarding the oldest record to maintain the real-time update you expressed interest in. 

 

- If the processing (cosumer) is faster than the aqcuisition (producer) there's hardly a problem, unless we find the refresh rate being not "even" to be a problem.

You may find some uneveness, but at the rates I gave (based on my previous calculations of about 8-9Hz) I doubt your eye would find it troubling.  If it is absolutely important to keep an even refresh rate, you can use a Timed Loop in LabVIEW, or if you aren't using LabVIEW you can introduce some software-adjustable delay from one cycle to the next.

 

As you require a more and more deterministic (regular) update on your screen, you need to look into more complex implementations.  For instance, LabVIEW Real-Time is the most complex implementation, as accounts for every processor clock cycle when compiling the code you write.  LabVIEW Real-Time deploys to a target, however, so there is some extra learning on how to get that to work if you're used to running Windows.

 

What I tried to tell very indirectly in my OP is that the DScopeIII must work very much (if not 100%) the same;

Correct me if I'm wrong, but you're talking about this dScope Series III (not IIIE/A/A+) product?

 

- If you're looking for an experience that everything will run right out of the box with some basic measurements, that is what the Soft Front Panel aims to address.

- If you're looking for something more configurable which will requires some program learning, then LabVIEW or C programming with the NI-SCOPE driver is what I'd recommend.
- The SignalExpress option is something between those two (limited programming, but limited options for optimization.)  I'm not sure if it is the kind of experience you're looking for based on that dScope instrument.

 

Since I don't see a requirement for digital input, I must assume that analog input-only can do the job already.

I'm not sure I fully understand this.  There are both digital and analog inputs to the 5922.

 

Just put in my sine of any frequency and cough up all the jitter specs (as described for the module).

There isn't anything in the NI-SCOPE driver software that provides jitter specifications (for instance, Rj, Dj, eye diagrams, etc.). All of the NI-SCOPE Measurement functions happen in software (so they affect your loop speeds.)

 

There is a separate Jitter Analysis Toolkit that is available (it is an add-on for LabVIEW).  Depending on what exactly you need for Jitter Analysis, you may be able to get suitable results using LabVIEW functions and not the Jitter Analysis Toolkit.  I'd suggest talking with your sales representative if you want to go into more depth on which functions need or do not need that toolkit.

 

On a side note, and hopefully interesting for you, the D/A converter should show 200fs of RMS jitter inherently. This will be lower than the 5922 is able to show, but I'm sure this doesn't matter. Why ? because what's implied for noise depicted by the PC (so, I'm talking about the PCI version for sure) deteriotares that 200fs. That's nice, but now I want to see that. Do notice that the playback software is able to influence the in-DAC jitter (DAC is outboard).

If I understand you correctly, then I agree that the digitizer itself (which has a higher, 3ps, integrated jitter specification between 100 Hz and 1MHz than 200fs).  Initially, with the DAC set at "no extra introduced jitter", you will see some baseline of jitter which will be the combination of all the elements.  As you increase the introduced jitter on the DAC, you will see the measured jitter rise.  There may be some "threshold" of minimum jitter you need to introduce in the DAC in order to see a noticeable effect on your output results.  

 

Is that the sort of experience you're expecting?

 

Hopefully that answers your questions.  In general, it sounds like you could benefit in your application from talking with your account manager at NI to arrange a sales visit.  They may be able to help translate your system requirements into "NI speak" and figure out if what you're trying to do is feasible with the local applications engineers.  You can call into 1-866-ASK-MYNI for more details.

 

-Andrew

National Instruments
0 Kudos
Message 7 of 13
(11,868 Views)

Andrew, again thanks for your huge efforts.

Maybe for a community which is not especially your target I will try to emphasize a few things which seem not have come across clear because possibly I imply too much as "known facts" or otherwise I have not been clear enough to begin with. There's also a small language problem of course, so apologies for that.

 

When we look at THD figures, in layman's terms they are composed of judging periodic waves (signals) over a certain length of time. For example, when we'd judge a 18KHz sine BUT sampled at 44100 Herz, this is not a sine at all. It can't because of too few samples. However, when we look at this wave on the longer term, it resembles that sine; the analyser will recognize it's 18KHz but next will see that it is heavily distorted - and which obviously springs from just over 2 samples per cycle being available.

 

When we "reconstruct" this 18 KHz sine ("reconstruction" may be an audio term) it is upsampled and brickwall filtered and some more, and if all is right a now nice sine is there. This is done by means of virtually repositioning the sampling points, and with e.g. 16 times upsampling more sample points are available for that now.

The better this "filtering" the better the THD figure will be.

 

Ah, okay, we knew that.

 

When I'd loop such a signal from software and observe it for a minute, and the base signal (audio file) would last e.g. 10 seconds, I'd have to connect the last sample to the first sample properly. When not, I would see a "THD spike" at the connection point. Thus, when not properly connected, the first sample of the next 10 second run will imply the sine to be off and harmonic distortion will be the result. It happens at two adjacent samples only, but I'm not allowed to miss it. Why ? because otherwise I can't see that my connection is (made) wrong(ly).

 

FYI : While the above is a virtual example, in audio this "connecting" happens all the time. This is just about the (circulair etc.) buffers you are famliar with and they need to connect right.

 

The above hopefully explains by a most simple example how real time judgement is necessary, including a fast enough refresh rate of the figures or graph, in order to see things not being OK. And of course, this can be watched offline just the same, but notice this can be about "real time" audio playing only and while such connection points may happen once per few minutes, it will need playing for these minutes *and* doing so again for as long when doing it off line. I have no time for that, so that's why the real time need. This, apart from me being behind the knobs and changing things, which needs immediate insight in the result.

 

Assumed it is clear about these "connection points" there can be no doubt about the analyser not being allowed to cut out parts because it would imply a similar situation : THD spike. And this each few tenths of a second.

Also it should be clear now that it is quite useless to see the most fresh 10 seconds only (as per your suggested possible solution); I'd miss my own (poor programmed) anomalies.

 

Peter

 

 

0 Kudos
Message 8 of 13
(11,863 Views)

There is a separate Jitter Analysis Toolkit that is available (it is an add-on for LabVIEW).

 

Per my referring to the $3000 price I thought you'd recognize I was talking about just that. That this is within LabView I didn't know. Ok.

 

>>Since I don't see a requirement for digital input, I must assume that analog input-only can do the job already.

>I'm not sure I fully understand this.  There are both digital and analog inputs to the 5922.

 

What I meant was that jitter analysis most often requires digital data for input. I did not see this as a requirement for the Jitter Analysis Toolkit, so I have to assume that analog input can do the job here.

This rephrased : I NEED that analogue input, because it is about the analog output of the D/A converter. And/plus, there is no digital input to the D/A converter other than the output of the playback software (btw through USB).

 

Resumer, I like to do this jitter analysis with test signals in audio files, those played by my software. Why ?

Because the software influences in-DAC jitter.

 

Initially, with the DAC set at "no extra introduced jitter", you will see some baseline of jitter which will be the combination of all the elements.  As you increase the introduced jitter on the DAC, you will see the measured jitter rise.  There may be some "threshold" of minimum jitter you need to introduce in the DAC in order to see a noticeable effect on your ouput results.  

 

Is that the sort of experience you're expecting?

 

Exactly. And it would work by my expectation that the software induced jitter is much higher than we like. If that doesn't exceed that threshold then I'm out of luck. But then I expect to be able to see something anywhere after a year of puzzling.Smiley Indifferent

 

What remains is the answer to the question whether this can be achieved with playing files through my software. When this can be done I'll get myself a 5922 today, although it may need some further investigation on LabView and the necessity for the Jitter Analysis Toolkit.

 

Thanks,

Peter

0 Kudos
Message 9 of 13
(11,861 Views)

Hi Peter,

 

Thanks for the background on up-sampling your 18kHz signal - that's interesting - I've heard of that sort of technique being used elsewhere, but not in the audio realm.

 

When I'd loop such a signal from software and observe it for a minute, and the base signal (audio file) would last e.g. 10 seconds...

So do you mean that you must acquire from your digitizer for 10s at 15MS/s continuously, without any gaps?  

 

Previously I didn't understand that.  I used an example record length of 1MS, which gave the result of time spent acquiring = 66.7ms.  If we use a time spent acquiring value of 10s, our record length is much longer:

 

Record Length [S] / Sampling Rate [S/s] = Time spent acquiring record [s]

Record Length [S] = Time spent acquiring record [s] * Sampling Rate [S/s] (algebra)

150MS = 10s * 15MS/s

 

You can see on p. 19 of the 5922 specifications that only up to 64MS per channel is supported on the highest-memory 5922 digitizer is available (which is the 256MB/ch).  If we convert 64MS record length back to the time spent acquiring, This means you can only acquire at 15MS/s for 4.27 seconds (which is less than half of 10s).

 

If you are able to lower your sampling rate to 10MS/s or 5MS/s on this 18kHz tone, and you still use the maximum 64MS available on a 256MB/ch 5922 digitizer, then you will trade off number of samples per cycle for a longer acquisition time.  This trade-off follows the same math, so (for example) sampling at 10MS/s instead of 15MS/s is 66% of the speed and provides 66% fewer samples for the 18kHz waveform, but you will be able to acquire for longer than 4.27 seconds (6.4 seconds)

 

If you do need to exceed what is available in the onboard memory of the digitizer (maximum 64MS/channel) then in this configuration you need to use the Fetch While Acquiring example that I mentioned earlier.  Before talking about that, can you confirm whether you need a continuous stream of recorded data (no gaps) for 10s at 15MS/s?

 

I'd have to connect the last sample to the first sample properly. When not, I would see a "THD spike" at the connection point....I'm not allowed to miss it. Why ? because otherwise I can't see that my connection is (made) wrong(ly)....The above hopefully explains by a most simple example how real time judgement is necessary, including a fast enough refresh rate of the figures or graph, in order to see things not being OK.

 

Whatever you decide in how to configure your digitizer for acquiring the signal, you can transfer it (in chunks) to your host computer using the producer-consumer loop I mentioned before.  You can then record it and re-connect the waveforms in software (from the last sample to the first sample in an audio playback as you described).

 

On your host computer, if you are playing back 10s of data sampled at 15MS/s, and each sample is 2B, that means you will need to post-process with hundreds of MB of data which your host computer (not the digitizer) will need to be capable of handling.

 

Resolution [B/S] *Sample Rate [S/s] * Acquisition time [s] = Memory used [B]

 

2B/S * 15MS/s * 10s = 300MB

 

For two channels, this is 600MB.

 

Will you be trying to process this data all at once with LabVIEW and the Jitter Analysis Toolkit?  Or will you be looking at sections of the data at a time?

 

A 64-bit computer may be what you need for processing on this amount of data on your host quickly enough to provide a quick refresh/update.  I'd recommend that if you are trying to process large amounts of data at once, you should try benchmarking this.  You can download the JAT with LabVIEW for a 30-day trial and run some benchmarks on example data to find out how quickly the data is processed.

 

What I meant was that jitter analysis most often requires digital data for input. I did not see this as a requirement for the Jitter Analysis Toolkit, so I have to assume that analog input can do the job here.


I understand.  No, the Jitter Analysis Toolkit does not require inputing digital waveforms to draw conclusions.  Some sorts of jitter analysis (such as eye diagrams) might not make any sense unless your data is a digital waveform, but that doesn't mean you couldn't technically run the analysis on a non-digital (analog) waveform.  To the JAT, it all comes in as the same kind of data (waveform data, see below).

 

Resumer, I like to do this jitter analysis with test signals in audio files, those played by my software. Why ?

Because the software influences in-DAC jitter....What remains is the answer to the question whether this can be achieved with playing files through my software.

The Jitter Analysis Toolkit is a software package that runs stand-alone and separate from the 5922 digitizer altogether.  In other words, you could purchase a digitizer, OR you could purchase the JAT, OR you could purchase both.

 

The JAT natively accepts the waveform data type.  You can use the NI-SCOPE driver with any NI digitizer (including the NI 5922) to return this data type (simply use this specific version of the fetch function).

 

There are several waveform data type functions available for ease of manipulation in LabVIEW.  If you need to convert from some other data type (e.g. you are using C to fetch data from the digitizer, where C programming returns an array) to LabVIEW (e.g. when analyzing with the Jitter Analysis Toolkit), you can convert an array of data into a waveform data type.  Since you can construct the waveform data  type without LabVIEW, this means you can also work with non-NI digitizers and still use the JAT. 

 

Does this help explain?

National Instruments
0 Kudos
Message 10 of 13
(11,853 Views)