RF Measurement Devices

cancel
Showing results for 
Search instead for 
Did you mean: 

5660 - How does spen determine phase noise?

I am writing a driver to make the 5660 function as a spectrum analyzer. In one of the manuals I found this statement:

span specifies the expected bandwidth of the RF input signal. You can specify a span value between 0 and 20 MHz. Span values affect phase noise and down converter tuning step size, as shown below:
span setting phase noise tuning step size
<= 10 MHz best 5 MHz
> 10 MHz good 1 MHz
Default Value: 20 MHz

Note: The NI 5600 RF down converter module hardware always down converts a 20 MHz bandwidth. Software span settings are used to determine optimal phase noise/tuning step size combinations.


I am wondering how this happens and if I should write my driver to take spans less then 10 MHz. Is the phase noise better with smaller spans? e.g. to get a span of 64 MHz is it better t take 7- 10 MHz spans or 8 - *MHz spans?

Terrill
0 Kudos
Message 1 of 5
(9,409 Views)
The PXI-5600 RF Downconverter module of the PXI-5660 RF Signal Analyzer does indeed always downconvert a 20 MHz band of frequencies from a desired RF to the IF output of 5-25 MHz. So if you request a center frequency of 100 MHz, the frequency span of 90-110 MHz will be downconverted to 5-25 MHz at the output of the PXI-5600.

The span input has two funcrtions. The first is to help set HW parameters, and the second is to actually filter the received data in software. In hardware, when the span <= 10 MHz, the PXI-5600 is configured for best possible phase noise performance, and as a result tunes in 5 MHz increments. When the span > 20 MHz, the PXI-5600 will have slightly reduced phase noise performance in order to tune in 1 MHz increments. In both cases, the data coming out of the PXI-5600 will still be 20 MHz wide and will be filtered to the requested span in software.
0 Kudos
Message 2 of 5
(9,403 Views)
In hardware, when the span <= 10 MHz, the PXI-5600 is configured for best possible phase noise performance, and as a result tunes in 5 MHz increments. When the span > 20 MHz, the PXI-5600 will have slightly reduced phase noise performance in order to tune in 1 MHz increments."

That confused me. it tunes in higher increments for a smaller span? Is this a function of the software controlling the device? I am basically rewriting the multi-span example in segmanted driver pieces. I think I need to segment a large span into segments that are 10 MHz or smaller to get the better phase noise. Can I force it to use the better phase noise settings on all spans? What does tunes in xxMHz increments mean?
0 Kudos
Message 3 of 5
(9,389 Views)
Why are you rewriting the multi-span example? NI-RFSA 1.5 installs a new set of LabVIEW driver VIs for the PXI-5660. They add multi-span capabilities automatically such that requesting a 100 MHz span, for example, results in 5 20 MHz chunks being acquired and concatenated automatically. In addition, programming the PXI-5660 is far easier with these VIs in general.

The VIs are installed into the Instrument I/O->Instrument Drivers sub-palette in LabVIEW.

As for the HW, yes - requesting a smaller span of frequencies (<= 10 MHz) causes the PXI-5600 to tune in larger increments (5 MHz). This is not a function of software, as the software is written as a function of the HW. If you want to force best phase noise, go into the code for the ni5660 Configure for Spectrum (NI-RFSA 1.5), look for the code inside the FALSE case of the case structure (which is for multi-span), look for the niTuner Set Freq VI and wire an array of spans whose length is equal to the length of the frequency array and whose elements are all values <= 10 MHz.

This is not recommended and any problems incurred as a result of changing the default behavior of the instrument will not be supported.

You can install NI-RFSA 1.5 from here:
http://digital.ni.com/softlib.nsf/websearch/C1A9A160FA5AD89B86256ED8005E0761?opendocument&node=132060_US
0 Kudos
Message 4 of 5
(9,382 Views)
Andy,

To answer your question on why I am rewriting the code, it is because the supplied code is junk. An obvious example is the NI 5660 drivers use receiver info input but the triggrer VIs use an instrument handle input. The measurement code is just example code. I have had my sales guy put me in touch with developers in Austin that have said example code is an example and is not meant to be used as final code. An example is the models supplied with TestStand. I am assuming this is the case with the code for this instrument as I look at it and see plenty of bad coding practices. I have never had a proble getting support for instrument code that I have written. We also use a class structure that has attributes so all the info that code is storing is not visible to anyone updating my driver. The NI code is sloppy and not proprerly documented. You would think NI could write good LabVIEW code.

Terrill
0 Kudos
Message 5 of 5
(9,375 Views)