Hi, guys! (and girls 😉 !!!)
Ok, I´ve a test system and I want to apply a calibration. I store the losses in a text file for the 3 signal generetors in a file, for a range of frequencies wich I can set (example: 0 MHz to 1000 MHz in 100 MHz step)
It´s working with no problem, as far as i run my test with the same frequency settings, but not when I run with a different one (for exemple, with 125 MHz) It´s clear that has no sense to make the calibration for each possible frequency. How can I do to interpolate the value, saying to Labview to look for the nearest value (either upper or lower).
Thanks in advance!