Hey everyone, I'm trying to create an application that can generate 10 ms wide pulses of current in an electronic load. My program was incapable of doing this, as it seemed to be generating pulses ~66 ms too long. After much optimization I halved this, but that is still far far too much error.
So, as an experiment, I tried telling the load to go to 2 amps, then back to 0 immediately so I could measure the minimum possible time by doing the following:
I wrote a .VI that initializes a Visa session with this hardware I have (a Kikusui PLZ1003WH electronic load to be exact) and then using the Visa write f
unction that is a part of Labview, I wire the string constant "ISET 2" to the "write buffer" input, and the visa output from my initialization function to the "Visa resource name" input. Then from there I wire the dup Visa name to the next Visa write, as well as the string constant "ISET 0" to that "write buffer" input.
The delay, measured on an oscilloscope, is a huge 22 ms. I've got a 233 Mhz processor, and even if this program is 200 cycles long, thats a mere 85.8 nanoseconds. GPIB should be able to handle these 96 bits of information (plus whatever parity/error checking stuff may be used, of course) in 64 microseconds. The cable is only five feet long, so the signal should get there in about 15.2 nanoseconds. So, we're talking a mere 64 microseconds, thats all it should take. Even adding a factor of 10 for the internals of the load, it's not responding near as fast as I would expect.
Anybody have any idea why this is so terribly slow?