08-24-2009 10:14 AM
I was referring to the indicators labelled start and end .They show little difference in the absolute value of the time in the front panel display.I guess its correct as the absolute time taken from any where at any point should be the same..i think that is why the front panel displays nearly equal value...
Ok any ways coming to the main problem now that it is clear that the keithley Vi takes up 140 msec what can be done? I had some similar issue with agilent 34907a and in one of the threads it was told to me that "PLC" value which is a measure of integration time can be lowered to make response faster.
IS there any such provision or any other settings in keithley ?People who are familiar in keithley sourcemeter can help in this regard.
In the context help of the read vi iget the followine message:
"using this vi will overwrite all trigger settings with settings for a single point taken immediatley.
For control over trigger use the low level API"
So what is API and will using that improve the speeds?
The drivers I am using is the one avilable in this link:
08-25-2009 09:12 AM
I managed to reduce the meaurment time by reducing the integration time by changing the value of the parameter PLC .By default it was 1 i changed to 0.01 which is the lowest possible value.
Now the time required for reading is only 60 msec.
But can this be reduced further?Will lower level pgmming or API command prgmming help in reducing the time of measurement?
08-26-2009 08:43 AM
I tried 0.1 PLC and the avg time for single point was baout 100 msec.However I tried even 0.01 PLC which is the lowest in keithley 2400 model...but still avg time is 100msec it does not come down lower than that..I am using a GPIB cable...
Why is the speed not increasing and how to improve the same?
Also please do answer the previous post also
08-31-2009 08:04 AM
1.I contacted keithley local support reg the problem i mentioned above..(low sampling rate..).
They said the timere used in labview cannot be relied upon as the computer clock may not be fast enough...
they have asked me to use the time stamp from the keithley sourcemeter..
Is it true that the labview timer is dependent upon the computer clock frequency..evn so it should not be in 100's of milliseconds i guess..
2.Secondly the support engineer suggested me to store the data in the keithley instrument itself and then transfer it to
computer at later stage.
3.If that is the case how do I do the same from labview?
Has any body done so earlier?
Help in this regard is appreciated.
08-31-2009 10:35 PM
what are the corresponding vi's in keithley driver to strore data in buffer and retrive it later?
09-01-2009 09:56 PM
Hello Siva,
The timer in LabVIEW would not be the best option to us as a timestamp on when the measurement is taken. Those timestamp should be the time on when the measurement is taken (from Sourcemeter) and not when the measurement is read (from LabVIEW). But it is not a reason for the low sampling rate of your measurement. The problem with the low sampling rate was likely caused by the GPIB databus which in it's communication with the instruments requires query and read operation. Each query and read operation requires some time for the instruments to complete. Hence your limited sample/seconds.
The second suggestion seems like a workaround from the limitation of the keithley instruments and the data bus (GPIB) that it supports. If the keithley sourcemeter have an instructions to take 1000 samples/second at once and read multiple numbers of data, then it is a possibility to use this mechanism where Keithley Instruments takes multiple points and save it inside their internal memory or buffer and LabVIEW only need to do query and read once every second or once every 100ms.
To do so in LabVIEW, you would need to know your instruments and what kind of functionality that it supports. I don't know whether Keithley instrument's document have the instruction set to do so. This one, you may need to refer to your user manual to find. I have done source meter measurement using PXI 4130 which can goes up to 3kS/s data rate in PXI data bus.
I hope that this helps.
James
09-01-2009 10:45 PM
09-01-2009 11:01 PM
09-01-2009 11:54 PM
I am currently using a PC based data aqcuistion using gpib cable ...so what PC based system are you suggesting?
by API pgmming i was referring to scpi commands specific for the instrument....I though tusing the std drivers which has many layers may slow down the communication.
09-02-2009 09:38 AM
siva0182 wrote:I am currently using a PC based data aqcuistion using gpib cable ...so what PC based system are you suggesting?
I'm suggesting that if your instruments were pc based (i.e. an NI DAQ card), you would have much greater performance
by API pgmming i was referring to scpi commands specific for the instrument....I though tusing the std drivers which has many layers may slow down the communication.
There are not many layers and the overhead of calling a subVI is going to be something you can barely measure.