Hallo Lynn,
thanks for your immediate reply.
My above description on the algorithm of the code in the chip might be not exact. I'd love to add more:
As the signal of interest is continuously changing, the chip is also continuously sampling, converting, and sending digital signal back to the pc.
The labview is expected to behave like an oscilloscope except that at the beginning it sends the 'convert' command which user types to the chip to start up the sampling, conversion and sending data.
For every sampled analog signal results in a 10-bit digital signal contained in 2 bytes after conversion in chip.
According to your suggestion, it's my understanding that in order to get the continuous results the 'convert' signal has to be sent repeatedly. Is it possible to send the command only once at the beginning, then the results which contain a throng of such two-byte results are correctly truncated and converted into decimal float for display and save?
win2s
Message Edited by win2suse on 04-20-2007 08:48 AM