09-29-2020 07:09 AM
I have a labview program. In this program first we send an instruction code to a cpu microcontroller to output an analog value. The instruction code will consist of a digital value which is equivalent to the value we want it to output. For eg : there are 5 output values which are 10V, 0V, -10V, 4mA and 20 mA. So I give the instruction code to the microcontroller cpu using VISA read and then using a calibrator read the output values by using GPIB send and recieve. GPIB send in labview will request the last isolated measurement reading from the microcontroller and recieve will give us the value. But the values are read like this , i.e 10 sets of one value, then 10 sets of the next value etc. But the problem is the first set of values always seems to be 0. If i try to read 10v values first then output will be all 0 and from the next one onwards I will get proper values. Then if I try to output 0V first ( and by 0V i should something like 0.12 or something) it is always 0 and the next set of values are correct. Can anyone help me with. What could be the problem. Why is it like this? My supervisor doesnt know LabVIEW. I dont know hardware that well and since it is the same program everywhere, I am thinking its a hardware problem and not software. But I need to clarify that before approaching him. Kindly do help me. Thank You.
09-29-2020 07:56 AM
There are way too many pieces here with no details. So if you really want help, you need to post code: both the LabVIEW code and the microcontroller code. Any additional information like model numbers, wiring, etc. would also help us help you.