LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Writing with VISA while reading.

Solved!
Go to solution

I did the necessary changes in my original VI. The only one that was not applied was the change from 'byte at the port' with the proper configuration with VISA. A selector is used instead of the case structure, and the graph is now just checking if the length of the answer is under 2, then it will not print that. This way, even if is a zero from the module, it will be printed because the answer will have a bigger length.

0 Kudos
Message 11 of 26
(1,687 Views)

I did the necessary changes in my original VI. The only one that was not applied was the change from 'byte at the port' with the proper configuration with VISA. A selector is used instead of the case structure, and the graph is now just checking if the length of the answer is under 2, then it will not print that. This way, even if is a zero from the module, it will be printed because the answer will have a bigger length.

0 Kudos
Message 12 of 26
(1,687 Views)

Hey DDragos,

 

from just looking at your code (don't have any VISA instrument here) I have another idea where your issue could come from. Your "Untitled 3 (SubVI)" writes the command to the instrument and then instantly checks how many bytes there are as an answer. I assume that your instrument needs some time to process the command and send the answer afterwards. So maybe your "Bytes at Port" sees "0" bytes, so the "first" read operation won't read anything, but the next operation will do...

 

Insert a delay between the write and the read and check if this helps.

 

There is also another trick to omit "wrong" values: Flush the read buffer before writing to the instrument. To do so check how many "Bytes at Port" there are, read them and simply ignore them. Then do your write operation.


Ingo – LabVIEW 2013, 2014, 2015, 2016, 2017, 2018, NXG 2.0, 2.1, 3.0
CLADMSD
Message 13 of 26
(1,668 Views)

Why are you ignoring crossrulz's warning not to use Bytes at Port, but instead to use a Termination Character (if your VISA Device supports this, i.e. if it sends data as a variable-length String terminated by a "termination character", often \n)?  If your device does send out a Termination Character, then failure to use it, and using Bytes-at-Port to give you an unreliable indication of when the message has been completely sent will give you errors!

 

So if you insist on introducing known errors into your code, please don't ask us how to "fix" them.  At least try the "correct" way and if it doesn't fix your problem, show us the "code that fails", describe the VISA device (so we can check its documentation), and we'll all learn something.

 

Bob Schor

Message 14 of 26
(1,658 Views)

Ups, you are absolutely right. I had become blind to this shortcoming probably because I somehow expected it was already tackled after I read through the thread again. Or because the last device I wrote a driver for did not send termination characters (annoying, indeed). Obviously should not post here after a long day. But all that is no excuse for my previous post. Unfortunately I cannot edit/remote the wrong section Smiley Frustrated


Ingo – LabVIEW 2013, 2014, 2015, 2016, 2017, 2018, NXG 2.0, 2.1, 3.0
CLADMSD
0 Kudos
Message 15 of 26
(1,653 Views)

@ikaiser wrote:

Insert a delay between the write and the read and check if this helps.


Shame on an NI employee for giving that kind of advice!!!  You are just delaying the issue instead of doing it correctly.  The correct method is to use the communication protocol.  In this case, it appears there is a termination character, so we should use that.


GCentral
There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 16 of 26
(1,651 Views)

@crossrulz wrote:

@ikaiser

You are just delaying the issue instead of doing it correctly.  The correct method is to use the communication protocol.  In this case, it appears there is a termination character, so we should use that.


Actually I should face-palm myself. I gave you a kudos for posting "DO NOT USE THE BYTES AT PORT!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!" and then posted a similar hint not solving this issue.

 

But yes, you are right, the ADAM-4000 uses carriage return as termination character, at least I found a manual stating so (page 4-2).


Ingo – LabVIEW 2013, 2014, 2015, 2016, 2017, 2018, NXG 2.0, 2.1, 3.0
CLADMSD
0 Kudos
Message 17 of 26
(1,642 Views)

@crossrulz wrote:

@ikaiser wrote:

Insert a delay between the write and the read and check if this helps.


Shame on an NI employee for giving that kind of advice!!!  You are just delaying the issue instead of doing it correctly.  The correct method is to use the communication protocol.  In this case, it appears there is a termination character, so we should use that.


Are you talking about ikaiser? I don't see any indication that he's an NI employee.

0 Kudos
Message 18 of 26
(1,622 Views)

"blue" users are NI employees.

Paolo
-------------------
LV 7.1, 2011, 2017, 2019, 2021
Message 19 of 26
(1,614 Views)
Solution
Accepted by topic author DDragos

@DDragos wrote:

No change, the system reacts the same way, with the error( -1073807253 ) starting from the READ block of VISA. Also when I try to send the 'set' command, everything brakes and is not working anymore.

 

Also it looks like when I am using the Highlight Execution, everything works fine. Maybe a delay is necessary somewhere or something like that. I implemented a small delay of 35ms in my original VI so the system has time to read the data before sending the command again.


That particular error code indicates a framing error, meaning that your serial port received a byte from your device that wasn't formatted the way it expected, for example it was sent at a baud rate or with a parity setting different than your port was configured for. Could you have your port misconfigured? If the defaults used by the Configure Port function aren't correct for your application, you need to wire the correct ones to it.

 

The reason everything breaks after one error is because you're feeding the error out into the shift register rather than handling it, so it just wraps around to the left side of the loop and gets passed into the next iteration of the loop. Once any subVI in the loop encounters an error, it'll be endlessly recycled through the loop with no way to recover. At a minimum, use the error as a condition to end the loop.

 

As has been said before, if your "set" command elicits a reply from your device but you don't read the reply or otherwise clear it out of the receive buffer, then it will still be there the next time you try to read the temperature from your device, and that's what you'll read out when you're trying to read temperature. You should be reading everything your device sends, even if it's just an acknowledgement. And as has been said repeatedly before, if your device sends a termination character, use it, and don't rely on delays or "bytes at port" to tell you when the response has been received.

Message 20 of 26
(1,613 Views)