LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Visa read error 1073807339

I get error 1073807339 at the visa read.  This program works on my laptop but not on my test computer.  I have reinstalled NI488.2, visa, niserial, and nimax.  My whole program works with gpib and usb daq equipment.  I just cannot get this RS232 tachometer to work.  

- checked the port settings and they match

- ran Nimax and it works.

- tried a usb-serial adapter.  Works on my computer but not the target computer.

- had previously put this file on another computer and it worked.

- I added the *IDN? write option because the Nimax would give me something back if I used this.  But my application works on my laptop without this.

 

Any help is appreciated.  

0 Kudos
Message 1 of 12
(8,310 Views)

 1073807339 basically means "no response".  

 

Since you say it works on other PCs but not this one, it's almost certainly a problem with the PC setup or the connection to the device and not your LabVIEW code.

 

Can you use other software to communicate with this device on the test machine?  Software from the manufacturer, test panels in NI-MAX, PuTTY, etc?

 

I suspect in the end you'll find a hardware problem or a driver problem rather than a LabVIEW problem, given the information you've provided.  

 

 

0 Kudos
Message 2 of 12
(8,298 Views)

The test panel in NI Max worked.

0 Kudos
Message 3 of 12
(8,287 Views)

Did you send the correct termination character for your VISA Write?

 

You don't have a termination character in the command you write, but I do see that you have all the busy property nodes set up to send a new line character on writes.  And the termination character on reads is also a new line character.

 

Get rid of all those property nodes, and just use the Serial Configure subVI.  Make sure the termination character is enabled and it is the correct one  (line feed, decimal 10, hex 0A, or \n in \code display OR  carriage return, decimal 13, hex 0D or \r).

 

Then include the correct termination character in the string you are writing.  Set display mode to \code mode so it is obvious it is in there, and set the display format to show the radix.

 

If it worked in MAX it is probably because you hit enter and that sent a carriage return.  Your code is current set up to send a line feed character.

 

 

0 Kudos
Message 4 of 12
(8,275 Views)

After analyzing each bit in the return string, I find that Ni-Max gives me bad data on the target computer and good data on my laptop.  The data I get is below on both test computer and laptop with the development program.  The start string character is OD.  Byte 2 and Byte 80 80 both are not invalid for this instrument.  I have since setup up a 2nd computer and get the same reading on both the computers where I am loading the executable.  The devices is to only send back 10 bytes.  It has a starting character 0D and no termination character.  Therefore I just read 1 character until I match 0D, then get 9 more bytes after I get a match.    I did deploy this code in another program early this year on another computer that is working.  So seems like some setup issue.  I'll have to see what runtime version is on that computer.

 

1:ReadOperationtestcomputer Bad data
0D808040A04BCA0701FF
0D808040A04ACA0501FF
0D808040A0EB7D0080
0D808040A0EA7D0080
0D808040A0CB550100
0D808040A0CA550100
0D808040A0CB550100
0D808040A0CA550100
0D818040A00BA70500
0D818040A00A

1:ReadOperationlaptop Good data
0D010001014D2C280000
0D020001016D00E10000
0D020001014D00E10000
0D020001016D00E10000
0D020001014D00E10000
0D020001016D00E10000
0D020001014D1C1A0100
0D020001016D1C1A0100
0D020001014D1C1A0100
0D020001016D1C1A0100
0D020001014D1C1A0100

 

 

0 Kudos
Message 5 of 12
(8,252 Views)

Something is unclear.  Is your device sending binary data, or ASCII data?  Those examples you gave us, is that a literal "0" and "D" and "A" that you see in a string control set for normal display mode?  OR what you see when the string display is in Hex display mode?

 

If it is true binary data, then you need to disable the termination character otherwise it is looking for the linefeed character hex 0A that may never come.  Or it may come in the middle of of a packet of data since it would be perfectly valid byte to receive in binary data, in which case your VISA Read will be truncated.

0 Kudos
Message 6 of 12
(8,245 Views)

Sorry this was a bit messy nearly 10 years old with various updates because of Labview changes.  I removed all the unnecessary sections as you mentioned.  I did not need a termination character because I only need the 9 bytes after I find an 0D string and this tachometer does not send a termination character. I added the device protocol in the diagram. This file is working on my development computer and not on the test computer with the vi update.  Since I receive bad data in Ni-Max only on the test computers, looks like come kind of driver issue.  

 

I updated Ni-Max and NiVisa versions to match an older computer working with this code.

 

  • Windows 7 Enterprise SP1
  • Labview Runtime 2016 f5 32bit runtime
  • Ni-Max 18.0
  • ni-488.2 17.0
  • ni Visa 18.0
0 Kudos
Message 7 of 12
(8,228 Views)

If you don't need a termination character, then you need to wire a False constant into the top of the serial configure.  Right now it is unwired, which means it defaults to True.

 

That in itself wouldn't cause bad data bytes, just incomplete messages, or ones getting out of sync.

 

Double check your parity settings, none of those are wired in the VI so it will use the defaults of 8 data bits, 1 stop bit, no parity.  And like you said, look for a driver issue, or perhaps some sort of flaky cable issue.

0 Kudos
Message 8 of 12
(8,221 Views)

Just added the False constant and still have the issue.  I checked the port settings are 8 data bits, 1 stop bit, no parity, no flow control.  I am using the same device and cable and just plug it to the RS232 port.  Looking at Ni-max software tab, All the software is the same now as one the one computer I deployed a while back that works.. Note that I have two new computers I am trying to deploy and am having the same issue on both

0 Kudos
Message 9 of 12
(8,215 Views)

Wow, that's a terrible protocol then!

 

The leading byte is 0x0D, which is the same as the ASCII carriage return value commonly used as a termination char.  Further, bytes 3,7,8,9,10 carry data that could also legitimately have an 0x0D value.  So 1 of 10 characters definitely has the value 0x0D, 5 others *might*.

 

Your program never writes any command to the device so I assume the device must be always streaming these 10-byte packets of data?   If so, you need a more sophisticated method for establishing your *framing* around those 10 bytes.

    When you find your first 0x0D character, that character alone can't tell you whether it's byte 1,3,7,8,9, or 10.  It could be any one of them.   Your code assumes you've found byte 1, but that's not at all guaranteed.   You'll need a more sophisticated algorithm to know for sure where you are within the 10-byte packet.

 

So again, the protocol allows for ambiguity, making it terrible.

 

 

-Kevin P

 

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 10 of 12
(8,193 Views)