LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

fast communication using visa / serial

Mark,

 

that kind of sounds like what I'm used to doing so far.  I'll do some more research on the baud rate and timeouts though.  thanks.

 

Everyone in this thread,

 

Is it possible that some devices do not send a terminating character?  I tried using my normal port sniffer to see the traffic from the device and I don't see anything common at the end of it's communications.  The manual for this device doesn't mention anything about a terminating character.  It does say that depending on what commands I send it, it will send back asci characters or binary.  Which I can already get both data fine.  I'm just in-experienced with this terminating character stuff to help speed up my reads.

 

but now my device's battery is dead so I'm done until it charges back up.

 

thanks everyone!

0 Kudos
Message 11 of 25
(2,359 Views)

Yes, it is possible that the device is not sending a termination character. The method of reading I suggested above is when not only do I not know how much data will be coming back but also that I don't know if it will be terminated using a new line or some other termination character. It is a general purpose read and I leave it to the application to decode the response. So, I use a raw read (no termination character).



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 12 of 25
(2,357 Views)

oh ok.....so if the equipment doesn't send a terminating character, then there is no reason to use those therminating character commands because they wouldn't work?

thanks,

0 Kudos
Message 13 of 25
(2,347 Views)

Jesse,

 

My experience has been that devices that send binary don't use termination characters.  Is it possible that you can expect a certain number of bytes to be returned based on the command you sent?

0 Kudos
Message 14 of 25
(2,347 Views)

 


@Wayne.C wrote:

Jesse,

 

My experience has been that devices that send binary don't use termination characters.  Is it possible that you can expect a certain number of bytes to be returned based on the command you sent?


 

This depends on how well behaved the device is. The printers that I work with can send back some very long responses and provide no idea how much data will be returned. In addition, sometime the data is binary with no termination character and sometimes it is a long ASCII dump with hundreds of lines (total number unknown) each with a line feed on the end.

 

If you know what the data will look like then read accordingly. If you don't know the approach suggested above works quite nicely.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 15 of 25
(2,338 Views)

Yes, my device does say an exact count of bytes that will come when it does binary.  So I'll just tell it to read so many bytes when it's knows to expect binary.

thanks!!

Message 16 of 25
(2,327 Views)

Great discussion!

 

There are some good take-a-ways (or points of historical interest)

 

1) Electronically-transmitted binary-encoded serial data has been around for a long long time (thank you Samuel Morris!) but, some methods are more "advanced."

 

2) the "information age" has placed a premium on exchange of data.  Ergo, with greater information exchange effeciency the COST of transfering data decreases. and Return on data investment grows exponentially (this is "the proof" of Moore's Law)

 

3) There are "obsolete" data transfer mechanisms.  by obsolete I mean ones that were developed and commonly practised by "cutting edge" corporations of the time.  <history note ON> Way back in the 1940's AT&T had a call volume that was approching the limit of what telephony switch circuitry could do. (x switches per second) The "switches" in the day were vacuum tubes triodes and mechanical relays limited by electron beam transit time and mechanical action plus inductive delay respectfully.  A GaAs transistor was first produced in 1947.  you probably couldn't find one of these outside of a "you-see-em" as they have also been obsoleted by technology that is faster, more reliable and cheaper to produce.

 

4) Ben mentioned a serial data stream in "spew mode" (elegantly put, and for those of us that have ever held a baby with digestive tract "issues" a very apt analogy lead-in) A baby vomiting is analogious to a mid-50S serial protocol.  Spew- baby spew- I might catch some of it.  but, wouldn't it be preferable to have structured exhanges (analogy- OK baby, puke that over here when I'm ready.)

 

The seven layer protocol was designed less by intention than by necessity! <history OFF>

 

 


"Should be" isn't "Is" -Jay
0 Kudos
Message 17 of 25
(2,320 Views)

@Jeff Bohrer wrote:

Great discussion!

 

...

 

The seven layer protocol was designed less by intention than by necessity! <history OFF>

 

 


You date yourself with that ref. When was the last time you saw a true inplementation of the 7-layer model? In my case it was with DEC circa 1980 in the VMS environment (but again DEC helped write the 7-layer model). There was a write up in the back of the 2004 NI Catalog about foundation fieldbus that ref'd the 7-layer but evn that was a hibrid.

 

Ben

Retired Senior Automation Systems Architect with Data Science Automation LabVIEW Champion Knight of NI and Prepper LinkedIn Profile YouTube Channel
Message 18 of 25
(2,304 Views)

You can try the "Detect Break Event.vi" to learn to use VISA serial events. The other one usefull is here:

http://digital.ni.com/public.nsf/allkb/E393205297CFF1B386256DFA00738F4A

However in the second case I am almost positive that the Visa Enable Event should be at the front of the loop  and  Visa Discard and Visa Disable should be outside(after) the loop. Since there is no event change, there is not reason to kill it and create it every cycle. It will work this way, I think, but wont be very efficient.

 

The Visa Close with the inscription "This is required" seems to be in the right place as it is said.

 

Do not forget to check Bytes at Port on the property node. You would place it after Visa Wait on Event and before Visa Read.  

 

0 Kudos
Message 19 of 25
(2,261 Views)

Hi Ben,

I think my situation is similar to you. My device continously send data frame to my VI. But I don't know how to change byte to read in VISA is suitable to data frame. For instance, I want to get data from data frame. First byte is 113 and ending byte is 114 as I attach below.

Give me some advice,Ben

Regards,

JoJa

 

Download All
0 Kudos
Message 20 of 25
(2,177 Views)