04-04-2023 01:03 AM - edited 04-04-2023 01:23 AM
@Jasper16 wrote:
Funny you mentioned this. I modified the code a bit to write to COM1 and put it in a producer loop. Then I created a second while loop (consumer) and made the 'read' operation to read from COM2. The data is then displayed properly. I have virtual COM ports where COM1 is paired to COM2.
If that is in response to the use of Bytes at Serial Port then consider this:
- Your computer has an Intel 12th Generation 16 Core x64 CPU clocked with 3GHz and a serial port integrated in the motherboard chipset and connected through PCIe.
- Your external device likely has an ARM Cortex S CPU with one or maybe 2 cores clocked at a few 100 MHz and its UART may be connected through a “slow” I2C interface. That is if it is a recently designed device. It could also be a 8085 clocked at 8 MHz or a Atmel 8 bit CPU. In the time such a device needs to detect that there is a single character at the serial port your computer can send an email, stream a video in the background, control an inverted pendulum on the side and read the Bytes at Serial Port property before your device has realized that there is actually a single character at its serial port. And it needs to read the entire message, process it, produce a response and send that back before Bytes at Serial Port even can see that there is something.
Now tell me why you think your computer is a good benchmark to test why serial port communication won’t work with Bytes at Serial Port with your device!
Yes you could insert a delay between the Write and Bytes at Serial Port to make it work and then come back and complain that reading your device is terribly slow. The device may normally need 50 ms to respond but for some commands it may take longer and sometimes it may be in an internal recalibration cycle and not be able to respond in those normal 50 ms. So you need to find the worst case through trial and error and use that as your delay.
Or you enable termination character detection and can forget Bytes at Serial Port completely and always receive the response as soon as it arrived.
04-04-2023 07:25 AM
@rolfk wrote:Or you enable termination character detection and can forget Bytes at Serial Port completely and always receive the response as soon as it arrived.
Assuming you have a decently long enough timeout. The default is 10 seconds. I very rarely would recommend anything less than 1 second.
The rest of your post is pure gold as an explanation for why Bytes At Port is 98.5% evil. I repeat that the Bytes At Port has only one purpose: to see if a message has at least started to come in. The Bytes At Port should only be wired up to the VISA Read in extremely limited situations. I think I ran into a situation once where it was required, and I do A LOT of instrument communications.
04-05-2023 01:04 AM - edited 04-05-2023 01:06 AM
The timeout needs to be of course long enough but 10 seconds has been so far almost always enough. Still, even if you make it 1000s, the Read still will return as soon as it received the termination character so usually in 100 ms or less. The only problem with long timeouts could be if the device can get disconnected or hang itself up. Then your program sits there for the duration of the timeout, waiting for that response that never will come, But that can neither be solved in a good way with Bytes at Serial Port.
And it’s never a good idea to try to fix broken hardware in software!
04-05-2023 06:39 AM
@rolfk wrote:
And it’s never a good idea to try to fix broken hardware in software!
Yet, I'm asked to do that all the time. It typically does not turn out well (code turns into kluge on top of kluge).
04-05-2023 09:32 AM
@crossrulz wrote:
@rolfk wrote:
And it’s never a good idea to try to fix broken hardware in software!
Yet, I'm asked to do that all the time. It typically does not turn out well (code turns into kluge on top of kluge).
I try to tell them they need to solve the root cause, not treat the symptom, or it will get ugly. And it always does.
04-05-2023 09:49 AM
@billko wrote:
@crossrulz wrote:
@rolfk wrote:
And it’s never a good idea to try to fix broken hardware in software!
Yet, I'm asked to do that all the time. It typically does not turn out well (code turns into kluge on top of kluge).
I try to tell them they need to solve the root cause, not treat the symptom, or it will get ugly. And it always does.
It's like trying to build a house on an unstable foundation. You can try and invest millions in a super sturdy construction, but if the foundation eventually sinks into the ground anyways, your whole investment is going to be wasted anyways.