LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Switching DTR signal in LabView

Hello everybody. First of all, let me say this forum is an excellent aid for all Labview users and I have learned a lot here thanks to all of you.

I am designing a software to capture data from a sensor network. The communication is done using serial port VISA. Each sensor is identified by a letter (A, B, C or D). When the user decides to receive data from a single sensor, he presses the switch button and the LabView 7.0 sends the letter which identifies the sensor. My problem is switching the DTR signal. I need this pulse to last, in a high logic value, the exact time that it takes to send the identification letter. The transmission speed is 19,200 (ie the pulse of high DTR should last for approximately 417 microseconds) and so far I have only managed to get this pulse to last 2 milliseconds minimum. This results in a delay of 1.58 milliseconds between the sending of each 2 bytes of data from a sensor in particular. As you can see, this greatly reduces the data captured, since I lost three quarters of the time waiting for the DTR signal goes low.
So far for the use of DTR switching the VISA property node (Line Modem Settings), but I cant reduce this time run my application faster and collect the data in an appropiate way.
Any recommendation about all this? Is there any other way to control the change in the DTR state? is it a computer problem?

Any response or suggestions will be welcome. Thank you for your help in advance.
Usando:: LabView 7.0 Express 😄
0 Kudos
Message 1 of 4
(4,684 Views)

It's not real clear on what you are trying to do, or more so why you are trying to do this.  It sounds like you already know how to use the property node of the serial port to assert the DTR line.  Correct?  It almost sounds like you want to use hardware flow control.  But hardware flow control just uses the DTR line (and the others) to control flow depending on the status of the buffer whether it is too full to receive more data, etc.  It really has nothing to do with, "hey data is currently coming down the line at this very instant."  If you are trying to control the DTR line to do this, I think you have few problems.

One, determining how long to assert the line based on the number of characters and baud rate (not too bad, mostly a math problem).

Two, synchronizing of the setting of the DTR with the actual transmission of data.  It sounds like very tight timing you want.  I think this is difficult because I don't know how predictable the putting of the data in the serial buffer is relative to the time it actually goes out the line.  These is very low level functions that would be a part of the serial driver and the UART hardware. 

Three, if you are doing all of this control by way of software timing, I don't think you can get any better than 1 msec timing due to the nature of the windows operating system.  It will schedule the tasks of putting the data in the buffer and setting the DTR as it sees fit.  I don't think you can syncrhonize them in the sub millisecond range.  The use of notifiers, rendezvous's and structures like that may help.  But I think it is still very tight timing.

I think to have a chance of doing what you are describing, you would probably have to hard code all of these actions in an FPGA program which would not be trivial.

One other possibility might be to use a high speed digital output card where you can output a defined bit stream and simultaneously trigger another output that simulates the DTR signal on another digital output line.  I don't think this would be easy to do, but it might be possible.  It would mean simulating the entire serial protocol with start bits, stop bits, parity, intercharacter delay and all of that.

Perhaps you can fill in more details of your application and why you need to do what you are asking.  Others may have some better ideas of how to accomplish this task.

Message Edited by Ravens Fan on 10-26-2007 12:31 AM

Message 2 of 4
(4,657 Views)
Hi Ravens Fan, thanks for your answer.
I do know how to use the property node of the serial port to assert the DTR line. My problem is that its state doesnt change as fast as i want. The minimum time i get is 2ms and that's using delays to try to make it faster. I was wondering if i could use another way to switch the DTR line because obviously this one is not working. Somebody told me i could use a C or Assembler program to make it faster but i dont know how to do that or if that's even possible.
Anyway i need to switch that line that fast because DTR (or RTS is the same for me, i just have to change the hardware i made. I cant use any other because those are the only ones for transmission besides TxData) controls the period of transmission/reception of the whole system. My microcontroller wont send the 2 bytes of data until he gets a High level (meaning it has to read the letter to identify who has to send) and then a Low level (meaning it has to send his data).
I have thought on the second possibility you gave me but using a 555 timer in monostable mode to generate the signal I want. This could be a solution but i dont think a hardware solution would be the best if I can fix this using LabView.

I hope I explained it better this time. Thanks in advance.

PS: I tried with RTS too but i have the same problem.

Mensaje editado por NavasCOL

Usando:: LabView 7.0 Express 😄
0 Kudos
Message 3 of 4
(4,653 Views)
Like I said, I think the results you are getting are about the best you can do with a software solution in Labview.  I don't think a C program would do any better because whatever you compile is still running in a windows environment.  And I don't think you can control the data transmission and the state of the DTR through the driver in the sub millisecond range.  What are your communication settings?  You said it takes 417 usec to send one letter.  I think that would be true if the byte was exactly 8 bits being sent.  But for a setting of 8-none-1.  You would have 8 data bits, 1 start and 1 stop for 10 bits.  That would be 1920 bytes per second or 521 usec per byte.  A 7 none 1 would be 9 bits per byte or 469 usec per byte.
 
When comparing the delays to the time it takes to transmit a single byte, it does sound like a lot of wasted time.  If you were sending multiple bytes, then the delay time would be a less significant portion of the overall timing.  To me, it sounds like you are trying to make a really high performance communication mechanism.  I don't think you can do that with Windows serial port drivers and software control.  It would be a decent amount of work, but I think making your own serial port out of digital input and output lines could work.  You would have an input line representing the RxD line.  An output line representing the TxD line.  Another output line representing DTR.  All referenced to ground.  A digital read and write based on a 19200 Hz sample clock rate (if that is possible, or it might have to be some multiple of that).  Then you would create your own digital wave form where the start bit, data bits, and stop bit are generated.  This would write to the one DO line and would be syncronized with another digital waveform that is constantly high that gets written to the other DO line.  For data coming in on the DI line, you would have to capture the digital waveform, and figure out out to decode the start bit , data bits, and stop bit out of it.
 
I have never done anything like this before, but I think it is possible, and would be an interesting challenge.  Labview may have all the functions already in it to generate and decode these waveforms.  If not, perhaps they have some add on toolkits that could help.  Maybe someone else has done something like this and can share their insights.
Message 4 of 4
(4,610 Views)