LabWindows/CVI

cancel
Showing results for 
Search instead for 
Did you mean: 

ComBreak duration time is changing from call to callr

Hello DavidP,

in my setup there are no .h files in the folder C:\Program Files\National Instruments\...

I made a search over the comlete content of C:\MeasurementStudio to find the header file with the attribute "VI_ATTR_ASRL_BREAK_LEN" inside, but there was nothing found.

Is there a difference between my installed version of CVI and yours ?

 

regards

Michael

0 Kudos
Message 11 of 20
(2,220 Views)

Our installations should be very similar as long as you did the default installation method for CVI 5.5.  I found the visa.h file in multiple locations on my machine. Try searching your computer for visa.h and be sure to include hidden folders.  If this doesn' find it, try looking at these locations:

C:\MeasurementStudio\cvi\include

C:\VXIPNP\WINNT\include

C:\VXIPNP\WINNT\NIvisa

C:\Program Files\National Instruments\Shared\CVI\include

My computer has the following NI software installed:
CVI 5.5

TestStand 4.0.1

NI-VISA 4.5

David Pratt
Group Manager for Application Engineering Specialists | Automated Test
NIC
0 Kudos
Message 12 of 20
(2,212 Views)

Hello DavidP,

I serched on my computer for the file visa.h and found the file at several locations.

In none of these files the  attribue "VI_ATTR_ASRL_BREAK_LEN" was defined.

Meanwhile I found an other way to generate a break signal with an accurate length, without using the ComBreak function.

I found the windows SDK functions "SetCommBreak()" and "ClearCommBreak()". With these two functions I was able to set and clear the break state on the communication line.

For the accurate time for the break signal I wrote the following function:

 

void wait(double time_to_wait)          
              { double stop_time,start_time,run_timer;

                start_time = Timer();
                stop_time = start_time + time_to_wait;
                run_timer = Timer();
                while(run_timer <= stop_time)
                        { run_timer = Timer();}
              }        

 

To use the functions "SetCommBreak()" and "ClearCommBreak()" a handle is needed. This handle will be provided by the CVI function GetSystemComHandle (int COM_Port, int *System_Handle).

                    

 

Following the function which generates the break signal:

void send_break(void)

                           { int iHandle

                              GetSystemComHandle (COM_PORT, &iHandle);
                              SetCommBreak((HANDLE)iHandle);

                              wait(0.001);  

                              ClearCommBreak((HANDLE)iHandle);

                            }

 

 

Last but not least you have to install the CVIRTE9.0.  With the default installed RTE the duration of the break signal will also change from call to call.

 

Nevertheless I will thank you for the time you spend to help me.

 

best regards

diver

 

 

 

 

 

0 Kudos
Message 13 of 20
(2,186 Views)

Hey diver,

 

Glad to hear that you found another method to set a break time! I checked on my machine for the visa.h file, and did see the VI_ATTR_ASRL_BREAK_LEN attribute defined, so I think the issue there might be with a difference in the versions of CVI/VISA. 

 

I would recommend making the following changes to your void wait(...) function :

void wait(double time_to_wait)          
              { double stop_time;

 

                stop_time = Timer() + time_to_wait;

                while(Timer() <= stop_time)
                        { }
              }     

 

This way you aren't wasting any time calculating local variables, so you should get a more accurate wait time

Justin E
National Instruments R&D
0 Kudos
Message 14 of 20
(2,158 Views)

Hello Justin_E,

thank you very much vor your recommendation.

I will exchange the function.

Thanks a lot.

 

regards

diver

0 Kudos
Message 15 of 20
(2,145 Views)

I feel I have to point out that this approach is deeply flawed - it is like playing Russian Roulette! There are at least three reasons not to use this technique in a Windows user level program:

 

  • The available resolution on the Timer() call is only 1 ms. So it could return at any time between 0 and 1 ms.
  • If Windows decides to interrupt your code in the middle of its time delay, and then go off to swap memory or re-arrange its disc indexes or whatever else Windows likes to do every now and then, your delay will most likely be blown out of the water - it might not get a CPU time slice again for 10's or 100's of ms. How lucky do you feel?
  • The delay is a busy wait, monopolising the CPU and not letting any other code run (except for the Windows interrupts, of course). Although this is insignificant for a 1 ms delay, if someone tried the same approach for a 10 second delay, the complaints would soon come thick and fast.

 

You really need a hardware solution, or a real-time operating system, to achieve precise delays like this.

 

JR

0 Kudos
Message 16 of 20
(2,140 Views)

Hello jr_2005,

the value you get by calling the Timer() frunction has a much higher resolution than 1ms.

The function returns a double value which has a very high resolution to generate accurate time delays (even with a higher resolutin than 1ms)

I monitored the output of a break signal (running in an asynchron timer function) on the communication line (duration of 1 ms)  with an oscillocpe and the signal was very very stable, independent of what happened currently on the computer (starting und running additional programms).

I also tested my software where the break signal is used for communication issues. The soft  ran some days without interruption. No communication problem occured. That means, that the break times over the whole test where all in range.

I do not know where the value of Timer() is coming from, but it is very accurate.

Maybe an engineer of NI can tell us, from which time source the value is taken.

By using the clock() funktion the time delay is changing between 0 and 1ms. This function can not be used to generate accurate time delays.

 

 

diver

0 Kudos
Message 17 of 20
(2,136 Views)

Maybe NI should change their help on the topic, then. It clearly states a resolution of 1 ms.

 

JR

0 Kudos
Message 18 of 20
(2,132 Views)

As much as the Timer() approach seems to be working for diver, I agree with JR that in the past this would have been futile.  Windows on older hardware just would not support determinance below about 10mS.  Newer multicore CPU's have helped this a lot, but the accuracy of Timer(), while better than clock(), would not have been of much use for event synchronization in the 1mS range.  The ability of the basic Windows OS to support 1mS resolution over the long haul would also be highly suspect.  Since diver mentions a change to CVIRTE9.0 changed the behavior of the system I suspect a low level change in the CVI internals has occurred.  Is it possible CVI hs picked up features/traits from the development of the realtime version of CVI.

0 Kudos
Message 19 of 20
(2,120 Views)

Hi JR,

 

The 9.0 help does mention that the resolution is 1 microsecond. This was a change that took place recently. But note that even if the resolution is 1 microsecond, this doesn't meant that the accuracy (in Windows) is 1 microsecond. Obviously Windows is not a real-time OS, so it's not possible to deliver that kind of accuracy. Even 1 millisecond would be a stretch.

 

Luis

0 Kudos
Message 20 of 20
(2,106 Views)