LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

triangle pressure waveform with time - do I need ro program with Labview RT on my PC ?

Loading

Hello, 

as you know, there are hardwares like function generators that output a voltage versus time, at a frequency F. 

Here, rather than a voltage source, I have a hardware that output a constant pressure when sending a "set P" command, and read the pressure with "read P". 

I would like to create a time dependent triangle pressure waveform with 1, 2 or 3 periods max (not infinite like a function generator).

 

Actually, I used academic Labview Development on a PC (i9).  

I created an array of pressure values that corresponds to the triangle waveform that I want to supply.

The number of points in the waveform, depends on the minimum time interval that the hardware can follow.

I need to control the loop timer to create this triangle/time pressure. 

So basically, I made a standard while loop with an event case where the timeout is set to 1/F and that is reading the index of the array every loop. 

If it is just the reading of the array, the performance are very good, like a minimum 1/F = 2msec. However, if I add a sub-vi that send the "set P" command, then I cannot keep the performance, because this instruction is slow. 

My hardware is a limiting factor. The number of points defining the triangle function, depends on the 1/F performances. I don't need a huge amount of points to define a triangle function. 

Example : 10Hz - 5 points define the triangle function - [0;t] = [0;100]msec and dt interval is 20msec. 

I can read the array every 20msec, but my hardware cannot set the pressure every 20msec but rather every let's say 30msec. 

But indeed, I would like also to read the pressure, plot a graph, etc... which are all detrimental to respect the time based loop. 

 

I already had a look to this timed-loop help : 

https://zone.ni.com/reference/en-XX/help/371361R-01/glang/timed_loop/

I think that I can assign one processor to set the pressure, and other processor to read P, and display graphs etc...

 

What should be the strategy to send instructions to the hardware with a precise time event ?

Do I need Labview Real Time module ? 

Any advices will be appreciated. 

0 Kudos
Message 1 of 5
(1,211 Views)

You showed us no LabVIEW code (meaning a LabVIEW 2019 VI, not a "picture of a block diagram"), did not tell us anything about the hardware you are using and how you communicate with it (is it a DAQmx-capable device?  VISA?  GPIB?), and did not clearly describe what you want to do.

 

I'm guessing that you are a beginner in LabVIEW, and have not learned the Principles of Data Flow that give LabVIEW the possibility of doing tasks in parallel (such as waveform generation and plotting data).

 

LabVIEW RT runs on hardware running a Real-Time Operating System (which Windows is not).  You can take a PC and reformat its C: drive to install such an RTOS, but it probably not the recommended solution -- better to purchase a PXI or cRIO system.  You'll want to have a year-or-two experience in LabVIEW development before tackling a task of this complexity (or find a knowledgeable colleague/professor and apprentice yourself).

 

Bob Schor

0 Kudos
Message 2 of 5
(1,198 Views)

Thank you Bob, your last paragraph is the most interesting. For other comments, I am just Labview Core 2 and Labview is not my daily job in academic research. However, I have quite some experience with building about 8 setups in various research fields, mainly using GPIB or USB, all are working nicely.

 

But here, it is my first time playing with the "time". And 1 instrument is limiting the performances of my program. I am trying to explain gently and in a synthetic way, but I guess it was a bit too simple. If you don't have experience with a Pressure setting, I will say that it is not a function generator, and it is not as fast as light or electrons. But I agree that I was not so clear in my description. 

  

--> I attached a part of my program, which is the setting of the waveform that I want to supply to my instrument with strictly respecting the dt interval. This corresponds (see below) to the event case "setWfm". 

 

My instrument is capricious because "set P" will take about 100msec, and "read P" will also take about 100msec or less, it is a bit random. The vi driving set/read functions are locked by the company. 

 

I will not attach my main program because it needs dll from Fluigent, and it needs the instrument to be plugged for being able to run the program without errors. 

 

I explain what my main program (not showed) is doing : 

I built a producer/consumer with event cases, and I use the consumer to display graphs and save data in text file. It is usual. The dequeue element is set with a timeout of -1. 

 

Basically, the event structure is using a enum containing several cases : 

init, setWfm, DispWfm, Start, check counter, setP, readP, save, stop, exit, WaitEvent

 

Init : detect instrument, set counter to -1 (idle mode)

setWfm : set the waveform (which is using the attached code)

DispWfm : display the resulting waveform that is going to be set with time

Start : when button "start" is pressed, start reading the index of the Wfm, and sending the settings to the instrument. The counter is incremented (+1) for each iteration

CheckCounter : If the counter value is > the max number of points in the Wfm, then stop. Else, go to "WaitEvent"

Set P : read the array index (counter) and retrieve the corresponding amplitude of the Wfm then set the pressure to the instrument

Read P : after setting P, send instruction to the instrument to read P. In this event case, both the data [time,set P value, read P value] are built in an array. 

Save : the array of data is sent to Enqueue Element to be read in the consumer loop. 

WaitEvent : check Start button, Exit button.

 

Here is my problem maybe : the timeout of the WaitEvent is supposed to drive the loop duration (WaitEvent-->start-->setP--> readP-->save-->WaitEvent), but this timeout value should be > than the sum of duration of setP and readP that is instrument dependent. Other switch between cases does not consume so much time, it is a few 100usec or less because the instructions are minimised. 

 

From real measurements, when setting the timeout to 100msec, I found that 1 loop is oscillating between 125msec and 250msec, randomly distributed. So I did not succeeded to reproduce the simulated Wfm by setting the timeout = dt, then I could not respect the simulated Wfm with time. 

But when setting the timeout to 500msec, still, the program could not strictly loop in the timeout = dt duration = 500msec. And I cannot understand this problem. 

 

So I think I need to reconsider the architecture of this program, in order to impose the dt to the loop. I am thinking about using timed-loop for this purpose.

 

 

 

 

 

0 Kudos
Message 3 of 5
(1,161 Views)

Sounds a lot like the kind of programs I've been known to write.

  • You have a series of "Tasks" that involve interaction with the User.  This can include setting parameters, reading/writing configuration files, displaying data after it has been collected, responding to Button Presses to start/stop Acquisition, etc.
  • You have a Data Acquisition (or Control) task that has a defined time repetitive sequence ("Update every 100 msec").

The first Bullet Point suggests a State Machine architecture, something like a Queued Message Handler (or, if you are a fan of Channel Wires, as I am, a Channel Messenger Handler) that can manage the User interface and take you through an orderly series of "States".

 

The second Acquisition/Control/Timed Task suggests a Parallel Loop, perhaps started and stopped by asynchronous signals from the State Machine, that are "internally timed", either because they involve a DAQmx Read or Write function that is configured for Continuous Input (or Output) of N Points at S samples/sec (so each loop takes precisely N/S seconds) or because they are placed in a Timed Loop (not really ideal for Windows, but works great in a device running NI Linux Real-Time OS).

 

Can you "isolate" your DAQ tasks in this fashion so they will "self-clock" at the speed that your hardware requires?

 

Bob Schor

0 Kudos
Message 4 of 5
(1,149 Views)

Dear Bob, thank you for your comment. Here, I am using any DAQ because set P and read P are performed through the labview vi provided by the company (here, Fluigent). I have experience with DAQ for ultra fast data acquisition for another project, and the sampling with time was very nice. However here, I cannot use it. 

So I will try a timed-loop to see if I can impose a 200msec on that loop (set / read), and then send data the Queue and dequeue in another parallel while loop (as I did already). The idea was to get some experience from people who used timed loop to see if this structure can allow respecting strictly the cadence/time. 

I will test next week, when the setup will be available. 

Once again, thank you again for your time in responding to my request and sharing your experience. I am again a bit sorry for my previous post that was not so precise. 

0 Kudos
Message 5 of 5
(1,144 Views)