LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Losing 75% of UDP data due to design/UI/references/charts (I think)

Solved!
Go to solution

Okay, please bare with me as I try to explain what's going on since I can't share the code.

 

System is OOP and the Main VI calls several Asynch VIs.  One of which is a UDP wrapper/driver. On the Front Panel of the Main.vi is a Tab control.  Within one of the Pages/Tabs are some charts.  I pass references to these charts to the Asynch UDP VI (and all the other Asynch VIs).  There the UDP accumulates the data, processes it, and then updates the charts appropriately.  I am not using DVRs. But I am using InPlace Element Structures where within it gets the reference, sets up the property node, and updates the value on the charts.  Now, I'm sure there are some poor design decisions in all that and you're welcome to point them out. But the main problem is that when I view the tab that contains the charts, I see a decrease in data accumulation by a factor of 75% (since it's UDP and it's just broadcasting away).  Click on a different tab and data rate jumps back to "normal."

 

I've been pondering this for a few days now and I thought I'd ask the pros here what you all thought.  I understand that charts are arrays. I have limited the size of the charts and made them even smaller when I encountered this problem and that didn't do much of anything.  (But for a side question, because the oldest value is being dropped and the newest value being added, are the arrays being completely rebuilt?)  Would the monitors being limited to 60Hz cause any issues?  The PC does have a dedicated GPU.  Is my problem that I used references inside "InPlace Element Structures" thus causing resource issues where the display/UI being slower is now waiting for access to the reference causing further delays?

 

I'm sure you're asking yourself why am I not buffering data?  Cuz I wanted real time data.  The station requires the data immediately to make certain decisions and complete certain actions.  So... any thoughts or solutions?  Any questions for further clarification?  Any questions that are more insults?  Happy to hear them.  

0 Kudos
Message 1 of 11
(1,965 Views)

Do not update the charts in your UDP reader. You are passing references to the data reader and when you switch to the tab with the charts the execution switches to the UI task and will be burdened with the time the UI updates take. Your data reader should read the data and then pass it to the display task via a queue/notifier/channel wire or some other method. The asynchronous reader will continue to read data at the desired rate and it will not be affected by UI updates or be forced to execute in the UI thread.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
Message 2 of 11
(1,958 Views)

This is what I had been suspecting.  But I didn't realize it was such a significant issue.

So basically, I need a separate loop that just sits and reads UI data queues and updates as necessary?  But wouldn't this issue/experiment mean that the queue would fill up faster than it could get dequeued when viewing the charts?  So I guess, I need a limit on the queue and just accept that if I cannot update the Front Panel, too bad?  Updating the Front Panel just isn't as important as actually getting it?

0 Kudos
Message 3 of 11
(1,953 Views)
Solution
Accepted by topic author DailyDose

@DailyDose wrote:

This is what I had been suspecting.  But I didn't realize it was such a significant issue.

So basically, I need a separate loop that just sits and reads UI data queues and updates as necessary?  But wouldn't this issue/experiment mean that the queue would fill up faster than it could get dequeued when viewing the charts?  So I guess, I need a limit on the queue and just accept that if I cannot update the Front Panel, too bad?  Updating the Front Panel just isn't as important as actually getting it?


In essence, yes if your data rate is rather fast you may not be able to display it all. Depending on what you are doing this may not matter. If your are getting data every second and trying to graph/chart 24 hours of data your display will never provide enough resolution to actually display all of the data. Some data will be lost on the display. Normally large data sets are decimated and only display a subset of the complete data set. A notifier with this since it only contains the last data value, not all like a queue.

 

An approach I have taken for processing queues is rather than dequeue a single element every time I go to process data I flush the dequeue, which returns all current elements and then process all of the elements at one time. This does add some efficiency in some cases.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
Message 4 of 11
(1,947 Views)

@Mark_Yedinak wrote:

An approach I have taken for processing queues is rather than dequeue a single element every time I go to process data I flush the dequeue, which returns all current elements and then process all of the elements at one time. This does add some efficiency in some cases.


That could maybe work.  I'll put it to the test and find out.  But yeah, if it can't all be displayed, then I guess it can't all be displayed.  

0 Kudos
Message 5 of 11
(1,941 Views)

So a few weeks later, I have implemented a separate Asynch VI to process the Front Panels with data being queued to it.  And I don't queue all data.  However, I still see the same massive loss in UDP packets not being collected when viewing certain charts.  Even when using flush queue.  I actually see this same loss when I place probes in the code.  This is crazy that this happening.  Is there a way to disable the chart's "real time" update?  Cuz the probe thing tells me there is something else and more LabVIEW related.

0 Kudos
Message 6 of 11
(1,853 Views)

At his point you would need to post your code. At least post the code that you use to read the data. Without seeing the code there is much of a chance to figure out where your processing delay is.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 7 of 11
(1,846 Views)

Aight.  Since posting the source code is unfortunately not possible due the amount of broken that will happen for anyone who tries to open it, here are a bunch of relevant photos...

 

Chart refs get passed into Object (Main.vi)

DailyDose_0-1659457017477.png

 

UDP driver gets called (Main.vi)

DailyDose_1-1659457051980.png

 

Get UDP Data

DailyDose_3-1659457256459.png

FP Queue FGV (this is where the data is Queued.  The reference is an enum for which chart to update with said data)

DailyDose_4-1659457538605.png

 

Post Process Data (processes/filters data into appropriate charts and groups)

DailyDose_5-1659457619423.png

 

Site - B REFLCT FGV

Data being sent to appropriate Queue/chart and DVR

DailyDose_6-1659457783872.png

 

Front Panel Driver Gets called (Main.vi)

DailyDose_2-1659457086182.png

Inside the Front Panel Driver (Each case is just the specific chart property node)

DailyDose_7-1659457964956.png

 

 

Hope this is enough to maybe see some massive (or subtle) flaw.

 

0 Kudos
Message 8 of 11
(1,835 Views)

Still not really enough but it appears that you are reading your UDP stream one value at a time. More likely than not your are losing data because the buffer is overflowing. What is the rate that the data is being produced? Why don't you read more at one time so you keep the UDP data buffer small? Do you really need to chart every point? If the data is coming in fairly quickly you don't need to chart every point. You only need to chart enough points to match your screen resolution.



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 9 of 11
(1,823 Views)

So this is something I wasn't entirely sure how I wanted to address when I first wrote this and did debate with myself quite a bit.  Yes, it's taking one UDP packet and processing it at a time.  I'm hearing that maybe the correct way to go about it is to throw it into a For loop, acquire 100 packets or so, then send them off to processing?

0 Kudos
Message 10 of 11
(1,815 Views)