LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Dataflow Anomaly

Solved!
Go to solution

Given your picture, wouldn't you expect Probe 1 to execute before Probe 2?  

 

Nobody here is expecting them to be synchronized.  Rather, they're expecting that order.  Instead, they're seeing Probe 2 execute before Probe 1.  That's where we're seeing confusion.

 

For anyone interested, this is filed under CAR 739508

0 Kudos
Message 21 of 29
(1,506 Views)

I think the idea is that dataflow only says that probe 1 becomes *eligible* to run before probe 2 does.  But it doesn't guarantee that it *will* run first.

 

 

-Kevin P

ALERT! LabVIEW's subscription-only policy came to an end (finally!). Unfortunately, pricing favors the captured and committed over new adopters -- so tread carefully.
0 Kudos
Message 22 of 29
(1,495 Views)

I'd argue that negates the value of debugging.  If debugging options were only "eligible" to run at the same time as the wire but able to run as if in a parallel node with no shared dependencies, debugging as a whole would be a nightmare.

 

The order the DVs take place should match the order the wires/functions execute.  When that doesn't happen, it hinders debugging in an unexpected way.

Message 23 of 29
(1,490 Views)

@BoKnows wrote:

Given your picture, wouldn't you expect Probe 1 to execute before Probe 2?  

 


No, I wouldn't expect that.  Why on Earth would you expect that?  Where order of execution is not forced by dataflow, it is undefined.

"If you weren't supposed to push it, it wouldn't be a button."
0 Kudos
Message 24 of 29
(1,470 Views)

@BoKnows wrote:

I'd argue that negates the value of debugging.  If debugging options were only "eligible" to run at the same time as the wire but able to run as if in a parallel node with no shared dependencies, debugging as a whole would be a nightmare.

The order the DVs take place should match the order the wires/functions execute.  When that doesn't happen, it hinders debugging in an unexpected way.


It certainly does not negate "the value of debugging".  I think it is quite rare to need to know the exact time that data arrives at a wire.

And your assertion that this lead to "debugging as a whole would be[ing] a nightmare" is laughable.

"If you weren't supposed to push it, it wouldn't be a button."
0 Kudos
Message 25 of 29
(1,463 Views)

@paul_cardinale wrote:


It certainly does not negate "the value of debugging".  I think it is quite rare to need to know the exact time that data arrives at a wire.

And your assertion that this lead to "debugging as a whole would be[ing] a nightmare" is laughable.


It sounds like you're over-generalizing my thoughts into a way that doesn't really make sense, in context.

 

Again, I don't think anyone here (myself included) is concerned with the exact time of the probe nor with the synchronization.

 

If we view the "Probe 1" and "Probe 2" in your image as separate nodes that can operate with a standard dataflow, this will effectively make ALL probes parallel (none of them have dependencies on each other so any path COULD execute first leaving to a very non-determined order in which debugging behavior displays).  As this thread has shown, that behavior also translates to highlight execution and other debugging.  For example, set a breakpoint leading into the For Loop's conditional terminal.  Place another on each of the two error wires outside of the For Loop.  Run the VI and watch where it stops.  

 

If that occurs, it means you cannot reasonably place breakpoints.  It'll stop on that wire, sure.  But, the part that executes afterwards may have displayed first meaning you won't get to watch what happens after your breakpoint.  Highlight execution will randomly display wires.  None of them have direct dependencies as depicted.  Etc.

 

As you've called the notion this is a nightmare "laughable," I'd love for you to share how you'd use debugging if the order in which the debugging visualizations wasn't ordered.  I'm honestly more than a little curious.

0 Kudos
Message 26 of 29
(1,451 Views)

Proper order of execution highlighting is extremely useful to the new programmer to get a feeling of how dataflow actually works (i.e. as a first approximation. It does not really tell much about true parallelism because highlighting always seems to only occur in one small area of the diagram at any given moment), so avoiding observations that are in contradiction to dataflow seems very important. Still, it does work correctly in nearly all typical scenarios, so fixing this oddity probably is not super high priority unless it also fixes a more fundamental code problem.

0 Kudos
Message 27 of 29
(1,446 Views)

Just for kicks, if you leave the concatenating tunnel but remove the string array indicator, all is normal again.  Smiley Frustrated

aputman
Message 28 of 29
(1,438 Views)

This surely is pretty weird if you look at it. I come from the world of OO and we have things called debug symbols which get created when your code is in debug configuration. I believe this has to do with how labview is creating debug symbols (visual nodes) as it continuously compiles like we expect it to. Technically, it can throw one off when it seems like the code works but your debug symbols are everywhere. Atleast this is what I believe could be the case. I have to thank the OP and the rest of you guys for your responses as this is a pretty weird case and I learnt quite a lot!

0 Kudos
Message 29 of 29
(1,404 Views)