LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Specifying an index for inserting elements in an existing queue

A notifier is typically used for a single writer/multiple reader.  But notifications overwrite old ones that haven't been read.  Meaning you can lose data.

 

If you want a FIFO for multiple instruments, maybe you should have a different queue for each instrument.  Each instrument would also have it's own loop.



There are only two ways to tell somebody thanks: Kudos and Marked Solutions
Unofficial Forum Rules and Guidelines
"Not that we are sufficient in ourselves to claim anything as coming from us, but our sufficiency is from God" - 2 Corinthians 3:5
0 Kudos
Message 11 of 34
(1,581 Views)

Runjhun,

 

Sorry about the confusion.  It seems to be going both ways.

 

In (2) you indicate that multiple devices will send data. In the next paragraph you call it "single writer/multiple reader."  These seem to be in conflict.

 

- Since you are sending "configured data" via a cluster queue, I assume that all the clusters have the same structure and the values differ.

- Perhaps multiple queues will work for you. One queue for each device.  Each queue uses the same cluster datatype but talks to a different device. By naming the queues as part of the configuration process, you can assure that the names are unique, thus keeping the data routed to the correct device.

 

Lynn

0 Kudos
Message 12 of 34
(1,580 Views)

Sorry, I misquoted that part.

Mutilple devices will only access the data from the main module.

 

So now what I am doing is :

 

>> Main module is creating a queue and enqueuing the data.

>> Each device will open a separate reference to the same queue and deque the data. (I am not concerned about synchronization here). So that means I satisfy the condition of having multiple queues.

 

But the problem statement is still the same. The devices are dequeing similar data for a certain period of time, and then next set of data for another period of time. I want that the devices process data from different data sets which will be possible only if the data is enqueued in the queue in a non-traditional way.

 

Definitely, notifiers will cause data loss which I cannot afford.

 

Reading all the replies I think that, whatever I want is not possible. I guess it'll be a limitation in my application.

0 Kudos
Message 13 of 34
(1,560 Views)

You can't have mutliple tasks dequeueing data from the same queue. Passing the reference to your separate tasks still results in multiple readers from the queue. You have no way of know which device will be grabbing the data at any point in time. If you require multiple devices you will need a queue for each device. If you need to broadcast the data to all of the devices you can use user events and have each device register for the event. Then your master simply generates the event to pass the data. To repeat, YOU CANNOT DO WHAT YOU WANT WITH A SINGLE QUEUE!



Mark Yedinak
Certified LabVIEW Architect
LabVIEW Champion

"Does anyone know where the love of God goes when the waves turn the minutes to hours?"
Wreck of the Edmund Fitzgerald - Gordon Lightfoot
0 Kudos
Message 14 of 34
(1,548 Views)

Let's forget about queues or any other specific data passing method for a moment. I want to make sure that everyone agrees on what is desired.

 

 

These are the assumptions that I have made based on your posts so far:

 

1. The data source (the main module?) generates blocks of data and puts them "somewhere" for the processing devices. The blocks of data may have various sizes.

2. Multiple devices process data obtained from "somewhere."  Each device processes one block of data at a time.

3. Any device can process any block of data.

4. The order of the blocks of data relative to the processing is not important.

5. It is important that all blocks of data are processed - no data loss.

6. The main module (data source) does not know or care which device processes which data block.

7. Each device has no knowledge of which other devices are processing which blocks of data.

8. Each device has no knowledge of how many other devices are actively processing data.

 

If these are not correct, please clarify where I have been mistaken.

 

Suppose the source has generated several data blocks: [data-a], [data-b], [data-c],...[data-m].

Suppose three devices are active: {device-1}, {device-2}, and {device-3}.

 

Now let {device-1} process [data-a], {device-2} process [data-b], and {device-3} process [data-c].  Now consider that [data-a] is much bigger than [data-b] or [data-c] so that {device-2} finishes before {device-1}. Now does {device-2} immediately start processing [data-d] or does it wait until all devices are done so  that they can take the data in order?

 

Lynn

0 Kudos
Message 15 of 34
(1,536 Views)

Hi Lynn,

The understanding is perfect but just with the following small corrections/updates:

 

9. The source generates data block [data - a] at time <ta>.

10. There is a chance that data block can have sub data set [data-a1], [data-a2].

 

Answering your question, yes, {device-2} will not wait for any other devices to finish the data as the devices work independent of each other. If data is available for it,it will start processing. The sequence of processing different data blocks is not important because that I'll take care by attaching a timestamp token. There is no problem in whatever scenario you stated.

 

I am not concerned also that which device is processing what data, my sole aim is to send the data to the target device which can be sent from any device.

 

Now the problem statement occurs in the scenario where:

 

"Suppose at <ta>, {device-1} and {device-2} is processing [data-a1] and [data-a2]. This configuration brings out the limitation that, [data-a2] cannot be processed by {device-1} and {device-2} until and unless [data-a1] is completed."

 

I don't want this limitation, so thats why I asked if there is way, that [data-a2] somehow can be added in between the blocks of [data-a1], so that the devices will send both [data-a1] and [data-a2] at same time (means sometime it send [data-a1] and sometimes [data-a2], instead of processing only one sub-data block at a time).

 

Runjhun.

0 Kudos
Message 16 of 34
(1,508 Views)

Runjhun,

 

Thank you for clarifying.

 

When you have sub data sets [data-a1], [data-a2], do you also have a data block [data a]?  Do all of these have the same timestamp <ta>? Can some other data block [data x] also have an identical timestamp <ta>?  Is there any flag or indicator in [data a] or [data a1] the tells the processing devices that sub data sets exist?  Can there be more than two sub data sets for any particular data block?

 

Lynn

0 Kudos
Message 17 of 34
(1,489 Views)

Just catching up on the thread.

 

From what I'm reading Dynamic Events would be the correct transfer mechanism.  One enqueuer multiple dequeuers that can be dynamically assigned to act on (Register for) or ignore (unregister for) the events "main" enqueues.  This does add some problems.  what if registered events are queued behind a specific consumers event structure that unregestered for the event allready queued? (I don't know how LabVIEW handles that is it processed by the event structure or left in the queue for the next srtucture to register for it not since it was registered to handle the event when it was enqueued?)  A test case could be made to find out.  Another problem would possibly be stale events that no structure is registered for.  What is the correct action to take? Leave them hoping some structure will eventually register or Flush them?  Third what if two structures are registered for the same event?  Currently both consumers will process the event.  Is this the desired behavior or do the event registrations need to be exclusive?

 

Still, this seams like the communications structure is overly complicated but solvable

 

 


"Should be" isn't "Is" -Jay
0 Kudos
Message 18 of 34
(1,478 Views)

Hi Lynn,

 

>>When you have sub data sets [data-a1], [data-a2], do you also have a data block [data a]? Do all of these have the same timestamp <ta>?

Yes, [data-a1], [data-a2].. is part of [data-a]. Lets not go into much more complexity but I would just give you a small example as how data block and sub-blocks are related.

Lets consider, [data-a] wants to send [info-i1]. Now the user has option to select what and how many info he wants to send to the target device. So when user selects multiple info, the sub-blocks come into picture.

The sub-blocks will be having same timestamp as assoicted to [data-a], but just instead of saying [data-a] sends [info-i1] and [info-i2], we redefined the terminology as [data-a1] sends [info-i1], [data-a2] sends [info-i2] and so on..

 

>>Can some other data block [data x] also have an identical timestamp <ta>?

No.

 

>> Is there any flag or indicator in [data a] or [data a1] the tells the processing devices that sub data sets exist?

As you would have understood from the first part, the [data-a] having multiple info configuration is renamed to [data-a1], [data-a2] and so on, So the device cannot differentiate between the two, as the structure for both is same.

 

>>Can there be more than two sub data sets for any particular data block?

Yes. It depends upon the user configuration. If he selects n [info-i], then [data-a1].. [data-an] can exist. (where n can be a real number but usually it'll be <100).

 

Runjhun.

0 Kudos
Message 19 of 34
(1,466 Views)
After reading that I'll stand by my first opinion. one queue won't do

Unless you can get the dequeuers to ignore elements in the queue meant for another actor. Dymanic events.

still. the data structure needs to be thought. I bet a more elegant solution exists

"Should be" isn't "Is" -Jay
0 Kudos
Message 20 of 34
(1,451 Views)