DIAdem

cancel
Showing results for 
Search instead for 
Did you mean: 

Parallel processing limitations

Hi,

 

I created a script that does mutli-core processing on about 40 channels.

We are limited to 5 workers, so I just loop through them, so when a worker is done with one channel, it gets the  next one. So ultimately I do start 40 workers, but of course not simultaneously.

 

To pass data, I store the time and value data in the array that is passed to the worker (the SetArgument and GetArgument methods).

 

This works well for short recordings, but with longer ones, it somehow doesn't. My gut feeling is that I cannot pass large arrays to the workers.

Is this a known limitation?

 

I wanted to avoid saving/loading temporary data, as suggested in the help. Not very elegant IMO.

 

Exchange data as an Array between master and worker. Use the commands ChannelsToArray and ArrayToChannels to exchange channel data. If you want to exchange large amounts of data, you can save them in a file

 

Thanks,

Jacques

 

0 Kudos
Message 1 of 2
(666 Views)

Quick update:

I stopped transferring the time data in the SetArgument array. I only pass the value data, effectively reducing the SetArgument array size by almost 2.

For the time info, I pass the channel size and the wf_increment, and recreate an implicit time channel within the worker.

 

By doing so, my longer recordings also work.

 

So... What is the limitation in the size of the SetArgument / GetArgument arrays?

Based on this info I could estimate how long my recordings can be before I run into problems again.

 

Thanks

0 Kudos
Message 2 of 2
(640 Views)