03-27-2006 09:37 AM
03-27-2006 10:44 AM
03-27-2006 11:34 AM
Hi Kevin,
thanks for your awnser...
1. Is there a way to get rid of the duplicates without sorting? I sort the Array (hoping NI implemented a nearly ideal code 😉 ). Then I create a new array of the size of the original array and step trough the sorted array, copying all elements that are unique into the new array. As a last step I cut the new array after copying the last value. So I avoid reshapes in the loop.
2. Floating point equality is not an issue in my case as I use an incremental position counter which is polled far to often. (The whole problem came up when we found out that we need a higher sampling rate on other channels but still synchronous sampling. Before that we used the incremental position counter as sample clock and there was no need to reduce data.) But for other raw data sets this certainly is an important point to keep in mind.
My main problem are the "delete from array"-operations: I'm not shure if they can be coded in a way that their benefit of reducing the search operation time is higher than their cost. Or even worse - if there is a break-even point, where is it?
Meanwhile I coded an algorithm to delete multiple indizes from an array but it seems that there is still a little bug in so I couldn't compare it's speed.... work for tomorrow 😉
Sören
03-28-2006 07:56 AM
Actually, in thinking about this a bit more, I realized that I'm going to need to do a fairly similar thing for an upcoming project. I've got a device that will cycle through its positional range many times, and I'll need to perform some averaged spatial frequency analysis between the position data and some analog data. In my app, I plan to perform all the analysis offline so I can settle for an effective method even if it isn't efficient. But today's "offline" becomes tomorrow's "can we look at that in real-time?" so I'll want to aim for decent performance where feasible.
So now that I've got a vested interest
, what kind of performance requirements are you trying to meet? Is it sort of a fuzzy requirement like, "gee, it seems like I have to wait a long time after I hit the button before the results show up in the graph?"
General thoughts: the 1D Sort function in LV is indeed pretty good. Delete from Array can require an awful lot of memory management overhead -- search the forums for this and other articles.
I suspect that a better approach is to associate the X and Y data together first and then sort the associated data. Here the association would be a cluster with Pos as the 1st element and Force as the 2nd. More tidbits on sorting clusters can be found here. After the sort, there's no need to "Delete from Array" or search for indices -- the associated Force data is already bundled with the corresponding Pos data.
Then final step should be fairly straightforward using an auto-indexed For loop and some shift registers. When the Pos value is same as previous iteration, increment a "# identical Positions" count, and add Force to "Sum of Forces". When it changes, calc results from Left-Hand shift registers and reset values for Right-Hand shift registers. The results can go into a pre-allocated array using "Replace Array Subset". The size of the pre-allocated array is the # of possible unique positions.
Post back with comments. I'm not yet ready to implement code for my app, and would be interested in the actual performance of different methods you try.
-Kevin P.
03-28-2006 09:22 AM
03-28-2006 09:29 AM
03-28-2006 10:31 AM
03-28-2006 11:00 AM
That was the way we originally worked. The method ist absolutely OK if the resulting sample rate is high enough.
But in our case we need some more signals at a higher sampling rate than the encoder can provide. This data should be synchronous to the position data. Sampling at different rates is not an option as we use NI6251 (at least without using some sort of external clock).
With the VIs in preparation I can sample at the highest necessary sampling speed and get the other data on the way. Originally the problem of the different sampling rates was solved by taking more measurement cycles at different sampling rates. That worked but we had n different measurement cycles leaving us with a synchronisation problem.
Now I hope to be able to drop the other measurement cycles and still get better -and synchronous- results.
Sören
03-28-2006 12:40 PM
03-28-2006 02:24 PM