08-12-2013 05:48 PM
Hello,
We are using a CVI table control to display test steps as part of a custom test sequencer. We have varying lengths of test sequences. A row in the table control is basically a test step in a sequence.
Is it a correct assumption to preallocate the maximum number of test steps we will encounter at application startup using InsertTableRows? (In our case it is around 60K rows)
I'm assuming that performing DeleteTableRows followed by InsertTableRows whenever a new test sequence is loaded could cause peformance issues involving possible memory fragmentation.
Thanks,
08-13-2013 12:21 PM
When you are saying custom test sequencer are you talking about NI TestStand? I am having a bit hard time understanding how correspond to different steps. Are they the input values going into different steps?
http://www.ni.com/white-paper/7959/en/
08-14-2013 11:56 AM
No, we are developing a test sequencer entirely written in CVI.
The application uses a CVI table control to hold test steps. The test steps are loaded into control rows upon operator selection and then columns in the table are updated during test execution (for example, measurements and results).
Originally we were deleting and inserting table rows whenever new tests are selected by the operator.
So, we tried preallocating using InsertTableRows a number of rows that would hold the maximum number we could expect. This seemed to improve an issue we saw with table update speed after a certain number of table delete/insert cycles. However, if we try to allocate too many rows during preallocation the table update speed is very slow.
08-16-2013 09:37 AM
Yes that is the expected behaviour. It is better to add and delete steps as you go along rather than preallocating memory from beginning.
08-19-2013 10:14 AM
OK, but we notice an apparent slow-down in table update speed after multiple cycles of insert/delete table row operations. Just wondered if this could be a possible slow-down due to memory fragmentation.
08-20-2013 04:56 PM
Can you try using it with another application, another computer with larger memory? Do you see the same behaviour?
09-12-2013 04:22 AM
Hello Sirs,
I have also the same of slowing problem, when I try to fill some data in CVI table.
If I try to add 50 (TBC) rows, 4 columns, the fully data display seems to be correct.
If I try to add 1000 rows, 4 columns, the fully data display take several seconds (30s to 1 min) !
I try several computers (Core 2 Duo to i3) (XP pro to Seven 32 & 64), same problem.
In the past, I have the same problem with Labview.
A NI guy gave me a VIs (FA) property, to permit to delay the refresh of the control latter.
Like this at each new adding, they was not have a refresh requested, so the filling was fully more quick.
http://zone.ni.com/reference/fr-XX/help/371361K-0114/lvprop/pnl_defer_pnl_updts/
http://digital.ni.com/public.nsf/allkb/547DFDA3D02FD0AE86257154006933F3
I go one my reading to find the same CVI function.
09-12-2013 04:29 AM
Did you see this post?
09-12-2013 07:19 AM
Thanks a lot, I will check it.
09-13-2013 11:51 AM
Grrr (again me) !!!
Two callbacks use in my project :
- one with a quick filling : SetTableCellAttribute & ATTR_CTRL_VAL
- and the other (neatly similar, but not modified) with a slow filling : SetTableCellVal
And it is writing on "Recall Function Panel".
And also thanks again Wolfman !!!
Between, some vacancy !!!
If that could help someone.