Okay, I'll try to be as clear as I can. I'm afraid that I don't speak German.
I have a piece of hardware on which I want to measure the voltage drops at at high temporal resolution (typically on the order of milliseconds). The voltages from from the hardware are fed to a signal conditioner, which can apply one of four different gain settings. This is in turn applies the signals to the DAQ card. The signal conditioner gain is controlled by Digital out lines from the same DAQ.
Now, every so often in a long period of time (hours, days), an 'event' (a rapid change in voltage levels) may happen very quickly (over a period measured in tens of milliseconds). In order to record the event accurately, the gain on the signal conditioner must be changed to keep the signal on the DAQ card in the range -10 - 10V, and fill that range as much as possible. Which means that LABview must sample the data AS IT COMES in, decide if the gain needs to be changed, and act as appropriate.
The difficulty I was having was that I couldn't make the read/check loop fast enough. I think I've solved that now, and I might post how once I'm sure it's working.