I'm running a P3 933Mhz with Win2K and 128 megs of ram. I'm trying to output data on a DIO32HS, 32 bits wide with a total throughput of about 27.5Mbit/sec (~860,000 samples/second). PLAYBUFFERSIZE=0x200000. gPlaybuffer is a short. I'm double buffering like so:
status = fread(gPlaybuffer, 1, PLAYBUFFERSIZE * 2, gStream);
nistatus = DIG_Block_Out(1, 1, gPlaybuffer, PLAYBUFFERSIZE >> 1);
while (!feof(gStream)) {
status = fread(gMiddlebuffer, PLAYBUFFERSIZE, 1, gStream);
nistatus = DIG_DB_HalfReady(1,1,&halfready);
DAQErrorHandler(nistatus, "DIG_DB_HalfReady");
nistatus = DIG_DB_Transfer(1, 1, gMiddlebuffer, PLAYBUFFERSIZE >> 2);
DAQErrorHandler(nistatus, "DIG_DB_Transfer");
}
nistatus after the DIG_DB_Halfre
ady() is frequently -10803. Sometimes it won't even succeede the first time (after the first DIG_Block_Out) I find it highly improbable that a system this fast cannot sustain this data rate (the fread can't take THAT long!!!). The same code works fine under Win95/98. Does anyone know why Win2K performance is worse than Win98's? Is it because my buffer is being paged out to VM? Does anyone know how to prevent Win2k from paging that memory? Taskmon says I have 40Megs or so free (even with swap file set to 20meg) so I'm assuming i have 20 megs of physical ram free. Are there file IO calls that are faster than stdio in Win32? I'm running NI-DAQ 6.5 on Win2k and NI-DAQ 6.0 on the old Win98 machine. Would that make a difference? Is there a way to increase the priority of the NI-DAQ driver?