Hello All
I have a PCI 6534 High Speed Digital I/O card that I am trying to use to generate a pattern output from port A and acquire some data from port C. I have connected Port C, bit 0 to high and all the others low and I have also connected the two REQ pins together. This is to allow for the REQ pin from the pattern generation output to drive the input as an external clock. I have set the timebase as 1uS and the request interval as 10, to give a REQ pulse every 10uS. The idea being that the pattern generation output will generate a REQ pulse every 10uS and this would cause an input read to occur. My code can be found in the attached word file.
Now initially I placed the DIG_Block_In command before the DIG_Block_Out and set the two counts to 100. For a single run of the application this filled my input buffer array with 50 elements of 257.... which I think is what I would expect as for a count of 100 it takes 2 to fill both the upper and lower 8 bits of the input array. OK....
Now if I change the In Count to 200 and leave the Out Count at 100 I only fill 48 elements. I have no idea why this would happen. In my finall application I hope to increase the count to nearer 2000 and this loss of elements becomes significant.
If I swap the DIG_Block_Out command to go before the DIG_Block_In then with both counts set to 100 I get no data acquired at all. If I increase both counts to say 2000, I actually acquire 944 elements, 56 less than I would expect? Why is this....?? Is it because the DIG_Block_Out command has already started the process before the DIG_Block_In command is initiated?
Does anybody know what is going on here? I have had the same problem with a PCMCIA 6533 card (worse) and thought it would be solved with the PCI 6534. Does anybody know how I can ensure the correct numbers of data are acquired every time I run this operation? I need to be sure that all the desired data is being acquired as my final application is very dependent on this.
Any help would be gratefully appreciated.
Jamie