Hey Menchar,
Great question, and it is good to get this nailed down before a purchase.
Most of our "digitizing" boards (MIO, DSA, Scopes, etc) work in a different way than you are thinking. A good tutorial for how this works is here:
http://zone.ni.com/devzone/conceptd.nsf/webmain/A423AC5664191A0986256F310055FFA5?opendocument
Let me summarize. If we have a certain number of bits (say 3 bits) then we take our range and divide it into 2^(3) number of steps (000 through 111). These three digits are what is actually returned by the hardware. When the driver gets back this information, it does the conversion back to voltage based on the current settings of the hardware.
Now assume the 14-bit question. You would now have 16,384 (2^(14)) steps in your vertical range. (00000000000000 through 11111111111111). This is actually what is returned from the hardware and changed back to voltage by the driver.
So as you can see, we never worry ourselves about a sign bit as traditional thinking preconditions us. We simply return back a 14-bit number and let the driver worry about the sign of the returned sample.
Hope this helps and thank you for choosing National Instruments!
Sincerely,
Gavin Goodrich
Applications Engineer
National Instruments