03-27-2014 01:59 AM
Hi Everyone.
I have acer t231h touch monitor which has options like multitouch, swipe forward, swipe backward and so on. I want to use these touch options in my labview application like using swipe forward option to change tabs. How can i do it. Do i need NI touch panel module to do this.
Thanks,
Vishwas
07-22-2014 04:45 PM
Hi Vshu,
I dont know the full answer to this but I can tell you that you do not want the touch panel module. That module is designed to add support for the NI Touch Panels from within LabVIEW. My assumption is that each swipe gesture creates a windows event which we may be able to monitor or it may throw "mouse" events. Either way I threw a couple links I found after a few minutes of looking but there may be more out there.
https://decibel.ni.com/content/docs/DOC-17052
(Looks like it is using mouse events)
http://jessefreeman.com/articles/from-webkit-to-windows-8-touch-events/
07-23-2014 04:07 PM
Back before Windows 8, we had a Behavioral task where we put a large Boolean button on the screen, lit it up, and asked a subject to Touch it. We originally had a Touch Panel screen that we velcro-ed to the top of the monitor -- this acted exactly like a mouse, with touch being Mouse Down (and "untouch" being Mouse Up).
About a year later, we purchased a Dell All-in-One with a touch-sensitive screen, thinking "Oh, boy, this will be much easier". Wrong -- we had troubles because a "long touch" was interpreted by Windows as a right-click, whereas we wanted it to be simply "a very long Mouse Down". We did not want Windows "getting between us and the Mouse".
I don't now recall quite how we fixed this (I may be able to find some notes). Fortunately, you seem to want the opposite, to let Windows interpret your touch according to its rules and return its own interpretation. The question, then, is if LabVIEW "knows" about the various touch gestures that are now available (I don't know the answer to that ...).
BS