02-23-2017 10:02 AM
Does anybody try to use WM_GESTURE on a VI? It seems easier than WM_TOUCH but I really don't know how does it works. I tried to use the code x0219 instead of x0240 (as in the above example) but I didn't receive any message (even if I use the x0240 code for WM_TOUCH message).
Thank you for your help!
02-24-2017 03:59 AM
Before a window can receive WM_TOUCH (and by inference also WM_GESTURE) messages you have to enable touch input for that Window by calling
RegisterTouchWindow(HWND hwnd, DWORD, flags);
This is most likely the first Call Library Node after the Find Window VI in the picture in the first post, but isn't documented there. But WM_GESTURE also contains a handle that most likely won't be valid at the time you retrieve it from the event queue.
02-24-2017 04:39 AM
Hello rolfk and thank you for your reply,
I tried to activate the touch input on my VI using user32.dll but it doesn't work, I enclosed an image to show you my diagram of my VI.
When you say that the handle won't be valid, could you explain? How can we ensure the handle is the right one? Why is it modified?
I'm sorry it's really new for me!
Thank you
02-24-2017 06:40 AM
Handles are memory allocations. They need to be properly freed at some point.
The whole windows message handling relies on the fact that every message is either consumed by some consumer and that consumer then will need to take care to properly deallocate any resources that are contained in the messages, and that includes handles.
An intermediate observer such as a message hook procedure used in the Message Queue library is NOT supposed to consume a message but pass it on to the next hook through CallNextHookEx() function. Not doing so basically makes any interaction with an application from the user impossible. You could argue that LabVIEW currently doesn't process the TOUCHINFO and GESTURE messages anyhow so nothing lost if you modify the message queue library to not pass those messages along through CallNextHookEx(). But that is not only short lived thinking but also wrong for several reasons, unless you know exactly what you are doing.
Short lived because a feature version of LabVIEW might actually support TOUCHINPUT processing in some way and your library would then disable that completely. Wrong because the LabVIEW Window might not be the only consumer of such TOUCHINPUT processing. Ultimately after the message is passed to the application (here LabVIEW) it usually is passed further to the DefWindowProc() if the application didn't decide to process it completely. This function may decide to translate a message into something else (an older less functional message) and send it back to the application to still produce some feedback such as scrolling. Also it ultimately deallocates any resources that the message contains including handles as otherwise those handles would be leaked.
The problem is that at the time your LabVIEW application gets to retrieve the message from the event queue, the DefWindowProc() processing in most cases has already been done and the handle that was copied into the event list is not anymore valid.
03-02-2017 06:10 AM
Thank you for your explanation. The fact is ther is a library LabVIEW that realise this published by Aledyne Engineering:
http://sine.ni.com/nips/cds/view/p/lang/fr/nid/212474
So I know it's possible but I dont't know how....
I can't use this library because it needs a license for every posts we want to install the application!
03-02-2017 09:15 AM - edited 03-02-2017 09:21 AM
If you really have a need for this, I'm sure they are very willing to negotiate some good deal with you for a volume runtime license. Developing this on your own without a very solid knowledge about C in general and Windows API programming especially, you are not very likely to end up with a reliable and well working solution.
I would estimate that developing this into a user friendly library would take me several days of programming work. That is a several thousand $ investment that I do not feel compelled to do as free community support.