Multifunction DAQ

cancel
Showing results for 
Search instead for 
Did you mean: 

Webcast: The Things I Wish I Would Have Known Before I Started Programming with NI-DAQmx

Hello all,

As some of you might know, NI develops informational webcasts on a regular basis. For instance, in 2007 we began producing an advanced NI-DAQmx session on a regular, quarterly basis. Based on the feedback from many of our users, one of the next advanced DAQ topics is going to discuss "The Things I Wish I Would Have Known Before I Started Programming with NI-DAQmx". The catch... all the tips and techniques presented must be user submitted ideas.

We’ve already received several great ideas from our users that will help us produce this user-created webcast. Based on the strong community involvement on the forums, we thought we would open up the discussion to everyone.

So here’s your chance to submit the tips and techniques you’ve picked up along the way while using NI-DAQmx. If your tip makes it into the webcast, we’ll be sure to give you proper credit.

At this point, we really just want to brainstorm and throw ideas and comments out there. Later on we’ll formulate the final list and produce the live webcast.

Thank you all very much for your support and participation.

Jervin Justin
NI TestStand Product Manager
0 Kudos
Message 1 of 24
(6,783 Views)
Thanks Jervin,

I really like NI's webcasts and try to catch them live when I can.   DAQmx is complicated and there are many ways to accomplish similar tasks. 

The trick is learning which methods are appropriate for your application.  Webcasts are a great way of picking up these "tricks".

---------------------
Patrick Allen: FunctionalityUnlimited.ca
0 Kudos
Message 2 of 24
(6,689 Views)

One of the most important improvments in daqmx is simulated devices.  I write 10-20 applications a year more than half use daqmx and without simulated devices I would not be able to support them.  One improvment to simulated devices would be a signal editor.  It would be great if the source was configurable to debug applications with realistic signals.  Another important thing to undersatnd about daqmx is the class hierarchy. Daqmx is much more OO in design than older legacy code in labview.

 

 

Paul

Paul Falkenstein
Coleman Technologies Inc.
CLA, CPI, AIA-Vision
Labview 4.0- 2013, RT, Vision, FPGA
Message 3 of 24
(6,683 Views)
i totally agree with falkpl:
configurable simulated data input would be really helpfull - not only to check our algorithms, but also to avoid "fake" hardware errors (monitored by AI or DI).
-----------------------------------------------------------------------------------------------------
... And here's where I keep assorted lengths of wires...
Message 4 of 24
(6,629 Views)

Thanks for the feedback so far!!

pallen: We appreciate the kudos on the webcasts.

falkpl: I really like your idea of including a demo of using simulated devices in the webcast! Its one of the tools I use all the time as well, especially when working with customers. Do you have any other ideas for the webcast?

We really take user suggestions to heart and if you'd like, you can suggest the configurable input/output on simulated devices (or any other feature you'd like to see in our products) at our Product Feedback webpage.

What stumped me the most when I first started using DAQmx were basic things like the difference between tasks, global tasks, physical channels and virtual channels. This never really stopped me from developing code, but in hindsight, if I had taken the time to understand this, it would have improved my program architecture. Did anyone else experience this when they started out?

Thanks and lets keep the ideas flowing!

Jervin Justin
NI TestStand Product Manager
0 Kudos
Message 5 of 24
(6,600 Views)
A helpful tip I like when I have DAQ communication issues is to check the test panels in MAX first before trouble shooting my LabVIEW code.

Also, one "gotcha" that I wish someone had told me is how to switch between using DAQmx and Traditional DAQ drivers.
0 Kudos
Message 6 of 24
(6,369 Views)
This is a great idea.  No questions or comments at this time.  I have a couple of projects that use DAQ-mx.  It will be interesting to see the difference from traditional DAQ.  I'm sure I'll have questions. 🙂
0 Kudos
Message 7 of 24
(6,337 Views)
 

JoeLabVIEW: I agree, it would be pretty handy to show an overview of what has changed in NI-DAQmx from Traditonal DAQ, especially for users that are switching their applications to use NI-DAQmx.

FYI, this table gives a brief overview of the differences between the two:

Also, if you are interested, I would recommend the following Developer Zone Tutorial that explains how to migrate existing vi's to NI-DAQmx from Traditonal DAQ in LabVIEW. There are similar tutorials for other languages as well.
Developer Zone Tutorial: Transition from Traditional NI-DAQ to NI-DAQmx in LabVIEW

MichelleGator: I love the idea of demonstrating the test panels! That's one of the first troubleshooting steps I do when talking to customers, and it really helps narrow down whether their problems originate from software or hardware.

These ideas are great so far, I can't wait to hear more ideas!

Jervin Justin
NI TestStand Product Manager
0 Kudos
Message 8 of 24
(6,317 Views)

Gabi1 wrote: i totally agree with falkpl: configurable simulated data input would be really helpfull - not only to check our algorithms, but also to avoid "fake" hardware errors (monitored by AI or DI).

HOT DANG YES!!!
 
I just posted a question about this..   here!!!
 
I mean really...  There has got to be a file somewhere which contains the "fake" data... Even if it's in binary.  How else do we fully test our code prior to having the actual card..  especially if that card is 1000 miles away 😉  😮  😞  (not sure which smiley is appropriate)
Message 9 of 24
(6,234 Views)
As I mentionned,  there are many things I wish I knew about DAQmx, expecially the limits of virtual devices created using MAX.
 
For instance, since they are simulations, how can I verify that what is implemented actually works?  An example is having a clock pulse train and a start pulse.  You'd want to make sure they are in sync and possibly wire this up to some virtual instrument like a scope.  Or having a "virtual power supply".  This is what I am curious about.
 
Thanks
 
RayR
0 Kudos
Message 10 of 24
(6,160 Views)