I am trying to generate various simple waveforms in LabVIEW, which will
then be output to various DAQ devices, amplifiers, etc. I would
like the user to be able to change the signal's amplitude and frequency
(and phase too, if possible) while the program is running.
I've put together the attached simplified file.which works fine as long
as I am willing to accept slow frequencies (i.e. those that still
retain some resolution with a 0.1 s time delay). (Note that for
output devices I would simply tie into the "Simulate Signal" output.)
I would like to be able to use this VI at higher frequencies, but if I
change the 0.1 s time delay to accommodate them, the output timing goes
way off. For example, change the frequency to 1 Hz and the Time
Delay to 0.01 s, and you'll see that the output is much faster than 1
Hz. I assume this may have to do with the relationship between
this time delay, the samples per second, and number of samples of
"Simulate Signal."
Is there a simple way to solve this problem using Simulate Signal, or
am I going to have to use a different method to create my waveforms?