‎08-11-2017 03:07 PM
Hi,
I'm wondering whether there are any Labview implementations of an efficient optimization algorithm for "real-world" signals (i.e. a control problem rather than a pure math problem). The goal is to automatically adjust a set of physical parameters (e.g. voltages, motor positions or the like), which maximize a measured "objective" signal. An intuitive example would be the alignment of a laser beam through a pinhole using a motorized mirror and the feedback is provided by a photodiode or powermeter behind the pinhole.
I thought about using one of the optimization VIs, but I think they would have an inherent problem with an objective function derived from a "real-world" measurement because of the noise and/or drift. Maybe it's possible to control the step size and tolerance to get a reasonable result but I couldn't figure out a robust procedure so far.
I'd rather treat it as a control problem with an approach similar to a PID controller but instead of locking the "process variable" to a setpoint on a monotonous curve, it should rather lock it to an extremum (which can drift over time). I'm not very familiar with control algorithms and I don't even know whether it's possible at all. Does anyone know a suitable solution?
‎08-11-2017
03:16 PM
- last edited on
‎04-04-2024
05:29 PM
by
Content Cleaner
There is a PID toolkit that should come with LabVIEW full or professional.
https://knowledge.ni.com/KnowledgeArticleDetails?id=kA00Z0000019RTlSAM&l=en-US
‎08-11-2017 03:20 PM
I know, but a PID controller can't lock to an extremum...
‎08-12-2017
12:03 PM
- last edited on
‎04-04-2024
05:29 PM
by
Content Cleaner
There is also a modulation toolkit that might help you make a phase locked loop implementation.
‎08-13-2017 12:00 AM
That's the same as a limited PID controller (using only P or PI) and the signal is not periodic, so I don't see how I can extract a phase from that to lock onto...