06-23-2025 11:05 AM
Posting code would be one step, just please save back to LV 2020 or earlier. What's also going to be needed is a much more thorough description of the system, the behavior you're capturing, what's "wrong" about it, and some rationale for why you expect to see something different.
-Kevin P
06-25-2025 03:44 AM
Hi,
Thank you for your response.
I am currently working on a project to develop a DAQ system for real-time, multichannel dosimetry in a LINAC (medical linear accelerator) environment. While I have made significant progress under ideal (lab) conditions, I am encountering a transient behaviour in the data that is limiting my ability to move forward toward real-time clinical scenarios.
This transient behaviour appears consistently and is proving difficult to resolve. Despite gaining a lot of valuable insights from this forum — and I truly appreciate the support and guidance I've received so far — this issue remains a critical bottleneck in my workflow.
I’ve attached a screenshot of setup and the relevant LabVIEW 2020 code in case it helps provide context.
Any suggestions or insights would be greatly appreciated.
Kind regards
Hasham
06-25-2025 07:56 AM
I reviewed the thread and even followed over to a linked one where I had tried to help you last fall, on mostly the same problem (I think). I'm still not sure I know what "transient behavior" you mean, what you expect different from that, and why.
In the plots up in msg #8, what I see is some noise superimposed on what looks like a "settling into steady state" response. If the ~10 mV worth of apparent settling is the transient you're talking about, the fact that it appears to be rather consistent and repeatable inclines me *not* suspect the DAQ config as a root cause.
Is your DAQ system just passively observing and measuring things that are already in a steady continuous run state? Or does it (or you) initiate some kind of start-up conditions when you run the program?
I'm mulling a theory about a combination of the DAQ circuitry and the sensor's source impedance contributing to this "settling in" time. The DAQ hardware is a place where there's some degree of startup conditions each time you start the program. But I'm not an EE to be able to really get into the weeds with such a theory.
What happens if you keep collecting out to, say, 1000 samples rather than the 100 shown in those screencaps? Does the data show a pretty horizontal "settled" trend all the way out to 1000? Again, if so, I'm inclined to say that you're capturing something that's real, you just need to understand its cause well enough to eliminate or work around it.
I'm also mulling a theory that your repeatedly retriggered 2-sample finite acquisition might not be the best approach. Why exactly 2 samples? And why approach the maximum available sample rate for those 2 samples?
My *guess* is that you really only want 1 sample but are stuck taking 2 because DAQmx requires at least 2 for finite acquisition. And then you use a max sample rate to try to "waste" as little time as possible getting the 2nd sample. But it's also the case that max sample rates are more demanding on the DAQ circuitry, possibly playing a role in this brief "settling in at startup" behavior.
A further troubleshooting thought occurs to me now. You've been at this a long time, so bear with me and take just a little more time to go down a different path for the sake of diagnosis. Start with a simple shipping example for continuous voltage acquisition. Configure it to use TDMS logging and capture 2 channels -- one of your sensors and the 360 Hz sine wave you want to use as an analog trigger. For starters, capture at 100 kHz and run for about 3 seconds.
Now you've got a data set that gives you a more continuous view of things. You can put together a fairly simple algorithm that searches through the sine wave data and identifies the points that *should* have produced a trigger event, which in turn identifies the sensor sample that *should* have been captured in response to that trigger.
Do you still see that same startup "settling in" behavior? What happens if you increase to 500 kHz sample rate? How about if you decrease to 25 kHz?
That's a very specific suggestion, but the general one is to start exploring this problem from different angles. Poke, prod, see what happens. Try to understand *why* it happens. Use that understanding to solve or work around the anomalies. An extremely crude example might possibly be: accept that on startup, you need to ignore the first second or so worth of data as things "settle in". Only start believing data that comes after that. The rationale is that you're keeping the same data you'd have gotten from a perfect system if you had hit the start button 1 second later. Seen in that light, it might be totally acceptable to ignore that first second.
-Kevin P