04-24-2023 09:31 AM - edited 04-24-2023 09:35 AM
So what does python tell you if you feed this data to the FitExp2 function???
Once you have reasonable data that can be fit to a double exponential, we can compare results.
Yes, given some reasonable guesses, we can do a double exponential fit, but it is extremely fragile and I would not trust it!
The model is taken from my old code:
04-24-2023 10:39 PM - edited 04-24-2023 10:41 PM
I do really appreciate your great support on this case.
Seems my previous reply was somehow disappeared for some reason.
I found my previous raw data processing had bugs. attached the correct data again. There is one mistake I confused myself or actually I did not get good understanding how to proceed from mathematics point of view.
As you can see from the Graph. The first 10 points are almost flat. After those the next 3~5 points are dropping fast. My plan was making double exponential fitting for those points to get ride of the variation from the fast-dropping data. Then uses the new best fitting data to combine the rest data as the final data set. After that , check if to use moving average or other fitting to clarify the trend and pick up the data from checking points. We have lots of this kind of data need to be proceed. That's the reason I would like to use fitting to get reasonable data when pick up the data from different check point.
Your demo VI is exactly I am looking for and want to learn. Could you please share it for learning purpose. thanks in advance.
04-25-2023 09:43 AM - edited 04-25-2023 10:20 AM
Your new data is equally bad. The noise is similar to the amplitude of the slower component while the faster component is define by only about two noisy points and thus very, very ill-posed. The tiniest amount of noise can change A1 and Tau1 by orders of magnitude.
The code was simplified from the link I gave you earlier, the model was even kept the same. I can attach it later.
04-25-2023 09:57 AM
Thanks again.
Indeed. It is really difficult to process those data. But seems my friend made it with python. I tried to make similar here but probably failed on the algorithm:).. I am going to spend some time to learn from that code. study what is the magic there.
I learnt a lot from Labview exercise especially with your support. I will back later when I understand his method for this data processing.
Best Regards.
04-25-2023 10:16 AM
Here's a very simplified draft. Let me know if you have any questions.
Note that the amplitude values would look significantly more reasonable if we would truncate the first points and set the start of the valid data to x=zero. (not shown).
04-25-2023 10:57 AM
@James_tt wrote:
But seems my friend made it with python.
Not sure what "it" is, but if he gave you precise results with that kind of data, don't believe anything. No algorithm in the world can turn dirt into gold!
LabVIEW has all the tools to determine correlations and estimates of parameter errors. (not shown).