LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

curve fitting - some hard stuff

I'm fitting experimental data with a Lev-Marq algorithm. A simplified version of the model equation is: Y = P1 * { exp(K * (X - Y * P2)) - 1}, with the free parameters P1, P2. Note that this is an implicit formular with Y on both sides of the '='. Well, most of the time it doesn't work, so here is a bunch of questions that might be more mathematical in nature. Stuck with LV 7.1 I need to code most on my own. * The parameters differ by several magnitudes (up to E+15!) so I get severe numerical errors. Is it possible to 'normalize' the parameters? * How to apply boundaries for the parameters, is it correct to corce them after each iteration? * How to deal with the implicit formula? Currently I ignore the fact that Y occures on the right hand side of the equation (I use the measurement data directly). Is it straight-foreward to implement Lev-Marq for data-points with F(x,y)=0? * In addition to this, literature suggests to use 'orthogonal distance regression' (it takes into account that both X and Y are 'noisy' signals). I haven't found any description of this method. Thanks in advance, Felix
0 Kudos
Message 1 of 6
(2,967 Views)

Hi Felix,

 

can you attach your code? What do you want to do in general?

 

Regards,

 

Mary

NIG

0 Kudos
Message 2 of 6
(2,938 Views)

 

Do you have some typical data? What are typical parameter values? What are typical ranges for x and y?
It would really help to upgrade to a newer LabVIEW version.....

 

0 Kudos
Message 3 of 6
(2,931 Views)

Just realized that my post didn't get the correct formatting.

My code is actually working fine. Typical data is confidential, so I can't give you something to play around. The measurement data is really good (not noisy at all), the Y values are taken with great dynamic gain covering about 8 orders of magnitude.

 

Here my old questions, I hope they get formatted:

* The parameters differ by several magnitudes (up to E+15!) so I get severe numerical errors. Is it possible to 'normalize' the parameters?

* How to apply boundaries for the parameters, is it correct to corce them after each iteration?

* How to deal with the implicit formula? Currently I ignore the fact that Y occures on the right hand side of the equation (I use the measurement data directly). Is it straight-foreward to implement Lev-Marq for data-points with F(x,y)=0?

* In addition to this, literature suggests to use 'orthogonal distance regression' (it takes into account that both X and Y are 'noisy' signals). I haven't found any description of this method.

 

Here my new insights: I mainly deal with numeric problems. The formula in my first post contains a really nasty part:

exp(K * (X - Y * P2))

where P2 is one of the parameters I'd like to obtain. If the initial guess isn't good, the term inside the brackets evaluates to numbers of several hundereds (for some X-Y pairs), and then raising to the power of e is giving me values in the range of E+100. In this situation all other variations (of the other parameters) is buried under this value. So in some situations, the fit terminates (not exactly matching the data but somehow ok). But my initial guess for the other parameter P1 didn't change at all during the fit (P1 is also about E-15 smaller than P2).

 

Have a nice weekend!

 

Felix

0 Kudos
Message 4 of 6
(2,919 Views)

Felix,

 

Your equation can be solved explicitly for x.  What happens if you try to fit the data to that equation (with y and x confusingly exchanged)?

 

X = P2*Y + (1/k)*ln(Y/P1 + 1)

 

If the curve fitting will find suitable values for P1 and P2 for this equation, try them in the original equation.

 

Lynn

0 Kudos
Message 5 of 6
(2,911 Views)
Lynn, the complete model is more complex, I just posted the term that is making the troubles. Felix
0 Kudos
Message 6 of 6
(2,906 Views)