LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

multivariate fitting issues

Need a hand: I have a dataset Y, composed of 256 points.  Y is theoretically composed of contributions from 5 other data sets, ie
Y = a*Signal1+b*Signal2 + c*Signal3....etc.  I need to retrieve the coefficients a,b,c....  I can easily do this in Mathcad simply by minimizing the sum of squares of the differences between Y and the Signals with something like a downhill simplex or conjugate gradient minimization.  However, I cannot seem to retrieve these coefficients properly with Labview, using the General Least Squares or the Levenberg-Marquardt algorithm.  I have also tried doing this with a modified Levenberg-Marquardt Algorithm written by Christian Altenbach which calls a VI rather than requiring an input string. Thank you in advance for you help.
Chris
0 Kudos
Message 1 of 16
(4,024 Views)


@bassbuckeye wrote:
Y = a*Signal1+b*Signal2 + c*Signal3....etc.  I need to retrieve the coefficients a,b,c.... 


This should be trivial using the "general LS fit". Just compose the "H" matrix by making a 2D array of all your signals. The rest should fall into place. 🙂

Let me know if you get stuck. (I have a nice example but I am currently in traveling, so I don't have access)

0 Kudos
Message 2 of 16
(4,019 Views)

Actually, the General Least Squares gives very poor fits.  I have been very careful in constructing and checking the H matrix, yet the fit is still very poor.  Even when fitting laboratory standards, which work exceptionally well in Mathcad, the fit fails to give values even close to the range that I would expect.  I am not sure as to why, but I am thinking that Chi Square is not the ideal function to minimize in this case.  I would like to be able to perform a fit by minimizing the sum of squares, but constructing the model string is not so simple for large models, and the time needed for the algorithm to converge is quite long, even with single variable models.  I am baffled here as to why a normal least squares  fit fails, as the same algorithm is used in several steps prior to this point in my data processing. If anyone has an idea or suggestion, I would greatly appreciate it.

Chris

0 Kudos
Message 3 of 16
(4,013 Views)


@bassbuckeye wrote:

Actually, the General Least Squares gives very poor fits.  I have been very careful in constructing and checking the H matrix, yet the fit is still very poor. 


That really surprises me. What is your LabVIEW version? What algorithm do you you use (try e.g. Givens or instead of the default SVD). How many signals do you have? How many points in each signal? Are some of the signals very similar? Do they differ dramatically in amplitude?  Would you mind posting a set of signals and a typical linear combination to be fit?

 

0 Kudos
Message 4 of 16
(4,010 Views)

Thanks Christian, problem solved.  One set of data values is coming from a subvi that I had re-written, and I forgot to scale the value.  This signal was so much larger than the others that it was confounding the fit, as you said.  I don't like to mix subjects in these threads, but I did have a question concerning the levenberg marquardt algorithm that you wrote which references a vi (very elegant, by the way).  Is it possible to constrain the fit so that the coefficients searched by the algorithm stay within predefined limits? If not, do you have any suggestions for alternative algorithms?
Thanks,
Chris

0 Kudos
Message 5 of 16
(3,990 Views)
Hi there. I also have the same problem. Is there any way to control the coefficents to some limits in the general least square fitting, e.g all coefficents >=0?
 
many thanks,
jan 
0 Kudos
Message 6 of 16
(3,908 Views)
0 Kudos
Message 7 of 16
(3,894 Views)
I had success by reparameterizing the output coeficients prior to each call of the nonlinear fit.  The equations for doing this can be found in:
McDonald, Roderick P.. A simple comprehensive model for the analysis of covariance structures: Some remarks on applications. British Journal of Mathematical and Statistical Psychology, 33. 1980. You will most likely not be able to find this article in an electronic collection and will have to find it in the stacks.  Reparameterization worked for me, but I am interested in this particular fitting routine from a purely utilitarian standpoint. This may cause some complications if you are interested in the statistics of the fit (I am not).
0 Kudos
Message 8 of 16
(3,881 Views)

I don't think there is an easy way using general LS fit, the algoritm mostly discussed in this thread. 🙂

Constraints are not directly supported by the LabVIEW tools but the answer is yes if  you rewrite your model in terms of transformed parameters and use nonlinear least squares (Levenberg Marquardt)

If you cannot find the above quoted book, the following link shows some typcial cases:

http://v8doc.sas.com/sashtml/stat/chap19/sect41.htm

( I posted a similar link (now dead) very long ago: http://forums.ni.com/ni/board/message?board.id=170&message.id=152529#M152529) .

 

Message Edited by altenbach on 10-12-2006 11:55 AM

0 Kudos
Message 9 of 16
(3,873 Views)
Oops, I guess after a few months I should have re-read my own post. lol To bcvan: I resolved my fitting issues by correcting 2 problems: 1. There was a degree of misalignment between the observed and theoretical data that I was fitting and 2. One of the theoretical data sets was much larger than the others, and would confound the fit.  Fixing the second was easy, fixing the first was a little more difficult and was done by adjusting alignment non-linearly and fitting linearly.  If you find that misalignment is your problem, I have VI you can look at to try and correct this.
0 Kudos
Message 10 of 16
(3,859 Views)