matlab,octave,pca,missing-data,least-squares

Thanks for all your help. I went through the references and was able to find their matlab code on the als algorithm from two of the references. For anybody wondering, the source code can be found in these two links: 1) http://research.ics.aalto.fi/bayes/software/index.shtml 2) https://www.cs.nyu.edu/~roweis/code.html...

matlab,least-squares,nonlinear-optimization

My favorite is lsqcurvefit from the Optimization toolbox. From the documentation you see that it requires: function handle (fun) starting values for the parameters (x0) additional non-fitted parameters (xdata) which in your case do not exist data values (ydata) Options can be set optimset where you can specify one of...

python,numpy,matplotlib,scipy,least-squares

According to the documentation of the function scipy.linalg.lstsq http://docs.scipy.org/doc/scipy-0.15.1/reference/generated/scipy.linalg.lstsq.html the estimated coefficients should be stored in your variable C (the order corresponding to columns in A). To print your equation with estimated coefficients showing 2 digits after decimal point: print 'f(x,y) = {:.2f}x^2+{:.2f}y^2+{:.2f}xy+{:.2f}x+{:.2f}y+{:.2f}'.format(C[4],C[5],C[3],C[1],C[2],C[0]) or: print 'f(x,y) = {4:.2f}x^2+{5:.2f}y^2+{3:.2f}xy+{1:.2f}x+{2:.2f}y+{0:.2f}'.format(*C)...

python,linear-regression,least-squares

What makes you so sure your residuals are not normally distributed? One way to check for this assumption is to use a Q-Q plot. From a pragmatic perspective, most people will just look at a scatterplot of their data to see whether residuals are normally distributed. Often a violation of...

matlab,math,curve-fitting,linear-programming,least-squares

You want to find an approximate solution x to A x = b in the least-squares sense, i.e. you want to minimize ||A x - b||2 = xT AT A x + bT b - 2 xT AT b. Disregarding the constant term bTb and dividing by a factor 2,...

Rewrite the quantity to minimise as ||Xa - b||^2 = (definition of the Frobenius norm) Tr{(Xa - b) (Xa - b)'} = (expand matrix-product expression) Tr{Xaa'X' - ba'X' - Xab' + bb'} = (linearity of the trace operator) Tr{Xaa'X'} - Tr{ba'X'} - Tr{Xab'} + Tr{bb'} = (trace of transpose of...

python,numpy,scipy,linear-algebra,least-squares

Look at this: Ax = b x^TA^T = b^T where A^T indicates the transpose of A. Now define the symbols Ap=x^T and Xp = A^T and bp=b^T and your problem becomes: Ap Xp = bp that is exactly in the form that you can treat with least squares...

Always pass a data.frame to lm if you want to predict: a <- mtcars$mpg x <- data.matrix(cbind(mtcars$wt, mtcars$hp)) DF <- data.frame(a, x) xTest <- x[2,] # We will use this for prediction later fitCar <-lm(a ~ ., data = DF) yPred <- predict(fitCar, newdata = data.frame(X1 = xTest[1], X2 =...

multidimensional-array,scipy,vectorization,linear-algebra,least-squares

You can gain some speed by making use of the stack of matrices feature of numpy.linalg routines. This doesn't yet work for numpy.linalg.lstsq, but numpy.linalg.svd does, so you can implement lstsq yourself: import numpy as np def stacked_lstsq(L, b, rcond=1e-10): """ Solve L x = b, via SVD least squares...

w needs to be 3x3 so make use diag to construct w as a matrix with those values on the diagonal instead of using a vector x <- matrix(c(1,2,3,4,5,6),nrow=3,ncol=2,byrow=T) xt <- t(x) w <- diag(c(7,8,9)) xt %*% w %*% x ...

python,pandas,least-squares,statsmodels

statsmodels is not directly of any help here, at least not yet. I think your linearized non-linear least square optimization is essentially what scipy.optimize.leastsq does internally. It has several more user friendly or extended wrappers, for example scipy.optimize.curve_fit or the lmfit package. Statsmodels currently does not have a generic version...

python,algorithm,feature-detection,least-squares

1,1 is too far from the correct value; you most probably fall into some local optimum. Try starting from a point much closer to the real center. You find it by first fitting a straight line to your cluster; then separate the points into two halves, according to which half...

r,optimization,linear-regression,least-squares

You are describing linear regression, which can be done with the lm function: coefficients(lm(v~t(B)+0)) # t(B)1 t(B)2 t(B)3 # 0.2280676 -0.1505233 0.7431653 ...

matlab,least-squares,polynomials

It sounds like you have the fitting toolbox and want to just remove a possible coefficient. If that's the case, here's a way to do it. %What is the degree of the polynomial (cubic) polyDegree = 3; %What powers do you want to skip (x^2 and x) skipPowers = [2,...

r,optimization,regression,least-squares

Use the fact that vec(AXA') = (A ⊗ A ) vec(X) so: k <- ncol(A) AA1 <- kronecker(A, A)[, c(diag(k)) == 1] lm.fit(AA1, c(V)) ...

python,numpy,matrix,weight,least-squares

I found another approach (using W as a diagonal matrix, and matricial products) : A=[[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,1,1],[1,1,0,0]] B = [1,1,1,1,1] W = [1,2,3,4,5] W = np.sqrt(np.diag(W)) Aw = np.dot(W,A) Bw = np.dot(B,W) X = np.linalg.lstsq(Aw, Bw) Same values and same results....

python,scipy,curve-fitting,least-squares

Not really an answer, but my feeling is it dependents on the size of the problem. For small size (n=500), the extra time spend in evaluating jacobian (with the supplied jac) probably don't pay off in the end. n=500, with jab: 100 loops, best of 3: 1.50 ms per loop...

matlab,math,curve-fitting,least-squares,best-fit-curve

If A is of full rank, i.e. the columns of A are linearly independent, the least-squares solution of an overdetermined system of linear equations A * x = b can be found by inverting the normal equations (see Linear Least Squares): x = inv(A' * A) * A' * b...

matlab,matrix,constraints,least-squares,inverse

First of all we need more data about your problem: What's the model? Where are the measurements coming from? Yet few notes about what I could figure from your issue: If you have boundaries on the solution you should use Constrained Least Squares. If you do it on MATLAB it...

python,arrays,numpy,broadcast,least-squares

Reshape x to have shape (2, K), with the pairs of the pixel values in the columns. Call lstsq, and then restore the shape of the result. For example, here are A and x: In [86]: A Out[86]: array([[ 1., 5.], [ 4., 0.]]) In [87]: x Out[87]: array([[[0, 1],...