UN4 - 1 year ago 251
Python Question

# Least squares fit in python for 3d surface

I would like to fit my surface equation to some data. I already tried scipy.optimize.leastsq but as I cannot specify the bounds it gives me an unusable results. I also tried scipy.optimize.least_squares but it gives me an error:

``````ValueError: too many values to unpack
``````

My equation is:

``````f(x,y,z)=(x-A+y-B)/2+sqrt(((x-A-y+B)/2)^2+C*z^2)
``````

parameters A, B, C should be found so that the equation above would be as close as possible to zero when the following points are used for x,y,z:

``````    [
[-0.071, -0.85, 0.401],
[-0.138, -1.111, 0.494],
[-0.317, -0.317, -0.317],
[-0.351, -2.048, 0.848]
]
``````

The bounds would be A > 0, B > 0, C > 1

How I should obtain such a fit? What is the best tool in python to do that. I searched for examples on how to fit 3d surfaces but most of examples involving function fitting is about line or flat surface fits.

This isn't strictly least squares, but how about something like this?

``````import numpy as np
import scipy.optimize

coeff_0 = np.array([1, 1, 1])
coeff = coeff_0

data = np.array([
[-0.071, -0.85, 0.401],
[-0.138, -1.111, 0.494],
[-0.317, -0.317, -0.317],
[-0.351, -2.048, 0.848]
])

def objective(coeff, data):
# Function that returns the squared loss.
# We want the function to choose A, B, C such that all values are close to zero
A, B, C = coeff
x, y, z = data.T
objective = (x - A + y - B) / 2 + np.sqrt(((x - A - y + B) / 2) ** 2 + C * z**2)
# L2 regularization
reg = .01 * np.sqrt((coeff ** 2).sum())
value = objective + reg
losses = (value - 0) ** 2
loss = losses.mean()
return loss

result = scipy.optimize.minimize(objective, coeff_0, args=(data,))

coeff = result.x
A, B, C = result.x

loss = objective(result.x, data)
print('final loss = %r' % (loss,))
``````

This solution is like throwing a sledge hammer at the problem. There probably is a way to use least squares to get a solution more efficiently using an SVD solver, but if you're just looking for an answer BFGS optimization will find you one.

I found in the documentation that all you need to do to adapt this to actual least squares is to specify the function that computes the residuals.

``````def target_func(A, B, C, x, y, z):
return (x - A + y - B) / 2 + np.sqrt(((x - A - y + B) / 2) ** 2 + C * z ** 2)

def residuals(coeff, data):
# Function that returns the squared loss.
# We want the function to choose A, B, C such that all values are close to zero
A, B, C = coeff
x, y, z = data.T
# The function we care about
objective = target_func(A, B, C, x, y, z)
losses = (objective - 0)
return losses

lst_sqrs_result = scipy.optimize.least_squares(residuals, coeff_0, args=(data,),
bounds=([0, 0, 0],
[np.inf, np.inf, np.inf]))
print('lst_sqrs_result = %r' % (result,))
``````
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download