The Powell Mathematica package implements Powell's hybrid (or "dogleg") method, described in M. J. D. Powell, "A hybrid method for nonlinear equations", in Numerical methods for nonlinear algebraic equations, Philip Rabinowitz, editor, chapter 6, pages 87-114, Gordon and Breach Science Publishers, New York, 1970. Only the case where the Jacobian is known analytically is implemented. Powell's paper describes an efficient mechanism for computing Jacobian numerically, but that is not included. There is one function: PowellHybrid[function, initialParameters, jacobianFunction]. The arguments are: function -- A pure function or the name of a function. It will be called with a list of parameters and should return a list of values. There can be fewer values than parameters. In this case PowellHybrid will try to find some solution near the initial guess. If function return something that is not a list, it is considered a failure, and PowellHybrid will try to take a shorter step to stay in the valid region of parameter space. initialParameters -- A list giving the initial guess. The length of this list gives the number of parameters. jacobianFunction -- A pure function or the name of a function. It will be called with the same list of parameters that was just used to call the function and should return a matrix whose [[i,j]] element gives the first derivative of the function i with respect to parameter j. PowellHybrid takes the following options: AccuracyGoal: The number of digits of accuracy sought. PowellHybrid attempts to make the sum of the squares of the functions less than 10^{-AccuracyGoal}. The default is half the machine precision. MaxIterations: The maximum number of steps that PowellHybrid will take before giving up. Unsuccessful steps (i.e., trials where the sum of the squares of the functions increases) are not counted. InitialStep: The length of the initial step (in the Euclidean norm on parameter space) or one of the special symbols PureNewton or PureGradient (the default) to indicate that the first step should use Newton's method or the gradient step only. MaxStep: We can make a rough guess at the distance to the solution by dividing the sum of the squares of the functions by the norm of the gradient of that quantity. If instead of approaching a solution we are approaching local minimum, the gradient will become a small and the estimated distance large. If it is larger than MaxStep, PowellHybrid declares that it has found a minimum and gives up. The default is 10 to the power of half the machine precision. MinFailStep: If the step computed by PowellHybrid takes us to a region of parameter space where the function fails, it will take shorter and shorter steps to try to find in improvements. But if the gradient direction leads to a bad region of parameter spaces, this can never work. When the stepsize is decreased below MinFailStep, PowellHybrid will give up. MinStep: When repeated backing off in an attempt to find an improvement in the sum of the squares of the functions reduces the step size below MinStep, PowellHybrid will give up. StepMonitorFunction: If you supply this, it is a function to be called with a list of the parameters after a successful step. Verbosity: How much information to print out as the process proceeds: 0: no messages 1: print a dot for each step taken 2: one line for each step taken 3: everything but Jacobian 4 (or Infinity): everything