Lesson 3
Optimization for Multivariable Functions in SciPy
Introduction to Optimization for Multivariable Functions

Welcome back to our course on optimization with SciPy. In previous lessons, you learned about defining functions in Python and explored the basics of optimization using SciPy. Today, we’ll dive into optimization for multivariable functions, an essential skill in many fields such as machine learning, engineering, and economics.

Understanding optimization and its real-world applications will help you solve complex problems more efficiently. Multivariable optimization deals with functions that have more than one input variable. We'll use practical examples to guide you through this concept.

Defining the Objective Function

An objective function is central to optimization. It's the function we aim to minimize or maximize. In multivariable optimization, the objective function may depend on several input variables.

Let's create a simple objective function. We'll use the function f(x,y)=x2+y2+4sin(x)+4cos(y)f(x, y) = x^2 + y^2 + 4\sin(x) + 4\cos(y). This function takes two input variables, xx and yy, and involves quadratic and trigonometric terms.

Python
1import numpy as np 2 3def objective_function(x): 4 return x[0]**2 + x[1]**2 + 4*np.sin(x[0]) + 4*np.cos(x[1]) 5 6# Example input for x 7example_x = [0, 0] 8print("Objective function value for example x:", objective_function(example_x)) # 4.0
  • objective_function is defined to take a list x containing the two variables, x0x_0 and x1x_1.
  • x0x_0 represents xx, and x1x_1 represents yy.
  • It calculates the sum of squares, sine, and cosine terms accordingly.
Setting the Initial Guess

The initial guess is crucial in optimization as it influences both the convergence speed and the solution found. Let's set an initial guess for our problem.

Python
1initial_guess = [1, 1]
  • Here, [1, 1] is our starting point for the variables xx and yy.
  • An initial guess can be based on prior knowledge or can be a simple starting point like the vector of ones we used here.
Implementing the `minimize` Function

Now let's move on to using the minimize function from SciPy to find the minimum of our objective function.

Python
1from scipy.optimize import minimize 2import numpy as np 3 4def objective_function(x): 5 return x[0]**2 + x[1]**2 + 4*np.sin(x[0]) + 4*np.cos(x[1]) 6 7initial_guess = [1, 1] 8result = minimize(objective_function, initial_guess) 9 10print("Optimal [x, y]:", result.x) 11print("Function value at minimum:", result.fun) 12print("Number of iterations:", result.nit) 13print("Successful:", result.success)

Output:

Plain text
1Optimal [x, y]: [-1.02986652 1.89549428] 2Function value at minimum: -0.051487585323559903 3Number of iterations: 6 4Successful: True

Explanation:

  • minimize is a versatile function for optimization tasks.
  • It takes the objective_function and initial_guess as inputs to start the optimization process.
  • The printed results give the optimal values for xx and yy, the minimum value of the objective function at the optimal solution, the number of iterations it took to reach the solution, and whether the optimization was successful.
Summary and Preparation for Practice

To summarize, you have learned how to define an objective function for multivariable optimization, set an initial guess, use SciPy's minimize function, and interpret the results. These are fundamental skills for tackling more complex optimization tasks.

Practice exercises will reinforce these concepts, so make sure to attempt them. Engaging with these exercises will further enhance your understanding and prepare you for real-world optimization challenges.

Congratulations on reaching this stage in the course, and thank you for your dedication. Your ability to optimize multivariable functions will be an invaluable tool in your future projects.

Enjoy this lesson? Now it's time to practice with Cosmo!
Practice is how you turn knowledge into actual skills.