With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

No credit card required

Optimization

Optimization, covered in Chapter 5, is a little different from the other methods; instead of working with a dataset, it attempts to select values that minimize the output of a cost function. Chapter 5 showed several examples of cost functions, such as planning group travel using a combination of price and waiting time at the airport, assigning students to the most appropriate dorm, and optimizing the layout of a simple graph. Once the cost function was designed, the same algorithms could be used to solve these three different problems. Two algorithms were covered: simulated annealing and genetic algorithms.

The Cost Function

A cost function is any function that takes a guess at a solution and returns a value that is higher for worse solutions and lower for better solutions. Optimization algorithms use this function to test solutions and to search possible solutions for the best one. The cost functions you use with optimization often have many variables to consider, and it's not always clear which is the best one to change in order to improve the result. However, for illustration, consider a function with only one variable, defined as:

`y = 1/x * sin(x)`

Figure 12-18 shows the graph of this function.

Because the function has only one variable, it's easy to see from the graph where the lowest point is. We're using this for illustration so you ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

No credit card required