Find other solutions to the 5-queens problem.
Section 9.2 Additional Optimal Solutions of LOPs using Julia
In Section 9.1, we explored the concept of uniqueness of optimal solutions in linear optimization problems (LOPs). We saw that some LOPs can have multiple optimal solutions. In this section, we will delve deeper into the topic by examining additional examples and methods for identifying and characterizing these multiple optimal solutions.
Subsection 9.2.1 Using Random coefficients to find additional solutions
Based on a paper (REF?), one method to find additional optimal solutions is to modify the objective function slightly by adding small random perturbations to the coefficients. By altering the objective function in this way, different solution may arise over others. If done multiple times, this method can help identify a variety of optimal solutions for the original problem.
As we saw in Subsection 9.1.2, the solution to Problem 9.1.1 was found using Julia. However, since the technique that Julia uses is deterministic, it will always return the same solution. But we know from Subsection 9.1.1, that we found two optimal solutions to this problem.
In short, what we are going to do is to modify the objective function in the Julia code by adding small random values to the coefficients of the objective function, and then resolve the problem multiple times to see if we can find different optimal solutions. Here’s the original
JuMP model:
m = Model(HiGHS.Optimizer)
@variable(m, x₁ ≥ 0)
@variable(m, x₂ ≥ 0)
@objective(m, Max, 2x₁+4x₂)
@constraint(m, c1, 4x₁+3x₂ ≤ 120)
@constraint(m, c2, x₁+2x₂ ≤ 40)
@constraint(m, c3, x₂ ≤ 16)
print(m)
We will replace the objective function with the following:
@objective(m, Max, (2 + rand(-0.01:0.001:0.01))*x₁ + (4 + rand(-0.01:0.001:0.01))*x₂)
and the rerun the model with
set_silent(m)
optimize!(m)
is_solved_and_feasible(m)
and again printing out the solution with
value.(x₁), value.(x₂), objective_value(m)
The point was close to one of the solutions found in Subsection 9.1.1. The small difference is most likely round-off error, common to using numerical techniques. Note, however, that the objective value is not 80 and the value of 80.088 is not a round-off error. This is due to the fact that we used a different objective value.
1
The lack of round-off error being introduced is one of the reasons for using integer tableaus throughout the book. However, when solving large problems, using integers is impractical.
Once we have a point, like \((24,8)\text{,}\) we should use the original objective function and note that this is \(z=80\text{.}\)
You should run the code above again (starting with the cell that has the line
m = Model(HiGHS.Optimzer)) and see if there is a different solution. If you get the same point as above, try again. After a few tried, I got the result:(8.0, 16.0, 79.848)
and this was the other solution we found earlier.
Subsection 9.2.2 Solutions of the 4-queens Problem
In Subsection 5.5.2, the \(n\)-queens problem was formulated as an integer linear optimization problem. We examine the uniqueness of its solutions here.
When \(n=4\text{,}\) running the code in Listing 5.5.2 results in the solution
4×4 Matrix{Int64}:
0 0 1 0
1 0 0 0
0 0 0 1
0 1 0 0
As we saw above, using the dual to determine how to find other solutions is not possible here, since all dual variables are zero. Instead, we will use a method of using a new objective that is a small random perturbation of the original objective.
If we replace the objective in the code in Listing 5.5.2 with
@objective(m2, Max, sum([rand(0.99:0.001:1.01)*x[i,j] for i=1:n, j=1:n]))
and rerun the model. It may result in the same solution, but it may result in a different solution, such as
4×4 Matrix{Int64}:
0 1 0 0
0 0 0 1
1 0 0 0
0 0 1 0
These are the only two solutions to the 4-queens problem.
