Skip to main content

Section 5.4 Solving Linear Optimization Problems using Julia

Before continuing, make sure you have some grasp of the Julia language as in SectionΒ 5.1–SectionΒ 5.3 especially the proper syntax. Additionally, you should know how to install packages described in SectionΒ A.2.
If you haven’t already, make sure the packages JuMP and HiGHS are installed. The JuMP package which allows optimization problems to be constructed with the variables, constraints and objective defined. This package allows for many different types of problems to be constructed that fall outside of the realm of this book. The JuMP package only creates the problems, it does not do the work to optimize the problem. For that, we need the HiGHS package.
The HiGHS package is useful for solving Linear Optimization Problems of the type we have covered in this book. It does the hard work of the algorithms discussed in this text including the simplex method, cutting planes and branch-and-bound, however, you do not need to specify which method to use. Depending on the type of problem constructed, the HiGHS package will perform the correct technique.

Subsection 5.4.1 Setting up an LOP using JuMP

Before getting started, make sure that we have loaded the two packages discussed with
using JuMP, HiGHS
and the result doesn’t return anything.
As noted above, the JuMP package is used to define a problem. With the use of a few macros, many problems are easily constructed in Julia. For example, let’s use ProblemΒ 2.2.2 and walk through the steps to set this up.
Step 1: create a model
m = Model(HiGHS.Optimizer)
will create a JuMP model and assign the HiGHS.Optimizer as the solver.
 1 
We will always use this Model call throughout this book, however, if you are using JuMP for doing other solves, like a nonlinear minimization, then you will use a different package and it’s related Optimizer.
If the line above is the only expression in the cell, this returns the following:
A JuMP Model
β”œ solver: HiGHS
β”œ objective_sense: FEASIBILITY_SENSE
β”œ num_variables: 0
β”œ num_constraints: 0
β”” Names registered in the model: none
This doesn’t have a lot of information because we haven’t added variables or constraints to the equation. We will see this change as develop the model.
Step 2: Add variables
@variable(m, x₁ β‰₯ 0)
@variable(m, xβ‚‚ β‰₯ 0)
These lines adds the two variables x₁ and xβ‚‚ to the model and they are both nonnegative variables. The subscripts are Unicode characters that you can enter as \_1, then TAB inside a Jupyter notebook. Also, the \(\geq\) character is entered as \geq then TAB. Neither are necessary, but make the constraints look like they do written out mathematically. If these are placed in a notebook cell then \(x_2\) is echoed out only.
Step Three: add the constraints
@constraint(m, 4x₁ + 3xβ‚‚ ≀ 120)
@constraint(m, x₁ + 2xβ‚‚ ≀ 40)
@constraint(m, xβ‚‚ ≀ 16)
and again, if all three are in a single cell, you’ll only see the last constraint echoed out.
Step Four: add the objective function
@objective(m, Max, x₁+3xβ‚‚)
And note that the second argument is Max indicating that this is a maximum problem and the third argument is the function. We have now added all of the elements of this problem (variables, constraints, objective). If one enters print(m) in a cell by itself you can look at the internals of the model:
Max x₁ + 3 xβ‚‚
Subject to
 4 x₁ + 3 xβ‚‚ ≀ 120
 x₁ + 2 xβ‚‚ ≀ 40
 xβ‚‚ ≀ 16
 x₁ β‰₯ 0
 xβ‚‚ β‰₯ 0
and this looks almost identical to the way we write down the LOP.
Step Five: Find the optimal value
To find the optimal value, we use the command
optimize!(m)
where it is important to include the ! at the end of the optimize! function.
 2 
As mentioned before, Julia has a convention that if a function modifies its arguments--instead of just returning something, a ! should be used to tip off the Julia coder that the arguments are changing.
This function returns
Running HiGHS 1.10.0 (git hash: fd8665394e): Copyright (c) 2025 HiGHS under MIT license terms
  LP   has 3 rows; 2 cols; 5 nonzeros
  Coefficient ranges:
    Matrix [1e+00, 4e+00]
    Cost   [1e+00, 3e+00]
    Bound  [0e+00, 0e+00]
    RHS    [2e+01, 1e+02]
  Presolving model
  2 rows, 2 cols, 4 nonzeros  0s
  2 rows, 2 cols, 4 nonzeros  0s
  Presolve : Reductions: rows 2(-1); columns 2(-0); elements 4(-1)
  Solving the presolved LP
  Using EKK dual simplex solver - serial
    Iteration        Objective     Infeasibilities num(sum)
            0    -9.9999494546e-01 Ph1: 2(5); Du: 1(0.999995) 0s
            2    -5.6000000000e+01 Pr: 0(0) 0s
  Solving the original LP from the solution after postsolve
  Model status        : Optimal
  Simplex   iterations: 2
  Objective value     :  5.6000000000e+01
  Relative P-D gap    :  0.0000000000e+00
  HiGHS run time      :          0.00
Generally unless something goes wrong, this information isn’t helpful. Therefore, before running the optimize!(m) command, run set_silent(model) and you won’t get all of the output.
Step Six: Ensure the Solution
If you suppress the optimizer output (which should be done), you want to make sure that a solution is reached. If you enter
is_solved_and_feasible(m)
and true is returned then all is well.
Step Seven: determine the variable values and the objective of the optimal solution
The last step is generally to get the values of variables and objective of the optimal solution. To get the value of a variable use the value(VAR) command where the argument is the variable you want. In this case, if we do
value(x₁), value(xβ‚‚)
the result is (8.0, 16.0). The objective value is found with objective_value(m) and in this case, this returns 56.0. These are the same values that we get when we solved problem β€œby hand”.

Note 5.4.1.

Although the steps here are shown, typically many are done together. Usually set up the problem with steps 1-4, then steps 5 and 6 (solve and verify), then get the variable and objective values.
We’ll see how to better approach this in the next example.

Example 5.4.2.

Use Julia with the JuMP and HiGHS packages to solve ProblemΒ 2.1.4
Solution.
In this case, set up the model, variables, constraints and objectives as follows.
m2 = Model(HiGHS.Optimizer)
@variable(m2, y₁ β‰₯ 0)
@variable(m2, yβ‚‚ β‰₯ 0)
@variable(m2, y₃ β‰₯ 0)
@constraint(m2, 5y₁ + 18yβ‚‚ + 10y₃ β‰₯ 54)
@constraint(m2, 7y₁ + 10yβ‚‚ + 24y₃ β‰₯ 60)
@constraint(m2, 4y₁ + 8yβ‚‚ + 12y₃ β‰₯ 45)
@objective(m2, Min, 600y₁ + 750yβ‚‚ + 500y₃ )
print(m2)
which produce the model:
Min 600 y₁ + 750 yβ‚‚ + 500 y₃
Subject to
 5 y₁ + 18 yβ‚‚ + 10 y₃ β‰₯ 54
 7 y₁ + 10 yβ‚‚ + 24 y₃ β‰₯ 60
 4 y₁ + 8 yβ‚‚ + 12 y₃ β‰₯ 45
 4 y₁ + 8 yβ‚‚ + 12 y₃ ≀ 70
 y₁ β‰₯ 0
 yβ‚‚ β‰₯ 0
 y₃ β‰₯ 0
We will suppress the output from the Optimizer code, run the optimizer and determine if it was solved and feasible.
set_silent(m2)
optimize!(m2)
is_solved_and_feasible(m2)
and the last line returns true indicating that this is a good solution. We can find the values of it with
value(y₁),value(yβ‚‚), value(y₃) 
and the result is
(0.0, 1.4558823529411764, 2.7794117647058827)
and the objective value is found with objective_value(m2) which returns 2481.62.
This shows that Gary shouldn’t eat any salads, 1.455 sandwiches and 2.779 bagels. The total number of calories from these food items is 2481.62.

Subsection 5.4.2 Where’s the Simplex Method

I know you are wondering what happened to the simplex method in this chapter. We’ve spent a long chapter on how to create simplex tableaus and use them to step-by-step go through to the solution. We then spent another chapter on Integer problems with Cutting Planes and Branch and Bound, which use simplex method under the hood. With Julia and the JuMP and HiGHS packages, it seems totally unnecessary to learn all of that.
We won’t go through the details of how the HiGHS package finds the optimal solution to problems, but it is equivalent to using the simplex method and the Cutting Planes and Branch and Bound.
Understanding the simplex method is important. The basis of the algorithm in the HiGHS package uses the simplex method, as well as the integer techniques of ChapterΒ 4. It much like any technology in that to completely understand how to use software like this, you need to understand the underlying algorithm.

Subsection 5.4.3 Infeasible and Unbounded Problems

In SectionΒ 3.4, we discussed how to determine if a problem is infeasible or unbounded. The JuMP and HiGHS packages can also determine this for you.
Let’s start with the unbounded problem in ProblemΒ 3.4.1. We can set this up using JuMP as follows:
m4 = Model(HiGHS.Optimizer)
@variable(m4, x₁ β‰₯ 0)
@variable(m4, xβ‚‚ β‰₯ 0)
@constraint(m4, x₁ - 2xβ‚‚ ≀ -6)
@constraint(m4, -5x₁ + 2xβ‚‚ ≀ -20)
@constraint(m4, -3x₁ -5xβ‚‚ ≀ -75)
@objective(m4, Max, 5x₁ + 3xβ‚‚)
print(m4)
which returns:
m4 = Model(HiGHS.Optimizer)
@variable(m4, x₁ β‰₯ 0)
@variable(m4, xβ‚‚ β‰₯ 0)
@constraint(m4, x₁ - 2xβ‚‚ ≀ -6)
@constraint(m4, -5x₁ + 2xβ‚‚ ≀ -20)
@constraint(m4, -3x₁ -5xβ‚‚ ≀ -75)
@objective(m4, Max, 5x₁ + 3xβ‚‚)
print(m4)
and this looks like the problem in ProblemΒ 3.4.1. However, if we run the following code:
set_silent(m4)
optimize!(m4)
is_solved_and_feasible(m4)
the result is false. This may not be surprising because we know it is unbounded. To see a bit more about what is happening, if we enter:
solution_summary(m4)
then we get the following information:
solution_summary(; result = 1, verbose = false)
β”œ solver_name          : HiGHS
β”œ Termination
β”‚ β”œ termination_status : DUAL_INFEASIBLE
β”‚ β”œ result_count       : 1
β”‚ β”œ raw_status         : kHighsModelStatusUnbounded
β”‚ β”” objective_bound    : 7.08065e+01
β”œ Solution (result = 1)
β”‚ β”œ primal_status        : INFEASIBILITY_CERTIFICATE
β”‚ β”œ dual_status          : INFEASIBLE_POINT
β”‚ β”œ objective_value      : 8.06452e-01
β”‚ β”œ dual_objective_value : 7.08065e+01
β”‚ β”” relative_gap         : 7.00000e+01
β”” Work counters
  β”œ solve_time (sec)   : 2.45887e-03
  β”œ simplex_iterations : 2
  β”œ barrier_iterations : 0
  β”” node_count         : -1
This shows that the model was not solved correctly and the termination status is DUAL_INFEASIBLE. We will discuss the details of Dual Problems, however, at this point, you can translate this DUAL_INFEASIBLE to mean that the problem is unbounded.

Checkpoint 5.4.3.

Use the JuMP and HiGHS packages to set up and solve the problem in ProblemΒ 3.4.3.
Solution.
The problem in ProblemΒ 3.4.3 can be set up as follows:
m5 = Model(HiGHS.Optimizer)
@variable(m5, x₁ β‰₯ 0)
@variable(m5, xβ‚‚ β‰₯ 0)
@constraint(m5, -x₁ + 2xβ‚‚ ≀ 6)
@constraint(m5, 5x₁ - 2xβ‚‚ ≀ 20)
@constraint(m5, -3x₁ -5xβ‚‚ ≀ -75)
@objective(m5, Max, 5x₁ + 3xβ‚‚)
print(m5)
and then attempt to find the solution with:
set_silent(m5)
optimize!(m5)
is_solved_and_feasible(m5)
However, this results in false. To determine the reason, we use:
solution_summary(m5)
The output is similar to the one above for the unbounded problem:
solution_summary(; result = 1, verbose = false)
β”œ solver_name          : HiGHS
β”œ Termination
β”‚ β”œ termination_status : INFEASIBLE
β”‚ β”œ result_count       : 1
β”‚ β”œ raw_status         : kHighsModelStatusInfeasible
β”‚ β”” objective_bound    : 0.00000e+00
β”œ Solution (result = 1)
β”‚ β”œ primal_status        : NO_SOLUTION
β”‚ β”œ dual_status          : INFEASIBILITY_CERTIFICATE
β”‚ β”œ objective_value      : 0.00000e+00
β”‚ β”œ dual_objective_value : -1.76364e+01
β”‚ β”” relative_gap         : Inf
β”” Work counters
  β”œ solve_time (sec)   : 2.22933e-03
  β”œ simplex_iterations : 2
  β”œ barrier_iterations : 0
  β”” node_count         : -1
However, note that the termination_status is INFEASIBLE. This is because the problem is infeasible.