Lecture 37: Case Study#

Context - Dynamic-Stochastic Multi-Echelon Capacitated Location Routing Problem with Time-Windows (DS-ME-C-LRP-TW)#

The retail sector, traditionally dominated by brick-and-mortar stores and itinerant merchants, has seen a growing presence of digital marketplaces, driven by the rapid growth in internet access. This rise of e-commerce has bridged the gap between the consumer and the retailer, enabled demand consolidation and delivery route optimization, and facilitated access to vital products for otherwise disadvantaged communities, thus improving economic viability, environmental efficiency, and social equity in urban freight – three pillars of sustainability. This sustainable growth potential of e-commerce has drawn a growing number e-retailers to compete for market through increasingly consumer-focused delivery services. Such trends are particularly evident in quick commerce, wherein consumer demand is stochastic and dynamic. This necessitates the e-retailers to model strategic, tactical, and operational decisions associated with last-mile network design through the lens of a dynamic-stochastic multi-echelon capacitated location routing problem with time-windows (DS-ME-C-LRP-TW). To this end, it is pertinent that the e-retailer deploy sophisticated heuristics that can support decision-making to address the stochastic and dynamic uncertainties in the delivery environment. In the next subsection, this chapter will discuss one such metaheuristic – the Adaptive Large Neighbourhood Search (ALNS) algorithm, detailing its solution landscape exploration and exploitation principles through a pseudo code.


Algorithm - Adaptive Large Neighbourhood Search (ALNS)#

The Adaptive Large Neighbourhood Search (ALNS) algorithm begins by setting the current solution – \(s\), and the best solution – \(s^*\), to the initial solution – \(s_o\). In addition, it also initialises a long-term memory of previously visited solutions in the form of a hash list to prevent redundant search. Further, given a set of removal operators – \(o_r\), and insertion operators – \(o_i\), the algorithm assigns unit weight to each operator, thus ensuring unbiased search at the outset. Within this adaptive large neighbourhood search procedure, the algorithm also runs a simulated annealing mechanism to enable the algorithm to comprehensively explore the solution landscape. In the ALNS implementation here, the simulated annealing mechanism begins search with a temperature that allows the algorithm to accept a solution up to \(\overline{\omega}\) times worse than the initial solution with a probability \(\overline{\tau}\). Note, to address the vehicle routing problem at hand, the ALNS implementation here incorporates twelve removal operators, each combining one of three removal principles: random, related, and worst, with one of four removal scopes: customer, route, vehicle, and depot. Similarly, the five insertion operators are based on three insertion principles: random, greedy, and regret insertion.

Hereafter, the ALNS algorithm runs \(j\) segments, each comprising of \(n\) iterations. A segment begins with an update of operator probabilities based on the recent updates to operator weights, as well as a reset of operator counts and scores to 0. Thereafter, within each iteration of the segment, the ALNS algorithm randomly selects a removal and an insertion operator based on operator selection probabilities, and subsequently, updates their counts. It then deploys these removal and insertion operators on the current solution, thus destroying and rebuilding the solution to generate a new solution. Note, the scale of these operations is contingent on threshold determined by the absolute limit on number of solution elements – \(\underline{e}\), \(\overline{e}\), as well as relative limit on proportion of solution – \(\underline{\mu}\), \(\overline{\mu}\), that can be destoyed and rebuilt.

Based on the objective function evaluations of the solutions – \(f\); If the resulting new solution is better than the best solution, then the ALNS algorithm updates the current and the best solution, and subsequently, rewards the operators with a score of \(\sigma_1\). However, if the new solution in only better than the current solution, then the algorithm updates only the current solution. Further, the operators receive a relatively lower score of \(\sigma_2\), yet only if the new solution is novel. Nonetheless, if the new solution is worse than the current solution, then the ALNS algorithm accepts it as the current solution with a small probability, determined by the Boltzmann function – \(p=\text{exp}⁡((f(s)-f(s' ))/T_k)\). For this update, the operators receive an even lower score of \(\sigma_3\), but yet again, only if the new solution is novel.

Finally, at the end of each iteration, the ALNS algorithm stores the current solution into the hash list and updates the search temperature based on the cooling schedule. In the implementation here, the search cools down at an exponential rate dictated by cooling factor - \(\varphi\), nonetheless, the algorithm always maintains a minimum temperature that enables the algorithm to accept a solution up to \(\underline{\omega}\) times worse than the initial solution with a probability \(\underline{\tau}\).

After completing \(k\) such segments, the algorithm resets current solution to the best solution, thus reinforcing the focus on high-quality solutions. At the end of each segment, the ALNS algorithm updates operator weights based on operator performance, i.e., operator scores normalised by operator counts. In particular, the algorithm reacts to the operator performance in the current segment through a reaction factor – \(\rho\), while accounting for operator performance in previous segments through a dissipation factor – \((1-\rho)\). Further, before initiating the next segment, the algorithm refines the current solution through local search operators – \(o_l\), applied iteratively, each for \(m\) iterations. Note, to address the vehicle routing problem at hand, the ALNS implementation here includes a total of six local search operators, each combining one of the three search principles: move, swap, and opt, with intra- or inter-route search scope.

Finally, upon completing j×n iterations, the algorithm returns the best solution.

Pseudo Code#

  1. Procedure \(\text{ALNS}(s_o, (j, k, n, m, o_r, o_i, o_l, \sigma_1, \sigma_2, \sigma_3, \underline{e}, \overline{e}, \underline{\mu}, \overline{\mu}, \underline{\omega}, \overline{\omega}, \underline{\tau}, \overline{\tau}, \varphi, \rho))\)

  2. \(s ← s_o\)// initialise current solution \(s\) as the initial solution \(s_o\)

  3. \(s^* ← s\)// initialise best solution \(s^*\) as the current solution

  4. \(H ← \{h(s)\}\)// initialise hash list

  5. \(T ← \overline{\omega} f(s^*) / \ln(1 / \overline{\tau})\)// initialise temperature based on cooling schedule

  6. for \(o_r ∈ o_r\) do// initialise removal operator weights to 1

  7. \(w_r ← 1\)

  8. end for

  9. for \(o_i ∈ o_i\) do// initialise insertion operator weights to 1

  10. \(w_i ← 1\)

  11. end for

  12. \(u ← 1\)// initialise segment index to 1

  13. while \(u ≤ j\) do// repeat for \(j\) segments

  14. for \(o_r ∈ o_r\) do

  15.   \(c_r ← 0\)// set removal operator count to 0

  16.   \(\pi_r ← 0\)// set removal operator score to 0

  17.   \(p_r ← w_r / \sum_{r ∈ Ψ_r} w_r\)// update removal operator probability

  18. end for

  19. for \(o_i ∈ o_i\) do

  20.   \(c_i ← 0\)// set insertion operator count to 0

  21.   \(\pi_i ← 0\)// set insertion operator score to 0

  22.   \(p_i ← w_i / \sum_{i ∈ Ψ_i} w_i\)// update insertion operator probability

  23. end for

  24. \(v ← 1\)// initialise iteration index to 1

  25. while \(v ≤ n\) do// repeat for \(n\) iterations

  26.   \(o_r {R \atop ←} p_r\)// randomly select a removal operator

  27.   \(o_i {R \atop ←} p_i\)// randomly select an insertion operator

  28.   \(c_r ← c_r + 1\)// update removal operator count

  29.   \(c_i ← c_i + 1\)// update insertion operator count

  30.   \(\Lambda \sim U(0, 1)\)

  31.   \(\lambda {R \atop ←} \Lambda\)

  32.   \(q ← [(1 - \lambda) \min(\underline{e}, \underline{\mu} ||s||) + \lambda \min(\overline{e}, \overline{\mu} ||s||)]^-\)// set the size of removal/insertion operation

  33.   \(s' ← o_i(o_r(q, s))\)// remove and insert selected customer nodes

  34.   if \(f(s') < f(s^*)\) then// if the new solution is better than the best solution

  35.    \(s^* ← s'\)// update the best solution

  36.    \(s ← s'\)// update the current solution

  37.    \(\pi_r ← \pi_r + \sigma_1\)// update removal operator score by \(\sigma_1\)

  38.    \(\pi_i ← \pi_i + \sigma_1\)// update insertion operator score by \(\sigma_1\)

  39.   else if \(f(s') < f(s)\) then// else if the new solution is better than the current solution

  40.    \(s ← s'\)

  41.    if \(h(s) ∉ H\) then// if the solution does not exist in the hashed tabu list

  42.     \(\pi_r ← \pi_r + \sigma_2\)// update removal operator score by \(\sigma_2\)

  43.     \(\pi_i ← \pi_i + \sigma_2\)// update insertion operator score by \(\sigma_2\)

  44.    end if

  45.   else// else accept new solution with a small probability

  46.    \(\Lambda \sim U(0, 1)\)

  47.    \(\lambda {R \atop ←} \Lambda\)

  48.    if \(\lambda < \exp((f(s) - f(s')) / T)\) then

  49.     \(s ← s'\)// update current solution

  50.     if \(h(s) ∉ H\) then

  51.      \(\pi_r ← \pi_r + \sigma_3\)// update removal operator score by \(\sigma_3\)

  52.      \(\pi_i ← \pi_i + \sigma_3\)// update insertion operator score by \(\sigma_3\)

  53.     end if

  54.    end if

  55.   end if

  56.   \(H ← H ∪ \{h(s)\}\)// add current solution to hash list

  57.   \(T ← \max(\phi T, \underline{\omega} f(s^*) / \ln(1 / \underline{\tau}))\)// update temperature based on the cooling schedule

  58.   \(v ← v + 1\)// update iteration index

  59. end while

  60. if \(f(s) < f(s^*)\) then// if the current solution is better than the best solution

  61.   \(s^* ← s\)// update the best solution to the current solution

  62. end if

  63. for \(o_r ∈ o_r\) do// update removal operator weights

  64.   if \(c_r ≠ 0\) then

  65.    \(w_r ← \rho \pi_r / c_r + (1 - \rho) w_r\)

  66.   end if

  67. end for

  68. for \(o_i ∈ o_i\) do// update insertion operator weights

  69.   if \(c_i ≠ 0\) then

  70.    \(w_i ← \rho \pi_i / c_i + (1 - \rho) w_i\)

  71.   end if

  72. end for

  73. if \(h \mod k\) then// after every \(k\) segments, reset the current solution to the best solution

  74.   \(s ← s^*\)

  75. end if

  76. for \(o_l ∈ o_l\)// iteratively perform local search on the current solution

  77.   \(s ← o_l(s, m)\)

  78. end for

  79. \(H ← H ∪ \{h(s)\}\)// add the current solution to the hash list

  80. \(u ← u + 1\)// update segment index

  81. end while

  82. return \(s^*\)// return the best solution