User story: Outsmarting NP-hardness

Picture of warehouse

Order batching problems in warehouse are NP-hard

In a warehouse, orders have to be picked by order pickers. These order pickers have to visit the storage locations of the items on their pick list and retrieve the requested number of items. By combining similar orders [1] on a single pick list, the travelling distance per order can be reduced. 
This increases efficiency and, thus, productivity. 

However, finding which orders should be combined is complex, and solving this ‘order batching’ problem is NP-hard [2].  In an NP-hard problem, adding a new item to the problem increases the time to solve it exponentially rather than polynomially (P-hard problems). One peculiarity of such problems is that there is (from what we know) no way to solve them efficiently.

Furthermore, for every pick list, the best sequence of visiting the item locations has to be found to reduce the required time even more. The combination of these problems results in large computational times. 
If one also includes dynamic order arrivals, the computations must be performed multiple times ‘during the planning period’ (while operations are running) to include these dynamic orders in the current schedules. This also increases the required computational time considerably. 

Metaheuristic algorithms to the rescue

To solve large optimization problems that are NP-hard, metaheuristic algorithms can be used. 
A metaheuristic algorithm is used to find suitable solutions for difficult problems in reasonable computational times. Still, as the problem remains NP-hard, it does not guarantee finding the optimum, though the solution is usually quite good. 

The research group ‘Logistics’ of Hasselt University (LOG) aims to achieve theoretical advances in modelling and solving real-life decision-making problems in logistics. 

Prof. dr. Kris Braekers (UHasselt): Part of our research focuses on developing efficient heuristic algorithms. With these algorithms, we wish to solve complex combinatorial optimisation problems with transportation or logistics applications (e.g., vehicle routing problems).”

Ruben D’Haen (Logistics UHasselt): “Right now, the allocation of orders to batches is very simplistic in most warehouses. This implies that large efficiency gains can be obtained by using a more sophisticated algorithm that tries to group orders in a smart way. We developed a metaheuristic algorithm that solved the order batching problem in shorter computational times.  We also looked at how the batching of orders can be handled efficiently, even when new orders arrive during the planning period (i.e., while the company operates). This had not been studied yet in the literature for a setting as complex as ours.” 

After constructing an initial solution at the start of the working day, their new algorithm can incorporate newly arriving orders throughout the day in the existing schedule in an efficient way. 

Ruben D’Haen: “The challenge in this research was the combination of several problems (finding which orders should be combined and the best sequence of visiting the item locations). Every subproblem influences another. We tried to integrate this into a smart algorithm solving all these problems simultaneously. The computational complexity was reduced from NP-hard (exponential) to polynomial complexity. [3]

Supercomputing reduces tuning & testing time of the algorithm

Tuning the algorithm’s parameters to the problem at hand requires quite some testing. Furthermore, testing the algorithm to see how it performs in various settings requires solving many different ‘instances’ [4] of the problem. By using other factors in the experimental design, we can learn the impact of these factors on the solution quality, e.g., we can see the effect of a larger set of orders, a different arrival pattern of the orders… This gives an insight into what the effect of real-life characteristics and circumstances of a particular warehouse could be on the efficiency of our algorithm. 

Ruben D’Haen: “Combined, all parameter tuning and testing done for our scientific paper and afterwards for the company involved took at least five days on a supercomputer. Performing these tests on my laptop would take 36 times longer or over four months. Furthermore, using a supercomputer allows using my laptop in the meantime for other research activities, which would be impossible if computations were performed on it.” 

Bridging the gap between academia and practice 

Lotte Verdonck (Business Developer UHasselt): “We always valorise our results in cooperation with the industry. Using the supercomputing infrastructure makes it possible to bridge this gap between academia and practice as complex real-life problems can be solved in a reasonable time frame while providing a competitive advantage to the company involved.”

In this case also, what started as theoretical research for a scientific paper got a very practical component over time. 
For the scientific paper, UHasselt Research Group used data from an existing company to make the research more practically accurate.
Subsequently, the fine-tuned algorithm was used on production data provided by the company.

Business & scientific impact 

Ruben D’Haen: “When looking at the company we are working with, we see that they currently operate using a very simple algorithm to group orders. The distances to be travelled in the warehouse by the company’s order pickers are also not optimised but rather follow simple rules of thumb. Large efficiency gains are obtained if our algorithm is used on their data. This way, the company can pick up all orders before their deadlines with fewer order pickers than currently required. This implies that the same work can be performed at a lower cost, making the company more competitive.” 

The company can obtain strategic/tactical insights to improve its operational efficiency in the future. This research also showed how dynamic order arrivals could be handled efficiently, extending the current state-of-the-art scientific research. 

The research of UHasselt shows that developing and using a metaheuristic algorithm is helpful in practical settings, even if it does not guarantee finding the optimal solution. The improvements over the very simplistic optimisation methods used in practice are large, allowing for significant efficiency gains, while computational times are kept sufficiently short.

[1] Two orders are similar if their item locations are close to each other 

[2] NP-hard: NP-hard is a complexity measure (more info

[3] https://en.wikipedia.org/wiki/Time_complexity#Polynomial_time

[4] By instances, we mean artificially generated data sets following an experimental design based on probability distributions and parameter settings extracted from real data about the orders in the company.

More information

Ruben D'Haen

Ruben D'Haen

Ruben D'Haen obtained his Bachelor and Master's degree in Business Engineering (‘Handelsingenieur’) at Hasselt University in 2019. After his studies, he started working as a PhD student in the research group Logistics, also at Hasselt University, which is now funded by the FWO (Research Foundation Flanders). His research focuses on the integration of warehousing and distribution decisions, although his first project was completely situated in a warehousing context. There he studied the integration of several order picking planning problems (order batching, picker routing and batch scheduling) that arise in a warehouse while accounting for dynamic order arrivals.

Prof. dr. Kris Braekers

Prof. dr. Kris Braekers

Kris Braekers is an associate professor at the research group Logistics of Hasselt University. His research mainly focuses on the development of (heuristic) solution approaches for complex combinatorial optimisation problems in transport and logistics, including problems related to vehicle routing, warehouse operations and healthcare logistics.

Lotte Verdonck 

Lotte Verdonck

Lotte Verdonck is business developer for the Faculty of Business Economics at Hasselt University. Her aim is to bridge the gap between academic research and industrial and societal challenges. The research fields she covers include Logistics, Data analytics & AI, Service marketing & customer behavior, Family firms, Entrepreneurship, Diversity & inclusion and Policy management & law economics.