Guest Commentary: Why Stochastic Transportation Modeling is Better for Strategic Planning

Most large shippers spend a great amount of time and expense collecting, analyzing and maintaining the data used to drive daily transportation planning and execution. I refer to these data, when codified and integrated into the shipper’s transportation system, as the organization’s transportation policy. The policy is comprised of lanes, modal options, rates, carrier and fleet capacity and service levels, as well as a multitude of other decision variables and rules that must work in concert to drive daily decision making.

As the transportation policy defines the framework with which all transportation decisions will be made, it is imperative that it is constructed in a manner that is not overly constricting. It needs to be flexible enough to provide the advanced optimization capabilities of the transportation management system the latitude to find savings. However, it also needs to be built in a manner that is cognizant of typical variability that is found within virtually every supply chain. It is on this point that I believe there are opportunities for improvement over existing processes.

Deterministic vs. Stochastic Strategic Planning
When developing the data that make up the transportation policy, most analysts still rely heavily on statistical averages. This approach completely ignores the inherent variability that underlies nearly every aspect of transportation. From order sizes, to lane volumes, to travel times to fuel prices, using a single, discrete value to represent each of these variables during strategic planning will lead to a poorly designed transportation policy that will ultimately be reflected as inefficiencies in daily planning.

So, why do most organizations develop their policy in this manner? The simple answer is that averaging is both easy to comprehend and calculate. Additionally, averages also provide a discrete, numeric value than can be used by deterministic optimization tools that are often used in transportation modeling. Unfortunately, averages rarely reflect reality. For example, if half the time a supplier ships 5,000 pounds and the other half they ship 40,000 pounds, then the average is 22,500 pounds. But in reality the supplier never ships 22,500 pound orders. Therefore, a fundamental input used as the basis for the development of the transportation policy is incorrect.

To be clear, when doing daily planning, using a deterministic planning tool is usually the most logical method since order quantities, rates, capacity and other values are typically known at the time the solution run is initiated. However, strategic planning should utilize a “stochastic” approach that is based on calculated probabilities, not the oxymoronic concept of forecasted certainties that is implied in the deterministic approach. Applying stochastic principles is not new to supply chain planning. The concept of safety stock exists precisely due to the realization that we cannot predict with complete certainty customer demand, lead times or order fill rates. The latest transportation modeling technology allows for combined stochastic optimization and simulation, which enables analysts to incorporate known variability into the process of developing the transportation policy.

So, how does this all work in the real world?
Let’s take the example of prepaid-to-collect conversion. The decision is driven by the difference between the freight allowance provided by the supplier and the transport cost that would be incurred by the customer. Although there are many constants in this example that might lead one to a deterministic solution, prepaid-to-collect is a long-term, strategic decision that is subject to significant levels of variability over time. To make the right decision, this variability must be included in the analysis. Key elements that need to be modeled for variability include:

  • Order quantities and product mix
  • Order quantities of neighboring suppliers (i.e., Can LTL shipments be converted into multi-stop TL movements?) This is important because looking at each supplier independently will not locate the benefits associated with inbound multi-stop TL consolidation.
  • Freight rates
  • Fuel prices. If no fuel surcharge is charged back to the supplier, the customer is taking all the risk around fuel price volatility.
  • Order frequency
  • Transport lead times. Do I have time to perform consolidations or are the ordering patterns such that I have to regularly use direct and/or expedited freight?

The standard practice commonly employed today would be to gather a snapshot of “representative” data. The fallacy of this approach is that no representative data set that utilizes discrete values can accurately portray the variability that will naturally occur in a stochastic world. By combining optimization and simulation, the variability of these data is included in the model and will therefore lead to a transportation policy that maximizes savings over time while also reducing costly manual exception management.

In summary, most organizations take a deterministic approach when developing their transportation policy simply because it’s easy to do and it’s the way it has always been done. However, long-term strategic planning must take into account known variability. Only through taking into account variability will analysts be able to create an operationally resilient and efficient transportation policy that reflects the reality of an ever-changing world.

Mike Mulqueen is a Senior Director in Manhattan Associates’ Product Management organization where he is responsible for providing strategic direction for the company’s Transportation Management solutions. Mulqueen has more than 20 years experience developing, implementing and overseeing transportation and logistics systems. Prior to joining Manhattan Associates, he held transportation focused leadership positions at Accenture, C&S Wholesale Grocers. UPS, and Manugistics. Mulqueen holds a BS degree from the University of Maryland, College Park and a Master of Engineering degree from the Massachusetts Institute of Technology.


  1. Mr. Mulqueen is spot on target with his comments. “The Flaw of Averages” by Dr. Sam Savage has a detailed explanation of this issue, and the value of the stochastic approach. And it is misleading to use the mean and look at +/- 10% as a sensitivity analysis. The reality is usually quite a different distribution range.

    When you have historic variance, Mulqueen describes quite well the benefits gained by simulation and stochastic optimization. When you are facing the future where your historic data may or may not reflect what could happen, then a stochastic analysis is absolutely essential. In addition, there may be events that could happen that will shift the stochastic distribution in a dramatic fashion. These need to be included, with their probability of occurrence in the analysis of your strategies.

    The resulting comparison of alternatives and sensitivity analysis will provide a level of insight into what is important that cannot be obtained with deterministic assumptions. You will gain the confidence to make fully informed decisions.

Leave a Reply

Your email address will not be published. Required fields are marked *