My previous two posts focused on the importance of constraints in TMS algorithms and the issues around executing the routing algorithm’s recommendations in a dynamic, real-world environment. These are two very challenging issues that a TMS must be able to address. Customer’s requirements are complex and unless you have spent years in the transportation industry and have built a TMS before, it is very difficult to anticipate these needs in the original design. But, if designed correctly, a TMS should be able to support these needs – accurately, quickly, and intuitively.
- Accurately – Transportation is the point at which a company touches the outside world. Deliveries need to be on time and products need to be handled properly. When transportation decisions are not accurate, orders are often damaged or delayed, and money is wasted.
- Quickly – Transportation decisions typically need to be made and executed within hours and transportation professionals don’t have the luxury of analyzing data for days or weeks to decide how they will move freight.
- Intuitively – Transportation professionals want a TMS to figure out how to move their freight and do so with very little human intervention. For this to happen, a TMS must possess very sophisticated components that can quickly determine what to do while considering many complex variables that carry with them a lot of uncertainty.
There has been quite a bit of discussion in the industry lately about another key attribute of a great TMS, which is usability. As I read a recent post by Adrian Gonzalez discussing how Excel is the most popular TMS, I began to ponder why this is true. The point that resonated most with me was how intuitive Excel is and how comfortable most people are using it. Similarly to Excel, I believe that a great TMS needs to be intuitive. As an extension of your business, your TMS should provide you with familiar screens and data that you are accustomed to seeing and not add layers of complexity to the daily work flows already in place.
Additionally, we have been hearing about the importance of a single platform. In fact, I read a related article not too long ago and I could not agree more that the most common mistake relates to the algorithm being added as an afterthought. A good TMS (much less a great TMS) should not have its routing algorithm on a separate platform. The algorithm should not use different data as inputs. It should not require data to be moved to a different environment. And it should not be written and supported by a third party; the algorithm must be an integral part of the execution platform.
Why is this so important?
I have been doing this for more than 20 years and I’ve never quite been able to come up with a quick and easy explanation for why this is so important but, hopefully, as you continue to read this, the importance of a single platform will become clearer. In my opinion, the major issue is that the inputs to the algorithm are constantly changing, so the data used can be stale by the time the model is done. We don’t have the luxury of “stopping the world” while we run the model and execute its recommendations. Some of the data changes (even seemingly small changes) can drive major routing changes and potentially be very expensive if not handled correctly. We need many decision support tools to address the countless types of changes that occur throughout the day and not expect that one large algorithm will produce a magical plan that can be executed as expected. These algorithms and their single platform must be able to support any type of operation as companies change the way they receive, fulfill, and ship orders over time. Here are a few examples that may resonate with you:
- A shipper receives orders throughout the day that are expected to ship that same day: new orders, newly released orders, newly manufactured orders. This shipper might have to send loads to the warehouse throughout the day. The warehouse needs to pick them constantly and carriers come in throughout the day. At no point during the day are all the orders that will ship known and not yet shipped. Orders might come in at 2 PM to ship by 5 PM after 100 orders have already gone out the door. These new orders need to be routed. Some might go direct LTL or TL, others might be put together to form new multi-stop loads. Still others might be combined with LTLs or TLs that have not yet shipped to save money. The same algorithm that routes only fresh orders cannot handle this scenario; it must be solved by a different algorithm, but one that is part of the single platform that knows which TLs and LTLs can be changed.
- In the above (and other) scenario(s), a shipper might release routings to the warehouse throughout the day. Some TLs and LTLs go earlier in the day – when they are set – while others go later as appointments are being scheduled or carriers are searched for or sometimes on the expectation that additional orders might come in to fill out a truck. Here, the single platform driven by the TMS must be able to control not only the execution of the moves, but also the “releasing” of the moves. Only parts of the algorithm’s recommendation are released at a time. This can be very hard to handle if they are on separate systems.
- Most algorithms that are on separate platforms cannot share data with the TMS. This creates a real challenge as the algorithms need enormous amounts of data including appointments, hours of service, unloading times by location, special handling requirements by product, etc. Some systems are forced to keep two sets of supporting files for the same data elements. At best, these are hard to keep in synch and therefore could make a system barely usable.
- Most changes to a multi-stop load require at least part of the algorithm to be rerun. If the appointment for the first stop on a load is moved from 9 AM to 7 AM, can this load still get to the first stop on time? How does that affect the times for the rest of the stops on the loads? When one considers that the change could involve switching the order of stops or adding a new stop, it’s clear that these are serious algorithms to run, which would be very challenging indeed if the algorithm were not part of the TMS.
The other day, I reflected on all these issues and it suddenly hit me…a great TMS must be designed by someone (or ideally, a team) that has done it successfully before, preferably more than once. A novice could never know enough to consider most of these issues in the beginning, which is when they have to be designed correctly. Even someone who has designed a TMS once before, who would know some of the issues involved, would still get many things wrong. It’s only after trying many different solutions that one can really get it right.
Over 20 years ago, when the first generation of TMSs were created, the industry had two classes of product – algorithmic products that handled complex routing and rating products that did simple routing and payment. The market dictated the need to merge these two classes of products, which required new products to be written from scratch, as the data models of the earlier products could not support the new market requirements. About 10 years later, history repeated itself and it was obvious that the industry needed a much more flexible and functional, feature-rich system to handle more modes and more geographies. Once again it became necessary for new products to be written – from scratch – driven by the limitations of the earlier data models.
Fast forward to today, where we see countless companies with unique and challenging transportation environments, but that don’t require all the flexibility of the top second-generation TMSs, and thus can’t justify the costs of one. The only way the industry is going to change is to create products that are as sophisticated as the top second-generation TMSs, but much less complicated to set up and use. This would require changing the way a TMS uses its data model and advanced designs created by highly experienced TMS specialists who can create the needed features, while protecting users from unnecessary complexity. This new, third generation of TMSs will provide the needed single platform and multi-algorithm capabilities, but without all the “bells and whistles” of the top second-gen TMSs to appeal to a wider array of shippers across many different industries. Only then will we replace Excel as the tool of choice for TMS.
Mitch Weseley is the CEO of 3Gtms. With 30 years in the industry, Mitch is widely regarded as the “father of the TMS industry” having created six successful companies in the technology and logistics industry, including Weseley Software and G-Log.
Leave a Reply