Demand sensing is a forecasting method that leverages new mathematical techniques and near real-time information to create an accurate forecast of demand, based on the current realities of the supply chain. Gartner, Inc. insight on demand sensing can be found in its report, "Supply Chain Strategy for Manufacturing Leaders: The Handbook for Becoming Demand Driven."
Traditionally, forecasting accuracy was based on time series techniques which create a forecast based on prior sales history and draws on several years of data to provide insights into predictable seasonal patterns. However, past sales are frequently a poor predictor of future sales. Demand sensing is fundamentally different in that it uses a much broader range of demand signals (including current data from the supply chain) and different mathematics to create a more accurate forecast that responds to real-world events such as market shifts, weather changes, natural disasters, consumer buying behavior etc.
Demand sensing Wikipedia
The cornerstone of traditional forecasting is based on the Fourier series time series mathematical analysis conceived by Joseph Fourier in 1822. Fourier statistical modeling uses a historical data series to create seasonal forecasts and set the course of forecasting for the next 125 years. In 1957, Holt-Winters took time series analysis to a new level with exponential smoothing. In the 1980s, low-cost computing paved the way for larger and more complex time-series models and Moore’s law continues to fuel the trend of increasingly sophisticated models in the pursuit of refining forecast accuracy.
There remains, however, a ceiling for time series forecast accuracy. A ceiling governed not by processing power and memory, but rather fundamental limitations imposed by information theory and the fact that historical data does not reflect current events or market conditions.Information theory dictates that increasing model sophistication to pursue a “perfect fit” reaches a point where further sophistication of time-series analysis actually decreases forecast accuracy. Industry figures show that despite highly tuned models, forecast error remains a challenge. Even high-volume consumer packaged products with well-understood seasonality patterns established over decades continue to experience high near-term forecast error rates using sophisticated traditional time-series methods. A recent study encompassing $250 billion of trade from 17 multinational CPG companies with shipments of 9 B cases and 1.6 M item-location combinations found that the top quintile of highest volume products (1% of items representing 20% of total volume) had an average time-series forecast error of 43%, while the lowest quintile of slowest moving products( 85% of products representing 20% of the total volume) had an average forecast error of 65%.
Furthermore, historical data series are by nature disconnected from current events that effect demand in unpredictable ways – a financial downturn or recovery, a spike in energy prices, an outbreak of swine flu, regional unrest or natural disasters. Even changing weather patterns such as cold snaps and heat waves alter consumer demand from historical patterns. It therefore comes to no surprise that time series models are ill-suited to volatile markets, especially during market downturns or upturns.
Breaking this ceiling requires the inclusion of current demand signals from throughout the supply chain and new mathematics to sort through the masses of data and determine what is predictive. There is no shortage of near real-time data collected by manufacturers in their supply chain and it grows exponentially, once retailer data is included. According to a McKinsey & Company report, “Manufacturers can improve their demand forecasting and supply planning by the improved use of their own data. But as we’ve seen in other domains, far more value can be unlocked when companies are able to integrate data from other sources including data from retailers, such as promotion data (e.g., items, prices, sales), launch data (e.g., specific items to be listed/delisted, ramp-down plans), and inventory data (e.g., stock levels per warehouse, sales per store). By taking into account data from across the value chain (potentially through collaborative supply chain management and planning), manufacturers can smooth spiky order patterns. The benefits of doing so will ripple through the value chain, helping manufacturers to use cash more effectively and to deliver a higher level of service. Best-in-class manufacturers are also accelerating the frequency of planning cycles to synchronize them with production cycles. Indeed, some manufacturers are using near-real-time data to adjust production.” This last sentence refers to demand sensing solutions. For more information see McKinsey Global Institute's report, "Big Data: The Next Frontier for Innovation, Competition and Productivity."
The nonsensical name of "Demand Sensing" is simply an attempt to promote the concept of using all currently available data across forecasting nodes (and time) to forecast each. As such, this is a variety of pattern recognition techniques applied to the data susceptible to this kind of analysis. Many people try to interpret this concept in ways fitting their understanding of reality, some sensing the point better than others.
Lora Cecere, Partner, Altimeter Group, explains the process of using retailer data, also referred to as downstream data, to enhance demand sensing performance. “It is hard work. It is cross-functional. It is a new way of thinking. At the core, it challenges traditional paradigms. However, if you can cross these boundaries, companies find that the use of downstream data pays for itself in less than six weeks every six weeks, and companies that were good at the use of downstream data and sensing channel demand aligned and transformed their supply chains 5X faster than competition.” For more information see Lora Cecere's post on downstream data.