Delivering the right message with Explainable AI in machine learning

In today’s dynamic business environment, harnessing data from the supply chain can provide a competitive advantage, especially in the face of unpredictable consumer behaviour. When your demand is driven by promotions, prices, tenders, or whether you want to integrate these drivers into your forecasting process.

However, traditional forecasting methods like time series forecasting often fall short of effectively utilizing this data. Planners often have to manually enrich the statistical forecast, which is time-consuming and introduces the risk of human bias.  

Fortunately, machine learning offers a solution by leveraging past and future data to identify demand drivers and enhance demand predictions. But a challenge arises due to the inherent black-box nature of powerful machine learning models. This is where Explainable AI (XAI) methods come into play, offering transparency and transforming the black box into a glass box.

 

SHAP value

Explainable AI (XAI)

During my research at EyeOn, I explored various Explainable AI (XAI) techniques and their alignment with the stakeholders needs in the demand planning process. I discovered that explanations are not one-size-fits-all; they depend on the interaction between the user and the explanation tool. Different stakeholders have unique requirements, which reveals a significant gap between the needs for explanations and the existing algorithm-focused approaches. 

By examining stakeholder needs, I found that model developers seek explanations to improve and refine models, planners require a solid understanding to support their decision-making and persuade others, and general managers seek confirmation of effective model utilization. Recognizing these diverse needs is crucial for successful implementation and acceptance during the planning process. 

Overall, I enjoyed putting these techniques into practice and gaining practical experience. It was nice to uncover each technique’s strengths and weaknesses and better understand different stakeholder’s needs and motivations. With the support of a welcoming and innovative team, I am optimistic about future opportunities and the positive impact we can achieve by leveraging this knowledge and these techniques.

Explainable AI research

 

Curious to Learn More? Get in Touch with Our Experts!

If you’re interested in learning more about these findings ,their practical implications or engage in further discussions, feel free to contact one of our experts. Want to know how we use this? Take a look at our driver-based forecasting framework. Together, we can continue advancing the field of explainable demand planning.  

Search for