AI & Technology

How overcoming fragmented data will decide AI’s future in supply chains

AI adoption in supply chain management is ramping up. It promises to help providers overcome major logistical and supply chain hurdles so they can be more proactive, with key benefits including more accurate demand and delivery forecasting, real-time monitoring, and smart inventory optimization, among others. 

However, most AI deployment strategies are failing to show consistent value. Fragmented data, not weak models, is the culprit. Research shows that poor data accessibility, siloed systems, and inconsistent formats are real barriers to AI deployment, limiting real-time visibility and model reliability. 

Logistics providers continue to rush to deploy AI, but they will continue to encounter roadblocks if they fail to address data fragmentation. 

I spoke to Arindam Roy, Vice President, Client Partner at Straive, to discuss how overcoming fragmented data will decide AI’s future in supply chains.

When it comes to AI, why can’t the technology deliver when there are data disconnects?

Arindam Roy: Data is the foundation of AI. Any data disconnects will jeopardize the algorithm’s performance, snowballing into issues like hallucinations, tracking errors, false positives and negatives, and more. For logistics and operations, that leads to disastrous fallouts: backlogs, delayed or cancelled deliveries, failure to meet demand, and ballooning inventory, all of which ultimately harm reputation and customer relationships. 

This is a serious issue that is pervasive throughout the industry, considering that a significant portion of organizations plan on increasing investments in AI and their digital transformations. In fact, a survey from McKinsey found that 93% of shippers are increasing their digital investments. 

It’s also crucial to consider the sheer breadth and depth of data that providers have to stay on top of. Warehouse and transport management systems (WMS and TMS), enterprise resource planning (ERP), and application programming interfaces (APIs) together produce a huge amount of operational data. Even the smallest details that slip through the cracks create ripple effects that can hurt the entirety of the supply chain. 

AI is continuing to disrupt industries. How can we rethink data accountability and ownership here? 

Arindam Roy: As the Boston Consulting Group (BCG) notes, the best practice for successfully deploying AI is through an embedded end-to-end data and evidence-driven approach. This means visibility, accountability, and ownership are must-haves in any AI deployment strategy, and that starts with data management and governance. 

Arindam Roy

The cost of poor integration is being revealed in AI adoption and resilience planning, which are taking center stage as trade uncertainty only continues to increase. Most, if not all, organizations are well aware that the stakes are higher amid the ongoing economic, geopolitical, and trade uncertainty. Achilles’ heels in digital infrastructure, including AI deployment, make logistics providers even more vulnerable to these ongoing external threats. 

Inconsistent data formats and poor interoperability, as well as ownership ambiguity, mean organizations cannot achieve real-time insights. That negates much of the value of deploying AI in the first place. Moreover, without up-to-date inventory tracking, transportation monitoring, and accurate supplier data, companies can’t simulate disruption scenarios or test contingency plans effectively. This prevents scalability and much-needed continuous testing for improvement in digital transformation strategies. 

These have serious repercussions from a regulatory standpoint as rules and laws become more stringent around data handling, privacy, and security. Failing to comply with relevant regulations can lead to costly fines and reputational damage. The risks extend beyond compliance failures, too. Fragmented data essentially means holes in existing digital management systems that heighten threats of security breaches. 

Of course, in these situations, ownership and accountability are scrutinized. However, fragmented data often comes with opaque pipelines, meaning traceability and accountability are even more challenging to ensure. That prevents organizations from addressing and overcoming these risks. With these external threats, organizations are forced to rethink data accountability and pipeline management, particularly around who owns and validates information behind decision-making, in addition to data governance guardrails. 

How can companies best build data infrastructure for AI-ready logistics?

Arindam Roy: Fortunately, industry leaders are recognizing the importance of alleviating data woes, ensuring that data pipelines, guardrails, and governance plans are put into place. They are also realizing that it’s not the most advanced or sophisticated technology that matters, but the foundations, including data, in place. 

Getting data infrastructure future-ready and resilient means overhauling data governance from the ground up. Interoperability, security, accountability, and visibility are non-negotiable priorities here. Systems are designed with human-in-the-loop guardrails for swift intervention to prevent cascading consequences. 

Rather than pursuing a plug-and-play mentality where AI solutions are added in isolation, it’s crucial to take a holistic approach. That means designing integration and data pipelines that ensure visibility from end to end. 

The first step is to conduct a data audit. What are the key gaps and areas of improvement in existing data management and infrastructure? Where are workflows disrupted because of poor interoperability and incompatibility between systems such as connected Internet of Things (IoT) devices, advanced analytics, and AI? At this point, it’s also important to identify where the data is sourced from and ownership. Any missing information in this regard needs to be flagged and addressed. 

Additionally, pinpoint where data exchanges are not feasible and the cause of this. This can be due to poor formatting, missing data, permission and accessibility issues, or third-party interference from other tools. It’s vital to assess existing tech stacks to align on whether systems are interoperable or simply operating in isolation and undermining workflows as a result. 

Without strong data foundations, AI-powered logistics will come to a sharp halt. As logistics providers continue navigating an increasingly uncertain landscape, AI will become increasingly indispensable, but it can only be an asset with the right data infrastructure in place. 

Author

Related Articles

Back to top button