The March meetings of the Federal Maritime Commission’s (FMC) Maritime Data Initiative maintained the focus of previous sessions on data reliability and the consequences of bad data as it is utilized by different stakeholders in the supply chain.
The goal of the FMC initiative is to establish data standards and best practices for data access and transmission, which is essential not only for reliable and stable ocean transportation systems, but for the global supply chain.
The weekly meetings, continuing through April, are examining global supply chain issues from the perspectives of different groups of stakeholders.
Comments from the March data initiative meetings are summarized below.
The March 1 meeting focused on large aggregators, such as Flexport, Expeditors, and C.H. Robinson, and the discussion largely centered on data reliability.
Data reliability is one of the biggest problems in the supply chain, and when bad data quality creates real-life issues, it is the customers that are held financially accountable for the problems even though they are not the ones creating them.
In the last year, estimated time of arrival data has been particularly difficult to trust as vessels are anchoring for weeks outside of the port before getting a berth and being able to unload their cargo. As vessel schedules change, big issues are created down the supply chain, and having reliable data to be able to predict where a bottleneck will happen before it occurs is crucial.
Oftentimes, aggregators are “working in probabilities rather than in certainties.” They must determine the probability freight has moved based on how credible the source has been historically.
One speaker noted that it seems like they’re getting better at “war rooming” the data to be able to find the bottlenecks more precisely, but at the same time the challenges are becoming greater.
Labor and the Supply Chain
Speakers at the March 8 meeting, which centered on maritime labor, echoed the sentiments expressed the previous week, specifically noting that better vessel manifest data would improve efficiency and performance.
If a vessel type and voyage can be tracked for a long period, the stakeholder is able to look forward and know if there are going to be ebbs and flows in labor.
For example, with good data, a stakeholder will be able to tell whether an extra 100 workers will be needed on a Tuesday two weeks from now.
Validating the cargo with a ship’s manifest and then connecting that vessel to a specific port would facilitate more accurate predictions of labor costs, time forecasting, logistics and cost analysis.
One speaker stated that ports around the country are short on labor, and that to alleviate the labor shortage, one should be an advocate, not just for the supply chain, but for attracting more people to the workforce.
Multiple Data Sources
The March 15 meeting covered available technologies and platforms for accessing and transmitting data.
Data originates from many sources and formats, including physical documents, XML, PDF, application programming interface (API), web-scraping services, etc.
A representative from Lloyd’s List, a provider of maritime intelligence with a 300-year track history, remarked that his company uses 150 different sources. With so many sources of data being used at any one time, the consistency and standardization of the data is important for being able to push the data out effectively to the company’s clients.
The Lloyd’s List representative mentioned a study his company did in collecting the data that is available as far in advance as possible on where a vessel is likely to arrive.
They found that in 2021, 57% of vessels were shipping with inaccurate or unclear destinations stated in their automatic identification system (AIS) transmission.
To be able to use data effectively and combat these issues, it is necessary to invest heavily in machine learning and advanced analytics.
Setting Shipping Standards
The March 22 meeting revolved around international standards. A representative from the Digital Container Shipping Association (DCSA) said that the association has standards that are free to use online and accessible to anyone.
The DCSA makes standards through collaboration on both input and output sides by combining the many standards that already exist and discussing them with experts and customers to find what works best.
One speaker used the example of GPS trackers on shipping containers to illustrate how complicated standards can be in the shipping arena.
He noted that safety rules can be complex and elaborate, using the example of the batteries on the GPS trackers and how those batteries could easily be a source of ignition and cause fires. Such situations could create a collection of 40 regulations for each individual device. DCSA is trying to make it easier to implement such standards.
Another speaker advised the FMC not to make digital standards a goal in itself; instead, the goal should be increasing efficiency. It is important to not reinvent the wheel, but to reuse what already works through collaboration with other federal and state agencies, internationally, and with the private sector.
Good Data Essential to Efficiency of Ports
Marine terminal operators and how they use data to organize their ports were featured at the March 29 meeting. Stakeholders discussed the importance of receiving accurate and timely data, as it is key to everyday operations and the functioning of their cargo operating systems.
Good data allows the ports to organize their yards to promote efficiency better. The system at one port allows a trucking company to pre-advise the port by creating a “record” explaining when they will be arriving and what their transaction will be when their trucks arrive at the facility. This pre-advise record allows the port to decide in advance how to manage that cargo when it comes into the gate.
A speaker commented that better utilizing beneficial cargo owner (BCO) data would allow the port to be able to improve its organization of stacks and create a “peel and go” situation for individual shippers and cargo owners.
If the cargo data is available in real time, ports would be better able to leverage the data to forecast what is heading their way. This data would lead to increased efficiency and productivity, allowing ports to control the flow of cargo and potential congestion, which also would help subsequently to reduce carbon emissions and idling in the terminal.
Federal Maritime Commissioner Carl Bentzel will continue to lead meetings with maritime and intermodal stakeholders through April. Initial findings from these meetings are expected to be presented at the FMC Maritime Transportation Summit, currently scheduled for June 1.
All meetings are open to the public and the FMC has made available a new email address where stakeholders can communicate any concerns related to the topic of maritime data at email@example.com.
To see the dates and topics of the remaining initiative meetings, plus links for online viewing, visit the FMC Maritime Transportation Data Initiative website at www.fmc.gov.
CalChamber coverage of previous meetings is available here.
Information compiled by Nicole Ellis, CalChamber international affairs and media relations specialist.