In 1956, an American trucking entrepreneur named Malcom McLean revolutionized global trade. His brainchild, the standard shipping container, created a more efficient method for carrying goods from factory to ship to warehouse. Since then, instead of dockworkers unloading crates of all sizes and shapes, cranes lift McClean’s uniform metal boxes—which double as trailers—directly onto barges designed to hold them. The innovation quickened the loading process, which allowed for larger ships; reduced theft; cut the time cargo spent in port; enabled just-in-time production, which reduced costs associated with large inventories; and more.  A 2013 study by Daniel M. Bernhofen of American University, Zouheir El-Sahli of Lund University and Richard Kneller of University of Nottingham concluded that standard shipping containers increased bilateral trade nearly 800 percent over two decades and were a bigger driver of globalization than all the trade agreements of the past 50 years combined.

What do shipping containers have to do with data? IT currently lacks the data equivalent of McClean’s universal boxes. To be sure, standards exist. Web content is essentially XML  (Extensible Markup Language)-based, and commonly used messaging technologies, such as Java Message Services, IBM MQ and TIBCO Rendezvous, allow diverse applications to interact with each other.  Additionally, all database platforms ostensibly use SQL — the standard language for relational database management systems, as dictated by the American National Standards Institute. But, over time, vendors have diluted the universality of SQL as they’veinterpreted the language differently, creating major compatibility challenges when migrating data from one application to another. These challenges burden IT departments facing pressure to protect data for security and compliance while also integrating information sources throughout their organizations. How can IT work efficiently if so much of the data is at odds, like the mismatched crates that once slowed global trade?

To appreciate how badly standardization is needed, consider the emergence in recent years of the cottage industry of data quality and cleansing. If organizations had better managed and more consistent data, the new field would not be flourishing.

SAP extracts data through Open Data Protocol, or OData. OData represents a neutral way of accessing data out of the business suite, allowing clients to interact with SAP applications and consume data across multiple channels. Prior to this development, customers were forced to adopt one of SAP’s proprietary application program interfaces or protocols to retrieve data—not a trivial task. The IT industry also will need to collaborate better to resolve security and privacy issues surrounding data, particularly due to more data diversity and the rise of the Internet of Things.

Necessity is driving these issues. Today, developers spend an inordinate amount of time trying to answer a compound question as they devise technological solutions: Where is the data, and how do we get it? With better standardization, problems become easier to resolve, saving businesses time and money. Additionally, a more orderly system could have huge benefits for consumers, who are currently awash in content on various platforms and devices but have no idea if there information is being compromised, or if it will be supported three or even 10 years from now. As they wait for standardization, organizations can protect and elevate their data by instilling a chief data officer —a leader who focuses on the quality and management of information.

The shipping container was one of the most significant inventions of the 20th century, but the data equivalent of it could have an even greater economic impact. The global exchange of physical goods pales in comparison to the digital trade of information. That’s why SAP and other vendors must work hand in hand with global policy makers to build universal data shipping lanes, from continent to continent, for cloud or data center environments. Such cooperation could create seamless sharing and processing of company-owned data and avoid impediments to building the data equivalent of McLean’s shipping containers. If we can bring the same level of order and standardization to digital information that McLean did to global trade, then a remarkable new world will open for us.

This article is published in collaboration with The SAP Community Network. Publication does not imply endorsement of views by the World Economic Forum.

To keep up with the Agenda subscribe to our weekly newsletter.

Author: Irfan Khan is CTO for SAP Global Customer Operations (GCO), a twenty five thousand person strong field organization.

Image: A container ship departs Burrard Inlet in Vancouver, British Columbia March 6, 2009. REUTERS/Andy Clark.