Logo Rob Buckley – Freelance Journalist and Editor

Trading at Internet speed

Trading at Internet speed

New information standards underpinned by faster, more flexible infrastructure is set to revolutionise trade in goods and services

Page 1 | Page 2 | Page 3 | All 3 Pages

The recent emergence of web services and XML for messages has provided a “standardised” system of sorts for collaborating companies. But XML’s flexibility means that there are now many different ways of defining data – even within the same company data may be described differently by different systems. Oasis, RosettaNet and other competing standards bodies have devised BPXML, ebXML and other mark-ups for business data that are both incompatible and complicated, which has led many companies to use their own cut-down versions of the standards when dealing with known suppliers.

Yet this prescriptive state of affairs – where one company or group of companies dictates how a new company should interact with it – is still an obstacle to collaboration, as IT systems need to be reconfigured to output messages in particular ways. A new breed of IT suppliers is working on this problem, however. Magenta Technologies has developed analysis software that learns using neural networking algorithms how messages are composed. Users provide the software with examples of the messages they send and receive and the software adapts, translating the messages so they can be used by other systems without the need for reconfiguration of any software.

OmPrompt has similar technology but provides it as a service – a new business model for the Internet age (see box, “ASP lives on as SaaS”) – and across a wide range of messaging formats, including EDI, fax, voice and email. The company says that it takes just a day of analysis to accept a new format into the system, compared to the 90 days typical of many message-analysis exercises. “It allows you to build rapidly,” says president and CEO Brian Bolam. “It makes you independent from any one supplier and you can change very rapidly. It’s non-invasive, technology agnostic and future proof – if you change the message format, it learns and adapts.”

But for true automated collaboration, simple message exchange is not enough: there needs to be a way for events to occur automatically in response to messages. For that, business process management systems that automate software are necessary. Languages that can describe business processes, such as BPEL, JBIL and UML, will need widespread adoption and incorporation into standards.

But truly automated collaboration that requires no human intervention may well be some way off. There will always be exceptions that a human being needs to make decisions about. BEA, which recently bought workflow management company Fuego certainly believes so. Martin Perceval, senior European technology evangelist at BEA, says “There are some things that can’t be made better by using a computer. There’s always going to be someone who needs to say yes or no.”

Certainly, the issue of trust remains big on most companies’ agendas when selecting new partners for collaboration. “There’s a world of a difference between calling on a company for a computing service versus calling on them to turn up with a truck full of parts for your supply chain,” argues Perceval.

Wariness of unknown potential business partners stalled the adoption of UDDI repositories for partner discovery when Web services became a hot technology several years ago, and was also one of the factors in the initial failure of the ASP business model. It’s likely to remain a major inhibitor to many new Internet business models as well.

While companies are developing new business models for the Internet age, some of the obstacles to automated collaboration are still present. New technology and standards are removing them and making the majority of business processes automatable. But the human factor – and distrust – are still going to be elements in any business model for some time to come.

ASP lives on as SaaS

One of the most notorious business models of the dotcom boom was the application service provider (ASP) model. This asked companies to hand over management of their own in-house applications to a third-party who would host the application and deliver it to the companies over the Internet. In return, the new hosts would charge for usage, either via subscription or “per click”. Theoretically, this meant ASPs’ customers wouldn’t need to have the infrastructure or expertise to run their applications and could benefit from a reduced expenditure on licences as well. In practice, the pricing models weren’t flexible enough, reliability wasn’t high, few companies wanted to be early adopters and many were resistant to the idea of trusting an external company with mission-critical applications and enterprise data.

Page 1 | Page 2 | Page 3 | All 3 Pages

Interested in commissioning a similar article? Please contact me to discuss details. Alternatively, return to the main gallery or search for another article: