James Urquhart predicts the emergence of Flow and how the adoption of event-driven integration could help optimise the way organisations deliver solutions – but at what cost?
Editor’s note: This interview with James Urquhart was recorded for Coding Over Cocktails - a podcast by Toro Cloud.
What happens when the interfaces and protocols that we use to integrate our applications are used to consume real-time event streams and real-time indications of state change in third-party systems?
According to James Urquhart, this process is known as "Flow".
Urquhart is known as the proponent and author of "Flow Architectures: The Future of Streaming and Event-Driven Integration."
In it, he defines Flow as "event-driven integration between organisations using a standard interface and protocol specification."
During an interview at Toro Cloud’s Coding Over Cocktails podcast, Urquhart explains that Flow already occurs in stock-trading networks, healthcare systems, and manufacturing systems with sensors identifying problems to a central control system.
Although it is not yet widely adopted, he believes that Flow will eventually emerge as a new standard in optimising enterprise integration and as it does, real-time integration costs will drastically decrease, allowing for more data streams to be shared through new creative means.
"Access to real-time events without custom software development will become normal and expected. A World Wide Flow will emerge." he explains.
Mapping out Flow
Through Wardley mapping, Urquhart was able to see the opportunity in event-driven integration and its evolution. Part of this mapping was understanding that new technologies would rise, potentially replacing the old ones that cannot meet the capabilities of cross-organisation integration using event streams.
"If you look at how people are integrating, especially across organisation boundaries, they're integrating applications using event streams."
While a number of industries are already adopting the integration of event streams, such as the healthcare and finance sectors, Urquhart says that these event-driven integrations are still highly tailored to the specific industry’s needs. "The problem is, every single one of those use-cases that you go through have basically, bespoke, made-for-that purpose, interfaces and protocols that enable those things to talk to each other."
He tells us that this will evolve eventually, and protocols and interfaces for event-driven integration will soon become ubiquitous and standardised. "I believe that over the long term — we're talking a decade, maybe more — we will see a very large, massive, scale-connected environment where real-time data is being exchanged and consumed and generating new data across industries, across geographic locations and across the world."
Are we prepared to go with the Flow?
The evolution of event-driven integration and emergence of the World Wide Flow isn’t too far away, says Urquhart. "Unless there's some fundamental, either legal or physics thing that's not understood today that would block its evolution, over time it's pretty inevitable. And we're getting very close to the point where that's going to start to appear."
With Flow on the rise and the continuous improvement of technologies such as machine learning and AI, Urquhart warns that what he calls "clerk" jobs are potentially at risk, as they could easily be automated by software.
"Organizations are going to have to look at if it makes sense for them to try to process a growing velocity of data that's coming in through certain streams that they subscribe to in a human process way, or if they have to finally find ways to automate portions of compliance and automate portions of data reconciliation and data cleanup as they take a look at things."
However, there are ways businesses and organisations can prepare for and adapt to this. "The good news is because the interaction technologies are more or less commodities at this point, processing events at scale is a doable thing today with the technologies that are available." he claims.
One of the things Urquhart says to look into are microservices. He recommends organisations read through works that touch on stream processing using products available in the market. He also talks about looking into and building around serverless computing, where opportunities to consume external streams instead of APIs are already being prototyped.
"A lot of Lambda today is triggered by a call to an API gateway. And you could just replace that with something like it's consuming a stream. And you could do that today as well. The big thing is if you build it in a way that the interfaces and protocols that you depend on in the processing of the system downstream allow you to have the ability to adapt that very quickly and cheaply as you move forward."
The key thing Urquhart advises is to have organisations educate themselves and be able to learn to build around the existing technologies and architectures we have today.
"Keep in mind that those interfaces and protocols are what are going to evolve and change. Build with that in mind, but take advantage of the architectures that are available today, and the new architectures as they start to evolve where they make sense. And there's a number of books out there about event-driven microsystems, microservices and about service programming and things like that can help you, your organisation be ready when Flow appears."
You can catch more of Urquhart’s thoughts on event-driven integration and the processes and principles behind Flow in this episode of Coding Over Cocktails - a podcast by Toro Cloud.
This podcast series tackles issues faced by enterprises as they manage the process of digital transformation, application integration, low-code application development, data management, and business process automation. It’s available for streaming in most major podcast platforms, including Spotify, Apple, Google Podcasts, SoundCloud, and Stitcher.