Across almost all industries, software is playing an ever-more important role in making processes more efficient, and in some cases, it is replacing human-driven processes altogether. Silicon Valley investor Marc Andreessen famously argued that “software is eating the world” — making a clear case that all industries will be upended by software.
This software revolution is unavoidable and companies that do not adapt, will falter. Some of the most famous examples are Blockbuster and Borders Group — forced out of business by digital giants such as Netflix and Amazon.
As a result, a chorus of thought-leaders and commentators have been saying “every company is becoming a software company” over the past few years.
Jay Kreps of Confluent takes it a step further and argues that companies are not only becoming software companies, but instead, companies are becoming software:
Businesses are increasingly defined as software, as opposed to software merely being a tool that makes humans inside the business more productive. He uses an example of a loan approval process in a consumer bank. The image below illustrates how steps in this process have been almost entirely replaced by software:
With a change as fundamental as companies becoming software, will the old paradigms of how we have been developing software hold up? Kreps argues that the software world is undergoing a massive change:
“The purpose of an application, in this emerging world, is much less likely to be serving a UI to aid a human in carrying out the activity of the business, and far more likely to be triggering actions or reacting to other pieces of software to carry out the business directly.”
Most notably, this has a profound impact on traditional database architectures.
Traditional Database Architectures
Virtually all databases follow a model of data being stored passively in a database, awaiting commands to retrieve or modify it. This model has been driven by the fact that many applications are human-facing in which a user looks at a UI and initiates actions that are translated into database queries.
As Kreps writes: “it’s worth noting that of the applications that came to prominence with the rise of the RDBMS (CRM, HRIS, ERP, etc.), virtually all began life in the era of software as a human productivity aid. The CRM application made the sales team more effective, the HRIS made the HR team more effective, etc. These applications are all what software engineers call “CRUD” apps. They help users create, update, and delete records, and manage business logic on top of that process. Inherent in this is the assumption that there is a human on the other end driving and executing the business process. The goal of these applications is to show something to a human, who will look at the result, enter more data into the application, and then carry out some action in the larger company outside the scope of the software.”
This fundamental nature of applications caused companies to adopt them in bits and pieces — gradually expanding the set of business processes where a piece of software assists humans in carrying out tasks.
“But the data infrastructure itself had no notion of how to interconnect or react to things happening elsewhere in the company. This led to all types of ad hoc solutions built up around databases, including integration layers, ETL products, messaging systems, and lots and lots of special-purpose glue code that is the hallmark of large-scale software integration.”
This interconnection of databases led to point-to-point integrations that end up as a spaghetti-like mess inside of most companies today:
The Emergence of Event Streams
If the primary role of software is shifting from serving a UI to directly triggering actions or reacting to events from other pieces of software, and the traditional relational database cannot keep up with this new reality — what has to change?
Everything will revolve around events and event streams. Events are anything that happens in a business — a user login attempt, a product purchase, a completed job, etc. An event stream is a continually updating series of events, representing past and present events. Event streams enables companies to analyze data that pertains to an event and respond to that event in real-time.
Jay Kreps contrasts event streams to traditional databases:
“Event streams present a very different paradigm for thinking about data from traditional databases. A system built on events no longer passively stores a dataset and waits to receive commands from a UI-driven application. Instead, it is designed to support the flow of data throughout a business and the real-time reaction and processing that happens in response to each event that occurs in the business.”
Kreps uses a simple metaphor to better explain how event streaming works. Imagine traveling in a car toward a destination. The traditional database is more like a child who keeps asking “Are we there yet?”, who has no other method to get an answer to their question than to keep asking. With event stream processing, the time of arrival becomes a continuous calculation always in sync with the changing position of the car (a series of events).
Software such as Apache Kafka, the open-source stream-processing software platform originally developed for use inside of LinkedIn, has grown out of this vision for event streaming platforms. It is now being used by thousands of companies and 60% of the Fortune 100 — a testament to how pervasive event streaming has become across industries.
In many of these companies, event stream processing was implemented to enable a single use case. Adoption then rapidly spread throughout the company to more and more user cases. Event streams are multi-reader — meaning an event stream can have multiple “subscribers” that process, react or respond to it.
The adoption of event streaming platforms causes a positive feedback loop, or “a virtuous cycle of adoption” as Kreps puts it: “The first application brings with it critical data streams. New applications join the platform to get access to those data streams, and bring with them their own streams. Streams bring applications, which in turn bring more streams.”
“The core idea is that an event stream can be treated as a record of what has happened, and any system or application can tap into this in real time to react, respond, or process the data stream.”
“This has very significant implications. Internally, companies are often a spaghetti-mess of interconnected systems, with each application jury-rigged to every other. This is an incredibly costly, slow approach. Event streams offer an alternative: there can be a central platform supporting real-time processing, querying, and computation. Each application publishes the streams related to its part of the business and relies on other streams, in a fully decoupled manner.”
As companies are increasingly defined in software (i.e. whole business processes and business units are entirely software-driven) – real-time data and real-time action are becoming increasingly important, as well as reducing the human role in business processes.
Event streaming platforms support all these outcomes, and that is why they are rapidly becoming a cornerstone of every company’s systems architecture.