How Events and APIs Enable Modern Real-Time Applications
Instead of just opening up closed-off systems via APIs, many businesses are looking to use event-based triggers as well to react to changes in real time.
Competitive pressures are driving the need for new thinking when it comes to developing applications that help a business react in real time. Being able to make real-time decisions based on events is at the heart of those efforts.
Any event—whether internally generated, such as a transaction, state change, database change, etc., or externally generated by customer activity—invokes the need for an action. But working with events introduces new technical requirements.
One way to look at the different requirements is to compare the handling of events to the way applications built using APIs might work. For example, applications might use APIs to form a one-to-one relationship between different app components. For instance, a mobile banking app would use APIs to allow a customer to query a backend system to get a bank balance. One query is sent, and one result is delivered. And the app components at each end of the session must both be online at the same time.
Event-based apps break this one-to-one relationship. And they change the way the interactions work. Traditional apps require a push, an intervention to trigger the next action. Things happened sequentially. A customer asks for his or her bank balance, and the balance amount is returned. With events-based applications, systems reply to events as they occur naturally. A system doesn’t have to wait for a response to take another action.
Use cases abound
Event-driven applications can be used in a wide range of industries, including manufacturing, financial services, transportation, logistics, retail, and more.
Frequently, a single event is used by multiple applications for different purposes at different times. For example, if an airline passenger changes flights, that change impacts seat assignments on both the old and the new flights. If the trip was booked through a travel agency, the change might impact other aspects of the trip. A hotel reservation might need to be shifted from one night to another, and adjustments might need to be made to car rental and ground transportation services.
In retail, it is easy to understand how a single event like a purchase on a website must be shared with other systems, including inventory, tax calculation and collection, billing/payment processing, shipping, and more. The point to keep in mind is that multiple disparate systems, including some that may not be controlled by the enterprise, must all work together to give the customer a seamless experience.
MQ vs. Kafka: Not necessarily one or the other
When working with events, the conduit, the software that sits between the different event creators and event consumers, must have special properties. Quite often, the choice comes down to two general categories of solutions. The alternatives are one based on message queuing, such as IBM MQ, or one based on event streaming, such as Apache Kafka, the open-source distributed event streaming platform.
The two are often presented as competitive solutions. But in reality, they do different things and are designed for different uses.
Kafka is used to build real-time streaming data pipelines and real-time streaming applications. It enables a data pipeline to reliably process and move data from one system to another and allows a streaming application to consume streams of data.
IBM MQ supports the exchange of information between applications, systems, services, and files by sending and receiving message data via messaging queues. This simplifies the creation and maintenance of business applications. IBM MQ works with a broad range of computing platforms and can be deployed across a range of different environments, including on-premises, cloud, and hybrid cloud deployments. IBM MQ supports a number of different APIs, including Message Queue Interface (MQI), Java Message Service (JMS), REST, .NET, IBM MQ Light, and MQTT.
As such, one differentiator between Kafka and IBM MQ is that Kafka is very much about a stream of events or sequence of events, whereas MQ is more about individual messages.
Separation is the key
The way modern applications are developed results in independent elements working together as one. The decoupling of the various elements of a larger application is becoming the norm. APIs, IBM MQ, and Kafka serve as the glue between the elements. Each has its own purpose in different applications.
Businesses are making the different components available as services, often through APIs. However, instead of just opening up once closed-off systems via APIs, many businesses are looking to use event-based triggers as well to react to changes in real time.
This article originally appeared on rtinsights.com, to read the full article, click here.
Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering tools for Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT, IBM Cloud Pak for Integration and many more.
The Nastel i2M Platform provides:
- Secure self-service configuration management with auditing for governance & compliance
- Message management for Application Development, Test, & Support
- Real-time performance monitoring, alerting, and remediation
- Business transaction tracking and IT message tracing
- AIOps and APM
- Automation for CI/CD DevOps
- Analytics for root cause analysis & Management Information (MI)
- Integration with ITSM/SIEM solutions including ServiceNow, Splunk, & AppDynamics