There are two distinct categories for realtime infrastructure: realtime API infrastructure and realtime app infrastructure.  This article will focus on realtime API infrastructure as it relates to building internal APIs that enable data push.

Definition

Realtime API infrastructure specifically allows developers to build realtime data push into their existing APIs.  Typically, you would not need to modify your existing API contracts, as the streaming server would serve as a proxy. The proxy design allows these services to fit nicely within an API stack. This means it can inherit other facilities from your REST API, such as authentication, logging, throttling, etc. It can be combined with an API management system.  In the case of WebSocket messages being proxied out as HTTP requests, the messages may be handled statelessly by the backend. Messages from a single connection can even be load balanced across a set of backend instances.

Benefits of Realtime API Infrastructure

  • Custom build an internal API
  • Works with existing API management systems
  • Does not lock you into a particular tech stack
  • Provides realtime capabilities throughout entire stack
  • Usually proxy-based, with pub/sub or polling
  • Add realtime to any API, no matter what backend language or database
  • Cloud or self-hosted API infrastructure
  • It can inherit facilities from your REST API, such as authentication, logging, throttling

Use Cases

While some of the platforms out there function differently, here are some of the most typical use cases:

  • Microservices – In a microservice environment, a realtime API proxy makes it easy to listen for instant updates from other microservices without the need for a centralized message broker. Each microservice gets its own proxy instance, and microservices communicate with each other via your organization’s own API contracts rather than a vendor-specific mechanism.
  • Message Queue – If you have a lot of data to push, you may want to introduce an intermediate message queue. This way, backend processes can publish data once to the message queue, and the queue can relay the data via an adapter to one or more proxy instances. The realtime proxy is able to forward subscription information to such adapters, so that messages can be sent only to the proxy instances that have subscribers for a given channel.
  • API management – It’s possible to combine an API management system with a realtime proxy. Most API management systems work as proxy servers as well, which means all you need to do is chain the proxies together. Place the realtime proxy in the front, so that the API management system isn’t subjected to long-lived connections. Also, the realtime proxy can typically translate WebSocket protocol to HTTP, allowing the API management system to operate on the translated data.
  • Large scale CDN – Since realtime proxy instances instances don’t talk to each other, and message delivery can be tiered, this means the realtime proxy instances can be geographically distributed to create a realtime push CDN. Clients can connect to the nearest regional edge server, and events can radiate out from a data source to the edges.

Realtime API Infrastructure Solutions

Fanout / Pushpin

Streamdata.io

LiveResource