Real Time Push Messages From Server - Server Sent Event in Node - Js With Express - JS, Redis - by Ajinkya Rajput - Medium
Real Time Push Messages From Server - Server Sent Event in Node - Js With Express - JS, Redis - by Ajinkya Rajput - Medium
Save
SSE is a server push technology enabling a client to receive automatic updates from a
server via an HTTP connection.
Traditionally, To get updates from the server with http, we have to ask the server if
there are any updates available(ref. image 1). This adds load on the server, not real
time. As we are heading towards an event-driven, SSE got introduced with the same
idea. The idea is simple: a frontend/a browser can subscribe to a stream of events
generated by a server, receiving updates whenever a new event occurs(ref. image 2).
Now, the frontend doesn’t have to ask for updates, real time, and low load on the
server. Once the frontend/client opens a connection with the server, it will not get
close until the frontend/client will explicitly close the connection.
106 1
Open in app Get started
2. Notify me, Alert when product is available like in online shopping portal.
3. News feeds
Pros:
SSEs are sent over traditional HTTP. That means they do not require a special protocol
or server implementation to get working.
Simpler protocol
Cons:
In this tutorial, we will cover all test cases at backend which we’ll be required to add
SSE in our project:
1. Simple Publish
3. Right To Privacy
4. Multi Instance Server Architecture
5. Network Settings/Configuration
Step 1: Open terminal, change directory to the folder we have to work on this project,
install express
npm init -y
npm i express
index.js
hosted with ❤ by GitHub view raw
node index.js
1. Simple Publish
Create one api for live/stream connection. This connection will be used by the server
to push notification to the client.
In the API, once a request is received, send 3 headers to the client so the connection
Open in app Get started
will be live and store res obj which will be used to send messages.
1 var socket
2 function socketConnectionRequest(req, res, next) {
3 const headers = {
4 'Content-Type': 'text/event-stream', // To tell client, it is event stream
5 'Connection': 'keep-alive', // To tell client, not to close connection
6 };
7 res.writeHead(200, headers);
8 res.write('data: Connection Established, We\'ll now start receiving messages from the
9 socket = res
10 console.log('New connection established')
11 }
12
13 module.exports = { socketConnectionRequest }
socket.js
hosted with ❤ by GitHub view raw
index.js
hosted with ❤ by GitHub view raw
Once a connection is established between server and client, use this connection to
Open in app Get started
send messages to clients.
1 function publishMessageToConnectedSockets(data) {
2 socket.write(`data: ${data}\n`);
3 }
socket.js
hosted with ❤ by GitHub view raw
index.js
hosted with ❤ by GitHub view raw
node index.js
Open terminal(2nd terminal) and hit sse connection api with the help of curl
Open another terminal(3rd terminal) and hit send message to client api
curl http://localhost:3000/send-message-to-client
As soon as request reach to server, we’ll see `This event is triggered at` message in 2nd
Open in app Get started
terminal, we can hit this api any amount of times and we’ll received message in 2nd
terminal
Above implementation does not support multiple user connections, let’s say we have
hit the connection api from the new terminal(4th terminal) and hit send message api,
2nd terminal message will stop and messages at 4th terminal will start. That means
multiple users can’t be handled as we are overriding connection objects on every new
connection request. To overcome this issue, we have to store all connection objects in
node js application memory and on publish we have to iterate all connections and send
to each. Also we have to remove connection on close/loss.
1 var sockets = []
2 function socketConnectionRequest(req, res, next) {
3 const headers = {
4 'Content-Type': 'text/event-stream', // To tell client, it is an event stream
5 'Connection': 'keep-alive', // To tell client not to close connection
6 };
7 res.writeHead(200, headers);
8
9 const socketId = Symbol(Date.now())
10
11 const socket = {
12 socketId,
13 res
13 res,
14 }
Open in app Get started
15 console.log(`New connection established:`, socketId)
16 res.write('data: Connection Established, We\'ll now start receiving messages from the
17
18 sockets.push(socket)
19 req.on('close', function () {
20 console.log(socketId, `Connection closed`)
21 sockets = sockets.filter((socket) => socket.socketId !== socketId)
22 })
23 }
24
25 function publishMessageToConnectedSockets(data) {
26 sockets.forEach((socket) => {
27 const { res } = socket
28 res.write(`data: ${data}\n\n`)
29 })
30 }
socket.js
hosted with ❤ by GitHub view raw
Open in app Get started
To verify, hit connection api from 2 terminal and hit send api from another terminal
and connection api terminal should receive messages.
Note: For simplicity purpose, we are storing in array which is not efficient but in real
application we can use different data structure as per our requirement.
3. Right To Privacy
With the above implementation, there is still an issue with privacy, i.e. let’s say a group
of 10 users called group1 belongs to role1 and another 10 called group2 belongs to
role2. While publishing, we have to publish only to users who belong to that role.
To resolve this issue, we have to make some strategy. We can achieve this by storing an
array of rules in node js memory with connection objects on the time of connection.
Rules means, roles, user id, etc. if we want to publish event
Here expectation is, the user is authenticated and the request object contains user
objects after authentication.
index.js
hosted with ❤ by GitHub view raw
1 var sockets = []
2 function socketConnectionRequest(req, res, next) {
3 const headers = {
4 'Content-Type': 'text/event-stream', // To tell client, it is an event stream
5 'Connection': 'keep-alive', // To tell client not to close connection
6 };
7 res.writeHead(200, headers);
8
9 const socketId = Symbol(Date.now())
10 const socket = {
11 socketId,
12 res,
13 roles: req.user.roles,
14 userId: req.user.id,
15 }
16 console.log(`New connection established for user id= ${req.user.id} of connection Id =
17 res.write('data: Connection Established, We\'ll now start receiving messages from the
18 sockets.push(socket)
19 req.on('close', function () {
20 console.log(socketId, `Connection closed`)
g
21 sockets = sockets.filter((socketId) => socket.socketId !== socketId)
Open in app Get started
22 })
23 }
socket.js
hosted with ❤ by GitHub view raw
socket.js
hosted with ❤ by GitHub view raw
Open in app Get started
socket.js
hosted with ❤ by GitHub view raw
socket.js
hosted with ❤ by GitHub view raw
We have to add 2 APIs. 1. Send message to the role 2. Send message to the user
index.js
hosted with ❤ by GitHub view raw
Now restart node server and to make connection we have to pass email as header(here
jwt or express session may be present in real application with different header name)
curl http://localhost:3000/send-message-to-role?role=employer
curl http://localhost:3000/send-message-to-user?userId=1
Now we are done with all local server(Single server architecture) test cases. Let’s
consider the scenario where we have multi server architecture while deployment.
Like, we have a load balancer for routing and there are more than 1 node js server
behind the load balancer. That means, the user’s request can go to any of the node
servers. Suppose, some users have a connection with Nodejs server 1 and some with
Open in app Get started
Nodejs server 2. If a publish request came to nodejs server 1, all the users connected to
the server 1 will receive messages but users who are connected to the server 2 will not
receive messages. Ref following diagram.
One of the solutions is that we should have a communicating server which will publish
subscriptions between all Node.js servers. Redis is well known for publishing
subscriptions; we can use Redis for communicating with servers.
With Redis, all Nodejs servers will subscribe to the one common channel. Once we
receive a publish event, we’ll not publish directly to all connections(like we did in
previous steps), we’ll publish it to that redis channel. On receiving messages from the
redis channel, we’ll iterate over all connections who are connected to the node js
server from frontend and publish messages to those connections.
Open in app Get started
npm i redis
We need to redis client connection to publish and to subscribe. We will create this
connection on the start of the nodejs server.
redis-sse.js
hosted with ❤ by GitHub view raw
As discussed, we’ll publish a message in redis first and on receiving a message from
redis, we’ll publish it to connections, so publishMessageToConnectedSockets,
publishMessageToConnectedSocketsByRole,
publishMessageToConnectedSocketsByUser will get moved to the new file. As we are
publishing it to redis, we have to serialize data.
1 function publishMessageToConnectedSockets(data, { role, userId } = {}) {
2 Open in app GetJSON.string
redisClientToPublishMessages.publish(CHANNEL_NAME_TO_PUBLISH_SSE_MESSAGES, started
3 }
4
5 function publishMessageToConnectedSocketsByRole(data, role) {
6 publishMessageToConnectedSockets(data, { role })
7 }
8
9 function publishMessageToConnectedSocketsByUser(data, userId) {
10 publishMessageToConnectedSockets(data, { userId })
11 }
redis-sse.js
hosted with ❤ by GitHub view raw
1 function onReceiveMessageFromRedis(message) {
2 const { data, role, userId } = JSON.parse(message)
3 sockets.forEach((socket) => {
4 if ((role && socket.roles.indexOf(role) === -1) || (userId && socket.userId !== us
5 return
6 }
7 const { res } = socket
8 res.write(`data: ${data}\n\n`)
9 })
10 }
socket.js
hosted with ❤ by GitHub view raw
Open in app Get started
index.js
hosted with ❤ by GitHub view raw
5. Network Settings/Configuration
Since SSE needs to be real time without buffering, there might be an issue where the
frontend will not be receiving any messages and when the connection is closed, all the
messages will arrive at once. To solve this issue you will need to check the
configuration of the key networking elements. Buffer streams should be disabled for
all network elements in our network path.