I have a python server-side which sends a request using SSE.
Here the example of python code. It sends an 'action-status' and data which JS has to handle (to do):
async def sse_updates(request):
loop = request.app.loop
async with sse_response(request) as resp:
while True:
# Sending request for queue' token remove when it's been removed on server.
if request.app['sse_requests']['update_queue_vis_remove']:
await resp.send("update-remove")
request.app['sse_requests']['update_queue_vis_remove'] = False
# Sending request for queue' token adding up when it's been added on server.
if request.app['sse_requests']['update_queue_vis_append'][0]:
await resp.send(f"update-append {request.app['sse_requests']['update_queue_vis_append'][1]} {request.app['sse_requests']['update_queue_vis_append'][2]}")
request.app['sse_requests']['update_queue_vis_append'][0] = False
# Sending request for redundant token's list rewrite (on client side ofc)
if request.app['sse_requests']['redundant_tokens_vis'][0]:
await resp.send('update-redtokens ' + ''.join(token + ' ' for token in request.app['sse_requests']['redundant_tokens_vis'][1]))
request.app['sse_requests']['redundant_tokens_vis'][0] = False
await asyncio.sleep(0.1, loop=loop)
return resp
And the JS script which handles a response:
evtSource = new EventSource("http://" + window.location.host + "/update")
evtSource.onmessage = function(e) {
// Data from server is fetching as "<server-event-name> <data1> <data2> <data3> ..."
let fetched_data = e.data.split(' ');
// First option is when a token has been removed from server this event has to be represented on a client-side.
if(fetched_data[0] === "update-remove")
displayQueueRemove();
// The second option is when a token appended on server and also it should be represented to a user
else if(fetched_data[0] === "update-append")
// fetched_data[1] - token
// fetched_data[2] - it's (token's) position
displayQueueAdd(fetched_data[1], parseInt(fetched_data[2]));
// The last possible options is that if the web-page will has refreshed a data in redundant_tokens should be rewritten
else if (fetched_data[0] === "update-redtokens"){
fetched_data.shift();
// Creating variables for token' wrapping
let tag;
let text;
// Wrapping tokens and store it into the array.
for(let i = 0; i < fetched_data.length - 1; i++) {
tag = document.createElement("div");
text = document.createTextNode(fetched_data[i]);
tag.appendChild(text);
tag.setAttribute("class", "token-field");
redundant_tokens[i] = tag;
}
}
}
The problem is that if I open two or more browser windows (sessions), the only one of them catches a response and represents it. Moreover there were the cases when I send a request from one session, however obtain a response to another one.
Is there an option to fix it using SSE (I mean, I was considering some other methods but I'd like to try with SSE)?
I think your problem is synchronizing the data (app["sse_requests"]).
Depending on how you modify the data and who needs to be notified you might need to keep a list of clients (sessions).
For example if all clients need to be notified of all events then keep a list (or even better a set) of connected clients and create a periodic function (using create_task) in which to notify all of them.
If a client needs to only be notified of certain events then you need to identify that client using some sort of key in the request object.
Related
I am learning JavaScript (Node.js - using the Pipedream platform). I have been writing scripts to help automate some little tasks in my day to day work.
I am creating one that generates a report on recent interactions with clients.
As part of this I am using axios to get "engagements" from the Hubspot API (basically a list of identifiers I will use in later requests).
The API returns paginated responses. I have encountered pagination previously and understand the principle behind it, but have never written a script to handle it. This is my first.
It works. But I feel it could be improved. Below I've commented how I've approached it.
The endpoint returns up to 100 values 'per page' along with a "hasMore":true flag and an "offset":987654321 value which can be passed as a query parameter in subsequent requests (if hasMore === true).
Example API response:
{"results":[1234,1235,1236],"hasMore":true,"offset":987654321}
My code:
import axios from 'axios';
//function to get each page of data
async function getAssoc(req){
const options = {
method: 'GET',
url: `https://api.hubapi.com/${req}`,
headers: {
Authorization: `Bearer ${auths}`,
},
};
return await axios(options);
}
//declare array in which to store all 'associations'
const assocs = [];
//store the ID that I get in an earlier step
const id = vid;
//declare variable in which to temporarily store each request response data
var resp;
//declare query parameter value, initially blank, but will be assigned a value upon subsequent iterations of do while
var offset = '';
do {
//make request and store response in resp variable
resp = await getAssoc(`crm-associations/v1/associations/${id}/HUBSPOT_DEFINED/9?offset=${offset}`);
//push the results into my 'assocs' (associations) array
resp.data.results.forEach(element => assocs.push(element));
//store offset value for use in next iteration's request
offset = resp.data.offset;
} while (resp.data.hasMore); //hasMore will be false when there's no more records to request
return assocs;
I feel it could be improved because:
The DO WHILE loop, I believe, is making sequential requests. Is parallel a better/faster/more efficient option? (EDIT Thanks #Evert - of course I cannot make parallel requests because of the offset!)
I'm re-assigning new values to vars instead of using consts which seems simple and intuitive in my beginner's mind, but I don't understand a better way in this instance.
I would welcome any feedback or suggestions on how I can improve this for my own learning.
Thank you in advance for your time and any assistance you can offer.
I want to create a basic chat system which allows the user to attach a file to a given message.
This is a problem with web sockets because it is very difficult to send a file through them. unless you want to convert it to binary data. That wont work for me however because I want to be able to interact with the files on the server, and since i have multiple file types (different images, videos and gifs), I cant just decode the binary.
Here is my code:
Where I make the request to upload the message:
// this.state._new_message : {
// _message_type_id:int,
// _message:string || file,
// _user_id:int,
// _conversation_id:int
// } -> this is what a _new_message object looks like.
let fd = new FormData()
if(this.state._new_message._message_type_id !== 1){
fd.append("file", this.state._new_message._message)
}
fd.append("message", JSON.stringify(this.state._new_message))
Axios.post(API_URL+"/api/chat/send_message", fd)
.then(res => res.data)
.then(res => {
this.socket.emit("message", res._message)
})
Where I handle the request:
def send_message(self, request):
data = request.form["message"]
data = json.loads(data)
if data["_message_type_id"] != 1:
data["_message"] = "Loading Message. Please Wait."
query = fdb.prepare(data, "_messages") # Converts the data dict to a query string
fdb.create(query["query_string"], query["query_params"]) # Interacts with Database
_message = fdb.read("""SELECT * FROM _messages WHERE _id = LAST_INSERT_ID()""", (), False)
if data["_message_type_id"] != 1:
file = request.files["file"]
# OpenCV magic... #
path = "path/to/file"
file.save(path)
fdb.update("""UPDATE _messages SET _message = %s WHERE _id = %s""", (url_path, _message["_id"])) # Interacts with Database
url_path = "url/path/to/file"
_message["_message"] = url_path # If message is not text, set it to be the path to the file instead
return jsonify({
"_message" : _message
})
This is where I listen for the event (server):
#socket.on("message")
def handleMessage(msg):
print(msg)
send(msg, broadcast=True)
return None
This is where I listen for the event (client):
this.socket.on("message", data => {
if(data == null || data._message_type_id == undefined) return
if(this.props.chat.messages.messages !== null)
if(this.props.chat.messages.messages.filter(message => parseInt(message._id) === parseInt(data._id)).length > 0) return
this.props.addMessage(data)
})
From this code, you'd expect to send a message through HTTP, that message to be processed and sent back through HTTP, then the response to be sent to the server through the socket, and the socket to then send that message to all the clients which are listening to that socket. But it doens't work that way. Some times it does, other times it doesn't.
One thing I can confirm is that the messages are saved to the database and file system 100% of the time. I also always get a response from the HTTP request, but when I emit the event from the client and print them message in the server, I some times get None or the previous message logged to the console.
So when that message is sent to the client, my "error catching block" will see that its either undefined or already exists, so it will return without adding it (because it shouldnt add it of course).
Has anyone experienced this issue before? Do I have a wrong idea about something? Or is there wrong with my code?
I figured it out.
After a bit of time analyzing my code, I noticed that I do not execute the following two queries in the same transaction:
fdb.create(query["query_string"], query["query_params"]) # Interacts with Database
_message = fdb.read("""SELECT * FROM _messages WHERE _id = LAST_INSERT_ID()""", (), False)
The fdb extension I developed is basically a high level abstraction of sqlalchemy which has a very loose transaction management system (auto open/close). So I had to implicitly tell it to not close the transaction until it executes both queries.
As a result, I always get the latest insert, and not the insert which occurred beforehand (as was the case before I updated my code).
Goes to show you that abstraction of code isn't always a good idea if you don't think of all the use cases in advanced.
I'm trying to build real time notification system for users using
Server sent events in python.
The issue I'm facing is when I refresh the browser, which means the
browser then will try to hit the EventSource url again, and as per
docs it should send the event.lastEventId as part of headers in next
request. I'm getting None every time I refresh the page.
<!DOCTYPE html>
<html>
<header><title>SSE test</title></header>
<body>
<ul id="list"></ul>
<script>
const evtSource = new EventSource("/streams/events?token=abc");
evtSource.onmessage = function(event) {
console.log('event', event)
console.log('event.lastEventId', event.lastEventId)
const newElement = document.createElement("li");
const eventList = document.getElementById("list");
newElement.innerHTML = "message: " + event.data;
eventList.appendChild(newElement);
}
</script>
</body>
</html>
On the server side
from sse_starlette.sse import EventSourceResponse
from asyncio.queues import Queue
from starlette.requests import Request
#event_router.get(
"/streams/events",
status_code=HTTP_200_OK,
summary='',
description='',
response_description='')
async def event_stream(request: Request):
return EventSourceResponse(send_events(request))
async def send_events(request: Request):
try:
key = request.query_params.get('token')
last_id = request.headers.get('last-event-id')
print('last_id ', last_id) # this value is always None
connection = Queue()
connections[key] = connection
while RUNNING:
next_event = await connection.get()
print('event', next_event)
yield dict(data=next_event, id=next_event['id'])
connection.task_done()
except asyncio.CancelledError as error:
pass
Now, as per every doc on SSE, when client reconnects or refreshes the
page it will send the last-event-id in headers. I'm trying to read it
using request.headers.get('last-event-id'), but this is always null.
Any pointers on how to get the last event id would be helpful. Also,
how would I make sure that I do't send the same events even later once
the user has seen the events, as my entire logic would be based on
last-event-id received by the server, so if its None after reading
events with Id 1 to 4, how would I make sure in server that I should
not send these back even if last-event-id is null for the user
.
Adding browser snaps
1st pic shows that the events are getting received by the browser.
e.g. {alpha: abc, id:4}
2nd pic shows that the event received is
setting the lastEventId correctly.
I think this is a wrong understanding. Where did you get the part that says "when client reconnects or refreshes the page it will send the last-event-id in headers".
My understanding is that the last ID is sent on a reconnect. When you refresh the page directly, that is not seen as a broken connection and a re-connect. You are completely re-starting the whole page and forming a brand new SSE connection to the server from the browser. Keep in mind if your SSE had a certain request parameter for example and this could be driven by something on the page. You could have two tabs open from the same "page" but put different items into the page which causes this request parameter to be different in your SSE. They are two different SSE connections altogether. One does not affect the other except that you can reach the maximum connection limit of a browser if you have too many. In the same way, if you click refresh, this is like a new tab being created. It is a totally new connection.
Goal: Front-end of application allows users to select files from their local machines, and send the file names to a server. The server then matches those file names to files located on the server. The server will then return a list of all matching files.
Issue: This works great if you a user select less than a few hundred files, otherwise it can cause long response times. I do not want to limit the number of files a user can select, and I don't want to have to worry about the http requests timing out on the front-end.
Sample code so far:
//html on front-end to collect file information
<div>
<input (change)="add_files($event)" type="file" multiple>
</div>
//function called from the front-end, which then calls the profile_service add_files function
//it passes along the $event object
add_files($event){
this.profile_service.add_files($event).subscribe(
data => console.log('request returned'),
err => console.error(err),
() => //update view function
);
}
//The following two functions are in my profile_service which is dependency injected into my componenet
//formats the event object for the eventual query
add_files(event_obj){
let file_arr = [];
let file_obj = event_obj.target.files;
for(let key in file_obj){
if (file_obj.hasOwnProperty(key)){
file_arr.push(file_obj[key]['name'])
}
}
let query_obj = {files:title_arr};
return this.save_files(query_obj)
}
//here is where the actual request to the back-end is made
save_files(query_obj){
let payload = JSON.stringify(query_obj);
let headers = new Headers();
headers.append('Content-Type', 'application/json');
return this.http.post('https://some_url/api/1.0/collection',payload,{headers:headers})
.map((res:Response) => res.json())
}
Possible Solutions:
Process requests in batches. Re-write the code so that the profile-service is only called with 25 files at a time, and upon each response call profile-service again with the next 25 files. If this is the best solution, is there an elegant way to do this with observables? If not, I will use recursive callbacks which should work fine.
Have the endpoint return a generic response immediately like "file matches being uploaded and saved to your profile". Since all the matching files are persisted to a db on the backend, this would work and then I could have the front-end query the db every so often to get the current list of matching files. This seem ugly, but figured I'd throw it out there.
Any other solutions are welcome. Would be great to get a best-practice for handling this type of long-lasting query with angular2/observables in an elegant way.
I would recommend that you break up the number of files that you search for into manageable batches and then process more as results are returned, i.e. solution #1. The following is an untested but I think rather elegant way of accomplishing this:
add_files(event_obj){
let file_arr = [];
let file_obj = event_obj.target.files;
for(let key in file_obj){
if (file_obj.hasOwnProperty(key)){
file_arr.push(file_obj[key]['name'])
}
}
let self = this;
let bufferedFiles = Observable.from(file_arr)
.bufferCount(25); //Nice round number that you could play with
return bufferedFiles
//concatMap will make sure that each of your requests are not executed
//until the previous completes. Then all the data is merged into a single output
.concatMap((arr) => {
let payload = JSON.stringify({files: arr});
let headers = new Headers();
hearders.append('Content-Type', 'application/json');
//Use defer to make sure because http.post is eager
//this makes it only execute after subscription
return Observable.defer(() =>
self.post('https://some_url/api/1.0/collection',payload, {headers:headers})
}, resp => resp.json());
}
concatMap will keep your server from executing more than whatever the size of your buffer is, by preventing new requests until the previous one has returned. You could also use mergeMap if you wanted them all to be executed in parallel, but it seems the server is the resource limitation in this case if I am not mistaken.
I'd suggest to use websocket connections instead because they don't time out.
See also
- https://www.npmjs.com/package/angular2-websocket
- http://mmrath.com/post/websockets-with-angular2-and-spring-boot/
- http://www.html5rocks.com/de/tutorials/websockets/basics/
An alternative approach would be polling, where the client makes repeated requests in a defined interval to get the current processing state from the server.
To send multiple requests and waiting for all of them to complete
getAll(urls:any[]):Observable {
let observables = [];
for(var i = 0; i < items.length; i++) {
observables.push(this.http.get(urls[i]));
}
return Observable.forkJoin(observables);
}
someMethod(server:string) {
let urls = [
'${server}/fileService?somedata=a',
'${server}/fileService?somedata=b',
'${server}/fileService?somedata=c'];
this.getAll(urls).subscribe(
(value) => processValue(val),
(err) => processError(err),
() => onDone());
}
As a side project at work, I am currently working on implementing a LiveChat Report web app. Basically, it will download all of our LiveChat data to .csv files for analysis, etc.
I have the entire thing working quite well; however, I am having a hard time wrapping my head around how to loop though requests based on pages of results returned. For example, I want certain information from each chat. LiveChat API only returns 25 chats on each page. Unfortunately, I cannot call each page at a time and depending on the date range parameters, the number of pages varies each time. I want to get all of these pages on 1 csv, if possible.
My request looks like:
function chatListReport() {
document.getElementById('chat_list_submit').addEventListener('click', function(event) {
var req = new XMLHttpRequest();
var params = {date_from:null,date_to:null, page:null};
params.date_from = document.getElementById('date_from').value;
params.date_to = document.getElementById('date_to').value;
params.page = document.getElementById('page').value;
req.onreadystatechange = function() { //when response received
if (req.readyState == 4 && req.status == 200) {
var response = (req.responseText);
var final = "data:text/csv;charset=utf-8," + encodeURI(response);
var link = document.createElement('a');
link.setAttribute('href', final);
link.setAttribute('download', 'ChatListSurveyReport.csv');
link.click();
}}
req.open('POST', '/chat_list_report', true); //submit update via POST to server
req.setRequestHeader('Content-Type', 'application/json'); //set request header
req.send(JSON.stringify(params));
event.preventDefault();
});
My server side looks like this (using NPM liveChatAPI):
app.post('/chat_list_report', function(req, res){
var params = req.body;
api.chats.list(params, function(data){
var headers = 'Chat Date,Agent,PostChat Survey Rating,Comments';
var result = (data.chats || [])
.filter(function(chat) {return chat.type === "chat"})
.map(function(chat) {
var postSurvey = chat.postchat_survey || [];
return [
chat.ended.replace(",", ""),
chat.agents[0].display_name,
(postSurvey[0] || {}).value || "",
(postSurvey[1] || {}).value || "",
].join(',');
});
result.unshift(headers);
res.send(result.join('\n'));
});
});
I am able to return the number of pages. That is included in the returned JSON. So in short, is there a way to return that page number response back to the request and loop through the request X amount of times and then create 1 csv with all of the information? Right now I am limited to only 25 chats per single request.
Thanks! Any help is greatly appreciated.
We see two possible solutions:
1:
A. Front-end sends a request to the backend, and then the backend returns ID of a job, which the front-end must store on it’s side, for example in the localStorage.
B. At this time, back-end works on the jobs on its own, and tries to finish it, as a complete task.
C. Front-end sends a request to the back-end every few seconds about the task, giving it’s ID. While the task is not complete, back-end replies with 'in-progress',
and front-end continues to enquire about the task until its finished.
D. If the front-end enquires about the task, and on this time receive notification
that the task is completed, then back-end returns a message like finished' with
further completed information.
E.
GUI should notify:
o Waiting for response
o Response successful, link to download a file.
---------OR----------
2:
A. Front-end sends request to back-end, back-end registers the task and after
completing the task sends an e-mail with the completed data.
B. Front-end informs that the task has been added to the ‘queue’ and only when it is
completed will an email be sent with the results of the task.
C. Requires adding additional input on the front end side : e-mail and handling some sort of API for e-mails e.g e,g,:https://postmarkapp.com/
Let me know how it works,
Cheers,
Adam