I am trying to implement a queue based processes implementation. I have a react component that sends a post request to the server and on changing the status in the backend, starts to upload files to s3. In each process there are multiple files (of the order of 50 files to 100k files). On page refresh the upload process stops. Also this can happen in multiple tabs. The queue system is for that. Suppose the user starts an upload process in one tab and then opens another tab and creates another process and then third process in the third tab, then the second one will start only after the first process is complete and third one after the second one is complete and so on. For this, I have created a list of process ids in localStorage and in setInterval I keep on checking if the first element in the array is the present process id and if it is then I continue with the uploading else I keep checking it every 1 sec. After every file is uploaded, I shift the array.
My problem is that if there is a system crash there are orphan ids in the array and because of that my new processes get stuck. Is there any way to know if there are any requests coming from the process and if there are no requests then remove that id from the array in localStorage so that next process id becomes the first in the array and so on, so that I can remove the orphan process ids and starts the new relevant process? I have to implement this using storages but there is no way for me to know whether a browser crash happened. Also if user opens back the browser and clicks on restore pages and comes to the same page, this approach does not work. It only works if user closed the tab or refreshed the tab.
The solution in this link does not work for multiple tabs. I was looking at service worker but I am new to the concept. Will they help? Or any other approach helps in this case? I am trying to solve this problem on the client side. Is this possible or should I use a combination of backend and frontend for this?
Related
So I have a website where the api gets called on a button click and the api takes some time to process (as data is large) making user to wait. I want the api to keep running and complete even when the user goes to some other page or url. Is there any solution for the same ? using angularjs 1.1.1
Currently it just aborts the api when the url is changed.
button click and the api takes some time to process (as data is large) making user to wait.
You might want to read into caching or redis. Is this data 'fresh' each time? If so it might be a good time to optimize your code.
I want the api to keep running and complete even when the user goes to some other page or url
Either this is going to become a SPA (single page application) or it will need to become a service worker (ie similar to how notifications work)
I am looking to build an in-house debugging system so we can see how users react to certain things.
What would be the best way to communicate all mouse clicks, moves, etc. back to the server?
One way I've thought of is to run a bind on body for everything and then just add it to an array which is sent at page unload, but I figured this could seriously kill a browser if the user has decided to click everything in sight or has sat there in work for 4 hours just moving his mouse on the page.
Ideally I want to avoid web sockets.
I'm sure this has been done before so I'd love to know how it's been done.
Thanks
For those of you wondering, I used the answer found here (How do you log all events fired by an element in jQuery?)
as a wrapper, with a combination of #hallleron's approach - storing the values in a string separated by | firing off AJAX queries every 3.5 seconds, then setting the array back to null. On page unload, the AJAX query fires one final time.
I'm also considering making the unload script dynamically create an iFrame with (again) a dynamically generated form and contents which auto posts the contents, just in case the AJAX hangs for whatever reason.
All array strings use their own CSRF token and have a randomly generated ID for the client side which is then hashed on the server side and is used to check if that has already been sent, just to stop any possible double AJAX requests.
On the server side, it is stored using ARCHIVE Engine Type and also INSERT DELAYED Insert method.
Eventually I'll probably move the logging system to its own EC2/RDS group.
The reasoning behind all this is to be able to see the most popular features of the website, who is clicking where (say if there's 2 home buttons, which one is more popular, etc.)
Hope this helps anyone else stuck in this predicament.
I am building a WebApp (ERP) and I need to display the people currently logged in and active on the page. I managed to get something pretty accurate by listening on the mouse/keyboard events and periodically reporting to the DB.
I don't know how to mark people offline when they close the page. I tried using onbeforeunload, but it obviously fires when the user simply changes pages (click a link inside the ERP, that point to another page in the ERP).
I then tried to use WebSockets, but the problem is the same : everytime the page is realoded, the WebSockets connection is closed.
So I can think of two ways:
Use WebSockets indeed, and replace all links by a call to a javascript function that would somehow tell the server that the user is going to change page (so that the server doesn't mark it as offline). But that doesn't feel right, semantically speaking, links should be links, it simply points to another location.
Use either WebSockets or AJAX and never actually change page: links are replaces by a function that will call for the content, and display it on screen (updating the DOM with Javascript). But again, it doesn't feel right either, because semantically speaking the page would have no meaning and the URL would never change, so the user can't "copy paste" the link of the page to refer to it, right ?
So, is there a proper, clean way of doing this? Thanks for your help.
If each of your pages has a webSocket connection to your server, then on the server you can see when any given page is closed by seeing that the webSocket gets closed.
To avoid thinking that a user has left the site when they are just navigating from one page in your site to another, you simply need to add a delay server-side so that you only report that the user has left your site if there has been no webSocket connection from this user for some time period (probably at least a few seconds).
So, on your server when you detect that the last webSocket connection for this user has been closed, you set a timer for some number of seconds. If the user opens up another page on your site (either via navigation or just opens another page) before the timer goes off, you cancel the timer and count the user as still connected. If the timer goes off, then you now know that the user has been away from your site for whatever time period you picked (say 10 seconds) and to you, this will signify that they have left the site. You can control how long you want that time period to be before you decide that, yes they are gone.
All attempts at trying to "see" the user leaving your page IN the browser with Javascript are going to have holes in them because there are always ways for a web page to get closed without your client-side javascript having a chance to tell your server. The beauty of the webSocket solution is that the browser automatically and reliably tells your server when the page is now gone because it closes the webSocket and your server receives the notification that the socket has been closed.
As I understand you want to compute users active on website/pages.
Identify the user (99% unique id computed):
http://valve.github.io/blog/2013/07/14/anonymous-browser-fingerprinting/ you can use another library, there are few.
On each page send from time to time at page load meaning user is navigating or (60sec you can chose lower time frame meaning user is staing on the page) computed id (fingerprint js) to server (web-socket/ajax)
On server you need to have list of id's with expiration date (60s) increment when new user log's in (stored in database or session).
Retrieve on your website the count (60sec ajax/websocket) of id's having timestamp <= server time - let say 120sec.
Knowing if user is logged, and specify the page:
use an object to be sent at server {fingerprint: 123123124234, logged : true, page: home}
Clear your list if you are not storing in Database the users:
Separate thread (server only) access the object and destroy all nodes older then 10 min or whatever your page session is set.
js timer: http://www.w3schools.com/jsref/met_win_setinterval.asp
Let hope it's helpful, id did something similar using the timer at 5 min to sent to server if user is still on the page, or signal at page load.
Getting the cont of users in frame of 60 sec. And even the users with names present on page :)
Somebody already post this kind of question.
Hope this could help you .
Detect if user has closed ALL windows for a website?
I have a situation where I have a page with tabs to hold multiple buttons for various functions. Each tab is for a different set of functionality (e.g. customers, orders and admin).
The way it was originally designed was I load all of the tabs and all of their buttons. The buttons shown is dependant on who is logged on.
Additionally, if a user clicks on a function it loads the code for that page, replacing the buttons in that tab as well as all of the code for the other tabs. I don't think this is very efficient and I would guess it would be better to load the content for the tabs using AJAX.
What would be the best way to accomplish this behaviour and make the code more responsive? My thought is that I would store the html that goes inside of the tab divs in the session variables, so I would only need to get the code once when the user logs on and then just serve it back to the user based on the currently selected tab.
To save on session you must have on your mine that:
The session data are saved all together, and read all together on the start reading of every page load, and write all together on the end of every page read. And the session lock all users, so if you add too much data and create a delay to read and write on the session, this can possible affect the users on your site. Imaging to create a big html page data that you save on session and its about 300k, and then you make the same 10 times, then the session data will be 3M that must reads and writes all the time.
So its like to add an extra data that follow the user session.
Its better if its possible to make your custom cache on a database, and save this just using the session key id, and load them only when the user enters that page, save it only when user change it.
From the other hand if you do not have many users, and you need to make it quick, or you do not have database connection, or is difficult to make a small custom cache, then use the session for one or two data of that.
All my research so far suggests this can't be done, but I'm hoping someone here has some cunning ideas.
I have a form on a website which allows users to bulk upload lots of URLs to add to a list on the server. There's quite a lot of server-side processing to do on each URL, so to avoid timeouts and to display progress, I've implemented the upload using jQuery to submit the URLs one at a time using ajax.
This is all working nicely. However, part of the processing on each URL is deduplicating it against the complete list. The ajax call returns a status indicating either a successful upload or a rejection due to duplication. As the upload progresses, I tell the user how many URLs have been rejected as duplicates (along with overall progress and ETA).
The problem now is how to give the user a complete list of the failed duplicate URLs. I've kept them in an array in my jQuery, and would like the user to be able to click on a link on the form to download a text file containing those URLs. Is this possible just using client-side processing?
The server-side processing basically handles a single keyword at a time. I'd rather not have to store the duplicates in a database table with some kind of session key which gets sent with every ajax call, and is then used at the end to generate the text file server-side (and then gets cleaned up some time later). I can see how to do this, but it seems very clunky and a bit 20th century.
I haven't used it myself yet, but Downloadify was built for exactly this purpose I think.
Downloadify is a tiny JavaScript + Flash library that enables the generation and saving of files on the fly, in the browser, without server interaction.
It was created by Doug Neiner who is also pretty active on Stack Overflow.
It needs Flash 10 to work.