I am creating a complex social networking website that is all one single page that never refreshes unless a user presses the refresh button on the browser.
The issue here is that when I edit files and upload them to the server they don't take effect unless the user refreshes the browser.
How would I go about and fix this problem? Should I do a time interval of browser refreshes? Or should I poll the server every 10 minutes to check if the browser should do a refresh?
Any suggestions?
Server
I would communicate the version number through whatever means you're already using for data transfer. Presumably that's some kind of API, but it may be sockets or whatever else.
Whatever the case, I would recommend that with each response - a tidy way is in the header, as suggested in comments by Kevin B - you transmit the current application version.
Client
It is then up to the client to handle changes to the version number supplied. It will know from initial load and more recent requests what the version number has been up until this point. You might want to consider different behaviour depending on what the change in version is.
For example, if it is a patch number change, you might want to present to the user the option of reloading, like Outlook.com does. A feature change might do the same with a different message advertising the fact that new functionality is available, and a major version change may just disable the site and tell the user to reload to regain access.
You'll notice that I've skated around automatic reloading. This is definitely not a technical issue so much as a UX one. Having a SPA reload with no warning (which may well result in data loss) is not the best and I'd advise against it, especially for patch version changes.
Edit
Of course, if you're not using any kind of API or other means of dynamically communicating data with the server, you will have to resort to polling an endpoint that will give you a version and then handle it on the client in the same way. Polling isn't super tidy, but it's certainly better - in my strong opinion - than reloading on a timer on the offchance that the application has updated in the interim.
Are you talking about changing the client side code of the app or the content? You can have the client call the server for updated content using AJAX requests, one possibility would be whenever the user changes states in the app or opens a page that loads a particular controller. If you are talking about changing the html or javascript, I believe the user would need to reload to get those updates.
Related
I use Axios for my AJAX requests in my Vue components and noticed that (after page refresh) requests to cached endpoints were being sent to the server instead of hitting the cache if they were called from the mounted hook. Every other request was hitting the browser cache correctly.
I had two (different) requests being dispatched from the mounted hook and when I moved the second request inside the .then() of the first request it started hitting the browser cache.
What causes this behaviour? How can I have page data loaded from the cache inside the mounted hook?
Edit: This behaviour is seen in Firefox. Pasting the request URLs in a new tab loads the data from the browser cache correctly. I've still not found a solution to force the requests to respect the cache control in the Vue code inside the mounted hook.
tl,dr;
To properly test client-side caching for a page, copy/paste the URL in a new tab instead of hitting F5.
why?
What happens when a user hits Refresh is not entirely up to you, as a developer. It's also up to the browser manufacturer.
If you're looking for a way to forbid a browser to load fresh data when it already has a version which, according to your cache-control server-side settings, is still valid, there isn't one.
It's up to the browser manufacturer to guess what the user really wants and to provide it. On average, when the guess is correct, their market share goes up. When wrong, it goes down. For obvious reasons, you have no say in it.
You should also consider the vast majority of site owners do not know how to tweak cache-control and don't bother paying someone to do it for them, so they just go with defaults. So, in order to keep their users happy, browser manufacturers need to guess correctly when the user wants the content refreshed, even if the server says it's fresh enough.
In other words, you can't enforce client side caching (but you can server-side!). You can only suggest it. If the browser manufacturer has reason to believe the user wants a "Clear cache and hard reload" but is not tech-savvy enough to perform it, they'll perform it on simple Refresh.
Chrome, based on large amounts of usage data, tries to guess what's the "primary cache" for a page and what is "secondary cache". It tries to balance providing fresh content (by wiping out primary cache for that page) with doing it in a timely manner (by not wiping out secondary cache). I believe any resource loaded by more than one page on the same domain is marked as "secondary" although I'd guess the algorithm is smarter than just that.
Firefox, as far as I know, doesn't have multiple types of refresh, so if the user hits refresh all cache for said page is gone.
I am a young developer, and I work on the development of a site whose content is stored on Contentful. Currently, each reloading of the page, the javascript will retrieve the content on Contentful via the API.
The content of the site is not likely to change often, so I would like to cache it.
The site is stored on Netlify. Link
So I thought I could recover the content on Contentful on the Node build, store it in a "cache", that the javascript could use when loading the page. And when modifying on Contentful, a webhook would trigger the rebuild on Netlify.
I do not know if my thinking is the right one, thank you for your help and your answers.
Contentful actually has caching built into its service so you shouldn't need to do anything to get the benefits of caching on your website. Quoting from the Contentful Docs:
There are no limits enforced on requests that hit our CDN cache, i.e. the request doesn't count towards your rate limit and you can make an unlimited amount of cache hits. For requests that do hit the Contentful Delivery API enforces rate limits of 78 requests per second and 280800 requests per hour by default. Higher rate limits may apply depending on your current plan.
See https://www.contentful.com/developers/docs/references/content-delivery-api/#/introduction/api-rate-limits for full details
If you want to do additional caching onto of the Contentful API you could utilize a Node library that'll do it for you. Something like APICache would work pretty well in this use case.
If the rebuilding stack when new content is published, rather than rending it on page view, is important to you, I'd encourage you to take a look at static sites. Contentful has some great webhook support that you can use together with Netlify to help rebuild your site anytime an author pushes new content. Check out this tutorial about using Gatsby for more details - https://www.contentful.com/blog/2018/02/28/contentful-gatsby-video-tutorials/
It seems to be better to cache the pages separately (instead of caching the whole site) and use a cron job to compare the cache of each page (maybe weekly) against the current version. If it is different, regenerate the cache for that page. Also, you might want to manually trigger that, possibly on deploys or in the rare event when there is a change on a given page.
Anyway, before you start to do all this caching stuff you should check whether your site is anywhere near to be overwhelmed by requests. If not, then caching can be postponed to be later, which would be wise, since, in the case your site's nature will change over time and changes will occur often you might need a different cache, or even no cache at all.
I Want to make an app with ruby on rails and Jquery that will allow multiple users to have the same page open at the same time, and if any of them makes a change to the page, adds a post, or deletes a post it will show all other users that change without having to reload the page.
Here in stack-overflow, whenever another user comes and gives me a point or removes a point on the post, it will show me without having to refresh the page.
Same with the comments, if someone posts, I will see it without having to refresh.
Can anyone tell me here to get started with this?
I would rather not have to have the page reload every 30 seconds.
Any help would be appreciated.
Thank you in advance.
There are a few ways to go about this:
Websockets
Server Sent Events via ActionController::Live (Rails 4+)
Long Polling (outdated method at this point)
Between websockets and SSE I would go with the former. Higher browser compatibility and the more mature technology of the two. If you're willing to pay for convenience, check our Pusher (solid free tier). Otherwise you might want to check out something like Faye (good intro at http://railscasts.com/episodes/260-messaging-with-faye).
Say, a link to a person is sent to a user via email. If the person is already logged into the webpage in his/her browser, clicking on the link takes him/her to the page. However, if he/she is not logged in, he/she should be asked to login in order to access the page. Is there a way to achieve the above functionality using jquery, javascript?
Yes. Build a back-end authentication system, using AJAX and whatever your server-side language is.
From there, develop a hypermedia-style of content-system, and a modular, "widget"-based application delivery model.
Within your hypermedia responses to login (plus passing whatever relevant path information was gained from the e-mail), either redirect the page to a new page (based on the linked response from the server), or download the widgets requested from the server (for whatever application you're displaying media in), and then stream in AJAX content (again, from a URL dictated by the server-response).
This is about as close as you're going to get to security, in terms of delivering things to the client, in real-time, with authentication.
If you were to load the reports/gallery/game/whatever, and put a div over it, and ask for users to log in, then smart users can just kill the div.
If you include the content, or include the application components (JS files), or even include the links to the JS files which will request and display the content, then clever people are again going to disassemble that, in 20 seconds, flat.
The only way I can see to do this is to have a common request-point, to touch the server, and conditionally load your application, based on "next-steps" URLs, passed to the client, based on successful authorization and/or successfully completing whatever the previous step was, plus doing authentication of some form on each request (REST-based tokens+nonces, or otherwise)...
This would keep the content (and any application-structure which might have vulnerabilities) from the client, until you can guarantee that the client has been properly authorized, and the entire application is running inside of multiple enclosed/sandboxed modules, with no direct access to one another, and only instance-based access to a shared-library.
Is it worth the work?
Who knows.
Are we talking about a NORAD nuclear-launch iPhone app, which must run in JavaScript?
Then no, engineering this whole thing for the next six months isn't overboard.
And again, all of this security falls over as soon as one person leaves themselves logged-in, and leaves their phone on the table (biometric authentication as well, then?).
Are we talking about a gallery or discount-offers that you want to prevent people to log into, so you know that only the invited people are using them?
Well, then an 18-month project to engineer, develop, debug and deploy a system like this is probably going to be overkill.
In this case, perhaps you can just do your best to prevent the average person from stealing your content or using your cut-prices, and accept that people who take the time to dig into and reverse-engineer everything are going to find a way to get what they want, 95 times out of 100.
In that case, perhaps just putting a login div overtop of the page IS what you're going to be looking for...
If you're dealing with, say a company back-end, or with company fiscals or end-user, private-data, or anything of the sort, then aside from meeting legal requirements for collection/display/storage, how much extra work you put into the security of the system depends on how much your company's willing to pay to do it.
If it makes you feel better, there are companies out there that pay $60,000-$150,000 a year, to use JS tracking/testing programs from Adobe. Those programs sit right there, on the webpage, most of the time, for anybody to see, as long as you know where to look.
So this isn't exactly an unknown problem.
Yes it is. By authenticating (login) you can store a "loggedIn" cookie which you have to delete by session end (logout or closing the browser). You can use that cookie to check if somebody is logged in or not. If not logged in, than you can display the login page and send the login request with ajax. Btw it is not a good practice to use hybrid applications like that. It is better to use SPA-s with REST service, or implement this on server side.
I have an html5/javascript application in which multiple users can be viewing the same set of data of any given time. For the sake of a real world example, lets say its a calendar type page.
So user1 is looking has the browser open and looking at the calendar page and user2 is also on the calendar page. User2 makes a change to the calendar and i'd like (as quickly as possible) for those changes the be recognized and refreshed on user1's screen. What is the best way to do this?
I'm thinking about have a mysql table for active users that stores the page they are currently on and a timestamp for its last update, then use ajax calls to ping the server every few seconds and check for an updated timestamp, if its newer than what they have client side, the new data gets sent and the page "reloaded." I am putting reloaded in quotes because the actual browser window will not be refreshed, but a function will be called via javascript that will reload the page. Sort of the way stack overflow performs its update checks, but instead of telling the user the page has changed and providing a button for reload, it should happen automatically. If user1 is working away on the calendar, it seems it might be quite annoying for user2's screen to constantly be refreshing...
Is this a horrible idea? Is pinging the server with an ajax request every few seconds going to cause major slow downs? Is there a better way to do this? I would like the views on either users side to be real time because its important that user1 not be able to update an element on the calendar page that user2 has already changed.
Update: based on some web sockets research it doesnt seem like a proper solution. First its not compatible with older browsers and i support ie8+ and second i dont need real time updstes for all users on the site. The site is an account based applicatiin and an account can have multiple users. The data needs to sync between those users only. Any other recommendations would be great.
You need realtime app for this. You should have a look at socketio. Everytime a user log in, you make him listen for changes on the server. Then when something changed on the server, every users listening are notified.
you can find examples on the official website : http://socket.io/