For my Web apps I'm always wondering, which way is the best to design a proper Web applications with data persistance. For now I design every time a single HTML page, and all the content and the data upload is managed with jQuery AJAX requests based on a RESTful model, to a remote server which takes care of the database. But at the end that make sometimes a lot of AJAX calls, and getting huge amount of data takes sometimes a few seconds, which is not user-friendly.
Is there something like a guideline, or a standard way of developing to design web App ?
I've already looked over the WebWorkers and WebSockets Javascript API, but never used them yet. Does anybody already try it ? Does that allows better performance than AJAX exchanges ?
What is your way of Web App developing ?
This isn't really the place for questions like this, but I will give you a few brief pointers.
AJAX requests shouldn't take long, if they are consistently being slow then the problem is most likely your server-side code and inefficiencies there. Websockets aren't going to give you any benefit over AJAX if your server is slow.
A common design is to load the minimal dataset required for the page to function, AJAXing any other required data to get the page responsive as quickly as possible.
Caching and pre-fetching are great ways to speed up your site, for instance: if you are running a mysql query over and over, run it once and put the results in a caching service like memcached or mongodb with an expiration of an hour (or something) and serve the cached response, this will speed up your server response times. Pre-fetching is anticipating what your user is going to do next and loading that data in the background without any user interaction.
Consider using localStorage or IndexedDB if your users are loading the same data repeatedly
Related
I've inherited a site that uses knockout js and asp.net. The site runs decent after everything has been loaded but the initial load leaves a lot to be desired. Digging through the code there are around 20 models, each one calls an ajax method to get data from the server on page load. There is quite a bit of data being queried from the db which is causing the performance issue as the server sends the js, then the client sends and receives a large amount of data over 20 methods.
I want to handle all of the queries on server side before I send it to the client side, and then load the js models from that data. I am thinking about posting this data in a hidden div on the page as JSON and loading the models from there instead of an ajax call.
My question is, is this best practice? Is there a better way to optimize this scenario?
If you inline the data from the 20 queries in the page response, then the page response time can be significantly prolonged. It will results in the browser having to sit and wait from the previous page or on a boring blank page.
However if you keep the solution as-is then the user will get the page initially much faster, and the data will pop-in when it is ready.
Although the total load time is probably going to be better with the data inlined, the perceived performance from the users perspective is going to be worse. Here is a nice post on the subject: http://www.lukew.com/ff/entry.asp?1797
Another benefit is that you don't have a weakest-link problem in that the page response time will be that of the slowest query. That will be quite severe in query timeout conditions.
Be also aware of issues if one query fails, then you must still inline the successful queries, and also handle the failed query.
I would argue that it is much better to do the queries from the browser.
There are some techniques to consider if you want to have the 20 queries executed more efficiently. Consider using something like SignalR to send all queries in a single connection and having the results also stream back in a single connection. I've used this technique previously with great success, it also enabled me to stream back cached results (from server-side cache) before the up-to-date results from a slow backend service was returned.
I want to create a web application that displays data from a public api. I will use d3 (a javascript data-visualization library). I want to retrieve data from the api every ten minutes, and update my page (say it is traffic, or something). I have not built many web applications, how do I get the updates?
Should the js on the client side use a timer to request updates from the server side of my application (perhaps the application is written in Rails or node.js). The server then makes the api call and sends a response asynchronously? Is this called a socket? I have read that HTML5 provides sockets.
Or, perhaps an AJAX request?
Or, does the server side of my application create a timer, make the api call, and then "push" updates to the view. This seems wrong to me, there could be other views in this application, and the server shouldn't have to keep track of which view is active.
Is there a standard pattern for this type of web application? Any examples or tutorials greatly appreciated.
An AJAX request (XMLHttpRequest) is probably the way to go.
I have a very simple example of an XMLHttpRequest (with Java as the backend) here: https://stackoverflow.com/a/18028943/1468130
You could recreate a backend to receive HTTP GET requests in any other server-side language. Just echo back whatever data you retrieved, and xmlhttp.onload() will catch it.
Depending on how complex your data is, you may want to find a JSON library for your server-side language of choice, and serialize your data to JSON before echoing it back to your JS. Then you can use JavaScript's JSON.parse() method to convert your server data to an object that can easily be used by the client script.
If you are using jQuery, it handles AJAX very smoothly, and using $.ajax() would probably be easier than plain-old XMLHttpRequest.
http://api.jquery.com/jQuery.ajax/
(There are examples throughout this page, mostly-concentrated at the bottom.)
It really annoys me how complicated so many of the AJAX tutorials are. At least with jQuery, it's pretty easy.
Basically, you just need to ask a script for something (initiate the request, send url parameters), and wait for the script to give you something back (trigger your onload() or jqxhr.done() functions, supplying those functions with a data parameter).
For your other questions:
Use JavaScript's setTimeout() or setInterval() to initiate an AJAX request every 600000 milliseconds. In the request's onload callback, handle your data and update the page appropriately.
The response will be asynchronous.
This isn't a socket.
"Pushing" probably isn't the way to go in this case.
If I understand correctly and this API is external, then your problem can be divided into two separate sub-problems:
1) Updating data at the server. Server should download data once per N minutes. So, it should not be connected to customers' AJAX calls. If two customers will come to the website at the same time, your server will make two API call, what is not correct.
Actually, you should create a CRON job at the server that will call API and store its' result at the server. In this case your server will always make one call at a time and have quite a fresh information cached.
2) Updating data at clients. If data at customers' browsers should be updated without refreshing the page, then you should use some sort of Ajax. It can make a request to your server once per X minutes to get a fresh data or use so-called long-polling.
I think the most effective way to implement real time Web application is to use Web socket to push changes from the server rather than polling from the client side. This way users can see changes instantaneously once server notify that there is new data available. You can read more on the similar post.
I have tried using nodejs package called socket.io to make a real time virtual classroom. It works quite well for me.
Quick question:
I have a page that makes an AJAX call to my SQL server and receives an XML response. I parse the XML and display the relevant data in a table.
I have a button on the page that is used to display the table data in a simple line graph on a new page. Currently, I just re-query the database and to re-get the data and create a new array object of that data for my graph.
The GET can take up to 2.5 seconds, with the final graph time to render being about 8-9 seconds. I am investigating any alternatives to re-GET'ting the data.
So far I have:
localStorage (HTML5)
php pass me the data instead of querying the DB
jquery plugin (DOMCache)
Any thoughts on this or best practices??
thanks for any input!
You might do best to just make sure the response is cached in your users' browsers. There are a variety of ways to make this happen (variant upon the framework your server is running, browsers that your clients are using, etc etc), but the long story short is that relying upon caching will alleviate you from having to jump through performance hoops by making modifications to your codebase.
IE8+ is actually a kingpin in this area (much as I hate to admit it). Its aggressive caching of ajax responses is usually a serious pain in the arse, but in this case would prove valuable to your scenario.
Edit -
You mentioned SQL Server, so I'm making the assumption that you're running through an ASP.NET middle tier. If that's the case, here's an interesting article on caching ajax requests on the server and the client with the .NET framework.
I put together an AJAX chat a while back ASP.NET MVC and jQuery. The javascript would hit the server about every 7 seconds to check for new messages. Obviously this was horrible on performance as the chat grew and included more and more users. The site traffic grew exponentially with so many requests going on. A user could leave the computer on all day and not even be there and they would still be making hits every 7 seconds.
Is there a better way to do this? I have heard of something called "push" but I haven't really been able to wrap my head around it. I think I just need pointed in the right direction.
1.) What is the best way to develop an AJAX chat and have it be scalable?
2.) What is push and how would I just that with jQuery?
1.) What is the best way to develop an AJAX chat and have it be scalable?
I agree with #freakish about the complexity and potential lack of scaling of IIS.
However, there is a relatively new Microsoft option in the works called SignalR which could become a core part of ASP.NET. More details in this related SO Question:
AJAX Comet - Is there any solution Microsoft is working on or supports to allow it to be scalable?
2.) What is push and how would I just that with jQuery?
Partially answered elsewhere, but it's a long-held persistent connection between the server and the client which means the server can instantly 'push' data to the client when it has new data available.
jQuery does support making AJAX requests but the core library doesn't support expose ways of doing HTTP Long-Polling or HTTP Streaming. More information in this SO answer to 'Long Polling/HTTP Streaming General Questions'.
Server push is a technology that allows the server to push data back to the client without forcing client to make many requests (like every 7 seconds). It is not really a matter of javascript but rather good server scripting. The upcoming HTML5 will make it simple due to server-sent events and/or WebSockets. This will be a true TCP connection between different machines.
But if you intend to make a webpage compatible with older browsers, then the most common technique is the long polling. The client sends request to the server and the server does not respond to it until it has new data. If it does then the response is made and the client immediatly after receiving data calls the server with new request. In practice however this requires the server to be well-written (for example it has to maintain thousands of idle requests at the same time) and can become a rather big challenge for developers.
I hope this helps. :) Good luck!
The technique you should use is the real-time persistent long running connections over a web page using WebSockets. You can use this library.
Node.js is becoming quite popular to build something like this and supports socket connections so you could push the data out only when there is a new message. But that would be learning something completely new.
Another nice potential would be to use MVC's OutputCacheAttribute and use the SQL dependency option so your AJAX page could be cached and would only be a new request when a new chat message appears. Also you would want your controller to be an Asynchronous controller to help reduce the load on IIS.
Enjoy, optimization is always fun and very time consuming!
I am creating a Web Application using JSP, Struts, EJB and Servlets. The Application is a combined CRM and Accounting Package so the Database size is very huge. So, in order to make Execution faster, I want prevent round trips to the Database.
For that purpose, what I want to do is create some temporary XML files on the client Machine and use them whenever required. How can I do this, as Javascript do not permits me to do so. Is there any way of doing this? Or, is there any other solution which I can adopt in order to make my application Faster?
You do not have unfettered access to the client file system to create a temporary file on the client. The browser sandbox prevents this for very good reasons.
What you can do, perhaps, is make some creative use of caching in the browser. jQuery's data method is an example of this. TIBCO General Interface makes extensive use of a browser cache for XML data. Their code is open source and you could take a look to see how they've implemented their browser cache.
If the database is large and you are attempting to store large files, the browser is likely not going to be a great place for that data. If, however, the information you want to store is fairly small, using an in-browser cache may accomplish what you'd like.
You should be caching on the web server.
As you've no doubt realised by now, there is a very limited set of things you can do on the client machine from a web app (eg, write cookie).
You can make your application use the browser plugin Google Gears, that allows you a real clientside storage.
Apart from that, remember, there is a huge overhead for every single request, if needed you can easily stack a few 100 kB in one response, but far away users might only be able to execute a few requests per second. Try to keep the number of requests down, even if it means adding overhead in form of more data.
#justkt Actually, there is no good reason to not allow a web application to store data. Indeed HTML5 specifications include a database similar to the one offered by Google Gears, browser support is just a bit too sporadic for relying on that feature.
If you absolutely want to cache it on the client, you can create the file on your server and make your web app retrieve it. This way the browser will fetch it and keep it on the client cache.
But keep in mind that this could be a pain for the client if the file is large enough.