I have a ASP.NET MVC application running on IIS. I have the following handlers /DoWork (lets say it takes 10 minutes) and /ReportStatus (lets say it takes <1s). DoWork does the work while ReportStatus returns the progress of the work.
I wanted to asynchronously run the /DoWork by running $.ajax request from Javascript on it and then monitor its progress by repeatedly querying the /ReportStatus also through asynchronous $.ajax wrapped in function registered in window.setInterval. However, what I am seeing is that the long running $.ajax on /DoWork blocks all the other queries on /ReportStatus until it the DoWork finishes.
How do I circumvent this? I would guess that this has to do with IIS server setting possibly denying two active requests from one host? Any ideas?
My first idea is to have the /DoWork run the actual work in background asynchronous thread and immediately return. However I would like to know if there are better options as I want to keep the connection open during the /DoWork run.
Ended up using WebSockets as suggested by colecmc. Since I do not want to rely on Windows 8+ / Windows 2012+ I chose lightweight implementation called Fleck (available on Nuget).
http://jxs.me/2011/05/28/csharp-websockets-with-fleck/
Related
I am working on a CRM we inherited. Long story short - there is a button that calls a php script which should run in background and we don't need to wait for response.
request(idata+'transferInTimeExec.php',{
sync: false,
preventCache:true,
method:'GET'
});
Now, the transferInTimeExec.php takes an hour to run, it's a very complex script that deals with weekly timesheets for a recruitment company, processes them, does a lot of DB operations etc.
Using Chrome. Every time I press the button to run it, it blocks all the xhr calls until it finishes. CRM is "ajax heavy" and while the script is running, the user can't do anything, if they navigate to another subpage, no xhr requests will resolve until that process we started has finished. Even when I open a new browser tab and try to do something, it won't do it. If I open the CRM in another browser (Firefox) while the script is running, I can use the CRM.
In Network tab - the first one is pending, and as you can see all the subsequential calls to a different Ajax call wait (all have sync:false)
I even replaced the whole logic with PHP function sleep(30) to make it just do nothing for 30 seconds before returning anything - same issue.
I tried XHR in plain javascript, inside onClick on the button HTML markup, rather than Dojo methods - same issue.
I've done a brutal Search/Replace on the whole project, replacing sync:true to sync:false - nothing changed.
I have run out of ideas, maybe anyone here can help to figure this out? Is there a global switch for sync/async? What else could it be if not ajax issue?
Your script transferInTimeExec.php is probably using session. When thats the case, other AJAX Call would not initiate and instead wait for this AJAX Call to Finish so as to not overwrite the session data. Setting AJAX call to asynchronous does not change the behavior.
If this script takes 1 hour to run it is a bad idea to call it as an AJAX from UI. You should setup a Cron, hourly or daily, and perform all the operations in the backend. Of course, you will need to make some adjustment to the script if it is using session.
EDIT
You could use session_write_close(); at the top of your script. This tells the PHP that this script would not write anything to the Session so other AJAX Calls are free to use the Session. However be careful to not write anything to Session after this as this will result in an error.
I'm developing a web application and use jQuery to make asynchronous HTTP requests to my API. I have a detail view where you can see a lot of information of a specific object stored in the database. Because there is a lot of information and data that is linked to other objects, I make different calls to my API to gather different information for my views.
In the 'detail view' I have some kind of widgets that show the requested information. For that, I make about 5-7 HTTP GET requests to my API. When using the debugger (both Safari and Firefox), I can see that some requests are blocking other requests and the page takes a lot of time until everything is loaded and shown to the user.
I make a request like this:
$.get("api/api.php?object=myobject&endpoint=someendpoint", function(data) {
// data is JSON formatted
$("#my-widget input").val(data["name"]);
});
And another one e.g. like this:
$.get("api/api.php?object=anotherobject&endpoint=anotherendpoint", function(data) {
// data is JSON formatted
$("#other-widget input").val(data["somekey"]);
});
If the first request takes a little longer to finish, it blocks the second request until the callback function of the first request finished. But why? I thought that those calls are asynchronous and non-blocking.
I want to build a fast web application for a company where the requests are only made inside the local network, so a request should only take about 10-50ms (or even less). But the page takes about 10 seconds to show up with all information.
Am I doing something wrong? Or is there a JavaScript framework that can be used for exactly this problem? Any help is appreciated!
EDIT: As you can see in the screenshot, the requests have to wait some seconds, and if the request is fired, it takes a few seconds until a response comes back.
If I call the URL directly in my browser or do a GET request using curl it is a lot faster.
EDIT2: Thanks #CBroe! The session file write lock was the problem. As long as the session file is locked, no other script can run until the previous script finished. I just called session_write_close() immediately after session_start() and it runs a lot faster now.
Attention: Use session_write_close() only if you don't need to write to the $_SESSION array. Reading is possible after that, but writing not. (See this topic for further details: https://stackoverflow.com/a/50368260/1427878)
I have a main view function for my application. After logging in successfully this main view method is called and is expected to render the template.
But I have to perform some calculations in this view method [I am checking certain conditions about the user by making facebook graph api request.]
Thus it takes 2~4 seconds to load.
How do I show this loading scene since the template is rendered by return statement and thus is executed only when the process is complete.
Should I make 2 views , one for showing loading and the other one for calculating and keep making AJAX request to other view method to check if the process is complete or not ?
You should indeed make two views, one to only return the page showing the loading UI and one to perform the long task.
The second view will be called using an AJAX request made from the "loading" page. The response from the AJAX request will notify your "loading" page that it is time to move on.
You need to make sure the AJAX request's duration won't exceed the timeout of your server (with ~10 seconds, you should be fine).
You need to run your Graph API requests in a task executed asynchronously, allowing you to return a HttpResponse without waiting for the task to finish.
Celery will allow you to do just that.
You then need a way to notify your client that the asynchronous task has finished.
I see two ways to do that:
Making AJAX requests at regular intervals to a view that will check if the task is finished.
Using WebSockets.
The first approach is the simplest but has the drawback of making a lot of useless requests and of being less reactive.
Using WebSockets on the other side will require more configuration as an external app is required (ex: django-socketio or swampdragon).
If it is the only place where you need notifications from server to client, using WebSockets seems to be overkill.
I'm working on a simple chat implementation in a function that has an ajax call that invokes a setTimeout to call itself on success. This runs every 30 seconds or so. This works fine, but I'd like a more immediate notification when a message has come. I'm seeing a lot of examples for long polling with jQuery code that looks something like this:
function poll()
{
$.ajax(
{
data:{"foo":"bar"},
url:"webservice.do",
success:function(msg)
{
doSomething(msg);
},
complete:poll
});
}
I understand how this works, but this will just keep repeatedly sending requests to the server immediately. Seems to me there needs to be some logic on the server that will hold off until something has changed, otherwise a response is immediately sent back, even if there is nothing new to report. Is this handled purely in javascript or am I missing something to be implemented server-side? If it is handled on the server, is pausing server execution really a good idea? In all of your experience, what is a better way of handling this? Is my setTimeout() method sufficient, maybe with just a smaller timeout?
I know about websockets, but as they are not widely supported yet, I'd like to stick to current-gen techniques.
Do no pause the sever execution... it will lead to drying out server resources if lot of people try to chat...
Use client side to manage the pause time as you did with the setTimeout but with lower delay
You missed the long part in "long polling". It is incumbent on the server to not return unless there's something interesting to say. See this article for more discussion.
You've identified the trade-off, open connections to the web server, therefore consuming http connections (i.e. the response must block server side) vs frequent 'is there anything new' requests therefore consuming bandwidth. WebSockets may be an option if your browser base can support them (most 'modern' browsers http://caniuse.com/websockets)
There is no proper way to handle this on the javascript side through traditional ajax polling as you will always have a lag at one end or the other if you are looking to throttle the amount of requests being made. Take a look at a nodeJS based solution or perhaps even look at the Ajax Push Engine www.ape-project.org which is PHP based.
I want to slow down my app by adding latency for ajax requests. I have two options: doing it in javascript and doing it server-side. With javascript, I could easily add a setTimeout on my requests but there are about 30 different requests and I'm wondering if there's a better way, with less code.
I want to slow down ajax requests server-side. What's the best way to do it? I'm using about 25 different asmx web services (will be converted to wcf soon) and I'm wondering how to make it so that all requests have 1000ms of latency.
My goal is to change as little code as possible so that I can turn this feature on/off by changing as little as possible.
Thanks for your suggestions.
In case you're wondering why: I'm running on my local machine. I'm going to do a user-testing session and I need to simulate real ajax requests. Without latency, the ajax request happens almost instantaneously.
You could add a
System.Threading.Thread.Sleep(1000)
in the OnRequestBegin-Handler or where ever you can intercept the request before doing the actual work.
you can hook in a timeout event on the server side code before it responds to the ajax request. At least then your ajax's interaction with a latent response is authentic.
If you are using jquery for the ajax call, just go into the jquery code file and add the latency there. Would that work?
Look for this line in the main jquery file:
ajax: function( url, options ) {
Add this code right after:
var ms = 1000 //wait time in milliseconds
ms += new Date().getTime();
while (new Date() < ms) { }
ms is the number of milliseconds to wait
I would put an http proxy in the way which you control and can make slow.
Since I know Perl best I'd use something like http://metacpan.org/pod/HTTP::Proxy and add a filter method that did nothing but wait a second.