*edit: I found a bug in Celery task which leads to this problem. So it's unrelated to that piece of JavaScript code blow. Suggest to close this question.
I have this Javascript function running on Flask web server, which checks status of a Celery task. If the task finished, console.log its result, otherwise recheck after 2 seconds.
It runs well if the Celery task finished in less than 6 minutes. However, when it hits the 6 minute mark, this function just stops running. The status of Celery task never updated again, even though I am sure that task is still running on the background, I don't receive any result from Javascript.
Is it a timeout setting of Flask? Any insights will be appreciated.
function update_progress(status_url) {
// send GET request to status URL
$.getJSON(status_url, function(data) {
if (data['state'] != 'PROGRESS') {
if ('result' in data) {
console.log(data['result']);
}
else {
// something unexpected happened
console.log(data['state']);
}
}
else {
// rerun in 2 seconds
setTimeout(function() {
update_progress(status_url);
}, 2000);
}
});
}
you could try use setInterval instead, registering it to a variable.
const updateInterval = setInterval(function(){
update_progress(status_url);
},2000);
And inside update_progress function, remove the else part and add clearInterval(updateInterval):
function update_progress(status_url) {
// send GET request to status URL
$.getJSON(status_url, function(data) {
if (data['state'] == 'PROGRESS') {
return;
}
clearInterval(updateInterval);
if ('result' in data) {
console.log(data['result']);
}
else {
// something unexpected happened
console.log(data['state']);
}
});
}
Related
I have an Angular 4 project that uses a http post to commands across to our backend. The issue is that sometimes a command can be sent out before the backend is fully up and running. Normally an "ERR_CONNECTION_TIME_OUT" would occur, but our embedded browser for whatever reason holds onto the post for an extremely long time before giving us the error (5 minutes). Since a 5 minute wait is unacceptable, I need to come up with a way to re-send our http post if there isn't a response within 15-30~ seconds
Here is what the current post looks like
this._http.post(this.sockclientURL, body, { headers: headers })
.subscribe((res) => {
let text = res.text();
if (text.startsWith("ERROR")) {
console.log("Sockclient Error.");
if (this.sockclientErrorRetryCount < this.sockclientErrorRetryLimit) {
console.log("Retrying in 3 seconds.");
this.sockclientErrorRetryCount++;
setTimeout(() => {
this.SendCommand(command, success, fail);
}, 3000);
}
return;
}
else {
this.sockclientErrorRetryCount = 0;
}
if (text == "N" || text.startsWith("N ")) {
this._modalService.alert(this._nackLookup.convert(text));
if (typeof fail == 'function') {
fail(text);
}
}
else {
let deserializedCommand = command.deserialize(text);
success(deserializedCommand);
let repeatMillis: number = deserializedCommand.getRepeatMillis();
if (repeatMillis && repeatMillis > 0) {
setTimeout(() => {
this.SendCommand(command, success, fail);
}, repeatMillis);
}
}
},
(err) => {
console.log(err);
let repeatMillis = 1000;
setTimeout(() => {
this.SendCommand(command, success, fail);
}, repeatMillis);
});
So to re-cap, I have some code in place to re-attempt the command if an error occurs, but our embedded browser holds onto its timeout error for several minutes. I need something to attempt to re-send after 15-30 seconds of no response
Retries are immediately executed without waiting for a delay. A better one consists of waiting for a bit before retrying and abort after a given amount of time. Observables allow to mix retryWhen, delay and timeout operators to achieve this, as described in the following snippet:
this._http.post(this.sockclientURL, body, { headers: headers })
.retryWhen(error => error.delay(500))
.timeout(2000, new Error('delay exceeded'))
.map(res => res.map());
Not 100% sure it still works in Angular4, but you should be able to do:
this
._http
.post(this.sockclientURL, body, { headers: headers })
.timeout(15000, new Error('timeout exceeded')) // or 30000
.subscribe((res) => { /* ... */ })
A little more information would be needed as to the entire scope of the application, however when I get myself in to situations like this I normally look at the following avenues of approach:
Will a try / catch solve my problem.
In your catch, you could redirect elsewhere. If you don't catch anything
often times you will get a bit of lag.
Is there a way to avoid the error all together through user constraints.
Lastly, You may want to use setInterval() over setTimeout().
Can you provide more information as to the scope of the operation?
In my app I make several nested AJAX calls to the LiquidPlanner API that limits requests to 30 requests every 15 seconds. When I hit the limit, I want to set a timeout of sorts to stop sending requests to the API until the 15 seconds have elapsed. This (at the moment) will only be used by one person ever, so multiple clients are not a concern.
Upon hitting the rate limit the response is:
{
"type":"Error",
"error":"Throttled",
"message":"34 requests, exceeds limit of 30 in 15 seconds. Try again in 7 seconds, or contact support#liquidplanner.com"
}
Here is some code, simplified for brevity:
$.getJSON('/dashboard/tasks/123, function(tasks) {
$.each(tasks, function(t, task) {
$.getJSON('/dashboard/project/987, function(project) {
$.getJSON('/dashboard/checklist-items/382983, function(checklist-items) {
// form some html here
});
});
});
});
So at any point in this process I could hit the limit and need to wait until the timeout has completed.
I am also open to suggestions to better form the requests instead of nesting them.
Another solution that probably prevents hammering better is a queue - however you need to be aware that the order of requests could be significantly different using this method. And that only one request will ever run at a time (so total response times may increase significantly depending on the use case).
//Keep track of queue
var queue = [];
//Keep track of last failed request
var failed_request = false;
function do_request(url, callback) {
//Just add to queue
queue.push({
url:url,
callback:callback
});
//If the queue was empty send it off
if (queue.length === 1) attempt_fetch();
}
function attempt_fetch() {
//If nothing to do just return
if (queue.length === 0 && failed_request === false) return;
//Get the url and callback from the failed request if any,
var parms;
if (failed_request !== false) {
parms = failed_request;
} else {
//otherwise first queue element
parms = queue.shift();
}
//Do request
$.getJSON(parms.url, function(response) {
//Detect throttling
if (response.type === 'error' && response.error === 'throttled') {
//Store the request
failed_request = parms;
//Call self in 15 seconds
setTimeout(function(){
attempt_fetch();
}, 15000);
} else {
//Request went fine, let the next call pick from the queue
failed_request = false;
//Do your stuff
parms.callback(response);
//And send the next request
attempt_fetch();
}
}
}
...your logic still remains largely unchanged:
do_request('/dashboard/tasks/123', function(tasks) {
$.each(tasks, function(t, task) {
do_request('/dashboard/project/987', function(project) {
do_request('/dashboard/checklist-items/382983', function(checklist_items) {
// form some html here
});
});
});
});
Disclaimer: Still completely untested.
As far as design patterns for chaining multiple requests, take a look at the chaining section in the following article: http://davidwalsh.name/write-javascript-promises . Basically, you could create a service that exposes a method for each type of request, which returns the promise object and then chain them together as needed.
As far as you question about setting a timeout, given the information you provided, it is a bit difficult to advice you on it, but if that is absolutely all we have, I would create a request queue ( a simple array that allows you to push new requests at the end and pop the from the head ). I would then execute the known requests in order and inspect the response. If the response was a timeout error, set a timeout flag that the request executor would honor, and if successful, either queue additional requests or create the html output. This is probably a pretty bad design, but is all I can offer given the information you provided.
Write a wrapper that will detect the rate-limited response:
//Keep track of state
var is_throttled = false;
function my_wrapper(url, callback) {
//No need to try right now if already throttled
if (is_throttled) {
//Just call self in 15 seconds time
setTimeout(function(){
return my_wrapper(url, callback);
}, 15000);
}
//Get your stuff
$.getJSON(url, function(response) {
//Detect throttling
if (response.type === 'error' && response.error === 'throttled') {
/**
* Let "others" know that we are throttled - the each-loop
* (probably) makes this necessary, as it may send off
* multiple requests at once... If there's more than a couple
* you will probably need to find a way to also delay those,
* otherwise you'll be hammering the server before realizing
* that you are being limited
*/
is_throttled = true
//Call self in 15 seconds
setTimeout(function(){
//Throttling is (hopefully) over now
is_throttled = false;
return my_wrapper(url, callback);
}, 15000);
} else {
//If not throttled, just call the callback with the data we have
callback(response);
}
}
}
Then you should be able to rewrite your logic to:
my_wrapper('/dashboard/tasks/123', function(tasks) {
$.each(tasks, function(t, task) {
my_wrapper('/dashboard/project/987', function(project) {
my_wrapper('/dashboard/checklist-items/382983', function(checklist_items) {
// form some html here
});
});
});
});
Disclaimer: Totally untested - my main concern is the scope of the url and callback... But it's probably easier for you to test.
I am developing a web interface for Arduino, using Python. For automatic updates and display, I use JSON. I have a very interesting problem.
The following code sends the command to a python function, if a command exists. Then, whether a command was sent to the function or not, the function checks for updates from the Arduino by calling another function.
Here is what I can't find any explanation to: in the first and only condition of the update() function, if I remove the line that says alert('hey'); the python function is not called anymore. But if I do write alert('hey'); after the JSON request, it works fine, the function is called and the arduino gets the message.
Anyone has an idea why?
function update(command=0) {
// if a command is passed, send it
if (command!=0) {
$.getJSON('/action?command='+command);
alert('hey'); // if I remove this, the action function is not called. Why?
}
// read from the read function, no matter what
$.getJSON('/read', {}, function(data) {
if (data.state != 'failure' && data.content != '') {
$('.notice').text(data.content);
$('.notice').hide().fadeIn('slow');
setTimeout(function () { $('.notice').fadeOut(1000); }, 1500);
}
setTimeout(update, 5000); // next update in 5 secs
});
}
update(); // for the first call on page load
Are you checking for the results of the first command with the second? If so, I suspect the alert('hey') is pausing execution long enough for the first to finish. Can you try making your read a callback function of the first getJSON?
function update(command=0) {
if (command!=0) {
$.getJSON('/action?command='+command, function() {
read();
});
} else {
read();
}
}
function read() {
$.getJSON('/read', {}, function(data) {
if (data.state != 'failure' && data.content != '') {
$('.notice').text(data.content);
$('.notice').hide().fadeIn('slow');
setTimeout(function () { $('.notice').fadeOut(1000); }, 1500);
}
setTimeout(update, 5000); // next update in 5 secs
});
}
update(); // for the first call on page load
Here's a fiddle
Hi all i got a getjson call and wondering how i can check its response(siteContents) if it is empty or if it doesn't have a required string(for example look for seasonEpisode=)then call getjson again .Can we call getjson itself from within it ?My goal is to get correct response from getjson.Hope you guys help me.Thanks
$.getJSON('http://www.mysite.com/doit.php?value=55?', function(data){
//$('#output').html(data.contents);
var siteContents = data.contents;
Try this:
var handler = function(data){
//$('#output').html(data.contents);
var siteContents = data.contents;
if (!siteContents) {
$.getJSON('http:/...', handler);
return;
}
// handle siteContents
}
$.getJSON('http://...', handler);
edit: the above would spam the server with repeating attempts in case the siteContents is empty - creating infinite loop and high load. I would suggest two improvements:
1) count how many repeating empty siteContents loops you made. Cancel the loop with an error message (if appropriate) after some failure threshold (eg. 20 attempts).
2) do the iteration with setTimeout(function() { $.getJSON(...) }, delay) where delay is some milliseconds to wait between retries.
It sounds like the better question is why doesn't your server return the 'correct' response on the first try? Or as NuclearGhost points out, why does it return different responses for the same request?
But to accomplish what you're asking for requires recursion. You can't just do it in a loop because the response is asynchronous. But if you name a function, you can call that function in the success handler, something like this:
function getJSONRecursive(maxRetries, count) {
if(!count) count = 1;
if (count > maxRetries) {
alert('Giving up after '+count+' retries.');
return;
}
$.getJSON('http://www.mysite.com/doit.php?', function(data) {
if(!data || !data.contents || data.contents.indexOf('seasonEpisode') == -1) {
getJSONRecursive(++count, maxRetries);
} else {
$('#output').html(data.contents);
}
})
}
Then invoke it like this:
getJSONRecursive(5);
I wouldn't recommend doing without the count, because otherwise you will overflow the stack if the correct response never comes back. If the situation you're avoiding is a server timeout or overload problem, then I would recommend putting the recursive call in a timeout, like so:
if(!data || !data.contents || data.contents.indexOf('seasonEpisode') == -1) {
setTimeout(function() { getJSONRecursive(++count, maxRetries)}, 5000);
// etc.
This waits an extra 5 seconds between calls, but the extra time will help ensure your server doesn't get slammed and your getjson calls don't just run themselves out too quickly.
I'm a novice-to-intermediate JavaScript/jQuery programmer, so concrete/executable examples would be very much appreciated.
My project requires using AJAX to poll a URL that returns JSON containing either content to be added to the DOM, or a message { "status" : "pending" } that indicates that the backend is still working on generating a JSON response with the content. The idea is that the first request to the URL triggers the backend to start building a JSON response (which is then cached), and subsequent calls check to see if this JSON is ready (in which case it's provided).
In my script, I need to poll this URL at 15-second intervals up to 1:30 mins., and do the following:
If the AJAX request results in an error, terminate the script.
If the AJAX request results in success, and the JSON content contains { "status" : "pending" }, continue polling.
If the AJAX request results in success, and the JSON content contains usable content (i.e. any valid response other than { "status" : "pending" }), then display that content, stop polling and terminate the script.
I've tried a few approaches with limited success, but I get the sense that they're all messier than they need to be. Here's a skeletal function I've used with success to make a single AJAX request at a time, which does its job if I get usable content from the JSON response:
// make the AJAX request
function ajax_request() {
$.ajax({
url: JSON_URL, // JSON_URL is a global variable
dataType: 'json',
error: function(xhr_data) {
// terminate the script
},
success: function(xhr_data) {
if (xhr_data.status == 'pending') {
// continue polling
} else {
success(xhr_data);
}
},
contentType: 'application/json'
});
}
However, this function currently does nothing unless it receives a valid JSON response containing usable content.
I'm not sure what to do on the lines that are just comments. I suspect that another function should handle the polling, and call ajax_request() as needed, but I don't know the most elegant way for ajax_request() to communicate its results back to the polling function so that it can respond appropriately.
Any help is very much appreciated! Please let me know if I can provide any more information. Thanks!
You could use a simple timeout to recursively call ajax_request.
success: function(xhr_data) {
console.log(xhr_data);
if (xhr_data.status == 'pending') {
setTimeout(function() { ajax_request(); }, 15000); // wait 15 seconds than call ajax request again
} else {
success(xhr_data);
}
}
Stick a counter check around that line and you've got a max number of polls.
if (xhr_data.status == 'pending') {
if (cnt < 6) {
cnt++;
setTimeout(function() { ajax_request(); }, 15000); // wait 15 seconds than call ajax request again
}
}
You don't need to do anything in your error function unless you want to put an alert up or something. the simple fact that it error will prevent the success function from being called and possibly triggering another poll.
thank you very much for the function. It is a little bit buggy, but here is the fix. roosteronacid's answer doesn't stop after reaching the 100%, because there is wrong usage of the clearInterval function.
Here is a working function:
$(function ()
{
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) clearInterval(i);
},
error: function ()
{
// on error, stop execution
clearInterval(i);
}
});
}, 1000);
});
The clearInterval() function is becomming the interval id as parameter and then everything is fine ;-)
Cheers
Nik
Off the top of my head:
$(function ()
{
// reference cache to speed up the process of querying for the status element
var statusElement = $("#status");
// this function will run each 1000 ms until stopped with clearInterval()
var i = setInterval(function ()
{
$.ajax(
{
success: function (json)
{
// progress from 1-100
statusElement.text(json.progress + "%");
// when the worker process is done (reached 100%), stop execution
if (json.progress == 100) i.clearInterval();
},
error: function ()
{
// on error, stop execution
i.clearInterval();
}
});
}, 1000);
});
You can use javascript setInterval function to load the contents each and every 5 sec.
var auto= $('#content'), refreshed_content;
refreshed_content = setInterval(function(){
auto.fadeOut('slow').load("result.php).fadeIn("slow");},
3000);
For your reference-
Auto refresh div content every 3 sec