I've written a few lines of code to tackle the following problem:
Get the TwitchTV UserID based on the username. The usernames are in an arrray. After hitting a button a GET AJAX request will be called to GET the UserID and to push it in another array and the call to the API will made via AJAX after hitting a button.
My problem is that if I hit the button again the usersID are in wrong order.
If I set async: false, it works.
Is this problem because of Asynchronous AJAX? What would be the approach for a valid fix? Use callback? A hint in the right direction would be appreciated.
The comments in the code are for testing.
Code:
<script>
var users = ["ESL_SC2", "OgamingSC2", "cretetion", "freecodecamp", "storbeck", "habathcx", "RobotCaleb", "spicyjewlive"];
clientID = "?client_id=XXX";
userID = [];
userStatus = [];
for(var i=0; i<users.length; i++){
idCall(users[i]);
};
function idCall (data){
$.ajax({
type: "GET",
url: "https://api.twitch.tv/kraken/users/" + data + clientID,
async: false,
cache: false,
dataType: "jsonp",
success: function (data){
console.log(data._id);
},
error: function (data) {
console.log("error");
}});
};
</script>
Use an array of request promises and update the dom after all those are resolved. The results will be in same order as original users array
var requestPromises = users.map(idCall);
$.when.apply(null, requestPromises).then(function(){
var dataArray = [].slice.call(arguments);
// now loop over data array which is in same order as users
dataArray.forEach(function(userData){
// do something with each userData object
});
}).fail(function(){
console.log("Something went wrong in at least one request");
});
function idCall(user) {
// return the ajax promise
return $.ajax({
url: "https://api.twitch.tv/kraken/users/" + user + clientID,
dataType: "jsonp"
}).then(function(data) {
// return the data resolved by promise
return data;
});
};
What would be the approach for a valid fix?
I think #charlieftl answer is a good one. I don't think I can really improve on this. I've mainly added this to try and explain your difficulties here. As I mentioned in a previous question you posted, you need to read and understand How do I return the response from an asynchronous call?.
If you had control of the server side then a simpler option is to send the array server side in order and use this to maintain the order, but your using a third party API so this isn't really practical.
A simpler option would be to use a callback method:
var users = ["ESL_SC2", "OgamingSC2", "cretetion", "freecodecamp", "storbeck", "habathcx", "RobotCaleb", "spicyjewlive"];
var clientID = "?client_id=XXX";
var userID = [];
var userStatus = [];
idCall(users);
function idCall (users, x=0){
if (x < users.length)
{
var data = users[x];
$.ajax({
type: "GET",
url: "https://api.twitch.tv/kraken/users/" + data+ clientID,
cache: false,
dataType: "jsonp",
success: function (data){
console.log(data._id);
},
error: function (data) {
console.log("error");
}})
.then(function(){
idCall(users, x++);
});
}
};
though like I said, charlies answer is better. This will guarantee the order and won't lock the UI. But it's slow.
Is this problem because of Asynchronous AJAX?
Yes, well asynchronous processing on a server and ordering.
When you first this open up the network panel of your debugging tool and watch the HTTP requests. What you'll see is a load of requests getting sent off at the same time but returning in different orders. You cannot guarantee the response order of a async request. It depends on the server response time, load, etc.
Example:
These request we're sent in order A,B,C,D,E,F,G but the responses we're received in B,C,E,F,G,D,A this is the nub of your problem.
Setting async to false does this:
Now no request will be sent until the previous has returned but. So stuff is sent A-G and returns A-G but this is A LOT slower than the previous example (the graphs here aren't to scale, I struggled to capture a good async:false example as I never use it, see reasons below).
Never do this (pretty much ever). What will happen is the UI thread will become locked (Javascript/browsers are single threaded, mostly) waiting for an external call and your web site will become unresponsive for long periods of time. Locking this thread doesn't just stop Javascript, it stops the entire site, so links don't work, hover overs, everything, until the async call returns the thread back to the UI.
Yes, it is because of the AJAX calls. One AJAX call might finish before the other and therefore the different order.
An obvious solution is to use jQuery promises but another solution to this problem could be that you sort your array after your AJAX calls have been completed.
You can sort your array according to the username string like this:
data.sort();
If you have an array of objects, which you don't in the example code, you can sort it like this:
data.sort(function (a, b) { return (a.Name > b.Name) ? 1 : ((b.Name > a.Name) ? -1 : 0); } );
After going through the comments, I thought to give some more code that can a better idea on how to solve this.
<script>
var users = ["ESL_SC2", "OgamingSC2", "cretetion", "freecodecamp", "storbeck", "habathcx", "RobotCaleb", "spicyjewlive"];
clientID = "?client_id=XXX";
userID = [];
userStatus = [];
var count = 0;
for(var i=0; i<users.length; i++){
idCall(users[i]);
}
function idCall (data){
$.ajax({
type: "GET",
url: "https://api.twitch.tv/kraken/users/" + data + clientID,
async: false,
cache: false,
dataType: "jsonp",
success: function (data){
console.log(data._id);
userID.push(data._id);
count++;
if( count === users.length - 1 ) {
userID.sort();
}
},
error: function (data) {
console.log("error");
}
});
}
Related
I'm using an API - 'Have I been pwned?' which is rate limited - "limited to one per every 1500 milliseconds". I have looked at quite a few other questions on here, researched via google and a couple of other forum sites as well as tried myself to find a solution to this one.
Does the Javascript function
setInterval()
Really work for this kind of issue or problem? Has anyone found a solution that effectively works? I'm kinda at my wit's end with this one as
var url = "https://haveibeenpwned.com/api/v2/breachedaccount/";
var breach = Array();
setInterval($.ajax({
url: url,
type: 'GET',
dataType: 'JSON',
success: function(data) {
breach[] = data;
}), 15000);
Does not seem to work, especially where my current project is storing the information for multiple email addresses. So for example if I store 4 email addresses in an array and want to loop through but wait the 1500 ms before hitting the API again to query for the next email address.
Any ideas anyone? or is there a NodeJS solution to this that might work as I've been learning that recently too.
Thank you in advance
I would create a queue with requests and process them one by one.
function myRequestQueue()
{
this.queue = [];
this.timer = null;
this.add = function(params)
{
this.queue.push(params);
}
this.processQueue = function()
{
if (this.queue.length > 0)
{
$.ajax(this.queue.shift());
}
}
this.timer = setInterval(this.proceesQueue,1500)
}
Usage:
var myQ = new myRequestQueue(), breach = [];
for(var i=0;i<100;i++){
myQ.add({
url: url,
dataType: 'JSON',
success: function(data) { breach.push(data) }});
} // 100 requests
// expect them to be processed one by one every 1500 ms
You may want to add a callback when the queue is emptied or something depending on your exact use case
I want to get some data about places using the Google Places API.
Thing is, I want to get data from more than 1000 records, per city of the region I'm looking for.
I'm searching for pizzeria, and I want all the pizzerias in the region I've defined. So I have an array like this:
['Pizzeria+Paris','Pizzeria+Marseille','Pizzeria+Nice','Pizzeria+Toulouse']
My objective is to make a single request, then wait 3sec(or more), and then process the second request. I'm using Lodash library to help me iterate.
Here is my code:
function formatDetails(artisan){
var latitude = artisan.geometry.location.lat;
var longitude = artisan.geometry.location.lng;
var icon = artisan.icon;
var id = artisan.id;
var name = artisan.name;
var place_id = artisan.place_id;
var reference = artisan.reference;
var types = artisan.types.toString();
$('#details').append('<tr>'+
'<td>'+latitude+'</td>'+
'<td>'+longitude+'</td>'+
'<td>'+icon+'</td>'+
'<td>'+id+'</td>'+
'<td>'+name+'</td>'+
'<td>'+place_id+'</td>'+
'<td>'+reference+'</td>'+
'<td>'+types+'</td>'+
'</tr>');
}
var getData = function(query, value){
$.ajax({
url: query,
type: "GET",
crossDomain: true,
dataType: "json",
success: function(response) {
var artisan = response.results;
console.log(artisan);
for (var i = 0; i < artisan.length; i++){
formatDetails(artisan[i]);
setTimeout(function(){console.log('waiting1');},3000);
}
setTimeout(function(){console.log('waiting2');},3000);
},error: function(xhr, status) {
console.log(status);
},
async: false
});
}
$(document).ready(function(){
var places =
['Pizzeria+Paris','Pizzeria+Marseille','Pizzeria+Nice','Pizzeria+Toulouse'];
_.forEach(places, function(value, key) {
var proxy = 'https://cors-anywhere.herokuapp.com/';
var target_url = 'https://maps.googleapis.com/maps/api/place/textsearch/json?query='+value+'&key=AIzaSyAClTjhWq7aFGKHmUwxlNUVBzFpIKTkOrA';
var query = proxy + target_url;
getData(query, value);
});
});
I've tried a lot of solutions I found on stackoverflow, but no one were working, or I might have done them wrong.
Thanks for your help!
The fact that $.ajax returns a Promise makes this quite simple
Firstly, you want getData to return $.ajax - and also get rid of async:false
var getData = function(query, value) {
return $.ajax({
url: query,
type: "GET",
crossDomain: true,
dataType: "json",
success: function(response) {
var artisan = response.results;
for (var i = 0; i < artisan.length; i++){
formatDetails(artisan[i]);
}
},error: function(xhr, status) {
console.log(status);
}
});
}
Then, you can use Array.reduce iterate through the array, and to chain the requests together, with a 3 second "delay" after each request
Like so:
$(document).ready(function(){
var places = ['Pizzeria+Paris','Pizzeria+Marseille','Pizzeria+Nice','Pizzeria+Toulouse'];
places.reduce((promise, value) => {
var proxy = 'https://cors-anywhere.herokuapp.com/';
var target_url = 'https://maps.googleapis.com/maps/api/place/textsearch/json?query='+value+'&key=AIzaSyAClTjhWq7aFGKHmUwxlNUVBzFpIKTkOrA';
var query = proxy + target_url;
return promise.then(() => getData(query, value))
// return a promise that resolves after three seconds
.then(response => new Promise(resolve => setTimeout(resolve, 3000)));
}, Promise.resolve()) /* start reduce with a resolved promise to start the chain*/
.then(results => {
// all results available here
});
});
The most effective answer is the one above from #jaromandaX.
Nevertheless, I also found a workaround with Google Chrome, which will help you to not get your hands dirty with promises.
On Chrome:
1. Open Console
2. Go to network tab
3. Near the options "preserve log" and "disable cache", you have an option with an arrow where you will see the label "No throttling".
4.Click on the arrow next to the label, then add.
5. You will be able to set a download and upload speed, and most important, delay between each request.
Kaboom, working with my initial code.
Nevertheless, I changed my code to fit the above answer, which is better to do, in terms of code, speed, etc..
Thanks
I have an ajax function hitting the Twitch API to find "Starcraft" streams.
$.ajax({
type: 'GET',
url: 'https://api.twitch.tv/kraken/search/streams?q=starcraft',
headers: {'Client-ID': 'xxx'},
success: function(json) {
console.log(json);
}});
This returns Object {_total: 108, _links: Object, streams: Array[9]}. I want streams array to hold all streams (all 108).
I've tried adding limit and offset to url like so:
https://api.twitch.tv/kraken/search/streams?limit=100&offset=0&q=starcraft but this will obviously only work for cases where there are under 100 streams. Anyone familiar with the Twitch API, is there like a limit=max kind of thing? If not, what is the workaround?
The following is based on my comment to your question and should get you started. I;ve worked on the assumption that offest is eqivalent to page. Note that the code is untested and a starting point only.
Also make sure to handle a failure from the ajax call so you don't get stuck in an infinate loop.
fucntion GetAllStreams(){
var arrStreams = [];
var offset = 0;
var total = -1;
while(total < 0 || arrStreams.length < total)
{
$.ajax({
type: 'GET',
url: 'https://api.twitch.tv/kraken/search/streams?limit=100&offset=' + offset++ + '&q=starcraf',
headers: {'Client-ID': 'xxx'},
success: function(json) {
arrStreams.push(json.streams);
console.log(json);
console.log(arrStreams);
}});
}
retrun arrStreams;
}
I created a site which load every few seconds data from multiple sources via AJAX. However I experience some strange behavior. Here is the code:
function worker1() {
var currentUrl = 'aaa.php?var=1';
$.ajax({
cache: false,
url: currentUrl,
success: function(data) {
alert(data)
},
complete: function() {
setTimeout(worker1, 2000);
}
});
}
function worker2() {
var currentUrl = 'aaa.php?var=2';
$.ajax({
cache: false,
url: currentUrl,
success: function(data) {
alert(data)
},
complete: function() {
setTimeout(worker2, 2000);
}
});
}
The problem is that many times, one of the workers returns NaN. If I change the frequency of calls for, lets say, 2000 and 1900, then everything is working ok and I got almost no NaN results. When those frequencies are same, I get over 80% NaN results for one of the calls. It seems like the browser cannot handle two requests called at exact same time. I use only those two workers, so the browser shouldn't be overloaded by AJAX requests. Where is the problem?
Note that the aaa.php works with the mySql database and do some simple queries base on parameters in url.
All you need is $.each and the two parameter form of $.ajax
var urls = ['/url/one','/url/two', ....];
$.each(urls, function(i,u){
$.ajax(u,
{ type: 'POST',
data: {
answer_service: answer,
expertise_service: expertise,
email_service: email,
},
success: function (data) {
$(".anydivclass").text(data);
}
}
);
});
Note: The messages generated by the success callback will overwrite
each other as shown. You'll probably want to use
$('#divid').append() or similar in the success function.
Maybe, don't use these workers and use promises instead like below? Can't say anything about the errors being returned though without looking at the server code. Below is working code for what it looks like you are trying to do.
This is a simple example but you could use different resolvers for each url with an object ({url:resolverFunc}) and then iterate using Object.keys.
var urls = [
'http://jsonplaceholder.typicode.com/users/1',
'http://jsonplaceholder.typicode.com/users/2',
'http://jsonplaceholder.typicode.com/users/3',
'http://jsonplaceholder.typicode.com/users/4',
'http://jsonplaceholder.typicode.com/users/5',
'http://jsonplaceholder.typicode.com/users/6',
'http://jsonplaceholder.typicode.com/users/7'
]
function multiGet(arr) {
var promises = [];
for (var i = 0, len = arr.length; i < len; i++) {
promises.push($.get(arr[i])
.then(function(res) {
// Do something with each response
console.log(res);
})
);
}
return $.when(promises);
}
multiGet(urls);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
Sorry, My first language is not English. I am not sure that if I explain my question properly.
My code is like a main function have two ajax functions (Use ajax function to get foursquare API)
main(){
ajax1();
ajax2();
all other codes
}
the ajax2() function has to get result from ajax1() as input and then return result(actually result was pushed in to global array).
all other codes should be processed after two ajax functions are finished. I tried the asyn: false but it is not working. My html file include newest jquery like this
<script type="text/javascript" src="http://code.jquery.com/jquery-latest.min.js" ></script>
I try the jquery function $.when().done() function and the first ajax works. However, the second ajax() function was in the for loop. The for loop will destroy the mechanism of $.when().done() function:
first ajax: in firstjson function
Second ajax: in transfer function
function firstjson(tmpName,tmpLoc,PhotoJson,foursq){
return $.ajax({
type: 'GET',
url: foursq,
dataType: 'jsonp',
success: function(json) {
for (i = 0; i < 3; i++) {
var resultname = json['response']['venues'][i].name;
var resultlocation = json['response']['venues'][i].location;
var resultlat = resultlocation.lat;
var resultlng = resultlocation.lng;
var tmpmarker = new google.maps.LatLng(resultlat,resultlng)
tmpName.push(resultname);
tmpLoc.push(tmpmarker);
var resultid = json['response']['venues'][i].id;
var tmpPhotoJason = 'https://api.foursquare.com/v2/venues/'+ resultid +'/photos?';
PhotoJson.push(tmpPhotoJason);
}
}
});
}
function transfer(PhotoJson,PhotoURL){
for (i = 0; i < 3; i++) {
return $.ajax({
type: 'GET',
url: PhotoJson[i],
dataType: 'jsonp',
success: function(json) {
resultphoto = json['response']['photos']['items'];
photoprefix = resultphoto[i].prefix;
photopresuffix = resultphoto[i].suffix;
photourl = photoprefix+"150x150" + photopresuffix;
PhotoURL.push(photourl);
}
});
}
}
$.when(firstjson(tmpName,tmpLoc,PhotoJson,foursq)).done(function(){
alert("test1");
$.when(transfer(PhotoJson,PhotoURL).done(function(){
console.log(PhotoURL);
all other codes!!!!
});
});
//PhotoURL is global array
So the first "when" function work properly. alert("test1") work after the firstjson was done. However the for loop inside transfer function will break the when function. How can I fix the problem. Please help me. I will appreciate you can give me any related information. Thanks!!!
This will execute ajax2 after ajax1
function anotherMethod(){
//Here you do all that you want to do after the last $ajax call
}
main(){
firstjson(tmpName,tmpLoc,PhotoJson,foursq)
.then(transfer(PhotoJson,PhotoURL))
.then(anotherMethod);
}
As you are returning a promise from the first with the "return $ajax..."
So you organice your code like this:
in methods with ajax calls you return the call as you are doing now
return $.ajax();
that returns a promise that you chain.
And you put what you want to do in another method so you call it in the last "then".
Non-Blocking Example
You should use non-blocking code. You can turn async off (async: false) but this can easily be done in a non-blocking manor using callback functions.
function main(){
$.ajax({ // First ajax call (ajax1)
url: "first/ajax/url",
type: "GET", // POST or GET
data: {}, // POST or GET data being passed to first URL
success: function(x){ // Callback when request is successfully returned
// x is your returned data
$.ajax({ // Second ajax call (ajax2)
url: "second/ajax/url",
type: "GET", // POST or GET
data: {
thatThing: x
}, // POST or GET data passed to second URL
success: function(y){
// y is your second returned data
// all other codes that use y should be here
}
});
}
})
}
This would be the non-blocking approach, nest your function within "success" callback functions. Nest ajax2 within ajax1's "success" callback to ensure that ajax2 is not executed before ajax1 has returned and nest your "all other codes" inside the "success" callback of ajax2 to ensure they are not executed until ajax2 has returned.
Blocking Example
If you absolutely must (please avoid at all cost) you can disable async which will block all JavaScript code from executing until the ajax has returned. This may cause your browser to temporarily freeze until the ajax request has returned (depending on the browser).
function main(){
var x = ajax1();
var y = ajax2(x);
window["y"] = y; // push to global as you requested but why?
// All other codes that can now use y
}
function ajax1(){
var x;
$.ajax({
url: "first/ajax/url",
async: false,
type: "GET", // POST or GET,
data: {}, // POST or GET data being passed to first URL
success: function(r){x=r}
});
return x;
}
function ajax2(x){
var y;
$.ajax({
url: "second/ajax/url",
async: false,
type: "GET", // POST or GET,
data: {
thatThing: x
}, // POST or GET data being passed to second URL
success: function(r){y=r}
});
return y;
}
Once again I stress, try not to disable async that will cause your code to block and is BAD code. If you absolutely 100% have to for some reason than than it can be done but you should attempt to learn how to write non-blocking code using callbacks as the first example does.
Social Network Example
Now I'll do an example of an ajax call to get an array of your friends IDs, and then a series of ajax calls to get each of your friends profiles. The first ajax will get the list, the second will get their profiles and store then, and then when all profiles have been retrieved some other code can be ran.
For this example, the url https://api.site.com/{userID}/friends/ retrieves an Object with a list of friends IDs for a particular user, and https://api.site.com/{userID}/profile/ gets any users profile.
Obviously this is a simplified api as you will probably need to first establish a connection with a apikey and get a token for this connection and the token would likely need to be passed to the api uris but I think it should still illustrate the point.
function getFriends(userID, callback){
$.ajax({
url: "https://api.site.com/"+userID+"/friends/",
success: function(x){
var counter = 0;
var profiles = [];
for(var i=0;i<x.friendIDs.length;i++){
$.ajax({
url: "https://api.site.com/"+x.friendIDs[i]+"/profile/",
success: function(profile){
profiles.push(profile);
counter++;
if(counter == x.friendIDs.length) callback(profiles);
}
});
}
}
});
}
getFreinds("dustinpoissant", function(friends){
// Do stuff with the 'friends' array here
})
This example is "Non-blocking", if this example were done in a "blocking" way then we would ask for 1 friends profile, then wait for its response, then request the next and wait and so on. If we had hundreds of friends you can see how this would take a very long time for all ajax calls to complete. In this example, which is non-blocking, all requests for profiles are made at the same time (within 1ms) and then can all be returned at almost exactly the same time and a counter is used to see if we have gotten responses from all the requests. This is way way way faster than using the blocking method especially if you have lots of friends.