Use data from two separate JSON files in one function - javascript

I have two local JSON files that I'm trying to access in a function in my main.js file. Everything works fine with just one JSON file, but I'm not sure how to incorporate the second. Ideally, something like data_set1=$.getJSON("file1.json") would work perfectly, but I see similar questions have been repeatedly asked and because of asynchronous calls, that's not necessarily possible (I don't completely understand all the answers to those questions).
This works as it is:
$.getJSON("data.json", function(json){
var data_points = [];
for (i = 0; i < json.length; i++){
data_points.push([json[i].name, json[i].age]);
}
$(function () {
//do stuff with data_points
but I don't know how to incorporate the second JSON call to make another list for the function at the end to use.

You can use jQuery deferred loading.
var xFile, yFile;
var requestX = $.getJSON("data1.json", function(json){
xFile = json;
});
var requestY = $.getJSON("data2.json", function(json){
yFile = json;
});
$.when(requestX, requestY).then(function(){
// do something;
// this function only gets called when both requestX & requestY complete.
});
Check out JQuery WHEN

An ajax request is ONE request for ONE resource. You can't fetch two different resources with the same request. You'd either need TWO requests:
var completed = 0;
$.getJSON('file1.json', function(json) {
completed++;
if (completed == 2) { all_done(); }
}
$.getJSON('file2.json', function(json) {
completed++;
if (completed == 2) { all_done(); }
}
function all_done(...) { ... }
Or simply combine your two json files into a single one on the server:
{"file1":{data from file one here}, "file2":{data from file two here}}
and access them as data.file1 and data.file2 in your code.

You could just set a var to indicate the file is loaded and call the same function while checking the status. This might be a bad idea depending on the amount of data in the JSON.
var xFileLoaded = false, yFileLoaded = false;
var xFile, yFile;
$.getJSON("data.json", function(json){
xFile = json;
xFileLoaded = true;
doSomething();
});
$.getJSON("data.json", function(json){
yFile = json;
yFileLoaded = true;
doSomething();
});
function doSomething() {
if(xFileLoaded && yFileLoaded) {
// good to go
}
}

Using something very similar to #MarcB's solution, I would keep the files in an array and generalize things slightly:
var files = ["file1.json", "file2.json"];
var results = {};
var completed = 0;
function afterEach(fileName, json) {
results[fileName] = json;
if (++completed >= files.length) {
afterAll();
}
}
files.forEach(function (file) {
$.getJSON(file, afterEach.bind(this, file));
});
This should request all files in parallel, allowing the browser to choose how many connections to actually open and download. After each finishes, the results (json) are put in a hash for storage, under the filename. After all files have completed (assuming files doesn't change), then afterAll function will be called.

It exists with jquery an elegant way to do this :
$.when(
$.ajax({
url: 'file1.json/',
success: function(data) {
// Treatement
}
}),
$.ajax({
url: 'file2.json',
success: function(data) {
// Treatement
}
})
).then( function(){
// Two calls are completed
});
Once the two calls are completed, you can do any manipulation in the block "then".

Related

Use the data from 1st JSON and convert it to a variable and use that variable to call a 2nd JSON

I am new here but I have a problem and need your help, so in my code I called a JSON file.
var data = {};
var spectator;
var tmp = [];
var IDcatcher =[];
var summonerID = [];
$.getJSON(getUrl, function(data) {
tmp = data;
$.each(tmp, function(key){
for (IDcatcher in tmp) {}
});
summonerID = tmp[IDcatcher].id;
});
so this gives me the ID from the JSON which is stored in summonerID variable now I want to use this variable to complete the URL to get the 2nd JSON so.
var spectatorUrl = "link" + summonerID;
Now get the 2nd JSON
var Charcatcher =[];
var CharID = [];
$.getJSON(spectatorUrl, function(data) {
tmp = data;
$.each(tmp, function(key){
for (Charcatcher in tmp) {}
});
CharID = tmp[Charcatcher].id;
});
My problem is the 2nd JSON doesn't run, it doesn't get anything and returns nothing (obviously).
Help?
I can't run 2 JSONs at different times? If so how can I do it or change it?
As I mentioned, due to the asynchronous nature of JavaScript, if you have an AJAX request with a callback and some code following the request, JS will fire off the AJAX request and will continue with the rest of the code. It won't wait for the result of the AJAX request to return.
Here's a simple demo -
function first() {
setTimeout(() => {
console.log("1");
}, 2000);
console.log("2");
};
first();
Look at the order of the console.log statements in the code, and then check the actual order in the console.
To solve your original problem, you can nest a $.getJSON() inside the first one, this will ensure that summonerID is available when you fire off the second AJAX request.
$.getJSON(getUrl, function(data) {
tmp = data;
$.each(tmp, function(key){
for (IDcatcher in tmp) {}
});
summonerID = tmp[IDcatcher].id;
// second AJAX request
var spectatorUrl = "link" + summonerID;
$.getJSON(spectatorUrl, function(data) {
// logic
});
});

Multiple ajax calls fired simultaneously not working properly

I created a site which load every few seconds data from multiple sources via AJAX. However I experience some strange behavior. Here is the code:
function worker1() {
var currentUrl = 'aaa.php?var=1';
$.ajax({
cache: false,
url: currentUrl,
success: function(data) {
alert(data)
},
complete: function() {
setTimeout(worker1, 2000);
}
});
}
function worker2() {
var currentUrl = 'aaa.php?var=2';
$.ajax({
cache: false,
url: currentUrl,
success: function(data) {
alert(data)
},
complete: function() {
setTimeout(worker2, 2000);
}
});
}
The problem is that many times, one of the workers returns NaN. If I change the frequency of calls for, lets say, 2000 and 1900, then everything is working ok and I got almost no NaN results. When those frequencies are same, I get over 80% NaN results for one of the calls. It seems like the browser cannot handle two requests called at exact same time. I use only those two workers, so the browser shouldn't be overloaded by AJAX requests. Where is the problem?
Note that the aaa.php works with the mySql database and do some simple queries base on parameters in url.
All you need is $.each and the two parameter form of $.ajax
var urls = ['/url/one','/url/two', ....];
$.each(urls, function(i,u){
$.ajax(u,
{ type: 'POST',
data: {
answer_service: answer,
expertise_service: expertise,
email_service: email,
},
success: function (data) {
$(".anydivclass").text(data);
}
}
);
});
Note: The messages generated by the success callback will overwrite
each other as shown. You'll probably want to use
$('#divid').append() or similar in the success function.
Maybe, don't use these workers and use promises instead like below? Can't say anything about the errors being returned though without looking at the server code. Below is working code for what it looks like you are trying to do.
This is a simple example but you could use different resolvers for each url with an object ({url:resolverFunc}) and then iterate using Object.keys.
var urls = [
'http://jsonplaceholder.typicode.com/users/1',
'http://jsonplaceholder.typicode.com/users/2',
'http://jsonplaceholder.typicode.com/users/3',
'http://jsonplaceholder.typicode.com/users/4',
'http://jsonplaceholder.typicode.com/users/5',
'http://jsonplaceholder.typicode.com/users/6',
'http://jsonplaceholder.typicode.com/users/7'
]
function multiGet(arr) {
var promises = [];
for (var i = 0, len = arr.length; i < len; i++) {
promises.push($.get(arr[i])
.then(function(res) {
// Do something with each response
console.log(res);
})
);
}
return $.when(promises);
}
multiGet(urls);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>

multiple calls to ajax simultaneously

I'm trying to make multiple calls to Ajax, i have fields like time intervals and no of calls to ajax under that time period. Now the problem is, while making multiple calls to same Ajax, there may be chances of merging of data with the other data that were send to Ajax earlier. I am not sure that it will happen.
Here my Ajax call.
callAjax = function () {
var dataIn = inObj.data || {};
var successFunc = inObj.success || function () {};
var passOn = inObj.passOn || {};
var myParams = {drape:1,type:'GET'};
myParams.url = this.homeComingUrl;
$.extend(myParams,params);
var data = this.fillAction(action,dataIn);
if (myParams.drape) { vidteq.utils.drapeSheer(action); }
var that = this;
var magicCall = $.ajax({
url:myParams.url,
type:myParams.type,
data:data,
success: function (response) {
// TBD we need better error handling
if (myParams.drape) { vidteq.utils.undrapeCurtain(action); }
successFunc(response,passOn);
},
error:function(response) {
if (myParams.drape) { vidteq.utils.undrapeCurtain(action); }
that.gui.io.handleError(response);
}
});
}
saveEvents = function () {
this.commitEditingEvent();
var dataEvents = this.collectEventsToSave();
//$('#calendar').fullCalendar('removeEvents');
var that = this;
if (vidteq.eTrainer==1) {
dataEvents = arguments[0];
}
if (!dataEvents.length) { alert("Nothing to save");return; }
this.callAjax('updateEvents',{
data : { events : JSON.stringify(dataEvents) },
success : function (response,passOn) {
that.handleGetEvent(response,passOn);
}
},{type:'POST'});
}
This may not be required for understanding the problem.
If any body can explain how Ajax handles multiple calls, then it'll really helpful.
First line, your anonymous function isn't saved and isn't ran. Then. In each function, what does this refer to ? What is this context ? Is this window or do you call your function like saveEvents.apply( jQuery ) ?
JavaScript is powerful, when your want to run XMLHttpRequest (Ajax uses it), scripts are called when an event happen, like "server is found", "request is send", "file is reading", "file loaded"... for each state of your request. Ajax by jQuery help you to request asynchronous. You can request as many Ajax request as you would like in the same time. The important is to create a function happen in success case.
In this success function, you receive data, you compute it, then this function may call another Ajax request, and so on. When you chain requests like this to get the same file, we call it Ressource.
Ressource uses Ajax which uses XMLHttpRequest.
you need to do asynic :false in your ajax method
function isLoggedIn() {
var isLoggedIn;
$.ajax({
async: false,
// ...
success: function(jsonData) {
isLoggedIn = jsonData.LoggedIn
}
});
return isLoggedIn
}

ajax complete callback function is never called

I'm using Django.
I have the following code
var done_cancel_order = function(res, status) {
alert("xpto");
};
var cancel_order = function() {
data = {};
var args = {
type:"GET",
url:"/exchange/cancel_order/"+this.id,
data:data,
complete:done_cancel_order
};
$.ajax(args);
return false;
};
The function var cancel_order is called when I press a button on the page. That url when accessed is does some things on the server side, which I can check indeed are done, and then returns a json specifying whether or not the request was successful. You get:
{'status':200, 'message':'order canceled'}
The problem is that the callback is never called. I would like to have the callback display to the user the thing that was returned from the server. But even the first alert("xpto") inside the callback is never executed. Why is that?
EDIT:
I have checked that this code:
var cancel_order = function() {
data = {};
var args = {
type:"GET",
url:"/exchange/cancel_order/"+this.id,
data:data,
complete: function() { alert("xpto"); }
};
$.ajax(args);
return false;
};
displays the same behavior as described above: everything goes great on the server side, but the callback isn't called.
Be sure nothing is messing with your debug tools [e.g. console.log], it may end up wrecking your js code, delivering unexpected results.
Why don't you change it to this:
function done_cancel_order (res, status) {
/* remains same */
};
I hope, this one would work for you!
Or just simply:
complete: alert("xpto");

Making functions wait until AJAX call is complete with jQuery

Im trying to develop a class in JavaScript I can use to access a load of data that is gathered by an AJAX request easily. The only problem is I need to make the members of the class accessible only once the AJAX call is complete. Ideally what I would like to end up is something where by I can call this in a script:
courses.getCourse('xyz').complete = function () {
// do something with the code
}
And this will only fire after the AJAX call has been complete and the data structures in the "class" are ready to be used. Ideally I dont want to have to create a .complete member for every function in the class
Here is the "class" I am trying to make so far:
var model_courses = (function() {
var cls = function () {
var _storage = {}; // Used for storing course related info
_storage.courses = {}; // Used for accessing courses directly
_storage.references = new Array(); // Stores all available course IDs
var _ready = 0;
$.ajax({
type: "GET",
url: "data/courses.xml",
dataType: "xml",
success: function(xml) {
$(xml).find("course").each(function() {
_storage.courses[$(this).attr('id')] = {
title : $(this).find('title').text(),
description : $(this).find('description').text(),
points : $(this).find('points').text()
}
_storage.references.push($(this).attr('id'))
})
}
})
console.log(_storage.courses)
}
cls.prototype = {
getCourse: function (courseID) {
console.log(cls._storage)
},
getCourses: function () {
return _storage.courses
},
getReferences: function (),
return _storage.references
}
}
return cls
})()
At the moment getCourse will be fired before the AJAX request is complete and obviously it will have no data to access.
Any ideas will be greatly appreciated, im stuck on this one!
jQuery already handles this for you using deferred objects, unless i'm misunderstanding what you are looking for.
var courses = {
getCourse: function (id) {
return $.ajax({url:"getCourse.php",data:{id:id});
}
};
courses.getCourse("history").done(function(data){
console.log(data);
});
I know this isn't exactly what you are looking for, I'm hoping it's enough to push you in the right direction. Deferred objects are awesome.
The following changes allow you to make the AJAX request just once and you can call your function like
courses.getCourse('xyz', function(course){
// Use course here
});
Here are the changes
var model_courses = (function() {
// This is what gets returned by the $.ajax call
var xhr;
var _storage = {}; // Used for storing course related info
_storage.courses = {}; // Used for accessing courses directly
_storage.references = []; // Stores all available course IDs
var cls = function () {
xhr = $.ajax({
type: "GET",
url: "data/courses.xml",
dataType: "xml",
success: function(xml) {
$(xml).find("course").each(function() {
_storage.courses[$(this).attr('id')] = {
title : $(this).find('title').text(),
description : $(this).find('description').text(),
points : $(this).find('points').text()
}
_storage.references.push($(this).attr('id'))
});
}
});
}
cls.prototype = {
// Made changes here, you'd have to make the same
// changes to getCourses and getReferences
getCourse: function (courseID, callback) {
if (xhr.readyState == 4) {
callback(_storage.courses[courseID]);
}
else {
xhr.done(function(){
callback(_storage.courses[courseID]);
})
}
},
getCourses: function () {
return _storage.courses
},
getReferences: function (),
return _storage.references
}
}
return cls
})()
As a side note, your module pattern will not work very well if you need to instantiate two of these model_courses objects, since the storage objects are all shared in your self calling function's closure. You usually don't mix the module pattern with prototypes (returning a constructor from a module), unless you really know what you are doing, that is, the shared closure variables work as static properties of your class.
This is what I would do if I were you (since you really want private variables)
function ModelCourses() {
var storage = {
courses: {},
references: []
};
var xhr = $.ajax({
type: "GET",
url: "data/courses.xml",
dataType: "xml",
success: function(xml) {
$(xml).find("course").each(function() {
storage.courses[$(this).attr('id')] = {
title : $(this).find('title').text(),
description : $(this).find('description').text(),
points : $(this).find('points').text()
}
storage.references.push($(this).attr('id'))
})
}
});
this.getCourse = function(courseId, callback) {
function getCourse() {
callback(storage.courses[courseID])
}
if (xhr.readyState == 4) {
getCourse();
}
else {
xhr.done(getCourse);
}
};
}
in getStorage either add a check to see if there is any data to pilfer (preferred), or make the "actual" method private than publicize it when it has items it can access. (I would recommend the first though otherwise you'll get exceptions about calling a method that doesn't exists on an object).
You can define a function getData that would perform the ajax request and that would take the getCourse as a callback.
The getData could possibly store locally the result of the Ajax call and test the local storage before performing the ajax call.
You could also specify a private member to allow the ajax call to be run only once.
You might want to check underscore.js for some handy tool
Here is a short example code :
cls.prototype.getData = function(callback) {
/*perform ajax call or retrieve data from cache*/
callback()
}
cls.prototype.getCourse = function(id) {
this.getData(function() {
/*do something with the data and the id you passed*/
}
}

Categories

Resources