How to make multiple HttpRequests with pagination cursors in Parse Cloud - javascript

I want to make a HTTP GET request from a url and it will contain the url of next webpage. I have to continue this process till I get an empty "next" url.
My code is as follows:
Parse.Cloud.define("myFunc", fucntion (request, response){
Parse.Cloud.httpRequest({
url: fb_url
}).then(function(httpResponse) {
next_url = httpResponse.data.next_url;
/******************/
// code to make another HttpRequest with next_url and iteratively
// doing it till next_url is null
response.success(httpResponse.text);
}, function(httpResponse) {
response.error("error " + httpResponse);
}
});
I tried a lot of different ways, but all in vain. Can anyone tell me how can I make another HttpRequest with the next_url and keep doing it until next_url is null.

Wrap the http invocation in a function that can be called recursively. This will return a chain of promises that make the requests until null is returned.
function keepGetting(url) {
return Parse.Cloud.httpRequest({ url:url }).then(function(httpResponse) {
nextUrl = httpResponse.data.nextUrl;
return (nextUrl === null)? httpResponse : keepGetting(nextUrl);
});
}
Parse.Cloud.define("myFunc", fucntion (request, response){
// initialize fb_url somehow
keepGetting(fb_url).then(function(result) {
response.success(result);
}, function(error) {
response.error(error);
});
});
(Careful, if the service takes too long or returns too many results before null, your parse call will timeout)

Related

Trying to make an AJAX call asynchronous in Javascript

I am trying to retrieve some data from neo4j for my web app. I have my code structured in the following manner:
When I click the button to retrieve the data, call
var childNodes = getAllChildNodes(uuid, ""); //uuid: node specific id in neo4j, second param not important
//do something with childNodes
....
In getAllChildNodes(),
function getAllChildNodes(uuid, filter) {
/*
prepare json data to send
*/
var resultNodes = {}
var successFunction = function(data) {
//store data in resultNodes ...
//do something with the data ...
}
var failFunction = function(xhr, status, error) {
//if query fails
};
//call to API function
try {
getChildrenAPI(jsonData, successFunction, failFunction)
} catch (err) { console.log(err) }
return resultNodes
}
In getChildrenAPI
function getChildrenAPI(jsonData, doneFunction, failFunction) {
var request = $.ajax({
method : 'POST',
url: myurl,
data : JSON.stringify(jsonData),
dataType : 'json',
contentType : 'application/json',
cache : false,
async : true,
});
request.done(function (data) {
doneFunction(data)
})
request.fail(function (xhr, status, error) {
failFunction( xhr, status, error );
});
}
The problem is that my childNodes var does not get populated. When I inspected further, in my getAllChildNodes() function, resultNodes is returned before the query data is stored in successFunction(). I thought this would be an async issue, so I made sure to check that the AJAX call had its async property set to true, but that didn't solve it. So I tried using async await on my getAllChildNodes(), but that didn't work either. So my question is, what am I doing wrong here? I'm still new to the idea of async so this was the best I can do. If someone can please help me with this I would really appreciate it.
It seems that you misunderstood the problem. AJAX requests are asynchronous by default. What you want, as far as I can tell by seeing your code is to be able to use the result of the request after the request in the code. For that you need to make it synchronous. You can specify async to be true, you can await and so on. However, it's a terrible idea in most cases to make your requests asynchronous. If you synchronize your request, then nothing else will run and your page will hang while you await.
What if a request lasts for 10 seconds? In that case your page is unresponsive for ten seconds if you synchronize the request.
What if you send 100 requests and on average they take 1 second? Then your page hangs for 100 seconds.
The best practice is to avoid syncrhonising your requests whenever possible and only do so when absolutely necessary. Instead, you will need to get used to callbacks, that is, functions defined to be executed once the request is completed and define the post-request behavior in them. You could also use promises or web workers, depending on your exact situation.
async function getAllChildNodes(uuid, filter) {
/*
prepare json data to send
*/
var resultNodes = {}
var successFunction = function(data) {
//store data in resultNodes ...
//do something with the data ...
}
var failFunction = function(error) {
//if query fails
};
//call to API function
try {
var data = await $.ajax({
method : 'POST',
url: myurl,
data : JSON.stringify(jsonData),
dataType : 'json',
contentType : 'application/json',
cache : false,
async : true,
});
successFunction(data);
} catch (err) {
console.log(err);
failFunction(err);
}
return resultNodes
}
var childNodes = getAllChildNodes(uuid, "");
<script src="https://code.jquery.com/jquery-3.6.0.slim.min.js" integrity="sha256-u7e5khyithlIdTpu22PHhENmPcRdFiHRjhAuHcs05RI=" crossorigin="anonymous"></script>
Javascript is single-threaded & non-blocking language so it will not execute code asynchronously.
To make your code sync, you have to create an async function that manage the async code (ajax, timeout, read a file, ...)
I think you're looking for something like the following:
getAllChildNodes(uuid, "", function done(results) {
// Results populated by the done callback.
console.log(results);
});
The trick here is that you need to be keeping track of how many requests were kicked off and when they finished.
So we can then change the definition of getAllChildNodes to call our doneCallback once all requests have been "processed".
function getAllChildNodes(uuid, filter, doneCallback) {
// How many calls do we need to make.
const callsToMake = [1,2,3];
// Track when all calls were made by the results.
const results = [];
const ajaxDoneCallbackCheck = function () {
if (results.length === items.length) {
doneCallback(results);
}
};
const ajaxSuccessCallback = function (data) {
results.push(data);
ajaxDoneCallbackCheck();
};
const ajaxFailCallback = function (error) {
results.push(error);
ajaxDoneCallbackCheck();
}
// Iterate through ajax calls to make.
for (const callToMake of callsToMake) {
// Do ajax stuff.
console.log('Request data');
getChildrenAPI(ajaxSuccessCallback, ajaxFailCallback);
}
}
Now results needs to be processed in our original done callback like so:
getAllChildNodes(uuid, "", function done(results) {
// Results populated by the done callback.
console.log(results);
// Iterate results.
for (const result of results) {
if (result instanceof Error) {
console.error(result);
} else {
// Process or track result!
console.log(result);
}
}
});

How to send Synchronous Requests with $http of Angular.js

hey guys I know this issue posted a lot, but nothing doesn't help me that is why I asking this question.Question is I am facing an issue of sending a synchronous request to php.
here is my Model function which is sending request.
State.pushData = function () {
$http({
method: 'POST',
url: 'pushData.php?action=pushdata',
data: {'status': 'push', 'email' : State.campemail},
headers: {'Content-Type': 'application/x-www-form-urlencoded'}
}).success(function(response){
if(response.error){
console.log(response.error);
return;
}
State.getCartData();
State.selectedItems = [],
});
}
this pushData function send a post request to defined url. and fetch a response. the code written is suppose to execute "State.getCartData()" function on success of request sent initially. But this is not working in this way. both request executes at once.
I had tried $http with .post and then methods but same results. like this
State.pushData = function () {
$http.post('pushData.php?action=pushdata',
{'status': 'push', 'email' : State.campemail}
).then(function(response){
if(response.error){
console.log(response.error);
return;
}
State.getCartData();
State.selectedItems = [],
});
}
I want to send request asynchronously, that once pushQuote request completes after that getCartData() function will execute. please share your experience on this. thanks in advance.
got an answer to my question after some brainstorming. I return $http in my model and call .then() on returned response. it worked as I want to send request once first is completed successfully. Here is my model function
State.pushData = function () {
return $http.post('pushData.php?action=pushdata',
{'status': 'push', 'email' : State.campemail}
);
}
in above function I just send post request to server and return its response to controller function. which executes right after returning from model. here is my controller function.
scope.pushIt = function() {
var responseObj = State.pushData();
responseObj.then(
function() { //successs call back
/*Business logic*/
State.getCartData();
State.selectedItems = []
},
function() { //Error call back
/*Business logic*/
}
);
}
Beauty of this approach is you can use then method as many as you want. they will all execute one by one in chain.
scope.pushIt = function() {
var responseObj = State.pushData();
responseObj.then(
function() { //successs call back
/*Business logic*/
},
function() { //Error call back
/*Business logic*/
}
).then(
function() { //successs call back
/*Business logic*/
},
function() { //Error call back
/*Business logic*/
}
);
}

How to successfully use the callback on an javascript function with an http request

Note: I have limited exp with js so correct me if my I'm completely wrong in how I'm describing this scenario.
I have two javascript files. I am calling a function on the first file (client side) which calls a function on the second file and uses the callback from the second file's function for the purposes of response.success/.error on the first file.
If that doesn't make sense here is some code:
Note: this is being done temporarily using Parse's cloud functions. Let me know if more information is needed regarding those but not sure if it's important.
First file:
Parse.Cloud.define("methodName", function(request, response) {
...
secondFile.myFunction(param1, {
stuff: request.params.stuff,
}, function (err, res) {
if (err) {
response.error(err);
} else {
response.success(res);// I'm assuming this is going to the hardcoded "yes." from httpRequest on second file's function
}
});
});
Second File:
myFunction: function(param1, properties, callback) {
if (!param1) return callback(new Error("Helpful error message"));
var headersForReq = {
...
};
var bodyForReq = ...; // the properties properly parsed
Parse.Cloud.httpRequest({
method: 'PUT',
url: ...,
headers: headersForReq,
body: bodyForReq,
success: function (httpResponse) {
callback(null, 'yes'); // the hardcoded "yes" i referred to
},
error: function (httpResponse) {
callback(httpResponse.status + httpResponse.error);
}
});
}
On my the client, the code is treated as a success (errors aren't thrown or returned back) but when I print out the value it comes across as (null) not "yes".
What's going on here? (Side note, httpRequest is currently not doing anything, its hard to verify if the request is properly being sent because it's being sent to a third party API).
I do know the second file's method is properly being called though. So it's not a silly issue with the module.exports or var secondFile = require('\path\secondFile')
I think you are just mis-use the api
Rewrite it with the example style.
https://parse.com/docs/js/api/classes/Parse.Cloud.html#methods_httpRequest
Parse.Cloud.httpRequest({
method: 'PUT',
url: ...,
headers: headersForReq,
body: bodyForReq
}).then(function (httpResponse) {
callback(null, 'yes'); // the hardcoded "yes" i referred to
},
function (httpResponse) {
callback(httpResponse.status + httpResponse.error);
}
});
I think below will work, too.
Parse.Cloud.httpRequest({
method: 'PUT',
url: ...,
headers: headersForReq,
body: bodyForReq
}, {
success: function (httpResponse) {
callback(null, 'yes'); // the hardcoded "yes" i referred to
},
error: function (httpResponse) {
callback(httpResponse.status + httpResponse.error);
}
});
BTW, if you are using open source parse-server, you can use request or request-promise. These 2 npm package is used by many people. (Parse.Promise is not es6-like promise)

Turn several ajax requests into Observables with RxJS

I'm struggling with something - which I'm guessing means I've misunderstood and am doing something silly
I have an observable and need to use it to create some object, send that to the server for processing, combine a result from the server with the object I sent, and then turn that into an observable so what I want to do (I think) is something like
var theNewObservable = my.observable.things.select(function(thing) {
var dataToSend = generateMyJavascriptObjectFrom(thing);
var promise = $.ajax({
type: 'POST',
url: http://somewhere.com,
data: dataToSend
}).promise();
return rx.Observable.fromPromise(promise).subscribe(function(data, status, jqXHR) {
var infoFromServer = jqXHR.getResponseHeader('custom-header-returned');
// I'm wanting this to be the thing other code can subscribe to
return { infoFromServer: dataToSend };
}, function(err) {
alert('PC LOAD LETTER!');
console.error(err);
});
}
});
theNewObservable.subscribe(function(combinedInfo) { console.log(combinedInfo) };
where I'm expecting {infoFromServer: dataToSend} I'm getting an AutoDetachObserver and I can see that has an onNext with the ajax onSuccess signature so I'm obviously doing something silly
A couple things that should help a bit:
1) The subscribe method is a terminal method, as in, it won't return anything. It is where the Observer attaches so there should be no further data propagation after the subscribe
2) The onNext method of subscribe can only take a single value which you will need to have all the message data wrapped in.
Since jQuery's Promise will not behave well with this, you have two options. First, you can use the RX-DOM project for an Observable ajax version. Or you will need to wrap the promise method. If you further need to wait on the response you should be using selectMany instead, which will allow you to fire off the promise, then await its return and map the response to the original request.
var theNewObservable = my.observable.things
//Preprocess this so that `selectMany` will use
//dataToSend as the request object
.map(function(thing) { return generateMyJavascriptObjectFrom(thing); })
.selectMany(function(dataToSend) {
var promise = $.ajax({
type: 'POST',
url: http://somewhere.com,
data: dataToSend
}).promise();
//Rewrap this into a promise that RxJS can handle
return promise.then(function(data, status, jqXHR) {
return {data : data, status : status, jqXHR : jqXHR};
});
}, function(request, response) {
return {
infoFromServer : response.jqXHR.getResponse('custom-header'),
dataToSend : request
};
});
theNewObservable.subscribe(
function(combinedInfo) {
console.log(combinedInfo)
},
function(err) {
alert('PC LOAD LETTER!');
console.error(err);
});

Using custom function as parameter for another

I'm currently dealing with refactoring my code, and trying to automate AJAX requests as follows:
The goal is to have a context-independent function to launch AJAX requests. The data gathered is handled differently based on the context.
This is my function:
function ajaxParameter(routeName, method, array, callback){
//Ajax request on silex route
var URL = routeName;
$.ajax({
type: method,
url: URL,
beforeSend: function(){
DOM.spinner.fadeIn('fast');
},
})
.done(function(response) {
DOM.spinner.fadeOut('fast');
callback(response);
})
.fail(function(error){
var response = [];
response.status = 0;
response.message = "Request failed, error : "+error;
callback(response);
})
}
My problem essentially comes from the fact that my callback function is not defined.
I would like to call the function as such (example)
ajaxParameter(URL_base, 'POST', dataBase, function(response){
if(response.status == 1 ){
console.log('Request succeeded');
}
showMessage(response);
});
I thought of returning response to a variable and deal with it later, but if the request fails or is slow, this won't work (because response will not have been set).
That version would allow me to benefit the .done() and .fail().
EDIT : So there is no mistake, I changed my code a bit. The goal is to be able to deal with a callback function used in both .done() and .fail() context (two separate functions would also work in my case though).
As far as I can see there really is nothing wrong with your script. I've neatened it up a bit here, but it's essentially what you had before:
function ajaxParameter (url, method, data, callback) {
$.ajax({
type: method,
url: url,
data: data,
beforeSend: function(){
DOM.spinner.fadeIn('fast');
}
})
.done( function (response) {
DOM.spinner.fadeOut('fast');
if (callback)
callback(response);
})
.fail( function (error){
var response = [];
response.status = 0;
response.message = "Request failed, error : " + error;
if (callback)
callback(response);
});
}
And now let's go and test it here on JSFiddle.
As you can see (using the JSFiddle AJAX API), it works. So the issue is probably with something else in your script. Are you sure the script you've posted here is the same one you are using in your development environment?
In regards to your error; be absolutely sure that you are passing in the right arguments in the right order to your ajaxParameter function. Here's what I am passing in the fiddle:
the url endpoint (e.g http://example.com/)
the method (e.g 'post')
some data (e.g {foo:'bar'})
the callback (e.g function(response){ };)
Do you mean something like this, passing the success and fail callbacks:
function ajaxParameter(routeName, method, array, success, failure) {
//Ajax request on silex route
var URL = routeName;
$.ajax({
type: method,
url: URL,
beforeSend: function () {
DOM.spinner.fadeIn('fast');
}
}).done(function (response) {
DOM.spinner.fadeOut('fast');
success(response);
}).fail(function (error) {
var response = [];
response.status = 0;
response.message = "Request failed, error : " + error;
failure(response);
})
}
Called like:
ajaxParameter(
URL_base,
'POST',
dataBase,
function(response){
//success function
},
function(response){
// fail function
}
);

Categories

Resources