How to clean the source code of callback in callback next structure - javascript

For example in this case
I would like to call url1, call url2, call url3; , step by step after waiting for previous call back.
For example this is my source code
var pram={};
var url = "http://myapiserver.com/api/test";
var client = Ti.Network.createHTTPClient({
onload : function(e) {
Ti.API.info("first call is success!!");
  var url2 = "http://myapiserver.com/api/test2";
  var client2 = Ti.Network.createHTTPClient({
  onload : function(e) {
   // call URL 3 here!!
   Ti.API.info("second call is success!!");
   },timeout : 3000
  });
client2.open("GET", url);
client2.setRequestHeader('Content-type','charset=utf-8');
client2.send(pram2);
},
timeout : 3000
});
client.open("GET", url);
client.setRequestHeader('Content-type','charset=utf-8');
client.send(pram);
This source code is OK however, if you need more steps, nest structure will be deeper and deeper and difficult to maintenance.
Is there good way to clean this kind of source code??

You should take advantage of CommonJS format to have an http.js file that receives the parameters and callbacks. That should avoid repeating the creation of the HTTPClient object over and over again.
Alloy includes underscore.js out of the box. If you want to run multiple async calls at the same time you can use something like http://underscorejs.org/#after
If you can to do subsequent calls you may want to organice your code in a way that you avoid a callback hell. There is plenty of docs already about different ways to address that. Here is a Sample of an old blog post.

You can try following. It is readable and easy to maintain.
var createClients = function (connections) {
var connection = connections.shift(),
url = connection['url'],
param = connection['param'],
client = Ti.Network.createHTTPClient({
onload: function (e) {
Ti.API.info("Call is success!");
connections[0] && createClients(connections)
},
timeout: 3000
});
client.open("GET", url);
client.setRequestHeader('Content-type','charset=utf-8');
client.send(param);
};
var connections = [
{'url': "http://myapiserver.com/api/test1", 'param': {}},
{'url': "http://myapiserver.com/api/test2", 'param': {}},
{'url': "http://myapiserver.com/api/test3", 'param': {}}
];
createClients(connections);
note that you will end up with empty connections
Also you can use connections to get intance of HTTPClient for each connection or extend (for example) with some specific callbacks for each connection.

Related

Make a while loop delay repeating until ajax calls in it are complete

Before I explain what I want to do, here's an MCV of what I'm coding
$("#button").submit(function(event) {
event.preventDefault();
var formData = new FormData(this);
var myString = $('#textarea').val();
var myRegexp = /src="blob:([^'"]+)"/gm;
match = myRegexp.exec(myString);
var inProgress = 0;
while (match != null) {
var xhr = new XMLHttpRequest();
addr = match[1];
xhr.open('GET', 'blob:' + addr, true);
xhr.responseType = 'blob';
xhr.onload = function(e) {
if (this.status == 200) {
var myBlob = this.response;
var data = new FormData();
data.append('file', myBlob);
$.ajax({
url: "uploader.php",
type: 'POST',
data: data,
contentType: false,
processData: false,
beforeSend: function() {
inProgress++;
},
success: function(data) {
myString = myString.replace("blob:" + addr, data);
},
error: function() {
// error
},
complete: function() {
--inProgress;
}
});
} else {
// error
}
};
xhr.send();
match = myRegexp.exec(myString);
}
if (!inProgress) {
formData.set('textarea', myString);
submitForm(formData);
}
});
So, I have a text area and it contains an unknown number of BLOB objects. I first try to find these BLOB objects using regexp and then I upload them to the server using a PHP file called uploader.php. Once the file is uploaded, it will return the URL of the uploaded file and I want to replace the BLOB URL by the URL of the uploaded file in the text area and then submit the final content of the textarea to the server for further processing.
It turns out that when I run the code, only the last instance of the regexp is replaced by its uploaded URL. The others remain as they were. I suspect that this is because the while loop does not wait for the ajax requests to be complete. I had a similar problem when trying to submit the form and I solved it by following the suggestions in this answer but I don't know how to fix this issue this time.
Any idea is appreciated
Update: I tried to make ajax work synchronously but my browser said that it was deprecated and it didn't work.
It seems you don't really need it to be synchronous (and I can't see a case when it's better to make an async action look synchronous), but rather only need it to be sequential.
It is possible to make async actions sequential by the use of callbacks (which are rewritable as Promise and in turn rewritable as async/await methods but I'll keep it simple):
// myString is made global for simplicity
var myString;
function uploadBlob(myBlob, addr, callback) {
var data = new FormData();
data.append('file', myBlob);
$.ajax({
url: "uploader.php",
type: 'POST',
data: data,
contentType: false,
processData: false,
success: function(data) {
// file uploaded OK, replace the blob expr by the uploaded file url(data)
myString = myString.replace("blob:" + addr, data);
callback();
},
error: function() {
// error, the uploaded most likely failed, we leave myString alone
// or alternatively replace the blob expr by empty string
// because maybe we dont want to post blob in the final form submit
// uncomment if needed
// myString = myString.replace("blob:" + addr, "");
callback();
}
});
}
function getBlobAndUpload(addr, callback) {
var xhr = new XMLHttpRequest();
xhr.open('GET', 'blob:' + addr, true);
xhr.responseType = 'blob';
xhr.onload = function(e) {
if (this.status == 200) {
var myBlob = this.response;
uploadBlob(myBlob, addr, callback);
} else {
// error, but callback anyway to continue processing
callback();
}
};
xhr.send();
}
function processAddresses(addresses, callback, current) {
var index = current || 0;
// all addresses processed?
if (index >= addresses.length) {
// yes no more address, call the callback function
callback();
} else {
var addr = addresses[index];
// once the get/upload is done the next address will be processed
getBlobAndUpload(addr, function() {
processAddresses(addresses, callback, index + 1);
});
}
}
$("#button").submit(function(event) {
event.preventDefault();
var formData = new FormData(this);
var addresses = [];
// initialize both localString and myString to the content of the textArea
// localString will be used to extract addresses,
// while myString will be mutated during the upload process
var localString = myString = $('#textarea').val();
var myRegexp = /src="blob:([^'"]+)"/gm;
match = myRegexp.exec(localString);
// collect all addresses first
while (match != null) {
addr = match[1];
addresses.push(addr);
match = myRegexp.exec(localString);
}
// initiate sequential processing of all addresses, and
// pass the callback function triggering the form submit
processAddresses(addresses, function() {
// all the successfully uploaded blob exprs in my string should
// be now replaced by the remote file url now (see commented part
// in upload blob error for a variation of the feature
formData.set('textarea', myString);
submitForm(formData);
});
});
So. I said in comments, that you could use async/await, and gave links. Now I am going to try to teach you how to work with promises and XMLHttpRequest.
So first thing. I would use my own 'library' (not really a library, just 3 new command) called PromiseReq which has XMLHttpsRequest that returns Promises.
You would need two functions from it:
sendToServer(config, data) and getServerFile(config). They do what their names implies.(sendToServer is not so good at the time, but I will improve it sometime later). They just use Promises as returns. They work in very easy way. Code # Github
BUT It was designed for my uses only, so it is not very flexible (although I hope I will improve it sometime).
So we need to learn how to use Promises.
Firstly you need to know what Promise is and why do we use it.
Then you can create one like this:
let pmq = new Promise((res,rej)=>{
// PROMISE BODY HERE
});
Here first warning. Promises made that way don't support return as a way to resolve Promise! You have to use res()!
Some functions just return them (such as fetch()) and we can handle them right after calling function.
Now pmq will be our promise.
You can use pmq.then(callback) to handle what will happen if somewhere in promise body is res() call and pmq.catch(callback) to handle what happens when rej() is called. Remember, that .catch(cb) and .then(cb) returns a Promise, so you can safely chain more than one .then() and at the end add .catch() and it will handle rejection from every one of .then()s.
For example:
pmq = fetch("file.txt");
pmq.then(e=>console.log(e.json())).then(console.log).catch(console.error);
There is a big note there.
The order in which this events will fire.
If for example rP() waits 1s than logs "A" then resolves, this code:
let a = rP();
a.then(_=>console.log("B")).catch(console.error);
console.log("C");
will result in:
C
A
B
Becuase of this there is async/await needed to do this.
To do so you have to make an async function with keyword async.
let fn = async ()=>{}
Here is second thing. Async functions ALWAYS return Promise. And that is the second way you can create a promise. You just don't use res(), rej() only return and throw.
Now we can call inside fn():
let a = await rP().then(_=>console.log("B")).catch(console.error);
console.log("C");
and we will get our
A
B
C
Now. How to use it with XMLHttpRequest?
You need to create new Promise with simple XMLHttpRequest inside:
let xmlp = (type, path,data) => return new Promise((res,req)=>{
let xhr = new XMLHttpsRequest();
xhr.open(type, path, true); // true implies that is it asynchronous call
//...
xhr.send(data);
});
And now when to resolve?
XMLHttpRequest has useful event properties and events. The one that is best for our case is onreadystatechange.
You can use it like so:
xhr.onreadystatechange = _=>{
if(xhr.readyState === 4 && xhr.status === 200) // Everything went smoothly
res(xhr.responseText);
else if(shr.readyState === 4 && xhr.status !== 200) // Something went wrong!
rej(xhr.status);
}
And then to get data you can either
Async/Await
// INSIDE ASYNC FUNCTION
let resData = await xmpl("GET", "path.txt", null).catch(console.error);
.then()
let resData;
xmpl("GET", "path.txt", null).then(r=>{
resData = r;
// REST OF WHOLE FUNCTION TO ENSURE THAT resData HAS BEEN SET
})
.catch(console.error);
You can also send data with xmpl().
xmpl("POST", "observer.php", "Data to send to observer.php!")
.then(whatToDoAfterSendFN);
/*
to get such data in PHP you have to use
$post = file_get_contents('php://input');
*/
I know that this answer is a bit messy and stuff, but I didn't have any idea how to write it :P Sorry.

Multiple HTTP Request to a Single File

Currently, I am attempting to call a single PHP file named 'sniffer.php'.
I am doing this async using javascript to a PHP file. The issue that I am currently having is that in the PHP code I added a sleep function to randomly (to act like a page is loading). The issue with that is two or more functions call that page it still waits until one of the pages finishes first then stops starts the other request. EG: One sleeps for 5 seconds and the other sleeps for 6 seconds. The first one completes in 5 seconds and the next one finishes at 11 seconds. What I am looking for is the one finishes in 5 seconds and the next finishes the one second after. I am not sure if it's just 'sleep' causing the issue or if the file is 'locked' because of the sleep.
Thanks for any help/feedback.
My PHP File looks like this:
$c = rand(2,10);
sleep($c);
$html .= $c;
echo json_encode(array('html'=>$html,'status'=>1));
exit;
My javascript class looks like this:
var path = '/';
var polling = {
add : function(name, obj) {
this[name] = new xAjax(obj);
return this;
}
};
function xAjax(options) {
var consti = {
};
var defaults = {
url: path + 'sniffer.php',
method: 'POST',
responseType: 'json',
async: true,
timeout: 30000,
success: function(response) {console.log(response);},
done: function() {},
beforeSend: function() {},
error: function(e) {console.log(e);},
abort: function() {}
};
var settings = Object.assign({}, defaults, options, consti);
var xhr = null;
this.run = function() {
xhr = new XMLHttpRequest();
settings.beforeSend();
xhr.responseType = settings.responseType;
xhr.open(settings.method, settings.url, settings.async);
xhr.timeout = settings.timeout;
xhr.onreadystatechange = function() {
if ( xhr.readyState == XMLHttpRequest.DONE ) {
if ( xhr.status === 200 ) {
settings.success(xhr.response);
} else {
settings.error(xhr.response);
}
settings.done();
xhr = null;
}
};
xhr.send();
return this;
};
this.abort = function() {
xhr.abort();
xhr = null;
settings.abort();
return this;
};
this.isRunning = function() {
return xhr != null;
};
this.set = function(options) {
settings = Object.assign({}, defaults, options, consti);
return this;
};
}
My creation/call to the sniffer.php:
polling.add('x');
polling.x.run();
polling.add('y');
polling.y.run();
This is happening because of sessions. It would happen with or without sleep, if the script takes time.
What happens when you start a session? PHP has to make sure the session data is current and that it will not change, it also has to make sure that the data it changes will be available in the next executions.
So if a script tries to open the session while it is open somewhere else, there's a lock, because the first script might very well change the session information still. Once the first script closes the session, the next script can get a hold of it and go on.
You can call session_write_close() to close the session for write, and thus remove the lock. While the session is closed, its value can still be accessed but it will be the value before any subsequent script changed anything (if your second script changes something before first script ends, it will not be known). Also, if you write new data to the session, it will not be saved...
From documentation
Session data is usually stored after your script terminated without
the need to call session_write_close(), but as session data is locked
to prevent concurrent writes only one script may operate on a session
at any time.
Also it seems like you are not the only one:
You can have interesting fun debugging anything with sleep() in it if
you have a session still active. For example, a page that makes an
ajax request, where the ajax request polls a server-side event (and
may not return immediately).
If the ajax function doesn't do session_write_close(), then your outer
page will appear to hang, and opening other pages in new tabs will
also stall.

Node.js request - handling multiple POST requests

I use request library to communicate with other servers via API. But now I need to send multiple (10 or more) POST requests at the same time and move further only if all responsens will be correct. Usually syntax looks a bit like this:
var options = {
url: "",
method: "POST",
header: {...},
body: {...}
};
request(options, function(err,response,body)
{
}
But now I've got an array of objects instead of a single options variable. Is there a way to do this? Or maybe there is another library able to handle the issue.
EDIT:
var arrayOfIds = [];
const requests = [];
for(var i in range){
var options = {} // here goes all bodies and headers to send
requests.push( // push a request to array dynamically
request(options, function(err,response,body){
if(!err && response.statusCode == 201){
arrayOfIds.push(body.id);
}
}));
Promise.all(requests)
.then(function(res){
console.log(arrayOfIds); // this is empty
});
There are several approaches to solve this:
async library, method parallel
Promise.all
To switch your request to promises, use additionaly to request module - request-promise. In code it will look like this:
const request = require('request-promise');
// Note, you don't assign callback here
const promises = [
request.post({...}),
request.post({...}),
request.post({...})
];
// And then you simply do Promise.all
Promise.all(promises).then(console.log);

Node.js minimal function for parsing route

I have a Node.js / Express app working, that receives routes like so:
app.get('/resource/:res', someFunction);
app.get('/foo/bar/:id', someOtherFunction);
This is great and works fine.
I am also using Socket.IO, and want to have some server calls use websockets instead of traditional RESTful calls. However, I want to make it very clean and almost use the same syntax, preferrably:
app.sio.get('/resource/:res', someFunction);
This will give a synthetic 'REST' interface to Socket.IO, where, from the programmer's perspective, he isn't doing anything different. Just flagging websockets: true from the client.
I can deal with all the details, such as a custom way to pass in the request verbs and parse them and so and so, I don't have a problem with this. The only thing I am looking for is some function that can parse routes like express does, and route them properly. For example,
// I don't know how to read the ':bar',
'foo/:bar'
// Or handle all complex routings, such as
'foo/:bar/and/:so/on'
I could dig in real deep and try to code this myself, or try to read through all of express' source code and find where they do it, but I am sure it exists by itself. Just don't know where to find it.
UPDATE
robertklep provided a great answer which totally solved this for me. I adapted it into a full solution, which I posted in an answer below.
You can use the Express router class to do the heavy lifting:
var io = require('socket.io').listen(...);
var express = require('express');
var sioRouter = new express.Router();
sioRouter.get('/foo/:bar', function(socket, params) {
socket.emit('response', 'hello from /foo/' + params.bar);
});
io.sockets.on('connection', function(socket) {
socket.on('GET', function(url) {
// see if sioRouter has a route for this url:
var route = sioRouter.match('GET', url);
// if so, call its (first) callback (the route handler):
if (route && route.callbacks.length) {
route.callbacks[0](socket, route.params);
}
});
});
// client-side
var socket = io.connect();
socket.emit('GET', '/foo/helloworld');
You can obviously pass in extra data with the request and pass that to your route handlers as well (as an extra parameter for example).
robertklep provided a great answer which totally solved this for me. I adapted it into a full solution, which is below in case others want to do something similar:
Node (server side):
// Extend Express' Router to a simple name
app.sio = new express.Router();
app.sio.socketio = require('socket.io').listen(server, { log: false });
// Map all sockets requests to HTTP verbs, which parse
// the request and pass it into a simple callback.
app.sio.socketio.sockets.on('connection', function (socket) {
var verbs = ['GET', 'POST', 'PUT', 'PATCH', 'DELETE'];
for (var i = 0; i < verbs.length; ++i) {
var go = function(verb) {
socket.on(verb, function (url, data) {
var route = app.sio.match(verb, url);
if (route && route.callbacks.length) {
var req = {url: url, params: route.params, data: data, socket:socket}
route.callbacks[0](req);
}
});
}(verbs[i]);
}
});
// Simplify Socket.IO's 'emit' function and liken
// it traditional Express routing.
app.sio.end = function(req, res) {
req.socket.emit('response', req.url, res);
}
// Here's an example of a simplified request now, which
// looks nearly just like a traditional Express request.
app.sio.get('/foo/:bar', function(req) {
app.sio.end(req, 'You said schnazzy was ' + req.data.schnazzy);
});
Client side:
// Instantiate Socket.IO
var socket = io.connect('http://xxxxxx');
socket.callbacks = {};
// Similar to the server side, map functions
// for each 'HTTP verb' request and handle the details.
var verbs = ['get', 'post', 'put', 'path', 'delete'];
for (var i = 0; i < verbs.length; ++i) {
var go = function(verb) {
socket[verb] = function(url, data, cb) {
socket.emit(String(verb).toUpperCase(), url, data);
if (cb !== undefined) {
socket.callbacks[url] = cb;
}
}
}(verbs[i]);
}
// All server responses funnel to this function,
// which properly routes the data to the correct
// callback function declared in the original request.
socket.on('response', function (url, data) {
if (socket.callbacks[url] != undefined) {
socket.callbacks[url](data);
}
});
// Implementation example, params are:
// 1. 'REST' URL,
// 2. Data passed along,
// 3. Callback function that will trigger
// every time this particular request URL
// gets a response.
socket.get('/foo/bar', { schnazzy: true }, function(data){
console.log(data); // -> 'You said schnazzy was true'
});
Thanks for your help, robertklep!

Catch Facebook Access token on demand, but how?

I build a Firefox Extension and i'm using the graph api. At the moment i catch the access token of each user while starting the browser like:
https://stackoverflow.com/questions/10301146/facebook-login-within-a-firefox-add-on
This works fine but kind of stupid, because nobody will use the extension in each firefox session. So what i'm trying to do is, catch the access token or more accurately call the methode Wladimir Palant recommends on demand. My code looks like this, while getAccessToken() is the mentioned method.
onLoad: function (){
var NoteHandler = window.arguments[0];
var sjcl = NoteHandler.sjcl;
NoteHandler.getAccessToken();
decryptionDialog.noteHandler = NoteHandler;
decryptionDialog.sjcl = sjcl;
var currID = decryptionDialog.getID();
if(currID==""){
window.close();
return false;
}else{
http_request = new XMLHttpRequest();
http_request.open('Get', 'https://graph.facebook.com/'+currID+'/notes?access_token='+NoteHandler.token, false);
http_request.overrideMimeType("text/json");
http_request.send(null);
decryptionDialog.value = decryptionDialog.ResponseToArray(http_request.responseText);
....
But the problem is while getAccessToken() is still waiting for the access token, the onLoad()-Method won't wait and goes on. Therefore the NoteHandler.token is null while the request is send. Does anyone have an idea, because i'm relatively new to javascript.
You should rewrite this code to be asynchronous - it shouldn't assume that getAccessToken() will get the result immediately, there should be rather a callback parameter, a function to be called when the operation is done (can be a closure function). Something along these lines:
onLoad: function (){
var NoteHandler = window.arguments[0];
var sjcl = NoteHandler.sjcl;
NoteHandler.getAccessToken(function()
{
decryptionDialog.noteHandler = NoteHandler;
decryptionDialog.sjcl = sjcl;
...
http_request.open('Get', 'https://graph.facebook.com/'+currID+'/notes?access_token='+NoteHandler.token, false);
...
});
}
...
getAccessToken: function(callback) {
...
// All done - call the callback
callback();
}

Categories

Resources