Function is :
[1,2,3].map( function (item)
{
console.log(item);
//return 'something';
});
My expected behaviour is getting only 1 as output, unless i uncomment the
//return 'something'
But i really get
1
2
3
What am i doing wrong ?
UPDATE:
i am testing that with nodejs.
i really dont understand.
var async = require("async");
[1,2,3].map( function (item)
{
console.log(item);
//return 'something';
});
async.map([1,2,3], function (item,callback)
{
console.log(item);
//callback(null,true)
}, function (err,result)
{
console.log(result);
}
);
Both return the same
1
2
3
And i really would like to wait till i get a return or a callback till the next item is executed.
SOLVED
async.mapSeries([1,2,3], function (item,callback)
{
console.log(item);
//callback(null,true)
}, function (err,result)
{
console.log(result);
}
);
is the way to do it.
Yes, map is synchronous.
It's a higher order function, that takes a new function and applies it to the given array.
Some people think that because they give a function as a parameter to map then it 'should' act like an event callback function, but it really doesn't. The map function just applies the function parameter to the array and only after it finishes, it continues execution for the resulting code after the map block.
As to your 'expected behavior' - it just doesn't work like you think ;)
"The map() method creates a new array with the results of calling a provided function on every element in this array."
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/map
The callback is called for each item, your logic is executed and the return value is set as an item in the new array.
Related
I am new to nodejs and promises. I have a requirement of passing arguments into a callback function in my promise chain. Looks something like below:
var first = function(something) {
/* do something */
return something.toString();
}
var second = function(something, item) {
/* need to work with both the args */
}
And my promise chain looks like
function(item) {
/* the contents come from first callback and the item should be passed as an argument to the second callback */
fs.readFile(somefile).then(first).then(second)
}
I should be passing the item as a parameter, can I do this without breaking my chain?
Please correct me if I am completely wrong.
Thanks
Wrap your second function into an anonymous function and pass in the parameter this way:
function(item) {
/* the contents come from first callback and the item should be passed as an argument to the second callback */
fs.readFile(somefile)
.then(first)
.then(firstResult => second(firstResult, item))
}
You can use Function.prototype.bind() to pass item to second function
fs.readFile(somefile).then(first)
.then(second.bind(null, item))
I am getting some json data from a remote url and iterating over each item in the collection and in the process creating new objects from the data and pushing them into a new array
function getPoints(year) {
var heatmapdata = [];
var promise = $.getJSON('/api/GetHeatMapData?nodeid=#UmbracoContext.PageId&year=' + year);
promise.done(function (data) {
$.each(data, function (i, item) {
heatmapdata.push({ location: new google.maps.LatLng(item.Location.lat, item.Location.lng), weight: item.Weight });
});
});
return heatmapdata;
}
In my browser (Chrome) I stick a breakpoint on the return line it says the array length is zero. In my browser console if I call getPoints(2010) it returns []. If I stick a break point on the line below I can see that items are being pushed into the array:
heatmapdata.push({ location: new google.maps.LatLng(item.Location.lat, item.Location.lng), weight: item.Weight });
I've been wresting with this for a long time and concerned I am missing something obvious?
As Dr. GIT alludes to, this is happening because $.getJSON is an asynchronous function.
When javascript encounters an asynchronous function, the steps inside it are run in the background so to speak and javascript does not wait for those steps to complete before running the rest of the code in the block.
Basically, you can't use return with an asynchronous function.
Currently, your function is called, it fires off $.getJSON, then immediately returns heatmapdata which is still an empty array, in the background, heatmapdata is being filled but you dont do anything with it when it's finished.
There are several ways you could fix this. One would be to provide a callback function as shown below which will be called after your promise has been resolved. (You could also move the code to the .done function, use another promise, or use $.Deffered)
function getPoints(year, callback) {
var heatmapdata = [];
var promise = $.getJSON('/api/GetHeatMapData?nodeid=#UmbracoContext.PageId&year=' + year);
promise.done(function (data) {
$.each(data, function (i, item) {
heatmapdata.push({
location: new google.maps.LatLng(item.Location.lat, item.Location.lng),
weight: item.Weight
});
});
callback(heatmapdata);
});
}
getPoints(year, function(heatmapdata ){
// do something with heatmapdata here
});
I think it is because of the promises. So the return Statement is called before the promise is executed.
I am working with a transnational framework within Javascript. So I need to wait for the previous query to finish before I move on. For example...
// Explicit this won't work because the length is not static
var i = [1,2,3]
doSomething(i[0], function(){
doSomething(i[1], function(){
doSomething(i[2], function(){
commitTransaction()
}
})
})
From this example I can't figure out a way to do this dynamically. It feels like a queue/recursion problem but I can't seem to crack it.
Does anyone else have an idea? I can also wrap in promises so that is an option as well, although that seems less synchronous.
Use async.eachSeries. So your code would translate to:
var transaction = {...};
async.eachSeries([1, 2, 3], function(value, callback) {
doSomething(value, transaction, callback);
}, function(err) {
if(err) throw err; // if there is any error in doSomething
commitTransaction(transaction);
});
jsFiddle Demo
I would suggest making a queue to do this. It would take the array, the generic callback function and a final function to callback with. Basically, the best way to accomplish this is to allow your functions to expect to have values injected.
The core assumption is that it is understood the caller will allow their callback function to have the current value and next callback function injected. That basically means we will end up with a function I have named queueAll which looks like this
function queueAll(arr,cbIteration,final){
var queue = [function(){ cbIteration(arr[arr.length-1],final) }];
for(var i = arr.length-2; i > 0; i--){
(function(next,i){
queue.unshift(function(){ cbIteration(arr[i],next) });
})(queue[0],i)
}
cbIteration(arr[0],queue[0]);
}
It takes the final call, places it in the queue, and then iterates, placing subsequent callback functions in the queue with the current value closed over, as well as closing over the front of the queue which at that point is the next call back. It is fairly simple to use. Pass it an array, a callback which expects values to be injected, and a final function.
In your case it would look like
queueAll(i,function(item,next){
doSomething(item,next);
},function(){
commitTransaction();
});
Stack Snippet Demo
//## <helper queue>
function queueAll(arr,cbIteration,final){
var queue = [function(){ cbIteration(arr[arr.length-1],final) }];
for(var i = arr.length-2; i > 0; i--){
(function(next,i){
queue.unshift(function(){ cbIteration(arr[i],next) });
})(queue[0],i)
}
cbIteration(arr[0],queue[0]);
}
//## </helper queue>
//## <user defined functions>
function doSomething(val,callback){
setTimeout(function(){
console.log(val);
callback();
},val*10);
}
function commitTransaction(){
console.log("commit");
}
//## </user defined functions>
//## <actual use>
var arr = [10,20,30];
queueAll(arr,function(item,next){
doSomething(item,next);
},function(){
commitTransaction();
});
//## </actual use>
Actually, I think promises are exactly what you're looking for. But for a traditional callback approach, consider the following:
var state = false,
doSomething = function (value, callback) {
/* do stuff with value */
if (!state)
doSomething(newValue, callback);
else
callback();
};
I want to understand one thing about async module in node.js.
I have created a function that map an object from a form to a model object and return this object.
This object is a video with an array of tags.
My question is where can I return the video ? I know normally it is inside the async callback function but if I do that, the object returned is undefined.
Whereas If i return the video object at the end of the whole function, it works but it's not safe as I'm not sure, my async is finished...
By the way, I don't understand the callback function passed in argument to async.each and
called after video.products.push(tag); . What does this function do?
Regards
in my mapping.js :
exports.video = function(object) {
var video = new Video();
video.name = object.name;
video.products = [];
async.each(object.tags, function(tago, callback) {
tag = {
"name" : tago.name
}
video.products.push(tag);
callback();
} ,
function(err) {
if( err ) {
console.log('Error' + error);
throw err;
}
logger.debug("into async" + video);
}
);
logger.debug("end function " );
**//return video;**
}
in my video.js :
var video = mapping.video(object);
logger.debug(video); // return undefined
The simple answer is that you can't - at least not via easy or obvious approach. As its name suggests, async is a library for queuing up asynchronous function calls into the event loop. So your exports.video function simply kicks off a bunch of asynchronous functions, which execute one after the other on an unpredictable time-frame, and then returns immediately. No matter where you try to return your video object within the scope of your function calls which are instantiated by async, the exports.video function will already have returned.
In this case it doesn't really seem like you need asynchronous function calls for what you're doing. I'd suggest that you replace your use of async with something like Underscore's each method, which executes synchronously, instead.
http://documentcloud.github.io/underscore/#each
You'd need to define a callback for your exports.video function e.g..
exports.video = function(object, callback) {
// video code (snip)...
async.each(object.tags,
function eachTag(tag, done) {
// code run for each tag object (snip)...
done();
},
function finished(err) {
// code run at the end (snip)...
callback(thingThatsReturned);
});
};
...and call it like this:
var videoUtils = require('videoUtils');
var tags = getTags();
videoUtils.video({ tags: tags }, function(thingThatsReturned) {
// do something with 'thingThatsReturned'
});
By the way, I don't understand the callback function passed in
argument to async.each and called after video.products.push(tag); .
What does this function do?
The async.each function will call the 'eachTag' function above (2nd argument) for each item in your array. But because it's done asynchronously, and you might do something else async in the function (hit a database/api etc.), it needs to know when that function for that particular array item has finished. Calling done() tells async.each that the function has finished processing. Once all the functions are finished processing (they've all called done()), async.each will run the 'finished' function above (3rd argument).
This is pretty standard async stuff for Node.js, but it can be tricky to get ones head around it at first. Hang in there :-)
Edit: It looks like your code isn't doing anything asynchronous. If it was, then the above code would be the way to do it, otherwise the following code would work better:
exports.video = function(object) {
// video code (snip)...
if (Array.isArray(object.tags)) {
object.tags.forEach(function eachTag(tag) {
// code run for each tag object (snip)...
});
}
return thingThatsReturned;
};
...and call it...
var videoUtils = require('videoUtils');
var tags = getTags();
var thingThatsReturned = videoUtils.video({ tags: tags });
I have a nested set of ajax calls, something like:
function getSubcategories(cat,callback) {
$.ajax({
url:'myurl.php',
data:'q='+cat,
dataType='json',
success:function(result){ callback(result) }
});
}
function getSubcatElements(subcat,callback) {
$.ajax({
url:'myurl2.php',
data:'q='+subcat,
dataType='json',
success:function(result){ callback(result) }
});
}
function organizeData(cat,callback) {
getSubcategories(cat,function(res){
totals=0;
list=new Array;
$.each(res['subcat'],function(key,val){
getSubcatElements(val,function(items){
$.each(items['collection'],function(key2,val2) {
list.push(val2['descriptor']);
});
totals+=items['count'];
// If I shove "totals" and "list" into an object here to callback, obviously gets called many times
}
// If I return an object here, it doesn't actually have counts from the asynchronous call above
}
function doStuff(cat) {
organizeData(cat,function() {
//stuff
});
So I'm running a looped asynchronous query that's a child of another asynch query, and I want the final result of the child loop without being "lazy". Right now I have it just returning updated results so the numbers change a few times, but I'd like to do it in one fell swoop.
It seems that the obvious place to do it would be to store the results in the asynch and return it after the $.each(), but JavaScript is insane and scoffs at things like obviousness. I feel like this should involve $.Deferred() but the samples I found all seemed like they should trigger after the first iteration ...
(The functions are deliberately separated as there is sometimes reason to use only one or only the other).
Thanks in advance!
Right now, your approach is fine. I want to add following changes in your code
function organizeData(cat, callback) {
getSubcategories(cat, function(res) {
totals = 0;
list = new Array();
totalSubCatItem = res['subcat'].length;
currentSubCatItem = 0;
$.each(res['subcat'], function(key, val) {
getSubcatElements(val, function(items) {
$.each(items['collection'], function(key2, val2) {
list.push(val2['descriptor']);
});
totals += items['count'];
// If I shove "totals" and "list" into an object here to callback, obviously gets called many times
// Here the solution
currentSubCatItem++;
if(currentSubCatItem === totalSubCatItem){
callback(/** pass argument here **/)
}
});
// If I return an object here, it doesn't actually have counts from the asynchronous call above
});
})
}
function doStuff(cat) {
organizeData(cat, function( result) {
//stuff
console.log(result)
});
}
First, you should probably organize your database query on the server side so it's returning a single, multi-plexed result. Rather than calling it lots of times.
Barring that, and assuming you don't know how many sub-categories you're going to call until your category call returns, your best bet is to create a global var that counts up every time it makes a call, and then counts down every time the callback receives a result. Whenever callback fires, counts down, and the new count is zero, do your updates.