I'm trying to understand why, when I assign the results from an axios call to a variable, console logging said variable will show the complete object, yet consoling its length returns zero.
As such, when I try to run a forEach on the results, there is no love to be had.
getNumberOfCollections() {
let results = queries.getTable("Quality"); // imported function to grab an Airtable table.
console.log(results); // full array, i.e. ['bing', 'bong', 'boom']
console.log(results.length); // 0
results.forEach((result) =>{ // no love });
}
It is quite likely that when you console.log the array, the array is still empty.
console.log(results); // full array, i.e. ['bing', 'bong', 'boom']
console.log(results.length); // 0
when console.log(results.length) is run, it is doing the console.log(0) and that's why 0 is printed out.
When console.log(results) is run, it is going to print out the results array later. That array is populated later when console.log() finally runs. (so console.log is not synchronous -- it will print something out a little bit later on.)
You can try
console.log(JSON.stringify(results));
and you are likely to see an empty array, because JSON.stringify(results) immediately evaluates what it is and make it into a string at that current time, not later.
It looks like you are fetching some data. The correct way usually is by a callback or a promise's fulfillment handler:
fetch(" some url here ")
.then(response => response.json())
.then(data => console.log(data));
so you won't have the data until the callback or the "fulfillment handler" is invoked. If you console.log(results.length) at that time, you should get the correct length. (and the data is there).
Related
I am supposed to access a subcollection called 'followers' for a mobile user, this contains the id of the follower.
Using this ID i should get data about the follower from the mobile_user collection and add it to an array.
I can successfully iterate through the list of documents but when using push it seems like i'm unable to return the full list of data back out of the for loop.
Have a look at my current code:
Notice the two console logs, on the first one i can see the array getting filled with the information i want, on the second one the array it's returned empty. I'm definitely missing whatever is needed for the array to be returned out of the for loop.
I am fairly new to js and any advice in the right direction would be appreciated.
const getFollowers = (data, context) => {
let id = data.id
const mobileUserRef = db.collection('mobile_user')
return mobileUserRef.doc(id).collection('followers')
.get()
.then(function(doc) {
var result = []
doc.forEach(function(follower) {
mobileUserRef.doc(follower.id).get()
.then(function(followerdoc) {
result.push({
name: followerdoc.data().name
})
console.log(result)
})
})
console.log(result)
return result
})
}
mobileUserRef.doc(follower.id).get() is asynchronous and returns immediately with a promise. The forEach loop will not wait for that promise to resolve before moving to the snapshot in the list. You should instead push that promise into an array, then use Promise.all on that array to wait for all the gets to complete before moving on. Then you will have to iterate each of those results and push them into another array to give to the caller.
See also:
How to use promise in forEach loop of array to populate an object
Node JS Promise.all and forEach
I created this function in an Angular4 app:
enrollmentCheck() {
this.allCourses.forEach(course => {
this._courses.getCurrentEnrolment(course.slug).subscribe(res => {
if(res.length > 0){
this.enrolledCourses.push(res[0].course_id);
console.log(this.enrolledCourses);
}
})
});
console.log(this.enrolledCourses);
}
It is supposed to iterate through an array of objects and check if the user is enrolled to any of them.
The first bit works well, the subscribtion gives me the right data (res). I then need to store the property course_id into an array.
The first log (inside the loop), seems to work fine. I get
[1]
[1,2]
[1,2,5]
[1,2,5,7]
as outputs, one for each time the loop is executed.
Problem is that the second log (outside the loop), will output something like:
[
0: 1
1: 2
2: 5
3: 7
]
rather than
[1,2,5,7]
as I would like, for I will need to iterate through this array, and I cannot find a way to do it with the one I get.
Can anyone help? I apologise if this may seem a silly question to someone, but any help would be really appreciated.
Thanks,
M.
There are a few problems with your method. First of all you're creating subscriptions inside a loop, that's a bad idea because you're never completing them. Second you're doing asyc operations inside the loop therefore at the time the second console log appears the data might not be there yet.
A better solution would be to use Observable.forkJoin to wait for all async requests and then map the data.
For example
enrollmentCheck() {
Observable.forkJoin(
this.allCourses.map(course => {
return this._courses.getCurrentEnrollment(course.slug);
}
).map(res => {
return res
.filter(enrollment => enrollment.length > 0)
.map(enrollment => enrollment[0].course_id)
}).subscribe(data => console.log(data))
}
Suppose there's an array called '_arr' from which I remove an item. Before removing though I log it to console. Problem is that log shows the array as if the item is removed already. I have reviewed data system in Polymer Documentation and still scratching my head.
Am I missing something on how data system works or I should be looking somewhere else for the cause?
EDIT: _arr is an array of strings and I am passing an event like:
this.fire('rmv-item' , {item: 'item content which is string'});
Here's the code
_removeItemFromArr: function(e) {
const index = this._arr.indexOf(e.detail.item) ;
console.log('array before remoivng item:' , this._arr , index); //item doesn't exist
if (index>-1) { this.splice('_arr' , index, 1 }
console.log('array after removing item: ' , this._arr , index); //item doesn't exist
},
The problem is that things are doing exactly what you say: the console logs the array, it most emphatically does not log the array "as it was at some specific time in the past", it logs the array as it is when log actually runs. And because the logging operation is not synchronous, by the time it actually writes the crosslinked reference and symbol-table-linked data to the browser console, you already removed the data from the array, so what you see is what console.log sees when it actually kicks in.
If you want a true snapshot of what your array ways "when you call log", don't log the array, log a copy of the array, which is guaranteed to generate synchronously using something like slice():
const index = this._arr.indexOf(e.detail.item);
console.log(`array before removing item at [${index}]: ${this._arr.slice()}`);
And job's a good'n.
In my Parse backend I have an array that contains unique number codes, so users must not be able to get the same code twice. For that reason somewhere in a column of some table I am keeping an index to this array.
Now there is a very simple operation - users ask for a unique code. The cloud function increments the current value of the index and returns the value of the array at the new index. The problem is that at first glance the Parse JS API has only increment operation performed atomically, but not the following read operation, since increment doesn't return a promise with a value which was set during THAT increment.
Now imagine the following scenario (pseudocode):
Field index has value 76, two users try to get the next code at the same time:
User1 -> increment('index') -> save -> then(obj1) -> return array[obj1.index]
User2 -> increment('index') -> save -> then(obj2) -> return array[obj2.index]
Now atomic increment will guarantee that after these 2 calls the index column will have value 78. But what about obj1 and obj2? If their value reading was not done atomically together with the increment operation, but was done through fetching after the increment was performed, then they both might have value 78! And the whole uniqueness logic will be broken.
Is there a way to get the atomic write operation result in Parse?
Increment does return the final value that was atomically incremented:
First the unit test to show how it is used:
fit('increment', (done) => {
new Parse.Object('Counter')
.set('value', 1)
.save()
.then(result => {
console.log('just saved', JSON.stringify(result));
return result
.increment('value')
.save();
})
.then(result => {
console.log('incremented', JSON.stringify(result))
expect(result.get('value')).toBe(2);
done();
})
.catch(done.fail);
});
Behind the scenes, here's what's happening (if you're using mongo, there's similar for postgress too):
MongoAdapter
turns into a mongo $inc operation which is returned
Mongo documentation explaining $inc which includes "$inc is an atomic operation within a single document."
I'm using node.js for a project, and I have this certain structure in my code which is causing problems. I have an array dateArr of sequential dates that contains 106 items. I have an array resultArr to hold resulting data. My code structure is like this:
function grabData(value, index, dateArr) {
cassandra client execute query with value from dateArr {
if (!err) {
if (result has more than 0 rows) {
process the query data
push to resultArr
}
if (result is empty) {
push empty set to resultArr
}
}
}
}
dateArr.forEach(grabData);
I logged the size of resultArr after each iteration and it appears that on some iterations nothing is being pushed to resultArr. The code completes with only 66 items stored in resultArr when 106 items should be stored because the I/O structure between dateArr and resultArr is 1 to 1.
I logged the size of resultArr after each iteration
When the grabData method gets called you start a query to somewhere, or someone named cassandra. As Felix Kling wrote, your notation seems to show an asynchronous function, that starts the request and returns.
As the function is asynchronous, you don't know when the query is ready. That might even take very long, when the database is locked for a dump, or whatever.
When you return from grabData "iteration" and check your resultArr, the resultArr will exactly be filled with each returned value. It might even be that the fifth iteration returns a query before the third, or fourth or any iteration before. So in you resultArr you sometimes have values of iteration n at some point m<n or o>n.
As long as you (or we) don't know anything about how cassandra operates, you cannot say when a query gets answered.
So when you check your result array, it returns the number of completed queries, not the number of iterations.
Found the root cause: There is a hard limit when querying Cassandra using node.js. The query that I am trying to completely execute is too large. Breaking dateArr up into smaller chunks and querying using those smaller pieces solved the problem.