Counter not increasing in async map function - javascript

I am working with mongodb and nodejs. I have an array of customers I have to create each inside database.
const promises2 = customers.map(async customer => {
if (!customer.customerId) {
const counter = await Counter.findOne({ type: "Customer" });
console.log({counter});
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
await Counter.findOneAndUpdate({ type: "Customer" }, { $inc: { sequence_value: 1 } });
}
});
await Promise.all([...promises2]);
The issue is counter is not increasing every time. I am getting same counter in all the created customers. What is the issue here?
Issue is something like this but don't have an answer.

The problem is that all the calls overlap. Since the first thing they each do is get the current counter, they all get the same counter, then try to use it. Fundamentally, you don't want to do this:
const counter = await Counter.findOne({ type: "Customer" });
// ...
await Counter.findOneAndUpdate({ type: "Customer" }, { $inc: { sequence_value: 1 } });
...because it creates a race condition: overlapping asynchronous operations can both get the same sequence value and then both issue an update to it.
You want an atomic operation for incrementing and retrieving a new ID. I don't use MongoDB, but I think the findOneAndUpdate operation can do that for you if you add the returnNewDocument option. If so, the minimal change would be to swap over to using that:
const promises2 = customers.map(async customer => {
if (!customer.customerId) {
const counter = await Counter.findOneAndUpdate(
{ type: "Customer" },
{ $inc: { sequence_value: 1 } },
{ returnNewDocument: true }
);
console.log({counter});
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
}
});
await Promise.all([...promises2]);
...but there's no reason to create an array and then immediately copy it, just use it directly:
await Promise.all(customers.map(async customer => {
if (!customer.customerId) {
const counter = await Counter.findOneAndUpdate(
{ type: "Customer" },
{ $inc: { sequence_value: 1 } },
{ returnNewDocument: true }
);
console.log({counter});
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
}
}));
The overall operation will fail if anything fails, and only the first failure is reported back to your code (the other operations then continue and succeed or fail as the case may be). If you want to know everything that happened (which is probably useful in this case), you can use allSettled instead of all:
// Gets an array of {status, value/reason} objects
const results = await Promise.allSettled(customers.map(async customer => {
if (!customer.customerId) {
const counter = await Counter.findOneAndUpdate(
{ type: "Customer" },
{ $inc: { sequence_value: 1 } },
{ returnNewDocument: true }
);
console.log({counter});
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
}
}));
const errors = results.filter(({status}) => status === "rejected").map(({reason}) => reason);
if (errors.length) {
// Handle/report errors here
}
Promise.allSettled is new in ES2021, but easily polyfilled if needed.
If I'm mistaken about the above use of findOneAndUpdate in some way, I'm sure MongoDB gives you a way to get those IDs without a race condition. But in the worst case, you can pre-allocate the IDs instead, something like this:
// Allocate IDs (in series)
const ids = [];
for (const customer of customers) {
if (!customer.customerId) {
const counter = await Counter.findOne({ type: "Customer" });
await Counter.findOneAndUpdate({ type: "Customer" }, { $inc: { sequence_value: 1 } });
ids.push(counter.sequence_value);
}
}
// Create customers (in parallel)
const results = await Promise.allSettled(customers.map(async(customer, index) => {
const customerId = ids[index];
try {
await Customer.create({
customerId
});
} catch (e) {
// Failed, remove the counter, but without allowing any error doing so to
// shadow the error we're already handling
try {
await Counter.someDeleteMethodHere(/*...customerId...*/);
} catch (e2) {
// ...perhaps report `e2` here, but don't shadow `e`
}
throw e;
}
});
// Get just the errors
const errors = results.filter(({status}) => status === "rejected").map(({reason}) => reason);
if (errors.length) {
// Handle/report errors here
}

Your map function is not returning a promise.
Try this :
const promises2 = [];
customers.map((customer) => {
return new Promise(async (resolve) => {
if (!customer.customerId) {
const counter = await Counter.findOne({ type: 'Customer' });
console.log({ counter });
const payload = {
customerId: counter.sequence_value,
};
await Customer.create(payload);
await Counter.findOneAndUpdate({ type: 'Customer' }, { $inc: { sequence_value: 1 } });
}
resolve();
});
});
await Promise.all(promises2);

Related

How to push data to firestore document?

Basically what I want is that if a document doesn't exist then create a new one (it works fine now) but if a document does exist, push a new object to the existing array.
I was able to get data from documents and console.log them, but don't know how to push new ones to the existing document.
My FB structure looks like this:
favorites
someUserID
Videos [
0: {
name: SomeName
url: SomeUrl
},
/* I would like to push new objects like this: */
1: {
name: data.name
url: data.url
},
]
This is my current code:
const { user } = UserAuth();
const UserID = user.uid;
const favoritesRef = doc(db, "favorites", UserID);
const test = async (data) => {
try {
await runTransaction(db, async (transaction) => {
const sfDoc = await transaction.get(favoritesRef);
if (!sfDoc.exists()) {
setDoc(favoritesRef, {
Videos: [{name: data.name}]
});
}
/* I got my document content here */
const newFavorites = await getDoc(favoritesRef);
console.log("Document data:", newFavorites.data());
/* And would like to push new Data here */
transaction.update(favoritesRef, { name: data.name});
});
console.log("Transaction successfully committed!");
} catch (e) {
console.log("Transaction failed: ", e);
}
}
To update the array Firestore now has a function that allows you to update an array without writing your code again:
Update elements in an array
If your document contains an array field, you can use arrayUnion()
and arrayRemove() to add and remove elements. arrayUnion() adds
elements to an array but only elements not already present.
arrayRemove() removes all instances of each given element.
import { doc, updateDoc, arrayUnion, arrayRemove } from "firebase/firestore";
const washingtonRef = doc(db, "cities", "DC");
// Atomically add a new region to the "regions" array field.
await updateDoc(washingtonRef, {
regions: arrayUnion("greater_virginia")
});
// Atomically remove a region from the "regions" array field.
await updateDoc(washingtonRef, {
regions: arrayRemove("east_coast")
});
Not sure if this helps but what i usually do is:
I'm adding every user that signs in to an array.
const snapshot = await getDoc(doc(db, "allUsers", "list"));
const currentUsers = snapshot.data().users;
await setDoc(doc(db, "allUsers", "list"), {
users: [...currentUsers, { name, uid: userId, avatar: photo }],
});
I first get the items that exist in the list, and them i create a new one that has every previous item and the ones i'm adding. The currentUsers is the current list in that caso. Maybe you should try thist instead of Videos: [{name: data.name}]
setDoc(favoritesRef, {
Videos: [...currentVideos, {name: data.name}]
})
I just figured it out like this:
const { user } = UserAuth();
const UserID = user.uid
const favoritesRef = doc(db, "favorites", UserID)
const test = async (data) => {
try {
await runTransaction(db, async (transaction) => {
const sfDoc = await transaction.get(favoritesRef);
if (!sfDoc.exists()) {
await setDoc(favoritesRef, {
favs: [
{
name: data.name,
ytb: data.ytb,
url: data.url
}]})
}
const doesExists = sfDoc.data().favs.some((fav) => fav.name === data.name)
console.log(doesExists)
if (doesExists === true)
{
console.log("AlreadyExist")
}
else {
const currentData = sfDoc.data().favs
transaction.update(favoritesRef, {
favs: [...currentData,
{
name: data.name,
ytb: data.ytb,
url: data.url
}]}
)}
});
console.log("Transaction successfully committed!");
} catch (e) {
console.log("Transaction failed: ", e);
}
}

JavaScript PromiseAll allSettled does not catch the rejected

I have a sns lambda function that returns void (https://docs.aws.amazon.com/lambda/latest/dg/with-sns.html). This event orderId and one message'status: success' are what I'm publishing. I check if the 'orderId' exists in my data database in the sns subscription lambda event. If it already exists, update the database; if it doesn't, console error it.
I created an integration test in which I transmit a random 'uuid' that isn't a valid 'orderId,' but it appears that my promise doesn't capture the'rejected'. It should show in console error failed to find order... I'm not sure where I'm going wrong. Also My promise syntax looks complicated, is there any neat way, I can do it. Thank you in advance 🙏🏽
This is sns event, which listen the publishing
interface PromiseFulfilledResult<T> {
status: "fulfilled" | "rejected";
value: T;
}
const parseOrdersFromSns = (event: SNSEvent) => {
try {
return event.Records.flatMap((r) => JSON.parse(r.Sns.Message))
} catch (error) {
console.error('New order from SNS failed at parsing orders', { event }, error)
return []
}
}
export const handlerFn = async (event: SNSEvent): Promise<void> => {
const orders = parseOrdersFromSns(event)
if (orders.length === 0) return
const existingOrdersPromiseResult = await Promise.allSettled(
orders.map(
async (o) => await findOrderStateNode(tagOrderStateId(o.orderId))
)
); // This returns of data if the order exsiit other it will return undefined
const existingOrders = existingOrdersPromiseResult // should returns arrays of data
.filter(({ status }) => status === "fulfilled")
.map(
(o) =>
(
o as PromiseFulfilledResult<
TaggedDatabaseDocument<
OrderStateNode,
TaggedOrderStateId,
TaggedOrderStateId
>
>
).value
);
const failedOrders = existingOrdersPromiseResult.filter( // should stop the opeartion if the data is exsit
({ status }) => status === "rejected"
);
failedOrders.forEach((failure) => {
console.error("failed to find order", { failure });
});
const updateOrder = await Promise.all(
existingOrders.map((o) => {
const existingOrderId = o?.pk as TaggedOrderStateId;
console.log({ existingOrderId }); // Return Undefined
})
);
return updateOrder;
};
this is my test suite
describe('Creating and updating order', () => {
integrationTest(
'Creating and updating the order',
async (correlationId: string) => {
CorrelationIds.set('x-correlation-id', correlationId)
const createdOrder = await createNewOrder(correlationId) // This create random order
if (!createdOrder.id) {
fail('order id is not defined')
}
const order = await getOrder(createdOrder.id)
// Add new order to table
await initializeOrderState([order])
const exisitingOrder = await findOrderStateNode(tagOrderStateId(order.id))
if (!exisitingOrder) fail(`Could not existing order with this orderId: ${order.id}`)
const event = {
Records: [
{
Sns: {
Message: JSON.stringify([
{
orderId: uuid(), // random order it
roundName,
startTime,
},
{
orderId: order.id,
roundName,
startTime,
},
{
orderId: uuid(),
roundName,
startTime,
},
]),
},
},
],
} as SNSEvent
await SnsLambda(event)
const updateOrderState = await findOrderStateNode(tagOrderStateId(order.id))
expect(updateOrderState?.status).toEqual('success')
},
)
})

Not getting result in node js, mongo db using promise in loop

I am new in nodejs and mongodb. Its really very confusing to use promise in loop in nodejs for new developer.I require the final array or object. which then() give me final result. Please correct this.
I have a controller function described below.
let League = require('../../model/league.model');
let Leaguetype = require('../../model/leagueType.model');
let Leaguecategories = require('../../model/leagueCategories.model');
let fetchLeague = async function (req, res, next){
let body = req.body;
await mongo.findFromCollection(Leaguetype)
.then(function(types) {
return Promise.all(types.map(function(type){
return mongo.findFromCollection(Leaguecategories, {"league_type_id": type._id})
.then(function(categories) {
return Promise.all(categories.map(function(category){
return mongo.findFromCollection(League, {"league_category_id": category._id})
.then(function(leagues) {
return Promise.all(leagues.map(function(league){
return league;
}))
.then(function(league){
console.log(league);
})
})
}))
});
}))
})
.then(function(final){
console.log(final);
})
.catch (error => {
console.log('no',error);
})
}
mongo.findFromCollection function is looking like this.
findFromCollection = (model_name, query_obj = {}) => {
return new Promise((resolve, reject) => {
if (model_name !== undefined && model_name !== '') {
model_name.find(query_obj, function (e, result) {
if (!e) {
resolve(result)
} else {
reject(e);
}
})
} else {
reject({ status: 104, message: `Invalid search.` });
}
})
}
and here is my model file
var mongoose = require('mongoose');
const league_categories = new mongoose.Schema({
name: {
type: String,
required: true
},
active: {
type: String,
required: true
},
create_date: {
type: Date,
required: true,
default: Date.now
},
league_type_id: {
type: String,
required: 'league_type',
required:true
}
})
module.exports = mongoose.model('Leaguecategories', league_categories)
First i recommend you stop using callbacks wherever you can, its a bit dated and the code is much harder to read and maintain.
I re-wrote your code a little bit to look closer to what i'm used to, this does not mean this style is better, i just personally think its easier to understand what's going on.
async function fetchLeague(req, res, next) {
try {
//get types
let types = await Leaguetype.find({});
//iterate over all types.
let results = await Promise.all(types.map(async (type) => {
let categories = await Leaguecategories.find({"league_type_id": type._id});
return Promise.all(categories.map(async (category) => {
return League.find({"league_category_id": category._id})
}))
}));
// results is in the form of [ [ [ list of leagues] * per category ] * per type ]
// if a certain category or type did not have matches it will be an empty array.
return results;
} catch (error) {
console.log('no', error);
return []
}
}

rxjs subscribing late results to empty stream

I have the following piece of code. As is, with a couple of lines commented out, it works as expected. I subscribe to a stream, do some processing and stream the data to the client. However, if I uncomment the comments, my stream is always empty, i.e. count in getEntryQueryStream is always 0. I suspect it has to do with the fact that I subscribe late to the stream and thus miss all the values.
// a wrapper of the mongodb driver => returns rxjs streams
import * as imongo from 'imongo';
import * as Rx from 'rx';
import * as _ from 'lodash';
import {elasticClient} from '../helpers/elasticClient';
const {ObjectId} = imongo;
function searchElastic({query, sort}, limit) {
const body = {
size: 1,
query,
_source: { excludes: ['logbookType', 'editable', 'availabilityTag'] },
sort
};
// keep the search results "scrollable" for 30 secs
const scroll = '30s';
let count = 0;
return Rx.Observable
.fromPromise(elasticClient.search({ index: 'data', body, scroll }))
.concatMap(({_scroll_id, hits: {hits}}) => {
const subject = new Rx.Subject();
// subject needs to be subscribed to before adding new values
// and therefore completing the stream => execute in next tick
setImmediate(() => {
if(hits.length) {
// initial data
subject.onNext(hits[0]._source);
// code that breaks
//if(limit && ++count === limit) {
//subject.onCompleted();
//return;
//}
const handleDoc = (err, res) => {
if(err) {
subject.onError(err);
return;
}
const {_scroll_id, hits: {hits}} = res;
if(!hits.length) {
subject.onCompleted();
} else {
subject.onNext(hits[0]._source);
// code that breaks
//if(limit && ++count === limit) {
//subject.onCompleted();
//return;
//}
setImmediate(() =>
elasticClient.scroll({scroll, scrollId: _scroll_id},
handleDoc));
}
};
setImmediate(() =>
elasticClient.scroll({scroll, scrollId: _scroll_id},
handleDoc));
} else {
subject.onCompleted();
}
});
return subject.asObservable();
});
}
function getElasticQuery(searchString, filter) {
const query = _.cloneDeep(filter);
query.query.filtered.filter.bool.must.push({
query: {
query_string: {
query: searchString
}
}
});
return _.extend({}, query);
}
function fetchAncestors(ancestorIds, ancestors, format) {
return imongo.find('session', 'sparse_data', {
query: { _id: { $in: ancestorIds.map(x => ObjectId(x)) } },
fields: { name: 1, type: 1 }
})
.map(entry => {
entry.id = entry._id.toString();
delete entry._id;
return entry;
})
// we don't care about the results
// but have to wait for stream to finish
.defaultIfEmpty()
.last();
}
function getEntryQueryStream(entriesQuery, query, limit) {
const {parentSearchFilter, filter, format} = query;
return searchElastic(entriesQuery, limit)
.concatMap(entry => {
const ancestors = entry.ancestors || [];
// if no parents => doesn't match
if(!ancestors.length) {
return Rx.Observable.empty();
}
const parentsQuery = getElasticQuery(parentSearchFilter, filter);
parentsQuery.query.filtered.filter.bool.must.push({
terms: {
id: ancestors
}
});
// fetch parent entries
return searchElastic(parentsQuery)
.count()
.concatMap(count => {
// no parents match query
if(!count) {
return Rx.Observable.empty();
}
// fetch all other ancestors that weren't part of the query results
// and are still a string (id)
const restAncestorsToFetch = ancestors.filter(x => _.isString(x));
return fetchAncestors(restAncestorsToFetch, ancestors, format)
.concatMap(() => Rx.Observable.just(entry));
});
});
}
function executeQuery(query, res) {
try {
const stream = getEntryQueryStream(query);
// stream is passed on to another function here where we subscribe to it like:
// stream
// .map(x => whatever(x))
// .subscribe(
// x => res.write(x),
// err => console.error(err),
// () => res.end());
} catch(e) {
logger.error(e);
res.status(500).json(e);
}
}
I don't understand why those few lines of code break everything or how I could fix it.
Your use case is quite complex, you can start off with building up searchElastic method like the pattern bellow.
convert elasticClient.scroll to an observable first
setup the init data for elasticClient..search()
when search is resolved then you should get your scrollid
expand() operator let you recursively execute elasticClientScroll observable
use map to select data you want to return
takeWhile to decide when to complete this stream
The correct result will be once you do searchElastic().subscribe() the stream will emit continuously until there's no more data to fetch.
Hope this structure is correct and can get you started.
function searchElastic({ query, sort }, limit) {
const elasticClientScroll = Observable.fromCallback(elasticClient.scroll)
let obj = {
body: {
size: 1,
query,
_source: { excludes: ['logbookType', 'editable', 'availabilityTag'] },
sort
},
scroll: '30s'
}
return Observable.fromPromise(elasticClient.search({ index: 'data', obj.body, obj.scroll }))
.expand(({ _scroll_id, hits: { hits } }) => {
// guess there are more logic here .....
// to update the scroll id or something
return elasticClientScroll({ scroll: obj.scroll, scrollId: _scroll_id }).map(()=>
//.. select the res you want to return
)
}).takeWhile(res => res.hits.length)
}

Async/await and promises [duplicate]

This question already has answers here:
Using async/await with a forEach loop
(33 answers)
Closed 4 years ago.
I am having an issue with using async/await using node v8.1. It appears my issue is that I am not returning a promise from my async functions. This is causing the flow of the program to run out of order. I thought by making a function async that the function automatically returns a promise, but that is not the case I am running into.
I expect the program below to output:
Validating xlsx file...
(text from validateParsedXlsx)
Adding users to cognito...
(text from addUsersToCognito)
Adding users to dynamodb...
(text from addUsersToDynamodb)
Instead, I get:
Validating xlsx file...
Adding users to cognito...
Adding users to dynamodb...
(text from validateParsedXlsx)
(text from addUsersToCognito)
(text from addUsersToDynamodb)
The issue seems to be pretty obvious, that validateParsedXlsx() addUsersToCognito() and addUsersToDynamodb() are not returning promises. Again, I thought that by using the async keyword, the function automatically took care of this.
Thanks for the help.
Here is my script:
const xlsx = require('xlsx');
const AWS = require('aws-sdk');
AWS.config.update({region: 'us-west-2'});
const documentClient = new AWS.DynamoDB.DocumentClient({convertEmptyValues: true});
async function main(){
if (!process.argv[2]) {
console.log('\nAbsolute filepath missing. Pass the absolute filepath in as command line argument.\n')
process.exit(1);
}
const xlsxFilePath = process.argv[2];
let parsedXlsx = [];
try {
parsedXlsx = parseXlsx(xlsxFilePath);
} catch (error) {
if(error.code === 'ENOENT') {
console.log(`\nThe file path: ${process.argv[2]} cannot be resolved\n`)
} else {
console.log(error);
}
}
console.log('\n\nValidating xlsx file...\n');
await validateParsedXlsx(parsedXlsx);
console.log('\n\nAdding users to cognito...\n');
await addUsersToCognito(parsedXlsx);
console.log('\n\nAdding users to dynamodb...\n');
await addUsersToDynamodb(parsedXlsx);
}
function parseXlsx(filePath) {
const workbook = xlsx.readFile(filePath);
const sheetNameList = workbook.SheetNames;
const parsedXlsxSheets = sheetNameList.map(function (y) {
const worksheet = workbook.Sheets[y];
const headers = {};
const data = [];
for (z in worksheet) {
if(z[0] === '!') continue;
//parse out the column, row, and value
const col = z.substring(0,1);
const row = parseInt(z.substring(1));
const value = worksheet[z].v;
//store header names
if(row == 1) {
headers[col] = value;
continue;
}
if(!data[row]) data[row] = {};
data[row][headers[col]] = value;
}
//drop those first two rows which are empty
data.shift();
data.shift();
return data;
});
return parsedXlsxSheets[0]
}
async function validateParsedXlsx(users) {
let error = false;
users.forEach(async (user, index) => {
if (!user.email) {
console.log(`User at row ${index + 2} doesn't have 'email' entry in xlsx file.`);
error = true;
}
if (!user.displayName) {
console.log(`User at row ${index + 2} doesn't have 'displayName' entry in xlsx file.`);
error = true;
}
if (!user.serviceProviderId) {
console.log(`Userat row ${index + 2} doesn't have 'displayName' entry in xlsx file.`);
error = true;
} else {
const params = {
TableName: 'service-providers',
Key: {
serviceProviderId: user.serviceProviderId
}
}
const response = await documentClient.get(params).promise();
if (!response.Item) {
console.log(`User at row ${index +2} does not have a valid serviceProviderId.`);
error = true;
} else {
console.log(`User ${user.email} is valid, assigned to service provider: ${response.Item.displayName}`);
}
}
if (error) {
console.log(`Every user in xlsx file must have these attributes, spelled correctly: email, displayName, and serviceProviderId\n\nIn addition, make sure the serviceProviderId is correct by checking the service-providers dynanomdb table.`);
process.exit(1);
}
});
}
async function addUsersToCognito(users) {
const cognitoIdentityServiceProvider = new AWS.CognitoIdentityServiceProvider();
const results = await cognitoIdentityServiceProvider.listUserPools({MaxResults: 10}).promise();
let serviceProviderUserPoolId = '';
results.UserPools.forEach((userPool) => {
if(userPool.Name === 'service-provider-users') {
serviceProviderUserPoolId = userPool.Id;
}
});
users.forEach(async (user) => {
const params = {
UserPoolId: serviceProviderUserPoolId,
Username: user.email,
DesiredDeliveryMediums: ['EMAIL'],
TemporaryPassword: 'New_User1',
UserAttributes: [
{
Name: 'email',
Value: user.email
},
{
Name: 'custom:service_provider_id',
Value: user.serviceProviderId
}
]
}
try {
await cognitoIdentityServiceProvider.adminCreateUser(params).promise();
console.log(`Added user ${user.email} to cognito user pool`);
} catch (error) {
if (error.code === 'UsernameExistsException') {
console.log(`Username: ${user.email} already exists. No action taken.`);
}
else {
console.log(error);
}
}
});
}
async function addUsersToDynamodb(users) {
users.forEach(async (user) => {
const params = {
TableName: 'service-provider-users',
Item: {
serviceProviderId: user.serviceProviderId,
userId: user.email,
displayName: user.displayName,
isActive: false,
role: 'BASIC'
},
ConditionExpression: 'attribute_not_exists(userId)'
}
try {
await documentClient.put(params).promise();
console.log(`Added user ${user.email} to dynamodb user table`);
} catch (error) {
if (error.code === 'ConditionalCheckFailedException') {
console.log(`User ${user.email} already in the dynamodb table service-provider-users`);
} else {
console.log(error);
}
}
});
}
main();
users.forEach(async (user, index) => {
That starts a few promising actions but never awaits them. May do:
await Promise.all(users.map(async (user, index) => {
... to execute them in parallel or do this:
await users.reduce((chain, user, index) => async (user, index) => {
await chain;
//...
}, Promise.resolve());
To execute them one after another.
PS: Using process.exit should be the very last option to end your program

Categories

Resources