My Lambda takes in an SQS message containing an ID and address. It parses out those fields, and updates the record associated with that ID in a dynamo table.
The parameters for this update contain the following logic
1.Where the record has a id equal to the ID sent by SQS
2.And where the SortKey has a value equal to “null” (Note that null is specifically a String with a value of “null”)
3.Update the address field with the new address
I'm seeing the following issues with this function
The function is not updating the DynamoDB instance
I am not receiving any kind of feedback from the update call. Looking over the code there are several console.logs that should execute but are not. See the Try,Catch,Finally block after the update. Looking at the logs you can see that these do not output to the console. Something is very wrong here. The finally not executing looks like undefined behavior, my only guess is that the call to dynamodb is not being awaited
I also need to implement the following functionality. This is bonus points, if you have an idea of how to do it please feel free to comment!
Right now the update will only change the fieldname of address from one value to another. Instead, I need the record to contain a set of addresses associated with that record. To do this we need to implement the following logic
If a set of addresses does not exist on the record, create a set with the address as the only element
If a set does exist on the student record, update that set with the address. Duplicate addresses should not be added
The code for this function is below. I’ve also attached the most recent CloudWatch log for this function, and the record I am trying to update (the address field on the record was added manually). You’ll notice that we aren’t getting any console.logs after console.log("starting upload"), and the promise has a state “PENDING” when it is examined. We also don’t get any feedback from the dynamodb update. Right now the function is not updating the record, and not giving me any feedback for why it is failing to do so.
const util = require('util')
const aws = require('aws-sdk');
const docClient = new aws.DynamoDB.DocumentClient();
exports.handler = async(event) => {
event.Records.forEach(async record => {
const { body } = record;
const test = JSON.parse(body);
console.log(test);
const message = JSON.parse(test["Message"]);
console.log(message);
const id = message.id;
const name = message.Name;
const address = message.address;
console.log("parameters parsed");
console.log("record being processed is " + id);
const params = {
TableName: "My_Records",
Key: {
"ID": ":id",
"SortKey": ":sortKey"
},
//KeyConditionExpression: 'SortKey = :sortKey',
UpdateExpression: "set info.address = :address",
ExpressionAttributeValues: {
':id': id,
':address': address,
':sortKey': "null"
},
ReturnValues: "UPDATED_NEW"
};
console.log(params)
console.log("starting upload")
try {
let putObjectPromise = docClient.update(params).promise();
console.log(util.inspect(putObjectPromise, {showHidden: false, depth: null}))
putObjectPromise.then(function(data) {
console.log("UpdateItem succeeded:");
}).catch(function(err) {
console.log("Unable to update item. Error JSON:" + err);
}).finally(() =>
console.log("done with upload")
);
return putObjectPromise
}
catch (err) {
console.err(err)
}
});
};
CloudWatch log of the most recent execution of this function
INFO {
Type: 'Notification',
MessageId: 'ID',
TopicArn: 'ARN',
Subject: 'DB updated',
SignatureVersion: '1',
INFO { id: '11111111', Name: 'Jerms Macgee', address: '102 homeslice lane' }
INFO parameters parsed
INFO record being processed is 11111111
INFO {
TableName: 'my_table',
Key: { ID: ':id', SortKey: ':sortKey' },
UpdateExpression: 'set info.address = :address',
ExpressionAttributeValues: {
':id': '11111111',
':address': '102 homeslice lane',
':sortKey': 'null'
},
ReturnValues: 'UPDATED_NEW'
}
INFO starting upload
INFO Promise { <pending> }
END RequestId
And here's an example of the record I'd expect to be updated
{
"address": "test",
"SortKey": "null",
"id": 11111111
"name": James Mcgee
}
The updated record should be
{
"address": "102 homeslice lane",
"SortKey": "null",
"id": 11111111
"name": James Mcgee
}
And for bonus points I'd really like to do something like
{
"address": {"102 homeslice lane"},
"SortKey": "null",
"id": 11111111
"name": James Mcgee
}
where addresses is a set that can hold other records
First, forEach won't work with async/await callback. The example from https://codeburst.io/javascript-async-await-with-foreach-b6ba62bbf404
const waitFor = (ms) => new Promise(r => setTimeout(r, ms));
[1, 2, 3].forEach(async (num) => {
await waitFor(50);
console.log(num);
});
console.log('Done');
Second, if you wrap a rejected promise in a try/catch block and that promise already has a .catch() set, catch block will never be executed.
const waitFor = (ms) => new Promise((resolve, reject) => {
setTimeout(() => {
reject(123);
}, ms)
});
try {
waitFor(2000).catch(e => { console.log(e) })
} catch (error) {
console.error('error');
}
You're mixing up your async/promise stuff.
First, event.Records.forEach isn't going to wait for the async function you're passing to it to resolve, you can change it to:
await Promise.all(event.Records.map(async record => {
///... the rest of your function body
});
This way your main handler function will actually wait for them all to resolve.
Next, all this stuff:
try {
let putObjectPromise = docClient.update(params).promise();
console.log(util.inspect(putObjectPromise, {showHidden: false, depth: null}))
putObjectPromise.then(function(data) {
console.log("UpdateItem succeeded:");
}).catch(function(err) {
console.log("Unable to update item. Error JSON:" + err);
}).finally(() =>
console.log("done with upload")
);
return putObjectPromise
}
catch (err) {
console.err(err)
}
is weird, you're using .then and callback functions, but you're in an async function so you can just await them. Eg:
try {
const putObjectResponse = await docClient.update(params).promise();
console.log("UpdateItem succeeded:");
console.log(JSON.stringify(putObjectResponse));
}
catch (err) {
console.log("Unable to update item. Error JSON:" + err);
console.err(err)
}
console.log("done with upload")
By awaiting update(params).promise() the return value becomes what the promise resolves to, not the promise. If the promise rejects, it is thrown and caught in your catch block.
This also fixes your weird logging messages because you're now logging the resolved value from the promise rather than the promise itself.
Related
I'm trying to build a nodeJS script that pulls records from an Airtable base, bumps a UPC list up against the [UPC Item DB API][1], writes the product description ("Title") and product image array from the API response to an object, and then updates corresponding Airtable records with the pre-formatted using the Airtable API. I can't link directly to the Airtable API for my base, but the "Update Record" should look like this:
{
record_id: 'myRecord',
fields: {
'Product Description': 'J.R. Watkins Gel Hand Soap, Lemon, 11 oz',
'Reconstituted UPC': '818570001330',
Images: [
'https://images.thdstatic.com/productImages/b3e507dc-2d4a-48d4-a469-51a34c454959/svn/j-r-watkins-hand-soaps-23051-64_1000.jpg',
'http://pics.drugstore.com/prodimg/332476/450.jpg',
]
}
}
var Airtable = require('airtable');
var base = new Airtable({apiKey: 'myKey'}).base('myBase');
var request = require('request');
// Function to slow the code down for easier console watching
function sleep(ms) {
return new Promise((resolve) => {
setTimeout(resolve, ms);
});
}
// Function to slow the code down for easier console watching
async function init(x) {
console.log(1);
await sleep(x*1000);
console.log(2);
}
// Big nasty async function
async function imagesToAirtable() {
///Run through the airtable list
/// create the UPC_list that will be updated and pushed to Airtable to update records
const upc_list = [];
/// Pull from Airtable and assign array to an object
const airtable_records = await base('BRAND')
.select( { maxRecords : 3 })
.all()
/// Troubleshooting console.logs
console.log(airtable_records.length);
console.log("Entering the FOR loop")
/// Loop through the list, append req'd fields to the UPC object, and call the UPCItemDB API
for (var i = 0 ; i< airtable_records.length ; i++) {
/// Push req'd fields to the UPC object
await upc_list.push(
{ record_id : airtable_records[i].get("Record ID"),
fields: {
"Product Description" : "",
"Reconstituted UPC": airtable_records[i].get("Reconstituted UPC"),
"Images": []
}
}
);
/// Troubleshooting console.logs
console.log(upc_list)
console.log("Break");
/// call API
await request.post({
uri: 'https://api.upcitemdb.com/prod/trial/lookup',
headers: {
"Content-Type": "application/json",
"key_type": "3scale"
},
gzip: true,
body: "{ \"upc\": \""+airtable_records[i].get("Reconstituted UPC")+"\" }",
}, /// appending values to upc_list object
function (err, resp, body) {
console.log("This is loop "+ i)
upc_list[i].fields["Images"] = JSON.parse(body).items[0].images
upc_list[i].fields["Product Description"] = JSON.parse(body).items[0].title
console.log(upc_list[i]);
}
)}
};
imagesToAirtable();
I haven't gotten to writing the Airtable "Update Record" piece yet because I can't get the API response written to the upc_list array.
I get an error message on the last run of the FOR loop. In this case, the first and second time through the loop work fine and update the upc_list object, but the third time, I get this error:
upc_list[i].fields["Images"] = JSON.parse(body).items[0].images
^
TypeError: Cannot read property 'fields' of undefined
I know this has to do with async/await, but I'm just not experienced enough at this point to understand what I need to do.
I also know that this big nasty async/await function should be written into individual functions and then called in one single main() function but I can't figure out how to make everything chain together properly with async/await. Tips on that would be welcome as well :)
I have tried separating FOR loop into two FOR loops. The first for the initial append of the upc_list item, and the second for the API call and append with the parsed response.
I was going to skip by this question until I saw this:
I also know that this big nasty async/await function should be written
into individual functions
You are so right about that. Let's do it!
// get records from any base, up to limit
async function getRecords(base, limit) {
return base(base)
.select( { maxRecords : limit })
.all();
}
// return a new UPC object from an airtable brand record
// note - nothing async is being done here
function upcFromBrandRecord(brand) {
return {
record_id: brand.get("Record ID"),
fields: {
"Product Description": "",
"Reconstituted UPC": brand.get("Reconstituted UPC"),
"Images": []
}
};
}
The request module you're using doesn't use promises. There's a promise-using variant, I believe, but without installing anything new, we can "promise-ify" the post method you're using.
async function requestPost(uri, headers, body) {
return new Promise((resolve, reject) => {
request.post({ uri: uri, headers, gzip: true, body },
(err, resp, body) => {
err ? reject(err) : resolve(body)
}
)}
});
}
Now we can write a particular one for your usage...
async function upcLookup(brand) {
const uri = 'https://api.upcitemdb.com/prod/trial/lookup';
const headers = {
"Content-Type": "application/json",
// probably need an api key in here
"key_type": "3scale"
};
const body = JSON.stringify({ upc: brand.get("Reconstituted UPC") });
const responseBody = await requestPost(uri, headers, body);
// not sure if you must parse, but copying the OP
return JSON.parse(responseBody);
}
For a given brand record, build a complete upc record by creating the structure and calling the upc api...
async function brandToUPC(brand) {
const result = upcFromBrandRecord(brand);
const upcData = await upcLookup(brand);
result.fields["Images"] = upcData.items[0].images;
result.fields["Product Description"] = upcData.items[0].title;
return result;
}
Now we have all the tools needed to write the OP function simply...
// big and nasty no more!
async function imagesToAirtable() {
try {
const airtable_records = await getRecords('BRAND', 3);
const promises = airtable_records.map(brandToUPC);
const upc_list = await Promise.all(promises); //edit: forgot await
console.log(upc_list);
} catch (err) {
// handle error here
}
}
That's it. Caveat. I haven't parsed this code, and I know little or nothing about the services you're using, or whether there was a bug hidden underneath the one you've been encountering. So it seems unlikely that this will run out of the box. What I hope I've done is demonstrate the value of decomposition for making nastiness disappear.
I am new to the javascript promise world and i wrote a code that updates my database table based on some params. but i want all these changes to be made at the same time, and i did some research and came across promise.all promise api. and i wanted to know what i was doing wrong or is there a better way to do this?
your help will be most appreciated.
promise code:
try {
syncEntities = async () => {
let saveTakerOrder;
let tradePromises;
// if no makers in the trade or the taker was not completely filled put it in the orders table
saveTakerOrder = await dynamodb
.put({
TableName: process.env.ORDERS_TABLE_NAME,
Item:
makers.length === 0 ||
taker.initial_quantity !== taker.quantity_removed
? taker
: {
...taker,
status: "CLOSED",
orderRate: `${taker.side}#CLOSED#${taker.rate}`,
},
})
.promise();
// if makers exist
if (makers.length > 0) {
tradePromises = makers.map(async (maker) => {
let savedTrade;
let updatedOrder;
// trade entity model
const matchedTrade = {
id: `TRD${uuid()}`,
buy_order: taker.side === "BUY" ? taker.id : maker.id,
sell_order: taker.side === "SELL" ? taker.id : maker.id,
ticker: maker.ticker,
rate: maker.rate,
quantity: maker.quantity_removed,
createdAt: date.toISOString(),
};
// Save the trade to the trades table
savedTrade = await dynamodb
.put({
TableName: process.env.TRADES_TABLE_NAME,
Item: matchedTrade,
})
.promise();
// Update the order in the orders table if quantity is not filled ELSE close it if completely filled
if (maker.quantity_removed !== maker.initial_quantity) {
updatedOrder = await dynamodb
.update({
TableName: process.env.ORDERS_TABLE_NAME,
key: { id: maker.id },
UpdateExpression:
"set quantity_removed = :quantity_removed, quantity_remaining = :quantity_remaining",
ExpressionAttributeValues: {
":quantity_remaining": maker.quantity_remaining,
":quantity_removed": maker.quantity_removed,
},
})
.promise();
} else {
updatedOrder = await dynamodb
.update({
TableName: process.env.ORDERS_TABLE_NAME,
Key: { id: maker.id },
UpdateExpression: "set #status = :status, orderRate = :orderRate",
ExpressionAttributeValues: {
":status": "CLOSED",
orderRate: `${maker.side}#CLOSED#${maker.rate}`,
},
ExpressionAttributeNames: {
"#status": "status",
},
})
.promise();
}
return Promise.all([savedTrade, updatedOrder]);
});
}
return Promise.all([tradePromises, saveTakerOrder]);
};
await Promise.all([syncEntities]);
} catch (error) {
console.error(error);
throw new createError.InternalServerError(error);
}
Please can someone point out what i am doing wrong with this code or help me correct it Thanks?
All you need to do is remove the awaits from the calls to the things you want to include in your Promise.all. The moment you await something it is going to resolve the promise before continuing. If you were to look at the object you get back from await dynamodb.update(...).promise(), for example, you'll notice that you have a DynamoDBUpdateResponse object (at least I think that's the type). But if you remove the await you'd have a Promise<DynamoDBUpdateResponse> object.
You can still get at the results of the promises after you call await Promise.all. Each item in the array will have a record in the resulting array.
I am developing the backend of an application using Node JS, Sequelize and Postgres database.
When the course is registered, the user must inform which organizations, companies and teachers will be linked to it.
The organization IDs are passed through an array to the backend, I am trying to do a check to make sure that the passed IDs exist.
What I've done so far is this:
const { organizations } = req.body;
const organizationsArray = organizations.map(async (organization) => {
const organizationExists = await Organization.findByPk(organization);
if (!organizationExists) {
return res
.status(400)
.json({ error: `Organization ${organization} does not exists!` });
}
return {
course_id: id,
organization_id: organization,
};
});
await CoursesOrganizations.bulkCreate(organizationsArray);
This link has the complete controller code, I believe it will facilitate understanding.
When !OrganizationExists is true, I am getting the return that the organization does not exist. The problem is when the organization exists, I am getting the following message error.
The Array.map() is returning an array of promises that you can resolve to an array using Promise.all(). Inside the map you should use throw new Error() to break out of the map - this error will be raised by Promise.all() and you can then catch it and return an error to the client (or swallow it, etc).
This is a corrected version of your pattern, resolving the Promise results.
const { organizations } = req.body;
try {
// use Promise.all to resolve the promises returned by the async callback function
const organizationsArray = await Promise.all(
// this will return an array of promises
organizations.map(async (organization) => {
const organizationExists = await Organization.findByPk(organization, {
attributes: ['id'], // we only need the ID
raw: true, // don't need Instances
});
if (!organizationExists) {
// don't send response inside the map, throw an Error to break out
throw new Error(`Organization ${organization} does not exists!`);
}
// it does exist so return/resolve the value for the promise
return {
course_id: id,
organization_id: organization,
};
})
);
// if we get here there were no errors, create the records
await CoursesOrganizations.bulkCreate(organizationsArray);
// return a success to the client
return res.json({ success: true });
} catch (err) {
// there was an error, return it to the client
return res.status(400).json({ error: err.message });
}
This is a refactored version that will be a bit faster by fetching all the Organizations in one query and then doing the checks/creating the Course inserts.
const { Op } = Sequelize;
const { organizations } = req.body;
try {
// get all Organization matches for the IDs
const organizationsArray = await Organization.findAll({
attributes: ['id'], // we only need the ID
where: {
id: {
[Op.in]: organizations, // WHERE id IN (organizations)
}
},
raw: true, // no need to create Instances
});
// create an array of the IDs we found
const foundIds = organizationsArray.map((org) => org.id);
// check to see if any of the IDs are missing from the results
if (foundIds.length !== organizations.length) {
// Use Array.reduce() to figure out which IDs are missing from the results
const missingIds = organizations.reduce((missingIds, orgId) => {
if (!foundIds.includes(orgId)){
missingIds.push(orgId);
}
return missingIds;
}, []); // initialized to empty array
throw new Error(`Unable to find Organization for: ${missingIds.join(', ')}`);
}
// now create an array of courses to create using the foundIds
const courses = foundIds.map((orgId) => {
return {
course_id: id,
organization_id: orgId,
};
});
// if we get here there were no errors, create the records
await CoursesOrganizations.bulkCreate(courses);
// return a success to the client
return res.json({ success: true });
} catch (err) {
// there was an error, return it to the client
return res.status(400).json({ error: err.message });
}
If you have an array of Ids and you want to check if they exist you should you use the (in) operator, this makes it so that you are hitting the DB only once and getting all the records in one hit (instead of getting them one by one in a loop), after you get these records you can check their lengths to determine if they all exist or not.
const { Op } = require("sequelize");
let foundOrgs = await Organization.findAll({
where: {
id: {
[Op.in]: organizationsArray,
}
}
});
I am struggling to get my head round Promises. I think i understand the concept but I am unable to get them to work on the backend.
I have read several stackoverflow posts. I still see a few which are only months old so I guess i am not the only one :)
Specifically, I need help on how I can pass the result of a resolved promise within my code. In the code below, I fetch a JSON file from the starwars api and want to write it onto a mongodb atlas collection.
I use axios.get, which returns a promise. I then resolve it using .then and then use insertOne on mongoDB collections.
On the frontend for e.g in React, it works as expected, where you use setState to change the state by using the setState within the .then function.
I don't understand why it doesn't work in the backend.
Could you please tell me what I need to change so I can get it to write to mongoDB atlas?
var axios = require("axios");
const MongoClient = require("mongodb").MongoClient;
var db;
const getData = () => {
return axios
.get("https://swapi.co/api/people/1")
.then(response => {
if (!response.data) throw Error("No data found.");
console.log(JSON.stringify(response.data)) **//This returns the data as expected.**
return JSON.stringify(response.data);
})
.catch(error => {
console.log(error);
throw error;
});
};
console.log(getData()); **// This returns {Promise <pending>}**
const client = new MongoClient(process.env.MONGODB_URL, {
useNewUrlParser: true,
useUnifiedTopology: true
});
// Connect to database and insert default users into users collection
client.connect(err => {
console.log("Connected successfully to database");
let d = {
name: "Luke Skywalker",
height: "172",
mass: "77",
hair_color: "blond",
skin_color: "fair",
eye_color: "blue"
};
db = client.db(process.env.DB_NAME);
db.collection("macroData").insertOne(d); //this works
db.collection("macroData").insertOne(getData); // this doesn't work as it still appears to be a promise
});
getData() returns a Promise, as you are well aware, so you have to wait on that promise to resolve. A straightforward approach would be to perform the insert once the data is available:
client.connect(err => {
// ...
getData().then(data => {
db.collection('macroData').insertOne(data)
})
})
Or, if you can use async/await:
client.connect(async err => {
// ...
const data = await getData()
db.collection('macroData').insertOne(data)
})
Your mongodb call needs to be within the axios promise, that way the resolved promise can be used to feed your database. That's what held me up for a while...
The code below is for Nick comment, I couldn't post the code in the comment as it was too long. You should have a database named test or appropriate name here let datab = client.db('test'), i think it comes by default when you create a mongodb atlas.
if you change your user and password in the mongourl, you should be good to go.
Hope that helps. This creates a starwars entry under test.starWarsData .. hope t
let axios = require("axios");
let MongoClient = require("mongodb").MongoClient;
let mongoParams = { useNewUrlParser: true, useUnifiedTopology: true };
let mongoUrl =
"mongodb+srv://user:password#cluster0-hnc4i.azure.mongodb.net/test";
//try feeding just the object e, and see if it works, in case your axios error catching is not great.
let e = {
a: "this is a",
b: "this is b",
c: "this is c",
d: "this is d"
};
let newUrl = "https://swapi.co/api/people/1";
console.log(newUrl);
let client = new MongoClient(mongoUrl, mongoParams);
client.connect(err => {
if (err) {
console.log(err.message);
throw new Error("failed to connect");
}
let datab = client.db("test");
console.log("db connected");
try {
axios.get(newUrl).then(res => {
try {
datab.collection("starWarsData").insertOne(res.data);
console.log("insert succeeded");
} catch (err) {
console.log("insert failed");
console.log(err.message);
}
});
} catch (err) {
throw Error("axios get did not work");
}
});
I deployed a function with the following query:
admin.firestore().collection("fcm").where("devices",'array-contains', mobile).get().then((snapshots)=> {...});
This returns the following error from the Cloud Function Log:
msgTrigger: Function execution started
msgTrigger: Function returned undefined, expected Promise or value
msgTrigger: Function execution took 8429 ms, finished with status: 'ok'
msgTrigger: Unhandled rejection
msgTrigger: TypeError: Cannot read property 'Symbol(Symbol.iterator)' of undefined at admin.firestore.collection.where.get.then (/user_code/index.js:23:65) at process._tickDomainCallback (internal/process/next_tick.js:135:7)
Anyone please?
Fighting for days with the editor here. decided to post my function code in chunks:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
var msgData;
var mobile;
Second part:
exports.msgTrigger = functions.firestore
.document('Messages/{MessageID}')
.onCreate((snapshot, context) => {
msgData = snapshot.data();
mobile = msgData.mobile;
admin.firestore().collection("fcm").where("devices", 'array-contains', mobile).get().then((snapshots) => {
Third part:
var tokens = [];
if (snapshots.empty) {
console.log('No devices');
} else {
for (var token of snapshot.docs) {
tokens.push(token.data().token);
}
var payLoad = {
"notification": {
"title": "de " + msgData.name,
"body": "Alerta de Emergência!",
"sound": "default",
"icon": msgData.icon
},
"data": {
"remetente": +msgData.name,
"mensagem": "Alerta de Emergência!"
}
}
Fourth part:
return admin.messaging().sendToDevice(tokens, payLoad).then((response) => {
console.log("mensagens enviadas");
}).catch((err) => {
console.log("erro: " + err);
});
}
});
});
Firestore 0.8 is quite an old version, see https://cloud.google.com/nodejs/docs/reference/firestore/0.8.x/. It is only from version 0.16 that you can use the array-contains query operator , see https://github.com/googleapis/nodejs-firestore/releases/tag/v0.16.0. So you should update to the latest version.
I've also adapted your function code by first (and very important, see below) returning the promise returned by the first asynchronous task, then re-organising your promises chaining in the if/then/else.
Does it execute correctly now??
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(); // Here changed, see https://firebase.google.com/docs/functions/beta-v1-diff#new_initialization_syntax_for_firebase-admin
exports.msgTrigger = functions.firestore
.document('Messages/{MessageID}')
.onCreate((snapshot, context) => {
const msgData = snapshot.data(); //No need to declare this outside of the Cloud Function, see https://www.youtube.com/watch?v=2mjfI0FYP7Y
const mobile = msgData.mobile;
return admin. // <- HERE return
.firestore()
.collection('fcm')
.where('devices', 'array-contains', mobile)
.get()
.then(snapshots => {
let tokens = [];
if (snapshots.empty) {
console.log('No devices');
return null;
} else {
for (var token of snapshot.docs) {
tokens.push(token.data().token);
}
var payLoad = {
notification: {
title: 'de ' + msgData.name,
body: 'Alerta de Emergência!',
sound: 'default',
icon: msgData.icon
},
data: {
remetente: +msgData.name,
mensagem: 'Alerta de Emergência!'
}
};
return admin.messaging().sendToDevice(tokens, payLoad);
}
})
.catch(err => {
console.log('erro: ' + err);
return null;
});
});
Why is it important to return the promises returned by asynchronous tasks? Watch the 3 videos about "JavaScript Promises" from the official Firebase video series (https://firebase.google.com/docs/functions/video-series/) for the answer!!
Removing ".where('devices', 'array-contains', mobile)" gives the same error. I added msgData.name and msgData.mobile to console.log and they get printed, so the first part is fine:
1: 31: 05.140 AM
msgTrigger
Function execution started
1: 31: 11.289 AM
msgTrigger
Nome: Alan
1: 31: 11.291 AM
msgTrigger
mobile: 91983372845
1: 31: 11.291 AM
msgTrigger
erro: TypeError: Cannot read property 'Symbol(Symbol.iterator)' of undefined
1: 31: 11.297 AM
msgTrigger
Function execution took 6158 ms, finished with status: 'ok'
Ok, for what it's worth I finally found the culprit:
for (var token of snapshot.docs) {
snapshot should be snapshots. Yes it's emabarrasing and it took nothing less than the excellent Firebase Support Team to point it out to me. Wished Android Studio could pick up this kind of silly typos in js code.
Will still mark Renaud's answer, since he helped optimize my code and gave me some usefull tips along the way.