Unit testing of HttpsCallable cloud functions - online mode - javascript

I am designing a backend api (for android and iOS apps) around HttpsCallable cloud functions. It becomes quite cumbersome to test them through the app, so I wish to switch to unit testing the functions (before production deployment) using the firebase-functions-test tool. I have been following this unit testing tutorial.
I am having some issues running the unit tests in online mode. Let me provide some details.
Here is my `package.json' content:
{
"name": "functions",
"description": "Cloud Functions for Firebase",
"scripts": {
"lint": "eslint .",
"serve": "firebase serve --only functions",
"shell": "firebase functions:shell",
"start": "npm run shell",
"deploy": "firebase deploy --only functions",
"logs": "firebase functions:log",
"test": "mocha --reporter spec"
},
"engines": {
"node": "8"
},
"dependencies": {
"#google-cloud/tasks": "^1.9.0",
"#googlemaps/google-maps-services-js": "^2.0.2",
"chai": "^4.2.0",
"firebase-admin": "^8.6.0",
"firebase-functions": "^3.3.0",
},
"devDependencies": {
"eslint": "^5.12.0",
"eslint-plugin-promise": "^4.0.1",
"firebase-functions-test": "^0.1.7",
"mocha": "^7.1.1"
},
"private": true
}
I am using google APIs (Directions, Geocoding etc) from firebase backend, therefore to be able to access them while running my tests, I configured my tests located at test/index.test.js as recommended in the Unit testing tutorial as follows:
const firebaseConfig = {
apiKey: ...,
authDomain: ...,
databaseURL: "https://my-project.firebaseio.com",
projectId: "my-project",
storageBucket: "my-project.appspot.com",
messagingSenderId: ********,
appId: *********
};
const test = require('firebase-functions-test')(firebaseConfig
, 'path_to_json_keyfile/myproject.json');
Below is the sample test code. Note that all my callable functions return HttpsError, and the test below, simply checks for the code field of HttpsError, if its ok, the test passes. The functions I am testing reside in a separate rides.js file, which the index.js exports as exports.rides = require(./rides.js) (not shown below)
const chai = require('chai');
const assert = chai.assert;
describe('Cloud functions', () => {
let rideFunctions;
before(() => {
rideFunctions = require('../index.js');
});
after(() => {
test.cleanup();
});
describe('getRideEstimate', () => {
it('should return ok', async () => {
data = {
riderType: 1,
pickup_lat: 37.0,
pickup_lng: -122,
dropoff_lat: 37.01,
dropoff_lng: -122.01,
scheduledTime: 1585939000,
ts: 1585939000,
tz_id: "uslax"
}
context = {
auth: {
uid: AUTH_ID_FOR_USER_ACCOUNT
}
};
const wrappedGetRideEstimate = test.wrap(rideFunctions.rides.getRideEstimate);
let httpsError = await wrappedGetRideEstimate(data, context);
return assert.equal(httpsError.code, 'ok');
});
})
describe('testCallable', () => {
it('should return ok', async () => {
data = {}
context = {}
const wrappedTestCallable = test.wrap(rideFunctions.rides.testCallable);
let httpsError = await wrappedTestCallable(data, context);
return assert.equal(httpsError.code, 'ok');
})
})
})
Problem
My simple testCallable function of the form
exports.testCallable = functions.https.onCall((data, context) => {
console.log('testCallable');
return new functions.https.HttpsError('ok', '', '');
})
passes the test (as expected) but inexplicably, it seems it is running in the offline mode, as there is no record of it on cloud function logs in Firebase Console. Also, if I disable connectivity of my laptop, the test result remains the same suggesting that somehow this function is running in the offline mode.
My getRideEstimate function which calls the google Directions API, returns a lengthy 400 error indicating some issue with forming the request to Directions API. I don't know if this error is related to the first problem, but since the Directions API call is embedded deeper inside the getRideEstimate function, it does suggest that the function is getting called, but somehow the API call is failing.
Is there any other way to test HttpsCallable functions in online mode?

For me this works:
import * as firebaseSetup from 'firebase-functions-test';
export const testEnvironment = firebaseSetup({
databaseURL: "https://project-id.firebaseio.com",
projectId: 'project-id'
}, './service-account-project-id-firebase-adminsdk.json');
You can find a full YouTube tutorial under this: https://youtu.be/UDMDpdu5-rE?t=1122

Related

Puppeteer Web Scraper runs on local machine only but fails to deploy as Google Cloud Function

I have built a simple scraper with Puppeteer which I can run locally on my machine, but when I deploy it as a Google Cloud function, it's not working. The only error message I get from the Google Cloud Logs is:
Function failed on loading user code. This is likely due to a bug in
the user code.
Here are the steps I follow to deploy it the function; code is further below.
(Note: I'm outlining the process of zipping the files; I have tried the Cloud Function inline editor as well but am receiving the same error.)
Run npm install
Run zip -r my-app.zip *
Create new Google Cloud function
-- Name 'newFunction'
-- Memory: 1gb
-- Runtime: Node.js 14
-- Entry point: scrapeFunction
Upload Zip
index.js
const puppeteer = require('puppeteer');
const { BigQuery } = require('#google-cloud/bigquery');
async function scrapeFunction() {
const browser = await puppeteer.launch({ args: ['--no-sandbox', '--disable-setuid-sandbox'] });
const page = await browser.newPage();
await page.goto('<URL>', {waitUntil: 'load', timeout: 0});
await page.waitForSelector('span.text');
const info = await page.evaluate(() => {
return document.querySelector('span.text').innerText;
});
console.log(info);
// Write results to BigQuery table
const bigqueryClient = new BigQuery();
const dataset = bigqueryClient.dataset('xxx');
const table = dataset.table('yyy');
const rows = [{ info: info }];
await table.insert(rows);
await browser.close();
}
scrapeFunction();
package.json
{
"name": "newFunction",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"keywords": [],
"author": "",
"license": "ISC",
"dependencies": {
"#google-cloud/bigquery": "^6.1.0",
"puppeteer": "^19.7.1"
}
}

How can i speed up firebase functions in an express app which are taking too long to load taking more than 8 seconds most times

We have an application that uses a nodejs and express backend powered by firebase but one problem that we're facing is very long load times even for simple queries. We refractored our endpoints into different files so that they all don't run all at once and we also have cron jobs which i suspected would help with cold starts in which i'm convinced and doubt we experiencing as all the requests including subsquent req are just slow and they really take at most times more than 8 seconds which is a poor perfomance for users to experience.
in our package.json this what we have in order for you to see the firebase versions we using
{
"name": "functions",
"description": "Cloud Functions for Firebase",
"scripts": {
"serve": "firebase serve --only functions",
"shell": "firebase functions:shell",
"start": "npm run shell",
"deploy": "firebase deploy --only functions",
"logs": "firebase functions:log"
},
"engines": {
"node": "14"
},
"dependencies": {
"#google-cloud/storage": "^5.8.2",
"#sendgrid/mail": "^7.2.1",
"algoliasearch": "^4.3.0",
"bcrypt": "^5.1.0",
"busboy": "^0.3.1",
"cookie-parser": "^1.4.5",
"cors": "^2.8.5",
"dayjs": "^1.10.4",
"dotenv": "^8.2.0",
"easy-soap-request": "^4.1.3",
"express": "^4.17.1",
"firebase": "^7.15.5",
"firebase-admin": "^8.6.0",
"firebase-functions": "^3.23.0",
"fs-extra": "^9.0.1",
"jwt-decode": "^2.2.0",
"moment": "^2.29.1",
"request": "^2.88.2",
"sharp": "^0.25.4",
"sib-api-v3-sdk": "^8.4.2",
"uuid": "^8.2.0",
"xml-js": "^1.6.11"
},
"devDependencies": {
"firebase-functions-test": "^0.1.6"
},
"private": true
}
below is the index.js file on how we set up everything.
require("dotenv").config();
const functions = require("firebase-functions");
const express = require("express");
const app = express();
const cookieParser = require("cookie-parser");
const cors = require("cors");
app.use(cors());
app.use(cookieParser());
app.use(express.json());
const dashboardRoutes = require("./routes/dashboardRoutes");
const userRoutes = require("./routes/userRoutes");
const pagesRoutes = require("./routes/pagesRoutes");
const orderRoutes = require("./routes/orderRoutes");
const cartRoutes = require("./routes/cartRoutes");
const wishlistRoutes = require("./routes/wishlistRoutes");
const authRoutes = require("./routes/authRoutes");
const storeRoutes = require("./routes/storeRoutes");
const createSellerRoutes = require("./routes/createSellerRoutes");
app.use("/", pagesRoutes);
app.use("/user", userRoutes);
app.use("/dash", dashboardRoutes);
app.use("/order", orderRoutes);
app.use("/cart", cartRoutes);
app.use("/wishlist", wishlistRoutes);
app.use("/auth", authRoutes);
app.use("/s", storeRoutes);
app.use("/cr", createSellerRoutes);
const {
cron_job1,
cron_job2,
cron_job3,
cron_job4,
} = require("./triggers/search_triggers_and_cron_jobs"); <-- not the name of the actual file
const { **other cron jobs** } = require("./cron-jobs");
const {
update_exchange_currency_rates,
} = require("./cron-jobs/currency_exchange_rates");
const { reset_product_visits } = require("./triggers/products");
const { Home } = require("./handlers/pages");
const { db } = require("./util/admin");
const { product_basic_obj } = require("./util/product_basic_obj");
exports.apis = functions.https.onRequest(app);
// this functionality below is the sample for the kind of executions we perfom in the endpoints in which we also experience slow execution times ven for a function in this file
app.get("/test-home", (req, res) => {
let content = {};
db.collection("products")
.where("status", "==", "active")
.orderBy("visited", "desc")
.limit(20)
.get()
.then((data) => {
content.popular_today = [];
data.forEach((x) => {
content.popular_today.push(product_basic_obj(x.data()));
});
return db
.collection("products")
.where("status", "==", "active")
.orderBy("todaysSales", "desc")
.limit(20)
.get();
})
.then((data) => {
content.hot_today = [];
data.forEach((x) => {
content.hot_today.push(product_basic_obj(x.data()));
});
return db.collection("departments").get();
})
.then((data) => {
content.departments = [];
data.forEach((x) => {
content.departments.push(x.data());
});
return db
.collection("departments")
.orderBy("products_sold_today", "desc")
.limit(6)
.get();
})
.then((data) => {
content.top_departments = [];
data.forEach((x) => {
content.top_departments.push(x.data());
});
return res.status(200).json(content);
});
});
//cron jobs
exports.cron_job1 = cron_job1;
exports.cron_job2 = cron_job2;
exports.cron_job3 = cron_job3;
exports.cron_job4 = cron_job4;
Upon executing an end point this is what shows in the consol and the in a deployed development we experience the same slow execution times which seems to be the average for all our executions
i functions: Beginning execution of "us-central1-apis"
⚠ Google API requested!
- URL: "https://www.googleapis.com/oauth2/v4/token"
- Be careful, this may be a production service.
i functions: Finished "us-central1-apis" in ~9s
i functions: Finished "us-central1-apis" in ~8s
i functions: Beginning execution of "us-central1-apis"
i functions: Finished "us-central1-apis" in ~7s
i functions: Beginning execution of "us-central1-apis"
i functions: Finished "us-central1-apis" in ~7s
i functions: Beginning execution of "us-central1-apis"
i functions: Finished "us-central1-apis" in ~7s
i functions: Beginning execution of "us-central1-apis"
i functions: Finished "us-central1-apis" in ~6s
i functions: Beginning execution of "us-central1-apis"
i functions: Finished "us-central1-apis" in ~7s
i functions: Beginning execution of "us-central1-apis"
i functions: Finished "us-central1-apis" in ~7s
How can i speed up the execution using the information above.
We tried breaking code into smaller files which couldn't work as we expected to get more faster excecutions and we also removed most of our 3rd part libraries but we failed to make a change. What could we do to bring down executions times futher down.
Your data loading strategy is sequential, Load 1 then Load2 then Load3, and in case none of the following Load depends on the result of previous Load - that aproach is not really effective and useful.
Instead - you can try to utilize Promise.all() to fire all that promises "in parallel".
Next issue - you are loading departments from firebase twice, actual departments and top_departments, and there is no need to load top_departments again due to all the data that is needed is already in departments, you only need to .sort and slice them (or its shallow copy [...departments]).
I'd try this approach:
// popular_today
function getPopularProductsByVisitedAsync() {
return db
.collection("products")
.where("status", "==", "active")
.orderBy("visited", "desc")
.limit(20)
.get()
.then((data) => {
return data.docs.map((x) => product_basic_obj(x.data()));
});
}
// hot_today
function getPopularProductsByTodaySalesAsync() {
return db
.collection("products")
.where("status", "==", "active")
.orderBy("todaysSales", "desc")
.limit(20)
.get()
.then((data) => {
return data.docs.map((x) => product_basic_obj(x.data()));
});
}
function getAllDepartmentsAsync() {
return db
.collection("departments")
.get()
.then((data) => data.docs.map((x) => x.data()));
}
app.get("/test-home", async (req, res) => {
const [popular_today, hot_today, departments] = await Promise.all([
getPopularProductsByVisitedAsync(),
getPopularProductsByTodaySalesAsync(),
getAllDepartmentsAsync()
]);
// TODO: Check asc/desc, im not sure, was not testing
const top_departments = [...departments]
.sort((a, b) => a.products_sold_today - b.products_sold_today)
.slice(0, 6);
const content = {
popular_today: popular_today,
hot_today: hot_today,
departments: departments,
top_departments: top_departments
};
return res.status(200).json(content);
});
Try to execute your requests in parallel on the index.js.
This optimization will provide some gains on the network request timings.

Firebase functions crashes while accessing firestore

Could you please find an error in following code?
const functions = require("firebase-functions");
const admin = require("firebase-admin");
admin.initializeApp();
exports.GetShort = functions.https.onRequest((request, response) => {
response.header("Access-Control-Allow-Origin", "*");
longURL = request.query.long
functions.logger.info("url is - " ,longURL)
SaveToDB(longURL)
})
function SaveToDB(link){
functions.logger.info("here")
admin.firestore().collection("url").where("urlNames","array_contains",link).get().then(
function(querySnapshot){
functions.logger.info("snap, " ,querySnapshot)
querySnapshot.forEach(function(doc) {
functions.logger.info("things : " ,doc.id, " => ", doc.data())
// doc.data() is never undefined for query doc snapshots
console.log(doc.id, " => ", doc.data());
});
}
) .catch(function(error) {
functions.logger.info("Error getting documents: ", error);
});
}
After hitting above function, firebase-functions logs displays logs till "here". After that it crashes without any more logs/stacktrace.
below is the contents of packages.json from functions directory.
{
"name": "functions",
"description": "Cloud Functions for Firebase",
"scripts": {
"lint": "eslint .",
"serve": "firebase emulators:start --only functions",
"shell": "firebase functions:shell",
"start": "npm run shell",
"deploy": "firebase deploy --only functions",
"logs": "firebase functions:log"
},
"engines": {
"node": "14"
},
"main": "index.js",
"dependencies": {
"firebase-admin": "^9.8.0",
"firebase-functions": "^3.14.1"
},
"devDependencies": {
"eslint": "^7.6.0",
"eslint-config-google": "^0.14.0",
"firebase-functions-test": "^0.2.0"
},
"private": true
}
I would kindly suggest you watch the 3 videos about "JavaScript Promises" from the Firebase video series to see how to manage the life cycle of a Cloud Function and the way to deal with calls to asynchronous methods.
In particular, for an HTTPS Cloud Function, you need to end it with send(), redirect(), or end().
So your code could be adapted as follows:
exports.GetShort = functions.https.onRequest((request, response) => {
response.header("Access-Control-Allow-Origin", "*");
const longURL = request.query.long;
functions.logger.info("url is - ", longURL)
SaveToDB(longURL)
.then(() => {
response.status(200).send('Saved to DB');
})
.catch(error => {
// See video series
response.status(500).send(error);
})
})
function SaveToDB(link) {
functions.logger.info("here")
return admin.firestore().collection("url").where("urlNames", "array_contains", link).get()
.then(querySnapshot => {
functions.logger.info("snap, ", querySnapshot)
querySnapshot.forEach(function (doc) {
functions.logger.info("things : ", doc.id, " => ", doc.data())
// doc.data() is never undefined for query doc snapshots
console.log(doc.id, " => ", doc.data());
// => Here, depending on your real functional requirements, you may need to use Promise.all()
});
return null;
}
).catch(function (error) {
functions.logger.info("Error getting documents: ", error);
// Throw an error
});
}
Well, I found the problem. Sometimes silly things make most of the noise around.
Instead of "array-contains", I wrote "array_contains".

TypeError: querySnapshot.forEach is not a function - Firebase Cloud Functions

I have a data structure made this way:
Posts(collection)
UserId(document)
Posts(collection)
postDoc (document)
And I'm setting a cloud function to change all Posts(subcollection) documents upon a certain event, and in order to do so I'm using collectionGroup queries:
This is how I set it up:
exports.changeIsVisibleFieldAfterDay = functions.pubsub
.schedule("every 2 minutes").onRun((context) => {
const querySnapshot = db.collectionGroup("Posts")
.where("isVisible", "==", true).get();
querySnapshot.forEach((doc) => {
console.log(doc.data());
});
});
In the Firebase Log I receive the following error:
TypeError: querySnapshot.forEach is not a function
Searching online I found out that there may be a problem with Firebase SDK version but mine is uptodate, I attach the package.json file here:
{
"name": "functions",
"description": "Cloud Functions for Firebase",
"scripts": {
"lint": "eslint .",
"serve": "firebase emulators:start --only functions",
"shell": "firebase functions:shell",
"start": "npm run shell",
"deploy": "firebase deploy --only functions",
"logs": "firebase functions:log"
},
"engines": {
"node": "12"
},
"main": "index.js",
"dependencies": {
"firebase-admin": "^9.2.0",
"firebase-functions": "^3.11.0"
},
"devDependencies": {
"eslint": "^7.6.0",
"eslint-config-google": "^0.14.0",
"firebase-functions-test": "^0.2.0"
},
"private": true
}
The get() method is asynchronous, so you either need to use then() to get the querySnapshot when the Promise returned by get() is fulfilled, or use async/await. More details on how to deal with asynchronous calls in this SO answer.
With then()
const functions = require('firebase-functions');
// The Firebase Admin SDK to access Firestore.
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
exports.changeIsVisibleFieldAfterDay = functions.pubsub
.schedule("every 2 minutes").onRun((context) => {
return db.collectionGroup("Posts").where("isVisible", "==", true).get()
.then(querySnapshot => {
querySnapshot.forEach((doc) => {
console.log(doc.data());
});
return null;
})
});
With async/await
const functions = require('firebase-functions');
// The Firebase Admin SDK to access Firestore.
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
exports.changeIsVisibleFieldAfterDay = functions.pubsub
.schedule("every 2 minutes").onRun(async (context) => {
const querySnapshot = await db.collectionGroup("Posts").where("isVisible", "==", true).get();
querySnapshot.forEach((doc) => {
console.log(doc.data());
});
return null;
});
Note the return null; at the end. See this doc item for more details on this key point.
Note also that if you want to update several docs within the forEach() loop, you will need to use Promise.all(). Many SO answers cover this case.

I am trying to run my test script using mocha but there is an error of "Reference Error: beforeEach is not defined"

Mocha test Suite gives a reference error and it says "beforeEach is not defined"
I am trying to run my test script for my todo app in node.js using mocha. But there is a reference error and it says "beforeEach is not defined"
const {app} = require('./../server');
const {Todo} = require('./../models/todo');
beforeEach((done) => {
Todo.remove({}).then(() => done());
});
describe('POST /todos', () => {
it('should create a new todo', (done) => {
var text = 'Test todo text';
request(app)
.post('/todos')
.send({text})
.expect(200)
.expect((res) => {
expect(res.body.text).toBe(text);
})
.end((err, res) => {
if (err) {
return done(err);
}
Todo.find().then((todos) => {
expect(todos.length).toBe(1);
expect(todos[0].text).toBe(text);
done();
}).catch((e) => done(e));
});
});
it('should not create todo with invalid body data', (done) => {
request(app)
.post('/todos')
.send({})
.expect(400)
.end((err, res) => {
if (err) {
return done(err);
}
Todo.find().then((todos) => {
expect(todos.length).toBe(0);
done();
}).catch((e) => done(e));
});
});
});
Also, I have included all the necessary packages for my package.json file.
My Package.json file is given below
{
"name": "todo-api",
"version": "1.0.0",
"description": "",
"main": "index.js",
"scripts": {
"test": "mocha server/**/*.test.js",
"test-watch": "nodemon --exec 'npm test' "
},
"author": "",
"license": "ISC",
"dependencies": {
"bluebird": "^3.5.3",
"body-parser": "^1.15.2",
"express": "^4.14.0",
"mongodb": "^2.2.5",
"mongoose": "^4.5.9"
},
"devDependencies": {
"expect": "^1.20.2",
"mocha": "^3.0.2",
"nodemon": "^1.10.2",
"supertest": "^2.0.0"
}
}
It looks to me like you are trying to clear out your data in your MongoDB database before each new test of your todo app. So you want to find your collection of todo and drop everything in there before your next test, if so
beforeEach(() => {
mongoose.connection.collections.todo
});
With the above you are directly referencing the collection of todo sitting inside your database and you call the drop() function on your collection of todo.
beforeEach(() => {
mongoose.connection.collections.todo.drop();
});
Please let me know if I am correct in fulfilling your requirement. Keep in mind this is an asynchronous operation, so you need to ensure you pause the entire testing environment until the operation is complete, but you already know this, because you have already attempted to implement the done callback.
Also, drop() will accept a callback function like so:
beforeEach((done) => {
mongoose.connection.collections.todo.drop(() => {});
});
The function will only be executed once you are done with the collection. So you also have to pass the done() callback after defining it like so:
beforeEach((done) => {
mongoose.connection.collections.todo.drop(() => {
done();
});
});
Also, when you run a Mocha test, you do npm run test.
I've just tried to replicate your issue with the test with simple repository using your package.json file. My test file
const expect = require('expect');
beforeEach((done) => {
done();
});
describe('just a test', function() {
it('test', function() {
expect(true).toBe(true);
})
});
Then, when running npm t, the test was executed successfully.
Perhaps there is a misconfiguration in your project.

Categories

Resources