I am writing a simple web app with firebase hosting and cloud functions. My functions are onCreate , onDelete and httpsServer. I wan't to test my app by running it locally. How can I do this since firebase serve only works with https function and hosting.
I have tried running firebase serve and firebase functions:shell at the same time on different bash terminals. This causes firebase functions:shell to fail.
The create function :
exports.created = functions.firestore.document('Books/{bookID}')
.onCreate((snapshot, context) => {
FUNCTION_BODY
});
The Delete Function :
exports.deleted = functions.firestore.document('Books/{bookID}')
.onDelete((snapshot, context) => {
FUNCTION_BODY
});
The https Function :
exports.app = functions.https.onRequest(app);
The error thrown from bash :
$ firebase functions:shell
i functions: Preparing to emulate functions.
Warning: You're using Node.js v10.13.0 but Google Cloud Functions only supports v6.11.5.
! functions: Failed to emulate created
! functions: Failed to emulate deleted
! functions: Failed to emulate app
i functions: No functions to emulate.
No functions emulated.
Output from second bash :
i functions: Preparing to emulate functions.
Warning: You're using Node.js v10.13.0 but Google Cloud Functions only supports v6.11.5.
i hosting: Serving hosting files from: public
+ hosting: Local server: http://localhost:5000
info: initalised
info: rendering home...
+ functions: app: http://localhost:5001/book-shelf-be347/us-central1/app
info: Worker for app closed due to file changes.
Note: These are separate bash terminals running at the same time on the same machine.
I did some digging through the firebase documentation and could not find any solution. This is probably because there is no official tools that let you do this. So I finally solved the problem by running the hosting using nodemon and then using
firebase serve --functions
This solved the problem of using same port since it is handled by nodemon.
Hope firebase will provide new tools in the future.
Related
I am currently trying to connect a frontend (React) to a backend (Express/nodejs) within Azure App Services. I am using Windows, since "Virtual applications and directories" are currently not available for Linux. However, according to my research, that is necessary in this case.
Backend sample: server.js
const express = require('express');
const app = express();
const port = 3003;
require("dotenv").config(); // For process.env
[...]
app.get("/api/getBooks", async (req, res) => {
const books = await Books.find();
res.send(books);
});
Frontend sample: App.js
const getBooks = () => {
axios.get('/api/getBooks')
.then(res => {
setBooks(res.data);
console.log("Got books: ")
console.log(res.data);
})
.catch(err => {
console.log(err);
})
}
Azure: Folder structure
site/server/server.js (Express)
site/wwwroot/index.html (React)
I successfully executed "npm install" via "Development Tools/Console".
The two are already connected via Virtual applications in Azure by using the following configuration.
Virtual applications
The app generally loads succesfully. However, the connection to the backend is not working.
How can I start the node.js server now on Azure and make the proxy working?
I tried to start the server via "node server" on the console. But this does not seem to be working.
I discovered two possible ways to solve this issue.
Assuming you have a client (client/App.js) and a server (server/server.js).
Serve the React App via node.js/Express
Based on the above architecture, a little bit of structure needs to be changed here. Because the React app is no longer output through its own server, but directly through Express.
In server/server.js, the following function must be called after express is declared.
app.use(express.static("../client/build"));
After defining some endpoints to the APIs, the last API node to define is the default route - the static output of the React build.
app.get("/", (res) => {
res.sendFile(path.resolve(__dirname, "client", "build", "index.html"));
});
Using an FTP client, you can now create the /client/build directory that will contain the built React app. Of course, another directory structure can be used.
The client files from the built React app are then simply uploaded there.
The deployment from the server is best done via Visual Studio Code and the Azure plugin.
In the above structure, /server would then be deployed to your in the Azure extension (Azure/App Services --> Right click on "myapp" --> Deploy to Web App ...)
Create two App Services
For example: myapp.azurewebsites.net & myapp-api.azurewebsites.net
myapp must simply contain the built React app (/build) in the wwwroot directory. This can be achieved via FTP.
The deployment from the /server to *myapp-api is best done via Visual Studio Code and the Azure plugin.
In the above structure, /server would then be deployed to myapp-api in the Azure extension (Azure/App Services --> Right click on "myapp-api" --> Deploy to Web App ...)
Also worth mentioning is that CORS should be configured, so that API calls can only be made from myapp.azurewebsites.net. This can be configured in the Azure Portal.
Occasionally the node dependencies have to be installed afterwards via the SSH console in the Azure Portal. For me it sometimes worked automatically and sometimes not.
To do this, simply change to the wwwroot directory (of the /server) and execute the following command.
npm cache clean --force && npm install
Combine this with React Router
React Router is usually used with React. This can be easily combined with a static-served web app from Express.
https://create-react-app.dev/docs/deployment/#other-solutions
Excerpt
How to handle React Router with Node Express routing
https://dev.to/nburgess/creating-a-react-app-with-react-router-and-an-express-backend-33l3
I ran
firebase serve --only-functions
Then ran
functions inspect addMessage
So I could debug the addMessage function. Debugging however did not work.
Running
firebase deploy addMessage --trigger-http
firebase inspect addMessage
Did work and allow me to debug but it doesn't seem to support hot reloading.
Is it possible to have hot reloading and debugging working at the same time?
My index.js:
const functions = require('firebase-functions');
// The Firebase Admin SDK to access the Firebase Realtime Database.
const admin = require('firebase-admin');
admin.initializeApp();
exports.addMessage = functions.https.onRequest((req, res) => {
// Grab the text parameter.
const original = "123";//req.query.text;
// Push the new message into the Realtime Database using the Firebase Admin SDK.
return admin.database().ref('/messages').push({original: original}).then((snapshot) => {
// Redirect with 303 SEE OTHER to the URL of the pushed object in the Firebase console.
return res.redirect(303, snapshot.ref.toString());
});
});
try: ndb firebase serve
debugger breakpoints are hit with stack traces visible, note it's a little slow so give the debugger time to instrument the child processes
Additionally I was able to debug cloud functions in isolation using (caps for removed values):
GCLOUD_PROJECT=THE-FIREBASE-PROJECT node --inspect-brk /path/to/functions-framework --target FUNCTION-NAME --port=5000
where functions-framework simply expands to the full path for the installed functions-framework (global in my case) from the working directory where the index.js file is for the target functions.
Alternately when or where the FIREBASE_CONFIG is needed try this format adjusted to fit:
FIREBASE_CONFIG="{\"databaseURL\":\"https://YOUR-FIREBASE-PROJECT.firebaseio.com\",\"storageBucket\":\"YOUR-FIREBASE-PROJECT.appspot.com\",\"projectId\":\"YOUR-FIREBASE-PROJECT\"}
https://github.com/GoogleChromeLabs/ndb
https://cloud.google.com/functions/docs/functions-framework
https://github.com/GoogleCloudPlatform/functions-framework-nodejs/issues/15
As of firebase-tools v7.11.0, the Firebase emulator now supports attaching a debugger with the --inspect-functions option. This answer shows WebStorm-specific instructions that can be easily adapted to other debuggers.
Is it possible to not run the actual code on firebase database while development by firing up a local instance and developing using that, like we do for other mongodb and mysql databases?
You can try this module firebase-server
I built an open-source project called firebase-server to implement end-to-end tests in my own application. With firebase-server, my end-to-end tests are now running 40% faster and I no longer depend on an Internet connection for running the tests in development.
Firebase Web Socket Protocol Server. Useful for emulating the Firebase server in tests.
var FirebaseServer = require('firebase-server');
new FirebaseServer(5000, 'localhost.firebaseio.test', {
states: {
CA: 'California',
AL: 'Alabama',
KY: 'Kentucky'
}
});
client side
var client = new Firebase('ws://localhost.firebaseio.test:5000');
client.on('value', function(snap) {
console.log('Got value: ', snap.val());
});
For more details
end-to-end-testing-with-firebase-server
firebase-local-development-and-testing-in-angularfire
You can use the Firebase Local Emulator Suite:
The Firebase Local Emulator Suite is a set of advanced tools for developers looking to build and test apps locally using Cloud Firestore, Realtime Database, Cloud Functions, Cloud Pub/Sub and Firebase Hosting.
I was deploying functions just fine, but then it stopped working, and I don't know why. I've reverted back to the sample code (from here or here):
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
// Listens for new messages added to /messages/:pushId/original and creates an
// uppercase version of the message to /messages/:pushId/uppercase
exports.makeUppercase = functions.database.ref('/messages/{pushId}/original')
.onWrite(event => {
// Grab the current value of what was written to the Realtime Database.
const original = event.data.val();
console.log('Uppercasing', event.params.pushId, original);
const uppercase = original.toUpperCase();
// You must return a Promise when performing asynchronous tasks inside a Functions such as
// writing to the Firebase Realtime Database.
// Setting an "uppercase" sibling in the Realtime Database returns a Promise.
return event.data.ref.parent.child('uppercase').set(uppercase);
});
But now, when I run firebase deploy --only functions I get:
=== Deploying to 'mydb'...
i deploying functions
i functions: ensuring necessary APIs are enabled...
i runtimeconfig: ensuring necessary APIs are enabled...
+ runtimeconfig: all necessary APIs are enabled
+ functions: all necessary APIs are enabled
i functions: preparing functions directory for uploading...
i functions: packaged functions (2.04 KB) for uploading
! functions: Upload Error: Cannot read property 'response' of undefined
i starting release process (may take several minutes)...
i functions: updating function makeUppercase...
! functions[makeUppercase]: Deploy Error: Function load error: Node.js module defined by file index.js is expected to export function named makeUppercase
+ functions: 0 function(s) deployed successfully.
Functions deploy had errors. To continue deploying other features (such as database), run:
firebase deploy --except functions
Error: Functions did not deploy properly.
What is wrong?
The console shows the same error messages, without any more explanations:
Version 3.6.0 of the Firebase Tools just came out... after installing that version, the deploy worked fine!
Inside your project by the terminal:
npm install
firebase deploy
It's much helpful to examine the actual logs by viewing the log
firebase functions:log
The specific issue will be visible there. I sometimes had error as simple as a missing package
I am have a huge headache from trying to get the ZMQ Node bindings working with Electron, especially on Windows. I am working on Windows 7 and Ubuntu 16.04 and both of them have two separate issues.
On Windows, I get an error when I try to do require('zmq')
C:\vueelectron\app\node_modules\bindings\bindings.js:91 Uncaught Error: Could not locate the bindings file. Tried:
→ C:\vueelectron\app\node_modules\zmq\build\zmq.node
→ C:\vueelectron\app\node_modules\zmq\build\Debug\zmq.node
→ C:\vueelectron\app\node_modules\zmq\build\Release\zmq.node
→ C:\vueelectron\app\node_modules\zmq\out\Debug\zmq.node
→ C:\vueelectron\app\node_modules\zmq\Debug\zmq.node
→ C:\vueelectron\app\node_modules\zmq\out\Release\zmq.node
→ C:\vueelectron\app\node_modules\zmq\Release\zmq.node
→ C:\vueelectron\app\node_modules\zmq\build\default\zmq.node
→ C:\vueelectron\app\node_modules\zmq\compiled\6.1.0\win32\x64\zmq.node
I've tried compiling with VS 2013 and 2015, rebuilt multiple times, used electron-rebuild nothing seems to be working.
On Linux it loads up fine but the problem is that when I send a message, it seems to get stuck in a loop somewhere and it keeps sending sending hundreds of messages and goes on doing that indefinitely. This was resolved by upgrading from the version of ZMQ in the Ubuntu repositories to the latest one downloaded from the ZeroMQ website.
This is the code I used in my index.html file of my Electron app.
const electron = require('electron')
const zmq = require('zmq')
const socket = zmq.socket('req')
socket.connect('tcp://10.10.0.51:3111')
socket.on('message', function (data) {
console.log(socket.identity + ': answer data ' + data)
})
socket.send('test')
Has anyone else been able to get Electron + ZMQ working? If so, what is your development enviroment like? Thanks.
The problem is the unmatched node.js binary that is delivered by Electron and your version of node. The long answer is that you need to compile Electron and ZeroMQ with the same Node.js headers. Here is the response from Electron community http://github.com/electron/electron/issues/6805. There's a short answer now though!
Use zeromq in place of zmq (same API). zeromq provides prebuilt binaries for electron and node.js for OS X, Windows, and macOS/OS X. After installing zeromq, rebuild for the version of electron you're using:
npm rebuild zeromq --runtime=electron --target=1.4.5
Thanks to the zeromq.js team and have fun with ZeroMQ!
It might be safer to put access to your queue behind an api layer. You might have better success with stability too, native modules in electron can be very tricky.
And but that I mean have a REST server which your electron application communicates with. It would send a message to that api, which then queues the message for your application. Restrict access to the queue at the network level to only the api server.