Cloud Functions hidden after uploading to github? - javascript

I have downloaded node.js, and have created the firebase-function files at the website directory (firebase.json, the functions folder and others).
If I was to write javascript cloud functions inside the project/functions/index.js file, it won't be private when I upload it to my Github repository for my static website. (something.github.io)
So how would I go about calling the firebase cloud functions in my index.js to my static website without uploading the index.js (to keep certain functions private)?
Edit:
I understand now that there are environmental variables, but how can I incorporate that with Github Pages website with firebase Admin SDK and cloud functions?
How do I upload my GitHub pages project and still link my client-side to the environmental variables? Would I need to upload my index.js containing my cloud functions? But also doesn't upload the index.js defeat the purpose of the client not being able to see the functions/data?
The comment below mentioned a software calledHeroku, what exactly is the purpose of it when I already have GitHub and firebase interacting with my website and database?
Also I saw a method of using dotenv, to create a .env file to put secret data (such as API keys) and use gitignore to prevent the file being uploaded? Would that be able to work on Github Pages, and if so, would the client be able to see the .env? And if they can't, can the client-server website link it to the .env even if it isn't pushed to Github

This is a good use of environment variables. Essentially, say you had an API key of 12345. And you had a function as such:
async function fetchResults() {
await fetch("myapi.com/lookup?key=12345")
}
You could do instead:
async function fetchResults() {
await fetch("myapi.com/lookup?key=${process.env.API_KEY}")
}
The environment variable is kept on your machine so it never goes to GitHub, hence allowing you to expose the code as open source while maintaining confidentiality of your sensitive keys.
Edit: So I reread your question and I see you're talking about publishing to GitHub pages. The key thing to note is that anything that is hosted client-side the user will be able to see. GitHub Pages only hosts the "client" portion of your application. So if your client (browser) website makes an API call to myapi.com/lookup?key=12345, they will be able to see it no matter what, since it's their browser making a request and they can see anything their browser does.
However, the best practice here is to write server-side code to run your application as well. For this, you can use what I suggested above, where you add environment variables to whichever server you use to host (for example, you can do this easily with [Zeit Now][2] or Heroku). That way, you can share your code but your environment variables stay secret on the machine that runs your server-side code.

Related

How to write to enviornment variables on Heroku

I have made a website meant to me used by only one person, so I want to dynamically write to .env file on Heroku without it resting,
because this is meant only for one person. I don’t want to deal with a database.
Something like this:
require(`dotenv`).config();
console.log(process.env.MYVAL); // Not my value
process.env.MYVAL = "MYVAL"
console.log(process.env.MYVAL); // MYVAL
You could use the heroku api to do that
but it will have to restart the dyno Docs
You can set the environment variables in the settings tab on your Heroku dashboard and also using the command line. Please check the following documentation to get more information.
Configuration and Config Vars
You need to persist data (even if it is a single value). Therefore you should not write to Heroku file system nor storing it in environment variables (Heroku configuration variables).
I understand using a database could be not worth it, and in this case I would use an external file storage (Amazon S3, Dropbox, and even using GitHub private repository).
On Files on Heroku you can see some options and (Python) code.

Allow full synchronized filesystem read/write access over a subdirectory of an S3 bucket?

I have a web service involving the attachment of multiple documents to one of many "objects". To simplify this process and make it easier to edit each file individually if the user so desires, I want the user to be able to synchronise all of these files onto a directory on his/her computer (much like that of Google Drive or Dropbox). If the user were to change one of these files, remove a file, or add a file, this would reflect on my web service, and hence affect the files that are attached to this "object".
What would be the best choice of services to do this? I am currently using a Node.JS back-end, although I suspect this will do little to influence the choice of storage. I'm looking for a service that allows the user the flexibility of full filesystem-level CRUD, whilst synchronising the files in a secure manner to a subset of a larger object storage collection, much like synchronising (and only providing access to) a subdirectory of an AWS S3 bucket, hence the title of this question.
I'm currently looking into (somehow) doing this with AWS S3, although I am open to using another storage service.
Thanks in advance.
AWS Storage Gateway (cached gateway) allows you to edit files locally and the Gateway will synchronize the update automatically over to S3.
You will need to install a small VM on the machine. Typically, if your clients have a private data centre / server , this config will allows a "share folder" (or a NAS) to be synchronized with S3.

Google APIs - removing auth scope

I'm changing scopes in an app for Google Classroom. I remove from courses .readonly and added student listing
var SCOPES = "https://www.googleapis.com/auth/classroom.courses https://www.googleapis.com/auth/classroom.coursework.students";
I get this error when requesting students even after logging out and attempting to re-authenticate:
Request had insufficient authentication scopes
It seems the token has been cached somewhere.
This Github issue, although for Google Sheets, says the token is in Documents/.credentials/ folder. I don't have this folder though on my Macbook Pro Sierra 10.12.6.
Where can I find that folder and remove the saved scopes so it reauthenticates and accepts my new scopes?
If you change the scopes needed in your application then the user will need to authenticate your application. especially if you go from a read-only scope to a read write scope. This is because you need additional permissions then what you had originally requested. List of google classroom scopes
Assuming that you are using the Google .net client library then you can find the user credentials in the %appdata% folder on your machine. By deleting that fine you can force a authentication. I am guessing that you are since this is the github project you have linked to.
Note: there should be a way of forcing reauth via code but i cant remember the command right now i will have to look it up.

Where to "hide" an API key

My question is about where and how to:
hide an API key
access the hidden API key in my app.js file
In this case I am dealing with an Algolia Admin API Key (but aiming to ask this question in a fairly generic way).
In order for my app to work, based on certain user actions I need to update my Algolia index (which requires an Admin API Key in order to do).
I understand that I am not supposed to expose my Admin API Key to the front-end (i.e. don’t put it in my app.js file). How can I securely pass my API Admin Key to my app.js file, so that I can make updates to my Algolia index?
Some things I've come across:
-Should I hide it in a config.json file? I can't figure out how use information exported from a config file in my js (which maybe would defeat the purpose anyways?). I read this
-Also on the Firestore docs it mentions, in reference to working with Algolia, that "App ID and API Key are stored in functions config variables". How do I store my API key in config variables?
-I read this about using environment variables, and exporting them to the app.js file. But if I then push the app-env file to the server (which I assume I'll need to in order for the app.js file to read the API key) how is that more secure than just putting it in the file?
You cannot give the key to the client, through any mechanism, and keep it secret.
Build a web service (in your server-side language of choice, with whatever authentication is appropriate) and write your own API to allow a limited selection of actions to be performed. Your web service then acts as a client to Algolia.
After doing a bit more research this is what I've decided.
I need to create a separate .js file that contains the Admin API Key and the function that listens for changes in my database, and makes the appropriate updates to Algolia (that's pretty obvious).
Next I need to get that file to run. I believe I have two options:
Run the file locally. This is secure because the file is not being served anywhere beyond my local machine. The main drawback here is that I'd basically have to keep the file open all the time, which would quickly become impractical in production.
Host my file somewhere like Heroku, Nodejitsu, so that it can run perpetually on their server.
Private Server
We can hide an API key/secret, here, for example, Algolia Admin API Key by creating a separate private server.
The private server can be kept private to yourself or the development team.
We can send an API request to the private server.
And the app/website will use the API key/secret and perform the necessary actions.
Later, send only the necessary data back to the app/website as the response.
This way the API is hidden from the user.
This approach provides a layer of abstraction over the API key/secret as well as the functionalities which uses the API key/secret are associated with the logic

Prevent users from seeing Meteor client script by role

In Meteor we put all sensitive code in /server and browser code in /client. Meteor then automatically compiles and minifies all /client side code for us. Thanks Meteor.
However, I'm using https://github.com/alanning/meteor-roles to manage content by user roles. One of those roles is an administrator and I have a client side scripts for use only by that user eg: /client/admin-only/**.js. All code in those scripts checks the user is an administrator and only calls the server to do sensitive tasks, but I don't want anyone but an adminstrator to be able to even see that code.
What I want to ensure is that these client admin JS files are only downloaded to users who are actual administrators and not included in the auto-compiled/minified JS created by Meteor.
Is there any way to setup Meteor to generate 2 versions of it's client JS - One for normal users and one for administrators - and only download those files based on user role?
The Meteor Guide addresses this issue:
While the client-side code of your application is necessarily accessible by the browser, every application will have some secret code on the server that you don’t want to share with the world. Secret business logic in your app should be located in code that is only loaded on the server. This means it is in a server/ directory of your app, in a package that is only included on the server, or in a file inside a package that was loaded only on the server.
Basically, MDG's guidance is to dumb down that admin view as much as possible. If that's not acceptable, you'll need to have it bundled in a separate Meteor application on either an internally accessible network only, or by using two MongoDB instances so you can separate authentication out for the second app.

Categories

Resources