Where to "hide" an API key - javascript

My question is about where and how to:
hide an API key
access the hidden API key in my app.js file
In this case I am dealing with an Algolia Admin API Key (but aiming to ask this question in a fairly generic way).
In order for my app to work, based on certain user actions I need to update my Algolia index (which requires an Admin API Key in order to do).
I understand that I am not supposed to expose my Admin API Key to the front-end (i.e. don’t put it in my app.js file). How can I securely pass my API Admin Key to my app.js file, so that I can make updates to my Algolia index?
Some things I've come across:
-Should I hide it in a config.json file? I can't figure out how use information exported from a config file in my js (which maybe would defeat the purpose anyways?). I read this
-Also on the Firestore docs it mentions, in reference to working with Algolia, that "App ID and API Key are stored in functions config variables". How do I store my API key in config variables?
-I read this about using environment variables, and exporting them to the app.js file. But if I then push the app-env file to the server (which I assume I'll need to in order for the app.js file to read the API key) how is that more secure than just putting it in the file?

You cannot give the key to the client, through any mechanism, and keep it secret.
Build a web service (in your server-side language of choice, with whatever authentication is appropriate) and write your own API to allow a limited selection of actions to be performed. Your web service then acts as a client to Algolia.

After doing a bit more research this is what I've decided.
I need to create a separate .js file that contains the Admin API Key and the function that listens for changes in my database, and makes the appropriate updates to Algolia (that's pretty obvious).
Next I need to get that file to run. I believe I have two options:
Run the file locally. This is secure because the file is not being served anywhere beyond my local machine. The main drawback here is that I'd basically have to keep the file open all the time, which would quickly become impractical in production.
Host my file somewhere like Heroku, Nodejitsu, so that it can run perpetually on their server.

Private Server
We can hide an API key/secret, here, for example, Algolia Admin API Key by creating a separate private server.
The private server can be kept private to yourself or the development team.
We can send an API request to the private server.
And the app/website will use the API key/secret and perform the necessary actions.
Later, send only the necessary data back to the app/website as the response.
This way the API is hidden from the user.
This approach provides a layer of abstraction over the API key/secret as well as the functionalities which uses the API key/secret are associated with the logic

Related

How can I keep my hidden fields like api key after getting bundle in CRA

I have a simple interface but I work with multiple APIs so I don't want to use a backend for it. How can I hide my API keys in CRA without using backend.
It is suggested to use environment variables for this and I have also used it as shown in the documentation but this is not exactly what I want. Even though I use environment variables , a simple CTRL + F search in the "chunk.js" files reveals things that should remain hidden, like my API keys. Is there a way to completely prevent this?
No, not really, and otherwise they could sniff them out with wireshark easily enough or other network monitoring tools.
The way to combat this is making your own api that connects with the other api's, that returns the required content.
Then they need to obtain a license key for your api, that you can revoke from your server as valid if you wish to refuse access to a publicly traded license key to your api.
If you wanna use any API key in JS bundle like this it will be bundled, either you bundle it or you don't.
So you can:
Bundle API key with JS(not secure)
Get API key via API endpoint
Can obfuscate key in client bundle https://github.com/anseki/gnirts , but it's still not secure and somebody can figure it out eventually.

Allow full synchronized filesystem read/write access over a subdirectory of an S3 bucket?

I have a web service involving the attachment of multiple documents to one of many "objects". To simplify this process and make it easier to edit each file individually if the user so desires, I want the user to be able to synchronise all of these files onto a directory on his/her computer (much like that of Google Drive or Dropbox). If the user were to change one of these files, remove a file, or add a file, this would reflect on my web service, and hence affect the files that are attached to this "object".
What would be the best choice of services to do this? I am currently using a Node.JS back-end, although I suspect this will do little to influence the choice of storage. I'm looking for a service that allows the user the flexibility of full filesystem-level CRUD, whilst synchronising the files in a secure manner to a subset of a larger object storage collection, much like synchronising (and only providing access to) a subdirectory of an AWS S3 bucket, hence the title of this question.
I'm currently looking into (somehow) doing this with AWS S3, although I am open to using another storage service.
Thanks in advance.
AWS Storage Gateway (cached gateway) allows you to edit files locally and the Gateway will synchronize the update automatically over to S3.
You will need to install a small VM on the machine. Typically, if your clients have a private data centre / server , this config will allows a "share folder" (or a NAS) to be synchronized with S3.

Securing JS client-side SDKs

I'm working on a React-Redux web-app which integrates with AWS Cognito for user authentication/data storage and with the Shopify API so users can buy items through our site.
With both SDKs (Cognito, Shopify), I've run into an issue: Their core functionality attaches data behind the scenes to localStorage, requiring both SDKs to be run client-side.
But running this code entirely client-side means that the API tokens which both APIs require are completely insecure, such that someone could just grab them from my bundle and then authenticate/fill a cart/see inventory/whatever from anywhere (right?).
I wrote issues on both repos to point this out. Here's the more recent one, on Shopify. I've looked at similar questions on SO, but nothing I found addresses these custom SDKs/ingrained localStorage usage directly, and I'm starting to wonder if I'm missing/misunderstanding something about client-side security, so I figured I should just ask people who know more about this.
What I'm interested in is whether, abstractly, there's a good way to secure a client-side SDK like this. Some thoughts:
Originally, I tried to proxy all requests through the server, but then the localStorage functionality didn't work, and I had to fake it out post-request and add a whole bunch of code that the SDK is designed to take care of. This proved prohibitively difficult/messy, especially with Cognito.
I'm also considering creating a server-side endpoint that simply returns the credentials and blocks requests from outside the domain. In that case, the creds wouldn't be in the bundle, but wouldn't they be eventually scannable by someone on the site once that request for credentials has been made?
Is the idea that these secret keys don't actually need to be secure, because adding to a Shopify cart or registering a user with an application don't need to be secure actions? I'm just worried that I obviously don't know the full scope of actions that a user could take with these credentials, and it feels like an obvious best practice to keep them secret.
Thanks!
Can't you just put the keys and such in a .env file? This way nobody can see what keys you've got stored in there. You can then access your keys through process.env.YOUR_VAR
For Cognito you could store stuff like user pool id, app client id, identity pool id in a .env file.
NPM package for dotenv can be found here: NPM dotenv
Furthermore, what supersecret stuff are you currently storing that you're worried about? By "API tokens", do you mean the OpenId token which you get after authenticating to Cognito?
I can respond to the Cognito portion for this. Your AWS Secret Key and Access Key are not stored in the client. For your React.js app, you only need the Cognito User Pool Id and the App Client Id in your app. Those are the only keys that are exposed to the user.
I cover this in detail in a comprehensive tutorial here - http://serverless-stack.com/chapters/login-with-aws-cognito.html

Get access to DocumentDB with JS

I'm developing an app, which should connect to an external DocumentDB database (not mine). The app is build with Cordova/Ionic.
I founda JavaScript library from Microsoft Azure in order to ensure a DocumentDB database connection, but it is asking for some weird stuff like collection_rid and tokens.
I've got the following from the guys of the external DocumentDB database:
Endpoint: https://uiuiui.documents.azure.com:443/
Live DocumentDB API ReadOnly Key: P8riQBgFUH...VqFRaRA==
.Net Connection String: AccountEndpoint=https://uiuiui.documents.azure.com:443/;AccountKey=jl23...lk23==;
But how am I supposed to retrieve the collection_rid and token from this information?
Without row-level authorization, DocumentDB is designed to be accessed from a server-side app, not directly from javascript in the browser. When you give it the master token, you get full access which is generally not what you want for your end-user clients. Even the read-only key is usually not what you want to hand out to your clients. The Azure-provided javascript library is designed to be run from node.js as your server-side app.
That said, if you really want to access it from the browser without a proxy app running on a server, you can definitely do so using normal REST calls directly hitting the DocumentDB REST API. I do not think the Azure-provided SDK will run directly in the browser, but with help from Browserify and some manual tweaking (it's open source) you may be able to get it to run.
You can get the collection name from the same folks who provided you the connection string information and use name-based routing to access the collection. I'm not sure exactly what you mean by token but I'm guessing that you are referring to the session token (needed for session-level consistency). Look at the REST API specs if you want to know the details about how that token gets passed back and forth (in HTTP headers) but it's automatically taken care of by the SDKs if you go that route.
Please note that DocumentDB also provides support equivalent to row-level authorization by enabling you to create specific permissions on the desired entities. Once you have such a permission, you can retrieve the corresponding token, which is scoped to be valid for a certain time period. You would need to set up a mid-tier that can fetch these tokens and distribute to your user application. The user application can then use these tokens as bearer-tokens instead of using the master key.
You can find more details at https://msdn.microsoft.com/en-us/library/azure/dn783368.aspx
https://msdn.microsoft.com/en-us/library/azure/7298025b-bcf1-4fc7-9b54-6e7ca8c64f49

Which (if any) Javascript storage API (Google Drive, Dropbox, OneDrive) provides automatic syncing?

I have an application that was developed using HTML and javascript. What I need now is to make use of a cloud storage system to access a user's files, which could either be using Google Drive, OneDrive or Dropbox.
One of the requirements is that the application should sync so that new files are added automatically and deleted files removed etc. The sync should be automatic, and there should be no need to poll for changes in the code "manually".
I have determined (as far as I can tell) that with the Dropbox Javascript API, you have to poll for changes and then pull the changes. It seems also with the Google Drive Javascript API that you need to watch for changes and then get those changes. I was leaning towards using OneDrive, but my big problem with that API is that you can (well, so it seems) only access files through a file picker, and I need to get the files without involving the user.
Can anyone confirm the above?
If not, if you need to poll for changes, which would be the best API to use?
And just if anyone has an idea, how often should this be done, and where in the code? Is there some sort of guideline for this?
You can get properties for Files and Folders without the need of the file picker.
File and folder properties (Windows Runtime apps using JavaScript and HTML)
The user will need to authentic with the service as well as grant consent for your application access to their data. Other than that there would be no user interaction required.
You can also use the REST Api's directly once authenticated and granted access. The REST API's are documented here.
Using the REST API
As for the polling interval I might consider using an "observer" design pattern. You're cloud storage system component would register with the "provider" (the parent HTML application) for notifications. You could call the "sync" logic to execute when a predefined operation occurred such as login. You could persist the modified date time your applications root data folder. Then only look for changes in the event of that miss match.
Polling at a given time frequency will only ensure that the data is in sync at that specific time. The user sync state may or may not be valid when they access your application regardless of what frequency you put on the polling method.
Regarding the Dropbox API at least, this is correct. Using the Dropbox JavaScript SDK you need to poll for changes and then pull those changes into your app's local state.

Categories

Resources