Node.js client api key - javascript

I have a node.js backend for an ios app that will provide json data to the app. I want to handle client authentication for each app. The users do not need to create an account. I only want to identify the client apps when providing data and save some data for each client on the node server.
How do I handle identifying each app on the server?
If I need to create an API key, how do I handle that?
If there is a way to authenticate the app when the app first accesses the API, how can I create a unique identifier for the app?
Last, what do I need to know before I deploy the node server? Can I get away by just pointing a domain to my router, opening a port and serving the api from there or is it a must to have a web server setup to handle that?
Thank you

You can basically find a lot of blogs posts to get best practices to follow when designing an api. But here is an over all idea
You can create a client key and send it on every api request or add as part of url
Example: api.example.com/v1/users?client=android&version=1.1
Use Middileware. You can either name as to your convenience or have a database to store key value to manage your clients.
Example:
Create a Middleware which does the handling of authentication and API key checker before you forward it to the routes.
android => 0, ios => 1, web => 2
url: api.example.com/v1/users?client=0&version=1.1
There are many ways to create api keys. Here are some of them
UUID - https://www.npmjs.com/package/uuid
Json web token - https://github.com/auth0/node-jsonwebtoken
Oauth - https://github.com/ciaranj/node-oauth
Again, You have a lot of online posts explaining best practices to follow in production. If express.js, You can find best practices to follow here Express Production
This is just an overview. I request you to do a lot of research online and ask a relative more concrete problems you face towards your learning.

Related

Use separate server for centralized users database

I am using Meteor 1.10 + mongodb.
I have multiple mobile chat & information applications.
These mobile application are natively developed using Meteor DDP libraries.
But I have same users base for all the apps.
Now I want to create a separate meteor instance on separate individual server to keep the users base centralized.
I need suggestions that how can I acheive this architecture with meteor.
Keeping reactivity and performance in mind.
For a centralized user-base with full reactive functionality you need an Authorization Server which will be used by your apps (= Resource Servers) in order to allow an authenticated/authorized request. This is basically the OAuth2 3-tier workflow.
See:
https://www.rfc-editor.org/rfc/rfc6749
https://www.oauth.com/
Login Service
You will also have to write your own login handler (Meteor.loginWithMyCustomAuthServer) in order to avoid DDP.connect because you would then have to manage two userbases (one for the app itself and one for the Authorization Server) and this will get really messy.
This login handler is then retrieving the user account data after the Oauth2 authorization request has been successful, which will make the Authorization Server's userbase the single point of truth for any of your app that is registered (read on Oauth2 workflow about clientId and secret).
Subcribing to users
The Auth server is the single point of truth where you create, updat or delete your users there and on a successfull login your local app will always get the latest user data synced from this accounts Auth Server (this is how Meteor does it with loginWith<Service> too)
You then subscribe to your users to the app itself without any ddp remote connection. This of course works only if the user data you want to get is actually for online users.
If you want to subscribe for any user (where the data might have not been synced yet) you still need a remote subscription to a publication on the Authorizazion server.
Note, that in order to authenticate users with this remote subscription you need an authenticated DDP request (which is also backed by the packages below).
Implementation
Warning - the following is an implementation by myself. This is due to I have faced the same issue and found no other implementation before mine.
There is a full working Accounts server (but constantly work in progress)
https://github.com/leaonline/leaonline-accounts
it uses an Oauth2 nodejs implementation, which has been wrapped inside a Meteor package:
https://github.com/leaonline/oauth2-server
and the respective login handler has also been created:
https://github.com/leaonline/meteor-accounts-lea
So finally I got a work around. It might not be the perfect way to handle this, but to my knowledge it worked for me so well. But yes I still open for suggestions.
Currently I have 4 connecting applications which are dependent on same users base.
So I decided to build SSO (Centralized Server for managing Users Database)
All 4 connecting applications ping SSO for User-Authentication and getting users related data.
Now those 4 connecting applications are developed using Meteor.
Main challenge here was to make things Reactive/Realtime.
E.g Chat/Messaging, Group Creations, Showing users list & listeners for newly registered users.
So in this scenario users database was on other remote server (SSO), so on connecting application I couldn't just:
Meteor.publish("getUsers")
So on connecting applications I decided to create a Temporary Collection called:
UserReactiveCollection
With following structure:
UserReactiveCollection.{
_id: 1,
userId: '2',
createdAt: new Date()
}
And I published subscription:
Meteor.publish("subscribeNewUserSso", function () {
return UserReactiveCollection.find({});
});
So for updating UserReactiveCollection I exposed Rest Api's on each connecting application respectively.
Those apis receive data from SSO and updates in UserReactiveCollection.
So on SSO side when ever a new user is registered. I ping those Apis (on connecting applications) and send the inserted userId in the payload.
So now those connecting applications receives onDataChanged ping from the subscription and gets userId.
Using that userId the connecting applications pings back to SSO and get user details of that specific userId and prepends to the users list.
Thats how I got it all working so for now I am just marking my answer accepted but as I mentioned above that: "It might not be the perfect way to handle this, but to my knowledge it worked for me so well. But yes I still open for suggestions."
And special thanks to #Jankapunkt for helping me out.

How to send Dynamodb query results to browser encrypted?

I have a tiny table (100 records) that I keep in my Dynamodb backend. My web app will scan table at the beginning of the session. And then users will query this data with different parameters. My intention is not to go my backend for each query and do it on client side(front end) for better performance.
But because I don't want anyone to see my tiny table's data I would like to encrypt it while sending and decrypt it on browser side after arrival. I'm using Nodejs, dynamodb and API gateways as backend (AWS serverless architecture).
I'm a newbie and was wondering if it is possible and what the best practices are.
I'll give an example to describe my concern better. Imagine skyscanner keeps all airline-flight-ticketprice data in one table. they will have 2 options to let everbody to search publicly. First they can let users to query the table everytime they search (which will be slow). Second they can scan the table's data and send it to browser and users can search flights much faste on front end (with arrays etc..). I want to implement the 2nd approach but I also want to keep my data encrypted so nobody can copy my data and create a very similar website :)
Thanks.
Using Cognito Identity Pools
You can achieve this with authentication using AWS Cognito Identity Pool(Granting who can access which DynamoDB Table and even which Partition key) and using AWS JavaScript SDK which uses SSL to encrypt the communication between the browser and DynamoDB.
Note: You can also use AWS Cognito UserPools if you don't have a user store which can be connected to Cognito Identity Pool.
Using API Gateway and Lambda endpoint for Temporary Access Credentials
If you already have an existing authentication mechanism you can use a API Gateway and Lambda endpoint where the Lambda function will have the code to assume an IAM Role and send the temporary access credentials to the browser. Then in the browser you can use AWS SDK for JavaScript to access DynamoDB.
Here's a demo app that does specifically what you're looking for...once logged in, the user has access to his own "row" in dynamoDB...
https://github.com/awslabs/aws-cognito-angular-quickstart
Here's what you get by running the install script (it creates all of the resources for you):
http://cognito.budilov.com

Securing JS client-side SDKs

I'm working on a React-Redux web-app which integrates with AWS Cognito for user authentication/data storage and with the Shopify API so users can buy items through our site.
With both SDKs (Cognito, Shopify), I've run into an issue: Their core functionality attaches data behind the scenes to localStorage, requiring both SDKs to be run client-side.
But running this code entirely client-side means that the API tokens which both APIs require are completely insecure, such that someone could just grab them from my bundle and then authenticate/fill a cart/see inventory/whatever from anywhere (right?).
I wrote issues on both repos to point this out. Here's the more recent one, on Shopify. I've looked at similar questions on SO, but nothing I found addresses these custom SDKs/ingrained localStorage usage directly, and I'm starting to wonder if I'm missing/misunderstanding something about client-side security, so I figured I should just ask people who know more about this.
What I'm interested in is whether, abstractly, there's a good way to secure a client-side SDK like this. Some thoughts:
Originally, I tried to proxy all requests through the server, but then the localStorage functionality didn't work, and I had to fake it out post-request and add a whole bunch of code that the SDK is designed to take care of. This proved prohibitively difficult/messy, especially with Cognito.
I'm also considering creating a server-side endpoint that simply returns the credentials and blocks requests from outside the domain. In that case, the creds wouldn't be in the bundle, but wouldn't they be eventually scannable by someone on the site once that request for credentials has been made?
Is the idea that these secret keys don't actually need to be secure, because adding to a Shopify cart or registering a user with an application don't need to be secure actions? I'm just worried that I obviously don't know the full scope of actions that a user could take with these credentials, and it feels like an obvious best practice to keep them secret.
Thanks!
Can't you just put the keys and such in a .env file? This way nobody can see what keys you've got stored in there. You can then access your keys through process.env.YOUR_VAR
For Cognito you could store stuff like user pool id, app client id, identity pool id in a .env file.
NPM package for dotenv can be found here: NPM dotenv
Furthermore, what supersecret stuff are you currently storing that you're worried about? By "API tokens", do you mean the OpenId token which you get after authenticating to Cognito?
I can respond to the Cognito portion for this. Your AWS Secret Key and Access Key are not stored in the client. For your React.js app, you only need the Cognito User Pool Id and the App Client Id in your app. Those are the only keys that are exposed to the user.
I cover this in detail in a comprehensive tutorial here - http://serverless-stack.com/chapters/login-with-aws-cognito.html

How to add database system to WebGL application

I'm currently working on a WebGL sketch drawing project where users can draw arbitrary objects on an html canvas. The javascript libraries and files are all stored on a node.js server which is currently being started up locally every time the software has to be run. Essentially all of the functionality for saving all of the drawn objects on the page has been implemented where the drawings are being written as JSON objects, but the next step is to persist these objects to a database where they can be mapped to a user id. I will also need to implement a login system where users will login and be able to select previously drawn objects to edit from the database.
If this was just a normal website, I would probably just use express.js or something similar, but as the views are essentially rendered entirely in WebGL, I wouldn't think that frameworks would work well with this construct.
Given that I currently just need to create a login system and implement a feature for persisting the JSON object to the DB, are there any frameworks or existing software that accommodates the specified needs of the system?
With regard to authentication, I would recommend taking a look at OAuth and using existing identity providers (e.g. Google, Facebook, etc). You can still retain profiles for your users but you don't have to deal with all of the intricacies of authentication, authorization, security, etc.
There are a ton of JavaScript libraries out there for handling OAuth/OAuth2 interactions. Some even have built-in identity providers. Here are a couple links that returned all sorts of potentially useful libraries:
https://www.npmjs.com/search?q=oauth2
https://www.google.com/search?q=javascript%20oauth2%20library
As for a database, you have a lot of options for storing raw JSON. Some that I've used recently for my JavaScript projects are PostgreSQL, MongoDB, and ArangoDB. You can find well written JS libraries for interacting with any of those.
Another thing to think about is if you want to install the database on your server or use a hosted solution such as RDS or DynamoDB (available from Amazon).
Regardless of the exact authentication and persistence options you choose you will likely use a pattern similar to this:
Your Node.js server is deployed somewhere accessible on the internet, where it exposes the endpoints for your WebGL application, authentication, and saving/loading sketches.
When the user navigates to the WebGL application endpoint on your Node.js server they are required to authenticate (which will utilize your authentication endpoints on the Node.js server).
When the user requests a "save" in your WebGL application you will submit the JSON representation of their sketch (and their authorization token) to the "save" endpoint of your Node.js server.
This "save" endpoint validates the user's authorization token and inserts the JSON into whatever database you've chosen.
The "load" endpoint works just like the "save" endpoint but in reverse. The user asks for a specific sketch. The identity of the sketch (id, name, etc) is sent along with their authorization token. The "load" endpoint on your Node.js server validates their authorization token and then queries the database for the sketch.
The key pattern to notice here is that users don't send requests to your database directly. Your WebGL application should communicate back to your Node.js server and it should commmunicate with your database. This is essential for controlling security, performance, and future updates.
Hopefully this gives you an idea of where to go next.
EDIT based on comments:
I searched around for a Node.js + PostgreSQL guide but didn't find anything I would feel comfortable recommending. I did find this JS library though, which I would check out:
https://github.com/vitaly-t/pg-promise
For MongoDB I would just use their official guide:
https://docs.mongodb.org/getting-started/node/introduction/

Web site using backbone for frontend and nodejs for backend

I'm developing a new web site that will be a single paged app with some dialog/modal windows. I want to use backbone for frontend. This will call backend using ajax/websockets
and render the resulting json using templates.
As a backend I'll use nodejs express app, that will return the json needed for client, it'll be some kind of api. This will not use server side views.
Client will use facebook, twitter, etc. for authentication and maybe custom registration form.
Client static resources, such as css, js, and html files will be handled by nginx (CDN later).
Questions that I have now:
How can I determine that a given user has the right to do some action in api(i.e. delete a building, create new building)? This is authorization question, I thought of giving user a role when they login and based on it determine their rights. Will this work?
Similar to the above question, will this role based security be enough to secure the api? Or I need to add something like tokens or request signing?
Is this architecture acceptable or I'm over engineering and complicating it?
Passport is an option for the authentication piece of the puzzle. I'm the developer, so feel free to ask me any questions if you use it.
I thought of giving user a role when they login and based on it determine their rights. Will this work?
Yes this will work. You can check for a certain role on the user after it's been fetched from the server. You can then display different UI elements depending on this role.
Will this role based security be enough to secure the api? Or I need to add something like tokens or request signing?
It wont be enough. Anyone could hop into the console and set something like user.admin = true. In your API you'll need to validate a user token from the request, making sure that the related user has the appropriate permissions.
Is this architecture acceptable or I'm over engineering and complicating it?
At the least you should have an API validation layer. That would make a decent enough start, and wouldn't be over-engineering.
For the authentication part of your question i would use everyauth which is an authentication middleware for connect/express. It supports almost every oauth-social-network-thingie.
For role management you could give node-roles a try. I didn't use it myself but it should help you out, because it checks the role on the server side. Of course that is only useful if your API is implemented in node.js. If that's not the case, you have to "proxy" the API calls over your node.js app.
I hope I could help you! :)

Categories

Resources