Serverless Stack Applications - javascript

I have recently came across this term "Serverless Stack" and while I did a little research about it, I found it helps us in creating serverless web applications using framework like ReactJS and DynamoDB for backend which can be hosted over cloud using AWS.
However when it comes to the scalability of applications, there is very little to no information available across various blogs.
Had anyone tried this stack in your applications?.
I would want to hear,
What are the other tech stack (languages, frameworks) which can be used as part of this new Serverless Stack
Does it scale well? (Particularly when a website has more viewers)
Can someone shed some light?

In my experience I can tell you that what you are talking about is also known as no-backend applications (resource).
The principle of this approach is that you can abstract a number of features that traditionally are implemented into the server tier, and move them into decoupled services exposed as SaaS.
As you mentioned, a famous example are the smartphone hybrid applications that rely only onto Firebase that provides them authentication, authorization and other few backend features.
If you need another kind of feature, like emails, you can do it within your frontend code by using a proper email service provider.
In terms of scalability what you have to do is simply to scale the services that you are using, for example using a bigger Firebase plan.
In terms of security you have to understand that in a web application your code is always visible so all your business logic could be red, analized and easly hacked. This is why the no-backend approach fits better the mobile application needs, since they are wrapped into proper containers designed in order to grant a better level of obscuration about what your application is doing.
Hope this could help you

Some of the existing frameworks:
apex: lets you build, deploy, and manage AWS Lambda functions with ease
chalice: Python Serverless Microframework for AWS
claudia: makes it easy to deploy Node.js projects to AWS Lambda and API Gateway
serverless.com: helps building apps on AWS Lambda
Search "serverless" on github to find more.
AWS-based services seem to scale well but have a look at the competition too:
https://azure.microsoft.com/en-us/services/functions/
https://cloud.google.com/functions/

Related

API, back-end and front-end as all three separate components

I tried to find something on the internet but could not find anything similar. So I'm asking it here:
SITUATION: I have a big API which does some heavy calculations and has a lot of functionality. There are some clients using this API and has implemented it in their software. Now I want to write some front-end for that API so some users could manage their workflow more easily.
CONSIDERED SOLUTION: I am considering of making a separate back-end application which would use an API and serve for the front end (look at the picture attached). The backend would do authorization / caching / data-adapting operations.
QUESTION: But I have never ever crossed such app design with three layers API-BE-FE. So is it worth making things this way? Are there any significant drawbacks? Is it safe to put some oauth authorisation in the back-end side, not api itself? Like what are your thoughts about it?
I agree with your design. You have a specific API which is meant to serve specific endpoints. This way you are separating your concerns, as you can add to your BE things that aren't related to the API itself, but are related to the FE.
Also, many APIs are using credentials and keys so you can implement a similar functionality.
Your considered solution on architecture looks good.
The most biggest advantage to implement a back-end between front-end and API is, it can provide good separation of concerns. It usually happens around me that front-end engineers ask API engineers every time when they need new endpoints. It looks just cooperation, but sometimes goes too much. This kind of conversation has potential to result in making too many endpoints in API which shouldn't have had. I am not really sure what the architecture policy of API team in your company is, but just to allow API to be growing big for front-end is not good. The more functionalities the API has now, the worse it will easily be.
In your plan, you are trying to implement back-end to access API for front-end. It was similar to the architecture of BFF (Back-end For Front-end) described by Sam Newman (http://samnewman.io/patterns/architectural/bff/). With this concept, you can implement a back-end as a kind of a gateway which handles front-end specific requests to API. Back-end can even buffer the potential influence to API caused by change in front-end if needed. Everything can be kept well separated.
In BFF, I don't think that back-end plays a role to provide application-related functionalities such as authorization, caching, and data-adapting operations, but this depends on you. You can implement new APIs to handle those functionalities and have back-end just be a gateway which ties them up. It would also work just to put those things into back-end as long as it is not too fat.
Drawback?
The possible drawback, I suppose, is maintainability of scaling. This totally depends on the infrastructure team or members you work with, but on production, API and backend will run on each different server or stack, so you might need to take care of scaling consistency among them under the large amount of traffic to your application. However, this independency could also be advantageous in monitoring hardware resources. You'd better to find a sweet spot.

Single Page Application - Frontend independent of backend?

I've done some research and I've noticed that in a lot of examples Symfony2/AngularJS apps the frontend and backend are combined; for example, views use Twig.
I'd always thought that it's possible (and common practice) to create the frontend and backend separately and just join them by API. In that case if I want to change a PHP framework I will can do it without any problems and it will be enough to keep API.
So what are the best practices for doing it? It would be great if you could explain it to me and even greater if you just give me a link to good example on github or something.
We have been developing some projects using the same approach. Not only I think it doesn't have any "side effect", but the solution is very elegant too.
We usually create the backend in Node.js, and it is just an API server (not necessarily entirely REST-compliant). We then create another, separate web application for the frontend, written entirely in HTML5/JavaScript (with or without Angular.js). The API server never returns any HTML, just JSON! Not even an index structure.
There are lots of benefits:
The code is very clean and elegant. Communication between the frontend and the backend follow standardized methods. The server exposes some API's, and the client can use them freely.
It makes it easier to have different teams for the frontend and the backend, and they can work quite freely without interfering with each other. Designers, which usually have limited coding skills, appreciate this too.
The frontend is just a static HTML5 app, so it can (and we often did) easily be hosted on a CDN. This means that your servers will never have to worry about static contents at all, and their load is reduced, saving you money. Users are happier too, as CDNs are usually very fast for them.
Some hints that I can give you based on our experience:
The biggest issue is with authentication of users. It's not particularly complicated, but you may want to implement authentication using for example protocols like OAuth 2.0 for your internal use. So, the frontend app will act as a OAuth client, and obtains an auth token from the backend. You may also want to consider moving the authentication server (with OAuth) on another separate resource from the API server.
If you host the webapp on a different hostname (e.g. a CDN) you may need to deal with CORS, and maybe JSONP.
The language you write the backend in is not really important. We have done that in PHP (including Laravel), even though we got the best results with using Node.js. For Node.js, we published our boilerplate on GitHub, based on RestifyJS
I asked some questions in the past you may be interested in:
Web service and API: "chicken or egg"
Security of an API server: login with password and sessions

Can I use node to power a web application on a separate server?

I asked this (voted to be too broad) Question while working my way through a starter book on node. Reading this book, I'm sure I'll learn the answer to this later, but I'd be more comfortable if I knew this up front:
My Question: Can I (efficiently) continue using a usual webhost such as iPage or GoDaddy to host my web application, building and hosting the front end in a simple, traditional manner through an Apache web server, and communicate with a separate Node.js server (my application back-end) via AJAX for queries and other things that I can more efficiently process via Node?
More specifically, would this be a bad programming practice in terms of efficiency and organization? In other words, would it be likely that a large scale commercial application would ever be handled via this method?
Yes, you can separate the front-end of your web application and the APIs that power it. In fact, this is a very common configuration, especially for "large scale commercial applications".
Where you draw the separation line between the two specifically depends on what you are doing, how you're doing it, and what your needs are. Also, in terms of hosting, remember that if you're accessing something server-side across the internet, you're adding extra latency to everything. Consider getting off Go Daddy and using another host that gives you more flexibility, such as a VPS provider.
It's ok. Actually, this is how things shoud be done. You have a backend API on a separate server and lots of apps which are using the API. Just go with Nginx server, check this Apache vs Nginx.
Yes you can use node js as a part of some big application. It depends on wich type of interaction you would like to get. Is it comfortable to you to mix technologies? Then node is pretty good thing to work over web. I've finished a part of big nodejs-ruby-php-c-flash application (my part was nodejs) for very large data mounts. This application have different levels of interaction. Sometimes I use 2 languages at one time to create each part of my application the best for task I'm working on. There is applications that initiate, run and destroy mutiple OS instances. So using of multi environmental application not so hard.

Allow users to broadcast live streams through webcam

I'm trying to create a website like Ustream using ruby on rails.
I want users to be able to turn on their webcams and broadcasts live. I also want them to be able to send out a link to their live broadcast. The broadcast will need to work cross-browser also.
How can I do this effectively using ruby on rails?
Please be as detailed as possible. I'm looking for most simple and efficient solution.
Thanks in advance.
Your question is pretty vague, so I apologies if this is not as specific as you may hope
Live Streaming & Rails
Rails really isn't designed for live-streaming
It's MVC structure is best used to interact with large data-sets, and is better suited to handling things like authentication, API's, data-driven applications, etc
The live-streaming functionality you seek is more in the realm of node.js & socket.io, more specifically websocket architecture, whereby two connected devices can share data across a single connection
There are a number of options available, but they are limited unless you go down the proprietary route:
TokBox
TokBox is the safest bet for Rails apps - it uses a third-party API to connect the devices, and implements the connection on the front-end with Javascript & flash. We have actually implemented this before, and it's very simple to do - it's all explained here
Tokbox is now owned by Telefonica, and I believe are investing heavily to make their technology better available & higher quality to developers. So we'll have to see how it goes
WebRTC
This is more like a driver, but is the best quality of all the options. The only issue is that implementing this technology is actually pretty hard. Here is an overview for you:
WebRTC is an open-source project enabling plugin-free, Real Time
Communications (RTC) in the browser. It includes the fundamental
building blocks for high-quality communications such as network,
audio, and video components used in voice and video chat applications.
Recommendations
Having implemented TokBox before, I'd recommend you look at that. You can see a tutorial about it here

Realtime web libraries - replace hookbox with socket.io or what?

I've got a couple projects that were built using hookbox to manage real-time message passing between web clients and servers. Hookbox was great -- it totally abstracted the transport layer, exposing a simple publish/subscribe interface across different channels with an elegant security system.
Unfortunately the hookbox project has rapidly fallen into disarray due to the original maintainer's unwillingness to even put in the effort to hand off ownership. (Grrr!) So it's hard to consider it a viable platform any more.
What's a good platform for providing real-time communication with web apps? Requirements:
Works seemlessly cross browser, using HTML5 websockets or COMET as available. Transport choice should be invisible to application layer. I don't care about ancient browsers (IE6)
Client access from both javascript and server-side systems (i.e. in php / python / ruby) -- this is critical
Provides a publish / subscribe metaphor with arbitrary payloads
Allows clients to see what other clients are connected to a channel, i.e. presence
Fine-grained access control through callbacks to any web application (nice to have)
I've heard that socket.io can do some of this, but I get the sense that it's at a lower layer of the stack. Can it connect to non-javascript libraries? Do auth?
I've had a very good experience with NodeJS and Socket.IO over the last 8 months. The server side component has been very stable for me - I can leave it running with a very high message volume and it's resident memory never really budges above 20MB. So far I've only been able to leave it running for about 4 weeks without terminating the server, but that was only because I needed to update my server side code.
Works seemlessly cross browser, using HTML5 websockets or COMET as available. Transport choice should be invisible to application layer. I don't care about ancient browsers (IE6)
Provides a publish / subscribe metaphor with arbitrary payloads
Socket.IO is also a fantastic piece of software. It under active development, and has a simple pub/sub style abstraction built in using EventEmitter (NodeJS) semantics of 'on' (subscribe) and 'emit' (publish). It is also very transparent on the client side regarding the transport being used. I used it primarily for the straight-up WebSocket support, but it can fall back to Flash based sockets, xhr-polling, and jsonp polling.
Client access from both javascript and server-side systems (i.e. in php / python / ruby) -- this is critical
NodeJS is JavaScript, running on the V8 engine. It has a ton of 3rd party modules that provide nice abstractions as well as interfacing with external components, such as a databases or message queues, among many other things. As far as hitting the system with php/python/ruby, it would work as with hitting any other server. Choose your method of communication (basic TCP/IP, or maybe HTTP POSTs or GETs, or even via filesystem) and NodeJS doesn't really care who is providing the data. Personally, I've implemented a C# client that is working great.
Allows clients to see what other clients are connected to a channel, i.e. presence
It doesn't not have any built in 'presence' logic, though with the built in 'pub/sub' logic already in place in Socket.IO, all you'd have to do is store state on the server so new clients can retrieve existing presence data. I've implemented my own basic pub/sub on the server that retains state, and all together (including the NodeJS server code, and the basic Socket.IO stubs) it was only 50 lines of JavaScript (including whitespace).
Fine-grained access control through callbacks to any web application (nice to have)
Not sure what you mean by 'Fine-grained access control through callbacks to any web application (nice to have)'. The pub/sub event/observer metaphor they have uses callbacks, so you hook specific actions to specific events.
Do auth?
I've had no need, yet, to do any auth for our systems, so I can't speak to it directly. However, if you browse the NodeJS modules you'll notice there are many auth modules available, including LDAP and OAuth, not to mention one module that claims to do "OpenId, Google, OAuth, Twitter, LinkedIn, Yahoo, Readability, Dropbox, Justin.tv, Vimeo, Tumblr, OAuth2, Facebook, GitHub, Instagram, Foursquare, Box.net, LDAP"
Although I haven't tried it yet, I started looking into Pusher for a Node Knockout 2011 entry. In addition to JavaScript, it supports the following non-js clients:
Objective-C
ActionScript
.NET & Silverlight
Ruby
Arduino
If messaging via a 3rd party is a possibility, you can try the service for free using their Sandbox plan (20 connections & upto 100K messages/day) and see if it meets your needs. (I'm a little uncertain about the "presence" requirement, though it may be covered in the docs.)
I recoment using node.js which has a lot of libraries for various things. One library for real time messaging is now.js. I don't have a lot of experience with this but have tried it and I would say it worked well and has everything you said you need.

Categories

Resources