Related
I'm starting to develop a new NodeJS (with Express) Web / CRUD / REST Multi-page Application and I would like to begin in the best way.
The Application object will have a lot of modules, just for example:
Users management at several levels with policies for the operations which they can do, based on the the user level.
Simple forms to insert, update, retrieve database data.
Screens to display real-time sensors (have already thought to make use of libraries such Socket.io).
The basic application (NodeJS server-side API) will also be "called" by Android / iOS applications to fetch / edit data from mobile.
Considering the project as a multi-page application with many asynchronous calls, I have a doubt about the views management and I would like to figure out which one of the following approaches is the most convenient:
1 conjecture (PURE API style?): write, for each request, an express route that returns only a JSON response and compose the view client-side (startly with simple jquery DOM editing, later with React or Angular), after downloading the same (via socket or ajax call).
Server side:
1 Call: /getFooView
router.get('/getFooView', function(req, res, next) {
//code
res.send(htmlview);
});
2 Call: /getFooData
router.get('/getFooView', function(req, res, next) {
//code
res.json({"foo": "bar"});
});
Client side:
Javascript / JQuery client-side will compose the "getFootView" then it will be showed to the user.
2 conjecture: write, for each request, an express route that returns the view (usually a tiny html block, list or similar) already composed server-side (let's suppose by handlerbars). In this case, each controller route should interpret the request according to the requester.
Server side:
router.get('/getFoo', function(req, res, next) {
// pseudo code
if(request == "ANDROID" || request == "iOS")
res.json({data});
else
res.render('index', {
// handlebars parsed view
title: blockTitle,
data: blockData
sensors: sensorsData
});
});
Client Side:
JQuery code to append the received block.
Which approach is better suited to my purpose? What are the pros and cons?
Sorry for my bad english.
If we talk about single-page-application, then you shouldn't send views. You can use nginx for client files including css, html, js and etc. Also nginx will caching and proxing client requests to you REST API.
So my answer is PURE API + nginx for static files.
Really fast, this question may have already been asked, but I am not totally clear on the terminology and have had no success when searching for a solution.
I am attempting to create a simple web application that can receive get and post requests. Then, I would like to take the information I receive from these requests and use them in a javascript file embedded in html that is displayed. I think it may be more clear with code. This is my app.js:
var express = require('express');
var app = express();
var assert = require('assert');
app.use(express.static('javascript'));
app.get('/', function(req, res) {
res.render('index.jade');
});
app.listen(3000);
I would then like to be able to take information received from a get or post request, specifically in JSON format, and then be able to use it in javascript that I have linked to index.jade. To make matters slightly more confusing in index.jade the only line is:
include map.html
So ideally I would be able to get the information to javascript in that html file.
I want to be able to pretty continuously update map.html using javascript based on frequent get and post commands.
This could be a very simple solution, I am pretty new with web application programming in general, and I have been struggling with this problem. Thanks in advance, and ask if you need any clarification.
I think you need to understand what you are doing. You are creating a web server using node.js. Express framework simplifies that for you. The official documentation of express is pretty great. You should refer that especially this link.
Now, in your application, you need to create endpoints for GET and POST requests. You have already created one endpoint "/" (root location) that loads your home page which is the contents of the file index.jade. A small example is :
app.get('/json', function (req, res) {
res.setHeader('Content-Type', 'application/json');
res.send(JSON.stringify({ a: 1 }));
});
Jade is a templating engine which resolves to HTML. You should refer its documentation as well. Since as you said, you are not very familiar with these, you could simply use HTML as well.
After this tasks are quite simple. In your javascript, using ajax calls you can access the GET or POST endpoints of your server and do the desired actions.
Hope it helps!
The way I get it you want to be able to call an endpoint (lets assume /awesome) pass some info and then write to a Javascript file that you can then reference in your html.
Something like the below would write a main.js file in your public folder. You can make it write whatever you want.
var fs = require('fs');
fs.writeFile("/public/main.js", "console.log('Hello!')", function(err) {
if(err) {
return console.log(err);
}
console.log("The file was saved!");
});
You can then reference /public/main.js in your html.
Check that: Writing files in Node.js
If I misunderstood your intentions, apologies. :)
So you want to reviece continuously data from the server maybe web sockts are an option for you http://socket.io/docs/
This way u will open a stable connection from the client to the server. This is a great way if u want to get updates done by other clients.
If you want to trigger the changes by the client then ajaxs calls(http://www.w3schools.com/ajax/) as hkasera mentioned are the way to go.
I very much like Meteor's pub/sub. I wonder if there is a way to get a similar workflow, using sails.js or just a socket library in general.
In particular, what I would like to be able to do is something along the lines of:
// Server-side:
App.publish('myCollection', -> collection.find({}))
// Client-side:
let myCollection = App.subscribe('myCollection')
let bob = myCollection.find({name: 'Bob'})
myCollection.insert({name: 'Amelie'}, callback)
All interaction with the server should happen in the background.
I very much like Meteor's pub/sub. I wonder if there is a way to get a similar workflow, using sails.js or just a socket library in general
Basically yes, at least about realtime sync between backend and frontend. Let's review what meteor's have and answer point by point.
Pub/sub
The Pub / Sub concept, as stated by Sabbir, is also supported by sails.js. Though the basics are slightly different :
In meteor, the client can subscribes to everything he wants, and the server control what it receives by only publishing to who he wants;
whereas in sails.js, the server both does subscribe some clients sockets and publish to all binded sockets
Note that, by default:
meteor contains the autopublish package that just notify every client without any kind of filtering. To acheive some filtering, you have to meteor remove autopublish then you can handle what will your client receive by adding a mongo request to it, like explained here.
sails by default, on its automatic "select" blueprints actions, auto-subscribes the calling socket to the events on the objects returned by the "select".
As a server-side conclusion:
Subscribe: just call findor findOne blueprint default action, through a socket (attaching some where filters or not) and your socket will automatically be subscribed to every events concerning returned objects => you don't have to code anything on the server, in most cases, for the Subscribe logic.
Publish: every blueprint default actions (create, update, destroy, add, remove) auto-publish to subscribed sockets => you don't have to code anything on the server, in most cases, for the Publish logic.
(Though, if you find yourself implementing some manual controller actions, sails API helps you publishing and subscribing easily)
Client handling
Therefore, with both meteor and sails, clients only receive what they're supposed to receive. Time for front-end now.
Philosophy
meteor in one hand, with it's isomorphic dimension, does provide a front-end connector by nature, exposing it's data-bound collections.
sails on the other hand, is front-end agnostic, and can be attacked by any http REST connector (JS or not), such as $http, $resource, or more advanced ones like Restangular.
Though, being aware of the complexity using raw sockets on their API (when it comes to session, CORS, CSRF and stuff), they developped a javascript socket.io wrapper called sails.io.js designed to be REST-like-over-socket, and just works like a charm.
Basically, The main difference is that meteor is one step higher-level than sails, because it provides the logic of syncing collections and objects.
All interaction with the server should happen in the background.
sails.io.js, the official front-end component, is just not that high-level. When it comes to Angular.js.
Though, you can find some community connectors that aim to, kinda, provide the same feature as mongo data-bound collections and objects. There is sails-resource, spinnaker or angular resource sails. I tried both of them, and I should say that I was disapointed. The abstraction level is so high that it just becomes annoying, IMHO. For example, with not-very-RESTful-friendly custom actions, like a login, it becomes very hard to adapt it for your needs.
==> I would advice to use a low-level connector, such as angularSails or (my prefered) https://github.com/janpantel/angular-sails, or even raw sails.io.js if you're not using Angular.
Edit: just foun a backbone version, by the sails' creator
It just works great, and believe me, the "keep my collection in sync with that socket" code is so ridiculous, that finding a module for this is just not worth it.
Some code please, stop talking
In particular, what I would like to be able to do is something along the lines of:
Server
Meteor
# Server-side:
App.publish('myCollection', -> collection.find({}))
Sails
//Nothing to do, just sails generate api myCollection
Client
Meteor
# Client-side:
myCollection = App.subscribe('myCollection')
Sails, with sails.io.js
(Here using lodash for convenience)
var myCollection;
sails.io.get('/myCollection').then(
function(res) {
myCollection = res.data;
},
function(err) {
//Handle error
}
);
sails.io.on('myCollection').function(msg) {
switch(msg.verb) {
case 'created':
myCollection.push(msg.data);
break;
case 'updated':
_.extend(_.find(myCollection, 'id', msg.id), msg.data);
break;
case 'destroyed':
_.remove(myCollection, 'id', msg.id);
break;
};
});
(I leave the find where and create to your imagination with [the doc])
All interaction with the server should happen in the background.
Well, Sails, only for angular, with sails ressources
I'm not pretty used to that process, so I leave you reading here or here, but once again I'd choose manual .on()method.
Since I asked this question, I've learned a few things and some new projects have popped up. I decided against sails.io, because when developing with React.js, most of the community's weight is behind webpack, but sails.io uses gulp. I realize these can be used together and there is even an npm package for this, but I wasn't too keen on making my stack bigger than it had to be, so I went with a simple express.js server that I could tailor to my needs.
In order to sync my data, I'm using rethinkdb which allows me to asynchronously watch the database for changes and then publish the changes to the clients through websockets.
I've set up a simple script where I keep an instance of a baobab tree on both the client and the server.
When the tree gets modified on the server, it sends transaction data to the appropriate clients through the websocket
The client merges the transaction with the tree.
This method does not make use of local storage and keeps the data in memory in the node.js process. The data in the transaction is also quite redundant.
The future plan has always been to set something up using redis and local storage ...
... until yesterday when I found deepstream.io!
This is a tool that does exactly what I want and need! Nothing more, nothing less.
Another project worth mention is meatier: "like meteor, but meatier". It is composed of many other well supported open source projects, so you could even pick and choose.
I am trying to set up a simple presentation using three computers synchronized by a central server, and I figured node would be the ideal tool.
I was wondering if there's any way to have all three computers connect to the server via the browser, and if I could control the server to push changes to each?
For example:
Computer-1 visits 10.0.0.1?comp=1?slide=1
Computer-2 visits 10.0.0.1?comp=2?slide=1
Computer-3 visits 10.0.0.1?comp=3?slide=1
Then from the server commandline, I would like to be able to trigger a change so the clients will each be redirected accordingly like so:
Computer-1 visits 10.0.0.1?comp=1?slide=2
Computer-2 visits 10.0.0.1?comp=2?slide=2
Computer-3 visits 10.0.0.1?comp=3?slide=2
I'm new to node, so I'm not even sure if this is the ideal platform, but was wondering what terminology I should be researching to be able to build something like this?
Thank you for your responses, I ended up looking into socket.io and managed to write this system in one evening! Node + socket.io and express is a pretty amazing tool with the socket emit events.
Thank you for pointing me in the right direction, this is exactly the tool I was looking for.
Just in case it may help anyone, in my client/jade template, I have something like:
socket.on('slideUpdate', function (data) {
// Do things with the data
}
and on the server app.js:
io.on('connection', function (socket) {
socket.on('slideChange', function (data) {
// logic for setting slide data
io.sockets.emit('slideUpdate', { example: exampleData ... });
});
});
where a slideChange event is triggered via a button on the client-side template.
For such a presentation I would use the most simple solution, i.e. not websockets, not server-sent events and not long-polling.
Just do a short poll, i.e. every client calls the server every 100ms for updates. The server then responds with a status update (if there is one).
the past few weeks I've been hard at work with Angular, Node, TDD, Heroku, Amazon S3 etc. Trying to get a better picture of how a fully scalable SPA with a solid backend is built, working with grunt, bower, haven't dipped my toes in TDD using Jasmine yet, though I understand how the tests are being made through Karma, this is supposedly my next step.
One thing is sure: IT IS A LOT OF INFORMATION
On to the Questions/Rationale on working with all these technologies.
First things first, I played with
Angular App https://github.com/angular-app/angular-app
NG Boilerplate https://github.com/joshdmiller/ng-boilerplate
and read many dozens of posts etc.
I found NG Boilerplate to be most logical structured (as far as my understanding of these things go).
As a demo project (which evolved from something really small) I want to make a Single Page CRUD Application using:
NodeJS as backend
Express as a web app framework
NG Boilerplate as the Client
The app deployed to Heroku
MongoDB for DB
Amazon S3 for dynamic storage
Now I want to use Angular-Apps's (https://github.com/angular-app/angular-app) server as a backend to my NGBoilerplate kickstarter
I want to know how:
from what I see the client connects directly to MongoDB?
how does the angular client communicate back and forth to express ?
I read an interesting article http://www.espeo.pl/2012/02/26/authentication-in-angularjs-application related to how the authentication works.
Long Story Short, without me asking a ton of questions, could someone please describe in detail the workflow of such an app? Getting the session, login, access to editing the content, tying express routes to angular routes (e.g. X route can be accessed by the admin only) etc. ##
there'a big blur in my head :).
In the last months I played a lot with these issues and questions and I got to the following conclusion:
For my purposes, I needed an app that relies almost entirely on Angular, without a separate backend, and the present backend should be from Angular.
Why? Because I want all of my eggs in one basket, I don’t want to configure a ton of stuff on a lot of different parts.
As a basis for my project I ended up using ng-boilerplate, as a boilerplate :), with some changes to the development process, Grunt tasks etc, this is for everybody to figure out, depending on each particular project.
Well, the main issue I’m gonna touch here is that, for a true backend, made in Angular, we need secure routes and a secure persistence method, a database.
For the app, I took advantage of the ng-boilerplate's modular and dependency aware structure, I think it’s perfect for a Angular app.
Anyhow, I’m gonna take things top to bottom (final product wise, the build env as I said above, it’s up to you, but ng-boilerplate is awesome), here we go.
On the upper layer we have the actual Angular app, made just the way we want
The server container, is a NodeJS server with express and other modules to take PARTIAL care of the routing on different browsers and devices (In my app, I made HTML5 routing that is augmented by express, .htaccess like settings whenever there’s a partial URL it should redirect to index where Angular will read the path requested and zapp you to that location)
For my case, the whole things runs on Heroku, on a Node.JS application, you can install several other things there if you want to.
Now, for the persistency, to have authentication and security, and NOT to rely on backend for that, I am using firebase (https://www.firebase.com/), there’s some great tutorials there to help you going and have true persistence in your Angular APP, with routes when you are logged in, access to custom tables/objects in DB when you are logged in etc. It’s the real deal.
If you don’t want to rely OAuth’s possible sites to log in with (Facebook, github, persona or twitter) and want custom emails and addresses you can do that directly with Firebase, to create accounts and delete them etc.
FIREBASE Angular Backend.
So, Firebase, just like they say on the site is a powerful API to store and sync data in realtime.
I don’t know exactly how to approach this, so I’m gonna start it with creating a Firebase database. Once we create it, in the backend we have several options, one of which is security.
{
"rules": {
".read": true,
".write": "auth != null"
}
}
Here, if we read the documentation on https://www.firebase.com/docs/security/security-rules.html we’ll learn that we can add rules for each ‘table' in our database, so we can have like 3 protected ’table’ objects and some that are not protected.
We can protect tables per user basis, per different rules, if logged in or not, we also have inheritance for rules etc, pleas read the documentation there, it really is a good read.
Now, for these rules to take effect we need to enable the Firebase Simple Login and select the desired login method, from Facebook, Twitter, Github, Persona, Email&Password and Anonymous.
For a real app, we need to write info to DB also as anonymous (user sessions etc) and also as logged (with either of the options above) to store and read information.
Me, I wanted to go the quick easy way and made a Facebook authentication, reading the docs there I made a quick Facebook app, and in the settings of the application on Facebook I’m putting Firebase’s backend https://www.dropbox.com/s/xcd4b8tty1nlbm5/Screenshot%202014-01-22%2013.51.26.png
This gives a interim link to login to Facebook and have access to ’tables’ that are otherwise locked if the rule is auth !=null.
NOW, onto the Angular backend.
Firebase provides a library for us to put in our app, and a SimpleLogin lib, also, for Angular, a factory service called AngularFire.
In my case, I made local firebaseService with use methods that connects to my DB:
angular.module('firebaseService', ['firebase'])
.service('firebaseService', function ($firebase, $rootScope) {
//Input data in Firebase
var URL = "https://glowing-fire-xxxx.firebaseio.com";
var tellFirebase = function(ID, JSON) {
users = $firebase(new Firebase(URL + '/' + ID));
users.details = JSON;
users.$save('details');
};
return {
addUser: function(ID, JSON) {
tellFirebase(ID, JSON);
if ($rootScope.debugStatus === true) {
console.log('Firebase Service .addUSer Called');
}
},
getUser: function(ID) {
if ($rootScope.debugStatus === true) {
console.log('Firebase Service .getUser Called');
}
}
};
})
From here we do our READ/WRITE, on the controller’s page I have this:
It’s worth noticing that I have a middleware service (storageManagement) where I switch between Firebase and MongoDB, to avoid confusion.
.controller( 'SomeCtrl', function SomeController( $scope, storageManagement, $firebase, $firebaseSimpleLogin ) {
/*===========================
* ==== FIREBASE LOGIN
* ===========================*/
var URL = "https://glowing-fire-XXXXX.firebaseio.com";
var users = new Firebase(URL);
$scope.auth = $firebaseSimpleLogin(users, function(error, user){});
if ($scope.auth.user == null) {
//$scope.auth.$login('facebook');
}
console.log($scope.auth);
//$scope.auth.$logout('facebook');
$scope.doLogin = function() {
console.log($scope.facebookemail);
console.log($scope.facebookpassword);
$scope.auth.$login('facebook');
$scope.$on("$firebaseSimpleLogin:login", function(evt, user) {
storageManagement.runFirebase();
});
/* example of logging in while asking access to permissions like email, user_list, friends_list etc.
* auth.$login('facebook', {
rememberMe: true,
scope: 'email,user_likes'
});*/
};
$scope.doLogout = function() {
$scope.auth.$logout();
};
});
I’m adding the $firebase service to my controller, and the $firebaseSimpleLogin one.
This here exposes to scope two buttons, login/logout, that popup the OAuth window from Facebook, with email/password setting you won't need to to go through this I think, for a full understanding please read the full docs at firebase.
SO, once we are logged, we can access tables described in the rules, if we choose email/password, actually even for Facebook or other methods, we can assign certain rules for certain IDENTITIES, so you could have a ADMIN table where you could save settings that get READ on page load to apply whatever you want.
Now, with routes, we can check for the $scope.auth status, if WE PUT IT IN $rootScope, and check for the status when going to a route, if the status checks, we get to that route and it gets populated with stuff from the DB, otherwise, even if someone hacks it’s way to that route it won’t see anything because there are no permissions to read that table for unauthorized/wrong email users.
This is loosely based on this article, http://www.ng-newsletter.com/posts/back-end-with-firebase.html … I had a hard time changing the mindset from what the guy wrote there, but, after ONE WHOLE day, of reading the docs (and setting up middleware, mind you) from Firebase I figured it out, and it works.
The connection to the DB is exposed like one BIG object where you can do whatever operations you want.
This isn't the most complete explanation, but it should get you well on your way to making some awesome things:D
The best example of this that I've come across is called angular-app.
It's very comprehensive and addresses all your needs. It's written by one of the authors of the fantastic book "Mastering Web Application Development with AngularJS".
https://github.com/angular-app/angular-app
From the github repo:
AngularJS CRUD application demo
Purpose
The idea is to demonstrate how to write a typical, non-trivial CRUD application using AngularJS. To showcase AngularJS in its most advantageous environment we've set out to write a simplified project management tool supporting teams using the SCRUM methodology. The sample application tries to show best practices when it comes to: folders structure, using modules, testing, communicating with a REST back-end, organizing navigation, addressing security concerns (authentication / authorization).
We've learned a lot while using and supporting AngularJS on the mailing list and would like to share our experience.
Stack
Persistence store: MongoDB hosted on MongoLab
Backend: Node.js
Awesome AngularJS on the client
CSS based on Twitter's bootstrap
Build
It is a complete project with a build system focused on AngularJS apps and tightly integrated with other tools commonly used in the AngularJS community:
powered by Grunt.js
test written using Jasmine syntax
test are executed by Karma Test Runner (integrated with the Grunt.js build)
build supporting JS, CSS and AngularJS templates minification
Twitter's bootstrap with LESS templates processing integrated into the build
Travis-CI integration