I'm writing a web crawler in Node. It will crawl my various bank accounts and provide me with a summary of my finances. Acknowledging the security issues around this, i'm just doing it as a proof of concept.
I'm having a problem with structuring my application.
So far my controller modules are:
/controllers/routes.js (contains express routes)
/controllers/configure.js (takes values from /settings.js and interprets them for /app.js)
/controllers/crawler.js (downloads a page, traverses DOM and outputs values from selectors)
/controllers/login.js (provides crawler.js with functions to log in to bank accounts)
Are these valid controller modules, or are they more suited for a directory such as /lib/?
At the end of the day it doesn't matter for the functionality of the project, but I'm presenting it at the end of the week.
Controllers are the things, which process requests by glueing models and views. The router routes a request to a controller, this one calls methods of models in order to render a view.
Since most of your code is just code to fulfill some specific tasks, which have nothing to do with the frontend code of your application: No, most of the code is nothing I would call controller code.
As you already said it makes more sense to group that into modules and put it in other directories. These functions are either called by the controllers, to render the frontend, or (more likely) are called via cronjobs to update the database.
Related
I am coming from models and controller paradigm, where models deals with DB and controllers have the business logic to serve the REST api's.
Now, i am looking into a Loopback(framework based on Node JS) to do new project with it. But, it only have model to do all kind of stuff. I am not able to understand how i can merge service layer and controller layer into models. That sounds me a bit confusing.
If any one can provide the right direction for designing a system with Loopback. That would be very helpful.
When you create a new model, say Profile, you get 2 new files:
profile.js
profile.json
Consider profile.json as your model, actually just a declaration of your model. And consider profile.js as your controller. All RESTful APIs you need are dynamically generated by loopback, if you need to add additional logic to the regular APIs or create new ones, your starting point is profile.js.
Now, you can structure your application code as you like. I usually put all the application business logic into a service layer, having the module profile_service.js and referencing it from profile.js.
i'm very confuse about how ember controller works.
I'm starting with ember and ember-cli now, and i would like to understand more about how controller works.
If i have an nested route called new, inside a events resource, i should i have:
models/event
routes/events/new
templates/events/new
What about controllers?? I just work one simple controller, or should i use controllers/events/new too?
There isn't any generator command that will create every resource for me? I need call one by one?
Thanks.
What about controllers?? I just work one simple controller, or should i use controllers/events/new too?
This mainly depends on what is your controller needs to do. If it's only the essential stuff the controller does anyways, Ember will create that controller under the hood for you and automatically bubble actions up to its parent controller.
No better place than Ember guides to read what a controller is used for:
The simplest definition is:
Controllers allow you to decorate your models with display logic.
This means that you basically use them as the main communication layer between your route and your template. Essentially, you model comes from your route, through your controller and into your template. Actions happening in the template go up to the controller and then to the route. Therefore, controller is essentially the middle layer where you user your model (and other data) to control what is shown to the user, control what a user can do, control where can they navigate etc.
However, be aware of the plan for the future:
Controllers are very much like components, so much so that in future versions of Ember, controllers will be replaced entirely with components. At the moment, components cannot be routed to, but when this changes, it will be recommended to replace all controllers with components.
This means, that right now, controller responsibility is limited to two things:
Maintaining application state based on the current route
Handling or bubbling user actions that pass through the controller layer when moving from a component to a route.
All actions triggered on a template are first looked up on the controller, if it is not handled or bubbled (by return true) in the controller, they are looked up on the route.
Therefore, controllers for your /events or events/new routes aren't necessary at all, only if you want to handle things happening on those routes right away (in a smaller scope) instead of allowing everything to bubble up to the ApplicationController.
There isn't any generator command that will create every resource for me? I need call one by one?
Yes. Since, if you don't specifically create a controller, Ember just generates one for you behind the scenes. You need to specify where you want to handle things yourself.
You should visit the link I gave above (or here it is again) to the Ember guides that have many more examples in much more detail.
I'm making a single page application using Require.js and Backbone.js. Its a fairly large web app with a lot of different "pages" aka views. Below is my router to give you an idea. There are several main pages with sub pages.
So for example there's a Settings section that has multiple different sub pages such as user settings, language settings, email settings etc.
How would I structure many routes and their views for simplicity?
Right now Im giving each sub page its own view but that means I have to import 20-30 views into the router so that all possible views are available for when that page is routed.
Another way I thought of was to have one view for each section and that in that view I should load different partials. That way I only have to load the 5-6 section views into the router... but then the view would have to understand routing.
Whats the right way to do this?
I create 'controller' objects that take care of view rendering and model fetches.
I prefer to keep the router clean at all times, which means that I will don't clutter it with callback functionality. Doing so would make the router a mess over time, while part of its purpose would be to get a quick overview of the available routes.
In Backbone, I found that it is useful to create your own conventions, just like a framework would do.
For example, for every view I create, I will create one controller object.
Every controller object has a method that is named 'makeView()', and which takes care of rendering the view, as well as memory management.
In my own theory, I created a method of 'cascading controllers', in the sense that one controller may also control other controllers, and controllers may use 'helper' objects to fulfil certain tasks.
For example, when you say that you may need to manage 20 views and subviews; we could imagine that some of the views are related to each other; that there will be a central controller that takes care of common tasks between related views, and specific controllers that take care of specific, individual view functionality.
A route in my router looks something like this:
auth: function(){
//--- Check the authStatus and render status independent views
var auth_ctr = new Auth_ctr();
auth_ctr.makeViews();
}
In the example given, you could imagine that you will create and render multiple views. So what I really do, is instantiate new controllers from within this controllers, that each individually will create and manage views, provide functionality that support the view, get the collection/model data.
It would be important to create a sort of independent 'View manager' that prevents memory leaks from occurring when you render new views each time.
This is just how I do it, but of course, I'm sure there are people who do this differently.
It is a theory I came up with; it has given me a clear structure, and it has worked well for me until now.
I have a basic express server that needs to store some global variables during each request handling.
More in depth, request handling involves many operation that need to be stored in a variable such as global.transaction[]
Of course if I use the global scope, every connection will share information of its transaction and I need a global scope because I need to access the transaction array from many other modules, during my execution.
Any suggestion on this problem? I feel like is something very trivial but I'm looking for complicated solutions :)
Many thanks!
UPDATE
This is a case scenario, to be more clear.
On every request I have 3 modules (ModuleA, ModuleB, ModuleC) which read the content of 10 random files in one directory. I want to keep track of the list of file names read by every request, and send back with res.write the list.
So ModuleA/B/C need to access a sort of global variable but the lists of request_1, request_2, request_3 etc... don't have to mix up.
Here is my suggestion avoid global state like fire.
It's the number one maintenance problem in Node servers from my experience.
It makes your code not composable and harder to reuse.
It creates implicit dependencies in your code - you're never sure which piece depends on which and it's not easy to verify.
You want the parts of code that each piece of an application uses to be as explicit as possible. It's a huge issue.
The issue
We want to synchronize state across multiple requests and act accordingly. This is a very big problem in writing software - some say even the biggest. The importance of the way objects in the application communicate can not be overestimated.
Some solutions
There are several ways to accomplish sharing state across requests or server wide in a Node server. It depends on what you want to do. Here are the two most common imo.
I want to observe what the requests do.
I want one request to do things based on what another request did.
1. I want to observe what the requests do
Again, there are many ways to do this. Here are the two I see most.
Using an event emitter
This way requests emit events. The application reads events the requests fire and learns about them accordingly. The application itself could be an event emitter you can observe from the outside.
You can do something like:
request.emit("Client did something silly", theSillyThing);
And then listen to it from the outside if you choose to.
Using an observer pattern
This is like an event emitter but reversed. You keep a list of dependencies on the request and call a handler method on them yourself when something interesting happens on the request.
Personally, I usually prefer an event emitter because I think they usually solve the case better.
2. I want one request to do things based on what another request did.
This is a lot tricker than just listening. again, there are several approaches here. What they have in common is that we put the sharing in a service
Instead of having global state - each request gets access to a service - for example when you read a file you notify the service and when you want a list of read files - you ask the service. Everything is explicit in the dependency.
The service is not global, only dependencies of it. For example, it can coordinate resources and the data, being some form of Repository).
Nice theory! Now what about my use case?
Here are two options for what I would do in your case. It's far from the only solution.
First option:
Each of the modules are an event emitter, whenever they read a file they emit an event.
A service listens to all their events and keeps count.
Requests have access to that service explicitly and can query it for a list of files.
Requests perform writes through the modules themselves and not the added service.
Second option:
Create a service that owns a copy of module1, module2 and module3. (composition)
The service delegates actions to the modules based on what is required from it.
The service keeps the list of files accessed since the requests were made through it.
The request stops using the modules directly - uses the service instead.
Both these approaches have advantages and disadvantages. A more complicated solution might be required (those two are in practice pretty simple to do) where the services are abstracted further but I think this is a good start.
One simple way is storing data on the request object.
Here is an example (using Express):
app.get('/hello.txt', function(req, res){
req.transaction = req.transaction || [];
if (req.transaction.length) {
// something else has already written to this array
}
});
However, I don't really see how you can need this. When you call moduleA or moduleB, you just have to pass an object as argument, and it solves your issue. Maybe you're looking for dependency injection?
using koa ctx.state doc for this scenario, in express I believe this Plugin should serve your needs.
in order to keep some data that will be resused by another request on the save server app, I propose to use session in expresse and avoid any global state or any props drilling from one request to another.
In order to manage session state in express you could use :
session-file-store save the session in a file
express-mongodb-session : save the session in mongoDb
mssql-session-store -> for a relation db
Of course there is another technique ti manage session in NodeJs.
Most of the open Backbone.js single–page apps and demos out there seem to deal with one or maybe two different resources and usually populate the collections in questions when initially loading the page.
How do you guys deal with multiple (more than 2 or 3) different resources having their data stored remotely? When do you load the content?
An example which should sound familiar to Rails devs:
A current_user object exists, having has_many associations to Project, Team, Task and Invoice models. My client–side app provides some kind of CRUD functionalities for these models, maybe some additional views to connect stuff, etc., having in total many different views and corresponding routes.
I want the user to be able to jump straight to any of these routes by e.g. pasting a link, let's say /#project/34/invoices, which would require the app to have the Project with ID 34 loaded as well as the invoices connected to this project.
How do people solve this issue, given that you can easily end up having many cases like this within a single app? Are you just loading everything initially
current_user: {
projects: {
invoices: {…},
tasks: {…}
tasks: {…}
…
}
which doesn't seem clean to me, or do you have a clever way to always load what you need?
Cheers!
One approach is to load enough at start-up to do most high-level tasks. Thereafter, lower-level tasks might need to perform fetches to get detail-level resources.
In a travel management app, you might establish a session for a user and get the user and any reservations that user has. This would enable you to quickly show reservation summaries without having to make additional server calls.
Requests for more detailed reservation data might require a lower level call. For instance, if I have a flight reservation, I might periodically call for flight status information.
I think the key question revolves around your caching strategy. You need to evaluate how long data can live within your app without needing to be refreshed. If your resources go stale quickly, then they should be fetched when needed.
I'm working on a backbone project that you can take a look at here: http://sourceforge.net/projects/myelin/ but more specifically maybe this model: http://myelin.git.sourceforge.net/git/gitweb.cgi?p=myelin/myelin;a=blob;f=public/javascripts/models/tab.js;h=6b4cb6ad26d2fc12a817ec027b60b2b5ef13f463;hb=HEAD
(a couple of caveats: This is my first project using pretty much all of the technologies I chose to use, so I'm positive I made some design errors. If you notice anything, just send me a note so I can try to fix it! I didn't use the backbone controller, and rolled up my own, but I think in your case, the default controller is exactly what you're looking for.)
Backbone.js is very REST-ish in it's approach to dealing with things. I think if you approach it much the same way as you would in the rails world, then it may become more apparent as to a solid solution.
For example, take a look at the way Backbone does it's routing / controllers: http://documentcloud.github.com/backbone/#Controller-routes
It maps a route to an event, which you can set a listener to. So, in your example route above, you could fire an event, and have your projects controller catch it and execute some stuff.
If you were implementing the example in Rails, your invoices controller would be called, you'd grab the data through your Model, then send that data off to the view to have it rendered. Same idea with backbone (Though it's more code-involved than the rails world).
In my app, that's what's happening. The user does something in the UI, which triggers an event, which is caught and sent off to a controller. The controllers have methods very similar to the names in rails: 'get', 'index', 'create', 'destroy' etc... Once the model has done it's thing, it then calls it's view, and renders, or does whatever it needs to do.
In the example above (tab.js), when a Tab is instantiated, it's TabContents Collection is built, but empty. Which is similar to rails (I think). Rails won't load the whole kit n kaboodle, unless you ask it to.
Hope that helps!
I found a way for myself to handle this — maybe it helps someone else:
I'm extending a base collection class which implements a method isStale.
Like this I can fetch or "refresh" the data for a collection if this hasn't been done for a certain time:
if (this.collection.isStale()) {
this.collection.fetch();
}