P.S I probably didn't quite understand the mvc principle, but I always thought that business logic should happen in models - correct me if I'm wrong
I am writing project and using the mvc pattern.
router accepts requests and passes to controller, it takes data and sends it to model. Model checks the data for validation and, if successful, works with the database. My models have very large code and mix with validation.
exports.modelFunction = (data) => {
// validation data ...
// working with the database
}
With new fixes it becomes difficult for me to read the code.
I decided to create a utils / validation / folder where I will handle all possible model validations and yes, now the code is more readable
exports.modelFunction = (data) => {
const validation = validationModel(data);
if (!validation) return {ok: false, message: 'Validation failed'}
// working with the database
}
I thought that there are already some other patterns that adhere to the separation of logic and solve problems like mine. So I decided to ask you for a hint.
I think you're confused with data validation with business logic.
The controller is where your business logic lives. It it is the gateway to allow for model access.
If the controller decides that the user can access/modify the data, then it passes that data down to your model to do work.
Your model should validate the data and see if it has all the required attributes to work with the database. If it doesn't then it fails and is returned to your controller. If that happens, then your controller needs to do your business logic and respond back to the request with the status.
Related
iam trying to use reactive approach in my angular app, but i am not quiet sure, that i am using it correctly. So i have a few questions to clarify it.
In my app i have global status service(its subject service), which holds some basic state of the app - month and workload id selected by user(user can switch it in navbar).
On market page i display offers, which user can apply for. To get data i need, iam using observables from market service with async pipe - no manual subscription.
Here is part of market service:
export class MarketService {
//subject with updates
private prefChangesSubj=new Subject<MarketOffer>();
//observable which loads data from API, everytime user changes status.
get loadedEvents(){
return this.statusService.appStatus.pipe(
switchMap(status=>this.getOffers(status.selectedMonth)),
shareReplay({refCount:true}));
}
//observable consumed by component - loaded Events + changes made by user
get events(){
return this.loadedEvents.pipe(
switchMap(initEvents=>this.prefChangesSubj.asObservable().pipe(
startWith(initEvents),
scan((events:{offers:MarketOffer[],dayPrefs:MarketOffer[],month:Date},update:MarketOffer)=>{
....
return events;
}
}
getOffers(date:Date){
return this.httpClient.get(....);
}
And now my questions:
Is it Ok, to have these combined observables(loadedEvents, events) in service? Or they should be combined in component?
How to handle errors when iam using async pipe? For example in loadedEvents getter, iam using switchmap to call getOffers, which gets data from API. How to handle error if http call fails? I can use catchError, but than component wouldnt be notified about error. But i must catch this potential error cause otherwise it will break the whole observable and new data wont be loaded later. How to solve this problem?
Is the approach to create combined observable from loadedEvents and changes subject correct? Or how it should be done using reactive approach?
I have searched for articles on this topic, but most of them doesnt cover problems like error handling. So i would be grateful even for links to some good articles or example apps, so i can read more about this.
thx and sorry for long post :)
Abstract: should I use express-validator if I can validate data using mongoose schema?
I'm a front end developer, and I usually do some validation on the form before submitting. Now, I've started studying express.
One of the first things I did was receive a form and validate it using a library called express-validator, then the database operations were done. very simple, no big deal. However, after doing a project by myself I realized that mongoose itself could handle the errors, not only that, but it was quite easy to return these errors on the front, especially in the case of an api.
So that is my doubt, why validate this data so many times? I know that database schemas is not only to do that, but doing theses things once in front and twice in backend cause too many
repetition and may be a little hard to maintain.
Here goes a few lines of code only to illustrate the case. Don't judge me, I'm still learning.
import mongoose from "mongoose";
const TaskSchema = new mongoose.Schema({
name: {
type: String,
required: true,
trim: true,
maxlength: 20,
},
description: {
type: String,
required: false,
},
completed: {
type: Boolean,
default: false,
},
date: {
type: Date,
default: Date.now,
},
});
export default mongoose.model("Task", TaskSchema);
import taskModel from "../models/tasksModel";
export function createTask(req: express.Request, res: express.Response) {
taskModel
.create(req.body)
.then((task) => res.status(201).send(task))
.catch((err) => res.status(400).send(err));
}
The thing is that you have both basic and business requirements. Meaning that by declaring database tables (collections), its properties, data types, constraints etc you setup, so to say, basic data structure. But you can also have extra business requirements, that probably won't be covered by the database syntax.
For example, you have in input as array of objects, that should be then saved to database. Probably you would like to prevent passing dublicate objects.
By different conditions (user roles, permissions etc) your data can be validated through different validation schemas.
Also it's a common case that you validate your input and transfer it to various layers, modules, components of your application as DTO.
At last but no least, SQL injection can be made by simple 'consuming' not validated data by database.
On the frontend each user works with their own copy of the application which runs on a single laptop and accepts input from local system or known web resources.
On the backend all users work with a single app which runs on the shared infrastructure and accepts inputs from the Internet.
Serverside validation is required to ensure data integrity and security as you have no control on where HTTP requests come from and what payload they deliver. Even legitimate clients may lack clientside validation if your api is called from a script. Invalid or malicious request can affect all users. Data loss may not be recoverable.
Client side validation is optional and is there to improve UX and provide instant feedback on invalid input - you save on HTTP roundtrip and can validate fields in isolation before whole form is completed and submitted to the server. Security-wise client side application affects only current user and can always be restored to the initial state by reloading the web page.
So I'm using Meteor's 'methods' to transmit data between the client and server. Is there a recommended pattern for validating data? I've seen SimpleSchema used on the server like this a bit
Lists.schema = new SimpleSchema({
name: {type: String},
incompleteCount: {type: Number, defaultValue: 0},
userId: {type: String, regEx: SimpleSchema.RegEx.Id, optional: true}
});
...
const list = {
name: 'My list',
incompleteCount: 3
};
Lists.schema.validate(list);
...which makes sense, is there something I similar I should be using on the client to validate forms? Any extra information would be appreciated as well.
You can use simple-schema to validate arguments in methods when you use Meteor's validated method package.
https://github.com/meteor/validated-method
One advantage to this is that you can validate the args on the client in the method simulation so if there is an error, the method is rejected before it even gets to the server.
You can also use the Meteor's check package (https://docs.meteor.com/api/check.html) as part of your validation.
In terms of validating forms, there are many solutions. An example would be jQuery validation
Client side validation requirement arises when you are not using autoforms. If wrong values are passed from UI then Simpleschema generates error on server side.
If you desire to keep no validations on client side without using autoform, then you can use check functionality to check data sent from UI. When any check fails, catch in asynchronous .Meteor.call and using bootstrap and jquery you can show user friendly message on UI.
Or else use normal JavaScript and gquery to.meet your needs. This procedure is very tedious to maintain later when the application is live on the server. Just to change simple validation or condition entire code will have to built and minified to suit server production ready code. But in case if you use simpleschema and autoforms, you will have to change app.js file and restart the application (may be using pm2 as I do) and your application will work fine.
I would like also to add that communication between server and client in Meteor are done through "publications" (serverside) and "subscriptions" (client side) to certain publications.
To my understanding, methods in Meteor are only for CRUD operations for example and they can be called remotely from the client.
Background: Using MEAN stack to build a web app, I am still learning.
The Issue: I find the following confusing. Say I have a user logged in (I am using Passport.js). From Angular I can retrieve it querying my Node.js server.
What I am doing now is something similar to:
app.get('/userLogged',function(req,res){
res.json({req.user});
});
This does not sound safe to me. I might be a novice, but I have seen this in many tutorials. With a console.log in the browser I can print all the info about the user, including the hashed password. My guess is that I should send a minimal set of information to the browser, filtering out the rest.
My Question: is this safe at all, or I am just leaving the door open to hackers?
Take a look at the concept of ViewModel. It represents the data you want to share publicly with an external user of the system.
What can be achieved in your case, is implementing the right view model out of the data model you store internally. A simplistic example illustrating this concept would be to create a view model for your user object that will pick the data you would like to send back :
// This function will return a different version
// of the `user` object having only a `name`
// and an `email` attribute.
var makeViewModel = function (user) {
return _.pick(user, ['name', 'email']);
}
You will then be able to construct the right view model on demand :
app.get('/user',function (req,res){
res.json(makeViewModel(req.user));
});
I've been working on a Backbone.js project that syncs to a Google App Engine REST server I've also put together. I'm using the Appengine-Rest-Server project code to enable the REST interface. IT works really well, but there is one problem. When I post a new model to it, it expects a JSON post in the form of:
{ 'modelname' : {'attribute1':'attribute','attribute2':'attribute'}}
When I use python and the requests library to POST this data to my REST server... it works fine.
Backbone.js appears to be sending POST requests without the model name a la
{'attribute1':'attribute','attribute2':'attribute'}
Now, not being an expert on what formats are 100% RESTful, I'm not sure whether my REST server is improperly configured (in which case I don't expect you guys to be able to help with the code), whether Backbone.js is improperly configured, or whether both formats are potentially RESTful and I just need to figure out how to get backbone to add in the model name.
So, to close, is one or both of these formats compatible with a truly RESTful API? If requiring the model name in the JSON is not a gross violation of making a RESTful API, does anyone know how I can make Backbone send out post requests in the proper format?
Thanks!
Most elegant way:
YourModel = Backbone.Model.extend({
name: 'modelName',
toJSON: function () {
var data = {};
data[this.name] = _.clone(this.attributes); // or Backbone.Model.prototype.toJSON.apply(this)
return data;
}
});
Or you can directly pass data to the options
model.save(null, {
data: {
// custom format
}
});