fastify and ajv schema validation - javascript

I am trying to validate the querystring parameter 'hccid' as shown below. Seems like the validation is not working for me. Can some one see what I am missing?
const fastify = require('fastify')({
ajv: {
removeAdditional: true,
useDefaults: true,
coerceTypes: true
}
});
const schema = {
querystring: {
hccid: { type: 'string' }
}
};
// Declare a route
fastify.get('/hello', {schema}, function (request, reply) {
const hccid = request.query.hccid;
reply.send({ hello: 'world' })
});
// Run the server!
fastify.listen(3000, function (err) {
if (err) throw err
console.log(`server listening on ${fastify.server.address().port}`)
});
So with that code, I should get a schema validation exception when I call the service with a total new queryparam abc just like I shown below
http://localhost:3000/hello?abc=1
but there was no error. I got the response back {"hello":"world"}
I also tried removing the queryparam all together http://localhost:3000/hello
and I still got {"hello":"world"}
so obviously the validation is not working. What is missing in my code? any help would be appreciated.

this schema structure solved my problem. Just in case if someone wants to check it out if they run into similar issue.
const querySchema = {
schema: {
querystring: {
type: 'object',
properties: {
hccid: {
type: 'string'
}
},
required: ['hccid']
}
}
}

According to the docs, you can use the querystring or query to validate the query string and params to validate the route params.
Params would be:
/api/garage/:id where id is the param accessible request.params.id
/api/garage/:plate where plate is the param accessible at request.params.plate
Example for param validation would be:
const getItems = (req, reply) => {
const { plate } = req.params;
delete req.params.plate;
reply.send(plate);
};
const getItemsOpts = {
schema: {
description: "Get information about a particular vehicle present in garage.",
params: {
type: "object",
properties: {
plate: {
type: "string",
pattern: "^[A-Za-z0-9]{7}$",
},
},
},
response: {
200: {
type: "array",
},
},
},
handler: getItems,
};
fastify.get("/garage/:plate", getItemsOpts);
done();
Query / Querystring would be:
/api/garage/id?color=white&size=small where color and size are the two querystrings accessible at request.query.color or request.query.size .
Please refer to the above answer as an example fro query validation.
Validation
The route validation internally relies upon Ajv v6 which is a
high-performance JSON Schema validator. Validating the input is very
easy: just add the fields that you need inside the route schema, and
you are done!
The supported validations are:
body: validates the body of the request if it is a POST, PUT, or PATCH method.
querystring or query: validates the query string.
params: validates the route params.
headers: validates the request headers.
[1] Fastify Validation: https://www.fastify.io/docs/latest/Validation-and-Serialization/#validation
[2] Ajv#v6: https://www.npmjs.com/package/ajv/v/6.12.6
[3] Fastify Request: https://www.fastify.io/docs/latest/Request/

Related

Apollo GraphQL merge cached data

I have a page that consists of 2 components and each of them has its own request for data
for example
<MovieInfo movieId={queryParamsId}/>
const GET_MOVIE_INFO = `gql
query($id: String!){
movie(id: $id){
name
description
}
}`
Next component
<MovieActors movieId={queryParamsId}/>
const GET_MOVIE_ACTORS = `gql
query($id: String!){
movie(id: $id){
actors
}
}`
For each of these queries I use apollo hook
const { data, loading, error } = useQuery(GET_DATA, {variable: {id: queryParamsId}}))
Everything is fine, but I got a warning message:
Cache data may be lost when replacing the movie field of a Query object.
To address this problem (which is not a bug in Apollo Client), either ensure all objects of type Movie have IDs, or define a custom merge function for the Query.movie field, so InMemoryCache can safely merge these objects: { ... }
It's works ok with google chrome, but this error affects Safari browser. Everything is crushing. I'm 100% sure it's because of this warning message. On the first request, I set Movie data in the cache, on the second request to the same query I just replace old data with new, so previous cached data is undefined. How can I resolve this problem?
Here is the same solution mentioned by Thomas but a bit shorter
const cache = new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
// shorthand
merge: true,
},
},
},
},
});
This is same as the following
const cache = new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
merge(existing, incoming, { mergeObjects }) {
return mergeObjects(existing, incoming);
},
},
},
},
},
});
Solved!
cache: new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
merge(existing = [], incoming: any) {
return { ...existing, ...incoming };
// this part of code is depends what you actually need to do, in my
case i had to save my incoming data as single object in cache
}
}
}
}
}
})
});
The other answers still work, but as of Apollo Client >= 3.3 there's an easier option that doesn't require specifying specific fields or a custom merge function. Instead, you only have to specify the type and it will merge all fields for that type:
const cache = new InMemoryCache({
typePolicies: {
YOUR_TYPE_NAME: {
merge: true,
}
}
});
From your example query, I'd guess that an id field should be available though? Try requesting the ID in your query, that should solve the problem in a much more ideal way.
Had same issue with inconsistency of data values vs. our schema. A value type within an entity was missing the id value. Caused by an incomplete data migration.
Temporary solution:
const typePolicies = {
PROBLEM_TYPE: {
keyFields: false as false,
},
PARENT_TYPE: {
fields: {
PROBLEM_FIELD: {
merge: true
}
}
}
}

How to get nested objects using Firestore REST API?

I'm sending REST API requests using axios package.
I can get a single document (for example, cities/cityId):
axios.get(`https://firestore.googleapis.com/v1/projects/<PROJECTIDHERE>/databases/(default)/documents/<COLLECTIONNAME>/<DOCID>`)
What I can't do is to get a nested document (for example, cities/cityId/streetId)
The database structure is very simple.
cities: { // COLLECTION
cityId: { // DOCUMENT
streetId: { // MAP
buildingId: '...', // STRING
...
},
...
},
}
This article Google Firestore REST API examples suggests that it's possible to get nested objects using structured queries. Unfortunately, I've been trying to do it without any success.
Here's my not working code:
getQuery ({ path, token }) {
const url = 'https://firestore.googleapis.com/v1/projects/big-boobs/databases/(default)/documents:runQuery'
const params = {
from: [ { collectionId: 'cities' } ],
select: {
fields: [
{ fieldPath: 'cityId1.streetId' }
]
}
}
const options = {
method: 'get',
url,
params,
paramsSerializer: function (params) {
return Qs.stringify(params, { arrayFormat: 'brackets' })
}
}
if (token) options.headers['Authorization'] = `Bearer ${token}`
return axios(options)
}
I'm getting an error:
{
"error": {
"code": 400,
"message": "Invalid JSON payload received. Unknown name \"from[][collectionId]\": Cannot bind query parameter. Field 'from[][collectionId]' could not be found in request message.\nInvalid JSON payload received. Unknown name \"select[fields][][fieldPath]\": Cannot bind query parameter. Field 'select[fields][][fieldPath]' could not be found in request message.",
}
You can select individual fields from a document, but you can't select individual properties from a map field. I think you'll have to settle for getting the entire map called streetId.
If it's unacceptable to pull down the entire map, you can reorganize your data such that each streetId exists in its own document.

How to validate the request body if the body is a single json array?

I am trying to validate the body of the request using the express-validator.The hole body is a single array, so i don't have a field name.
I am using the new API of express-validator and express version 4.
The body looks like this:
["item1","item2"]
My code:
app.post('/mars/:Id/Id', [
check('id')
.isLength({ max: 10 })
.body() //try many ways to get the body. most examples i found were for the old api
.custom((item) => Array.isArray(item))
],
(req, res, next) => {
const data: string = matchedData(req); //using this method to only pass validated data to the business layer
return controller.mars(data); //id goes in data.id. i expect there should be an data.body once the body is validated too.
}
How do i validate the body?
i did that following the instructions from the docs, here is the code:
just declare the custom validator after the expressValidator reference in your code.
app.use(expressValidator());
app.use(expressValidator({
customValidators: {
isArray: function(value) {
return Array.isArray(value);
}
}
}));
after that you can check the validity like that:
req.checkBody('title', 'title é obrigatório').notEmpty();
req.checkBody('media','media must be an array').isArray();
i was using the version 3.2.0 in my project that i could acheive this behavior.
Here is an example of my request body:
exports.validateAddArrayItem = function(req, res, next) {
{
title: 'foo',
media: [1,2,3]
}
Aditionally if you do not want to change your response i once did a validation like that:
if (req.body.constructor === Array) {
req.body[0].employee_fk = tk.employee_id;
}
req.assert('item', 'The body from request must be an array').isArray();
var errors = req.validationErrors();
if (errors) {
var response = { errors: [] };
errors.forEach(function(err) {
response.errors.push(err.msg);
});
return res.status(400).json(response);
}
return next();
};
here is an example of my request body:
[{
employeefk: 1,
item: 4
}]
If you are using ajax, try to put your array inside an object like the following:
$.ajax({
type: "POST",
url: url,
data: { arr: ["item1", "item2"] },
success: function (data) {
// process data here
}
});
Now you could use arr identifier to apply validation rules:
const { check, body, validationResult } = require('express-validator/check');
...
app.post('/mars/:Id/Id', [
check('chatId').isLength({ max: 10 }),
body('arr').custom((item) => Array.isArray(item))
], (req, res, next) => {
const data: string = matchedData(req);
return controller.mars(data);
});

Sails.js not applying model scheme when using MongoDB

I'm going through the (excellent) Sails.js book, which discusses creating a User model User.js in Chapter 6 like so:
module.exports = {
connection: "needaword_postgresql",
migrate: 'drop',
attributes: {
email: {
type: 'string',
email: "true",
unique: 'string'
},
username: {
type: 'string',
unique: 'string'
},
encryptedPassword: {
type: 'string'
},
gravatarURL: {
type: 'string'
},
deleted: {
type: 'boolean'
},
admin: {
type: 'boolean'
},
banned: {
type: 'boolean'
}
},
toJSON: function() {
var modelAttributes = this.toObject();
delete modelAttributes.password;
delete modelAttributes.confirmation;
delete modelAttributes.encryptedPassword;
return modelAttributes;
}
};
Using Postgres, a new record correctly populates the boolean fields not submitted by the login form as null, as the book suggests should be the case:
But I want to use MongoDB instead of PostgreSQL. I had no problem switching the adaptor. But now, when I create a new record, it appears to ignore the schema in User.js and just put the literal POST data into the DB:
I understand that MongoDB is NoSQL and can take any parameters, but I was under the impression that using a schema in Users.js would apply to a POST request to the /user endpoint (via the blueprint routes for now) regardless of what database was sitting at the bottom. Do I need to somehow explicitly tie the model to the endpoint for NoSQL databases?
(I've checked the records that are created in Postgres and MongoDB, and they match the responses from localhost:1337/user posted above)
I understand that MongoDB is NoSQL
Good! In sails the sails-mongo waterline module is responsible for everything regarding mongodb. I think I found the relevant code: https://github.com/balderdashy/sails-mongo/blob/master/lib/document.js#L95 So sails-mongo simply does not care about non existent values. If you think this is bad then feel free to create an issue on the github page.
A possible workaround might be using defaultsTo:
banned : {
type : "boolean",
defaultsTo : false
}
You can configure your model to strictly use the schema with this flag:
module.exports = {
schema: true,
attributes: {
...
}
}
I eventually settled on performing the validations inside my controller.
// a signup form
create: async (req, res) => {
const { name, email, password } = req.body;
try {
const userExists = await sails.models.user.findOne({ email });
if (userExists) {
throw 'That email address is already in use.';
}
}

Best way to validate request params sails.js / node.js

Currently I'm implementing an application using Sails.JS. Every time REST Request hit's my sails controller, I'm manually (if else checks) checking validations like if variable exist or not / if its valid data type or not.
Is there any standard way (custom middle ware to validate each request against pre defined JSON objects?) to validate parameters as its over heading my controller logic.
How is it handled in other programming languages / frameworks in production use?
Thanks in advance.
Prasad.CH
Sails.js is built on top of Express, so it seems you're looking for something like express-validator that is an express middleware. You can also use it along with the Sails policies, as waza007 suggested.
The best solution for me is https://github.com/epoberezkin/ajv
It's a JSON schema validator
Short example:
var Ajv = require('ajv');
var ajv = Ajv();
var schema = {
"type": "object",
"properties": {
"foo": { "type": "number" },
"bar": { "type": "string" }
},
"required": [ "foo", "bar" ]
};
var data = { "foo": 1 };
var valid = ajv.validate(schema, data);
if (!valid) {
return res.send(400, ajv.errorsText(), ajv.errors);
}
express-validator is one option, but I prefer to do most of my validation directly in my ODM (Mongoose). For example:
var mongoose = require('mongoose');
var validate = require('mongoose-validator');
var nameValidator = [
validate({
validator: 'isLength',
arguments: [3, 50],
message: 'Name should be between {ARGS[0]} and {ARGS[1]} characters',
type: 'isLength'
}),
validate({
validator: 'isAscii',
passIfEmpty: true,
message: 'Name should contain Ascii characters only',
type: 'isAscii'
})
];
var todoSchema = mongoose.Schema({
name: {
type: String,
required: true,
validate: nameValidator
},
completed: {type: Boolean, required: true}
});
Then you can handle the error object such a way:
router.post('/newTodo', handlers.newTodo, validationMonad(handlers.getTodos));
If Mongoose returns a validation error when creating the new document in handlers.newTodo, then it is passed to the validation handler which in turn will parse it, append the parsed error to the response and call the getTodos handler, which will show the errors in res.errors.
exports.validationMonad = function (reqHandler) {
return function (err, req, res, next) {
if (!res.errors) res.errors = [];
switch (err.name) {
case 'ValidationError':
for (field in err.errors) {
switch (err.errors[field].kind) {
case 'required':
res.errors.push({
name: err.errors[field].name,
message: "The field '" + err.errors[field].path + "' cannot be empty.",
kind: err.errors[field].kind,
path: err.errors[field].path
});
break;
default:
res.errors.push({
name: err.errors[field].name,
message: err.errors[field].message,
kind: err.errors[field].kind,
path: err.errors[field].path
});
}
}
if (reqHandler) {
reqHandler(req, res, next);
} else {
// JSON
res.json(res.errors[0])
}
break;
case 'CastError':
// Supplied ID is not an ObjectID
res.errors.push({
name: err.name,
message: err.message,
kind: err.kind,
path: err.path
});
if (reqHandler) {
reqHandler(req, res, next);
} else {
// JSON
res.json(res.errors)
}
break;
default:
return next(err)
}
}
};

Categories

Resources