I'm working on creating a web form that can dynamically read a swagger endpoint to create form fields. Specifically right now I am trying to read the schemas from the component section defined by openAPI 3.
Example json:
{
"openapi": "3.0.1",
"info": {
.......
},
"paths": {
........
},
"components": {
"schemas": {
"FakeAppConfiguration": {
"type": "object",
"properties": {
"setting1": {
"type": "string",
"nullable": true
},
"setting2": {
"type": "string",
"nullable": true
}
},
"additionalProperties": false
},
"OtherFakeAppConfiguration": {
........
},
"ThirdFakeAppConfiguration": {
........
}
}
}
}
}
Using this snippet of json as an example, I can easily get the names of the schemas that are defined by using (json has already been loaded into data using fetch)
for (let schema in data.components.schemas)
{
//this will print out FakeAppConfiguration, OtherFakeAppConfiguration, ThirdFakeAppConfiguration
console.log(schema);
}
My problem now comes in trying to access each of these schema trees without calling them directly. I could easily do data.components.schemas.FakeAppConfiguration, but that would defeat the purpose of making this dynamic. I've been trying to somehow use the strings obtained in the above loop to access what I want to no avail. Some examples of things I've tried are below. Anyone able to help me get further access without calling the variable directly with dot notation? I have also considered doing manual parsing of the JSON, but trying to avoid that. This is a react app, so if anyone can think of a library that could help, I'm all ears there as well.
//treating like a key
data.components.schemas['FakeAppConfiguration']
//trying to create a map
interface SchemaDef {
type: string,
properties: Properties,
//....etc,etc
}
let i = 0;
let schemas: Map<string, SchemaDef> = new Map<string, SchemaDef>();
for (let schema in data.components.schemas)
{
schemas.set(schema, data.components.schemas[i]);
i++;
}
You could iterate over the Object.entries() of your "schemas" object.
let schemas = {
"FakeAppConfiguration": {
"type": "object",
"properties": {
"setting1": {
"type": "string",
"nullable": true
},
"setting2": {
"type": "string",
"nullable": true
}
},
},
"FakeAppConfiguration2": {
"type": "object",
"properties": {
"setting1": {
"type": "string",
"nullable": true
},
"setting2": {
"type": "string",
"nullable": true
}
},
}
};
for (let [key, value] of Object.entries(schemas)) {
console.log(key, "\n\n", value);
}
Related
I was looking around the docs and couldn't find any direct or indirect solution.
Is there any way to get validation on JSON objects without knowing exactly where the specific object is located?
For example, I want to validate the following sub-object:
{
"grandParent": {
"parent": {
"child": {
"name": "John"
}
}
}
}
The object can be part of a larger JSON file the can be structured as follows:
{
"root": {
"someKey": {
"grandParent": ...
},
"grandParent": ...,
...<go in even deeper>: {
"grandParent": ...
}
}
}
Can I create a json schema that validates the object no matter where it is?
Similar example in glob would be: root.**.grandParent.parent.child
You'll need to use a combination of additionalProperties, items, and recursive references.
First, we define the structure you want to validate. You have to define properties for each layer of the object.
Next, you want your root level to reference that definition. Because you're using pre draft 2019-09, you'll need to wrap that reference in an allOf.
Then you want to make sure that for objects, the values have the root schema applied, and for arrays, each item has the root schema applied.
The use of "$ref": "#" resolves to the root of the schema, which creates the cyclical reference.
Some implementations may not like this, but most should be able to handle it.
Here's a live demo of the below schema: https://jsonschema.dev/s/lBrZk
{
"$schema": "http://json-schema.org/draft-07/schema",
"definitions": {
"grandParentToChild": {
"properties": {
"grandParent": {
"properties": {
"parent": {
"properties": {
"child": {
"properties": {
"name": {
"type": "string"
}
}
}
}
}
}
}
}
}
},
"allOf": [
{
"$ref": "#/definitions/grandParentToChild"
}
],
"additionalProperties": {
"$ref": "#"
},
"items": {
"$ref": "#"
}
}
I am using the AJV package in my node.js project.
I am trying to validate some data against a couple of schema files. Both of these schema files are in the same directory:
/dir
|
parent_schema.json
|
sub_schema.json
/data
|
data.json
I am trying to get a super simple example of the $ref property working but I am having trouble. parent_schema.json looks like:
{
"properties": {
"foo": { "type": "string" },
"bar": { "$ref": "sub_schema.json" }
}
}
And sub_schema.json looks like:
{
"properties": {
"sub1": { "type": "string" },
}
}
And I am trying to validate my data.json which for the sake of completeness looks like:
{
"foo": "whatever",
"bar": {
"sub1": "sometext"
}
}
The issue I'm having is with my $ref path. I am getting this error from AJV:
MissingRefError {
message: "can't resolve reference subschema1.json from id #"
missingRef: "subschema1.json"
missingSchema: "subschema1.json"
}
Anyone see what's wrong with my path? I know you are also supposed to use the # to select what specific property you want matched against, but I want the ENTIRE schema to be used.
It's a common misconception that $ref somehow "loads" a file.
See what ajv.js.org says:
$ref is resolved as the uri-reference using schema $id as the base URI (see the example).
And:
You don’t have to host your schema files at the URIs that you use as schema $id. These URIs are only used to identify the schemas, and according to JSON Schema specification validators should not expect to be able to download the schemas from these URIs.
Ajv won't try loading this schema from stack://over.flow/string for example:
{
"$id": "stack://over.flow/string",
"type": "string"
}
If you want to reference that schema in another schema, they both need to have the same base URI stack://over.flow/ e.g.,
{
"$id": "stack://over.flow/object",
"type": "object",
"properties": {
"a": { "$ref": "string#" }
}
}
Here { "$ref": "string#" } says "import the schema at stack://over.flow/string" so you end up with:
{
"$id": "stack://over.flow/object",
"type": "object",
"properties": {
"a": {
"$id": "stack://over.flow/string",
"type": "string"
}
}
}
This allows you to combine small schemas:
const ajv = new Ajv;
ajv.addSchema({
"$id": "stack://over.flow/string",
"type": "string"
});
ajv.addSchema({
"$id": "stack://over.flow/number",
"type": "number"
});
const is_string = ajv.getSchema("stack://over.flow/string");
const is_number = ajv.getSchema("stack://over.flow/number");
console.log(is_string('aaa'), is_string(42));
console.log(is_number('aaa'), is_number(42));
const is_ab = ajv.compile({
"$id": "stack://over.flow/object",
"type": "object",
"properties": {
"a": { "$ref": "string#" },
"b": { "$ref": "number#" }
}
});
console.log(is_ab({a: "aaa", b: 42}));
console.log(is_ab({a: 42, b: "aaa"}));
<script src="https://cdnjs.cloudflare.com/ajax/libs/ajv/6.12.2/ajv.min.js"></script>
(Please note that in your example both schemas are incorrect. You're missing {"type": "object"} in both.)
To answer your question:
const ajv = new Ajv;
ajv.addSchema({
"$id": "stack://over.flow/parent.schema",
"type": "object",
"properties": {
"foo": { "type": "string" },
"bar": { "$ref": "child.schema#" }
}
});
ajv.addSchema({
"$id": "stack://over.flow/child.schema",
"type": "object",
"properties": {
"sub1": { "type": "string" },
}
});
const is_parent = ajv.getSchema("stack://over.flow/parent.schema");
const is_child = ajv.getSchema("stack://over.flow/child.schema");
console.log(is_parent({
"foo": "whatever",
"bar": {
"sub1": "sometext"
}
}));
<script src="https://cdnjs.cloudflare.com/ajax/libs/ajv/6.12.2/ajv.min.js"></script>
I'm trying to find a way to alter schema validation to find most appropriate schema for a given object. Let's say we have a schema:
{
"oneOf": [
{
"$ref": "#/definitions/a"
},
{
"$ref": "#/definitions/b"
}
],
"definitions": {
"a": {
"type": "object",
"properties": {
"prop1": {
"enum": ["x"]
}
},
"required": ["prop1"]
},
"b": {
"type": "object",
"properties": {
"prop1": {
"enum": ["y"]
},
"prop2": {
"type": "string"
}
},
"required": ["prop1", "prop2"]
}
}
}
Now, if I have an object { "prop1": "y" }, I want it to be resolved as of #/definitions/b type, even if it's not really valid for this scheme. That is, I want to use just prop1 property for resolving.
I wonder if there is a way to do it using AJV custom keywords, without rebuilding the schema itself? In particular, if schema is not valid for an object, is it possible to use custom keywords to override it and make it valid?
If the objective is to only report errors from the correct schema you can use either "switch" (with v5 option, it is moved to ajv-keywords from version 5.0.0) or "if/then/else" (it is recommended as it is likely to be added in JSON Schema draft 7):
{
"id": "schema",
"if": { "properties": { "prop1": { "const": "x" } } },
"then": { "$ref": "#/definitions/a" },
"else": { "$ref": "#/definitions/b" }
}
If you need to know which schema was used for validation you can use a custom keyword to make a note of it:
{
"id": "schema",
"if": { "properties": { "prop1": { "const": "x" } } },
"then": {
"allOf": [
{ "schemaUsed": "schema#/definitions/a" },
{ "$ref": "#/definitions/a" }
]
},
"else": {
"allOf": [
{ "schemaUsed": "schema#/definitions/b" },
{ "$ref": "#/definitions/b" }
]
}
}
The keyword should be defined to store schema ID during validation in some variable so that after validation you could see which one was used.
If you need to get actual schema you can do:
var validate = ajv.getSchema('schema#/definitions/a'); // validating function
var schema = validate.schema; // schema as JSON
I try to find one object in document's array, and update its fields.
db.rescuemodels.findAndModify({
query: {
"features": {
$elemMatch: {
"properties.title": "W"
}
}
},
update: {
$set: {
"features": {
"properties": {
"title": "XXX"
}
}
}
}
})
Query is fine, result is one matching element, but how to make update method change just one field in this example title? Because now it create new array or object and clean old array.
MongoDB has "Dot Notation" for this purpose, as well as the positional $ operator for referencing matched elements of an array:
db.rescuemodels.findAndModify({
"query": { "features.properties.title":"W" },
"update": { "$set": { "features.$.properties.title":"XXX" } }
})
Note that this only works when there is a single array present as in:
{
"features": [
{ "properties": { "name": "A" } },
{ "properties": { "name": "W" } }
}
}
If you are nesting arrays then MongoDB cannot match in "positional operator beyond the "outer" array only:
{
"features": [
{ "properties": [{ "name": "A" }, { "name": "W" }] },
]
}
Postional matching will not work there because you cannot do features.$.properties.$.name and the matched element index would be 0 and not 1 as this refers to the outer array.
Also note that under nodejs the MongoDB driver syntax for .findAndModify() is quite different to the shell syntax. The "query" and "update" parts are separate arguments there rather than the document form as used by the shell,
To update an individual element in the array "features" you can use the positional operator, $. Your query would look something like this...
db.rescuemodels.findAndModify({
query: {
"features": {
$elemMatch: {
"properties.title": "W"
}
}
},
update: {
$set: {
"features.$.properties.title": "XXX"
}
}
})
I have a folder 'schemas' which contains different JSON files to store different schemas.
For example,
/schemas/apple-schema.json
{
"$schema": "http://json-schema.org/draft-06/schema",
"type": "object",
"properties": {
"apple_name": {
"type": "string"
},
"id": {
"type": "integer"
},
"apple_weight": {
"type": "number"
},
"timestamp": {
"type": "string",
"format": "date-time"
},
"required": ["id"]
}
}
/schemas/mango-schema.json
{
"$schema": "http://json-schema.org/draft-06/schema",
"type": "object",
"properties": {
"mango_name": {
"type": "string"
},
"id": {
"type": "integer"
},
"mango_mature": {
"type": "number"
},
"mango_age": {
"type": "number"
},
"mango_timestamp": {
"type": "string",
"format": "date-time"
},
"required": ["id"]
}
}
Different schemas have different keys. What I want to validate is below:
Keys (e.g. apple_name, id, timestamp, mango_name, mango_mature, mango_age and etc) among all schemas are following the same naming convention (lowercase with an underscore: 'xxx' or 'xxx_yyy').
Any key whose name contains 'timestamp' should be in format 'date-time'
Any schema should exist key 'id'. (Key 'id' is required for all schemas)
Is it possible to write a unit test which imports all JSON schemas and handle the validations?
You need a JSON Schema Validator like Ajv to do the schema validation at runtime instead of the features of TypeScript.
You may want to write some type definitions for these fruits schema to help you coding under the type safety situation.
TypeScript provides static type checking, which stripped away at compile-time and do not exist at runtime. So you can't check the type at runtime.
According to your requirement:
Validating name convention
You can do it by a regular expression.
Any schema should exist key 'id'
Here is something TypeScript can offer you:
interface Schema {
id: number;
}
interface AppleSchema extends Schema {
apple_name: string,
apple_weight: number,
// rest properties...
}
interface MongoSchema extends Schema {
mango_name: string,
mango_mature: number,
// rest properties...
}
// enjoy the power of TypeScript
export function testApple(apple: AppleSchema) {
console.log(apple.id); // now you can access apple.id, apple.apple_name, apple.apple_weight ...
}
// even more
export function findFruit<T extends Schema>(fruits: T[], id: number) {
return fruits.find(fruit => fruit.id === id)
}