JSON schema parser Javascript - javascript

My end goal is to read any JSON schema and represent it in a tree(HTML).
For this I need a method to parse JSON schema (right?). I went through the implementations in this page and this editor which outputs an html form from JSON schema.
What I am asking is whether there is any optimum open source solution I can use or is my approach wrong?
Is there a way to get a list of properties along with their attributes?

You can use Ajv with custom keywords to create a JSON data processor/parser (JSON Schema will be used as data in your case).
You will need to define a schema with custom keywords that would be used to process your schema and generate/collect any side effects you need in the validation context (you'll need to pass this context to validation function with call/apply method and use passContext option so it is passed to subschemas and custom keywords).
This approach is used in JSONScript evaluation schema to evaluate script (but instead of script you would pass your schema as data).

Related

Type for JSON field on Type-GraphQL

I'm defining a custom input field on type-graphql of type JSON. We're using Prisma as well. I tried with Prisma.JsonValue, Prisma.JsonObject and JSON but I get this errors. any suggestion will be welcome
You can't directly use Prisma-generated models as type definitions in TypeGraphQL. You will need to create custom classes as shown in the TypeGraphQL docs (with #ObjectType(), #InputType() etc decorators).
There is a third party library for generating TypeGraphQL types from Prisma called typegraphql-prisma which you could consider. However, in my opinion it's easier to define the classes manually.
Furthermore, I'm not sure what you mean by JsonValue. If you need to pass arbitrary JSON data, perhaps you could stringify your JSON object and pass it as a String?

Eloquent model attributes as camel case [Laravel 5.2] [Dingo API]

Our Eloquent models have attributes following the Laravel snake case convention.
e.g. first_name, last_name and created_at
Although my frontend (react) follows the javascript camel case standard.
e.g. firstName, lastName and createdAt
Is there a simple way to convert ALL attributes to camel case when sending an API response?
We are using Larave 5.2 and the Dingo API package.
UPDATE
Following on from the accepted answer I used the Custom Response Format approach. See the following gist for the implementation (includes unit tests):
https://gist.github.com/andrewmclagan/c5e0fe601d23f3b859b89a9f8922be68
You really have a few options. I won't go into implementing them (unless needed), but here are a few I can think of:
In Laravel:
Override the toArray() method on the model. When a model is converted to JSON, it calls toArray(). You can use this method to go through and convert all the keys to camelCase. You would need to override this on every model, though, but that can be abstracted through a parent class or a trait.
In Dingo:
Use Transformers with the Response Builder. You could create a transformer for every model, or you could create one CamelTransformer and register it with every model. You would convert the keys to camelCase in the transformer.
Create a Custom Response Format. You can create a camelJson response format that extends the default json format. Override the necessary methods with the logic to camelCase the keys.
Use the ResponseWasMorphed event. You can create an event handler for the ResponseWasMorphed event, and go through and camelCase your keys in there.
Any one of these options should be able to do the job, it's just a matter of how global or granular you want these transformations to be. For example, modifying toArray() will affect all code that converts your model to an array (or json), not just your response code, so you may not want your change to be that global.
I would think that the Custom Response Format is probably the best combination of ease and appropriateness for your situation.
Don't do this. The server doesn't need to know anything about its clients. If your React application needs to handle properties in camel case, delegate that task to the client. They should have on point in the system where all requests pass through. That's the correct place where the client, just after receiving the response must transform it.
I solved by following below :
In Your controller class
return response()->json($model);
// => ['userName' => 'foo', 'userKey' => 'bar', ...]
Requirements
Laravel 5+
Install
$ composer require grohiro/laravel-camelcase-json:~1.0
# Laravel 5.7+
$ composer require grohiro/laravel-camelcase-json:~2.0
Add the service provider.
config/app.php
'provider' => [
// default providers
// ...
Grohiro\LaravelCamelCaseJson\CamelCaseJsonResponseServiceProvider::class,
],
Reference : https://github.com/grohiro/laravel-camelcase-json

How to read values from properties file in ExtJs

I want to read values from config.properties and use these values in my ExtJs ComboBox. I do not want to read from JSON file.
Where should I place the config.properties file? Should I place it in the webcontent directly? I do not want to hard code the path/ or at least reduce it.
And how can I access property values through JavaScript.?
Thanks
This is not ExtJS specific but in general rather applicable to javascript as a whole.
Best bet would be to either include it in a global file that gets included in your app and is namespaced appropriately to your app so as to minimise chance of collisions with any other potential variables
MyApp = {
config:{
property1:'something',
property2:'something2'
}
}
// Then you can access this anywhere you like
myProperty1 = MyApp.config.property1; // assigns the string "something" to variable myProperty1
You can load the properties file into a data store, this way you can have the key-value pairs in the your properties file mapped to a simple model with fields
key and value.
And to enable your proxy to load data from the properties file and be converted into the corresponding model you'll have to create a reader that'll read the data from the properties file and convert to the desired record.
This custom package for i18n will give you some idea how to do that.

Passing a JSON object to a remote LESS compiler to use as LESS variables

I have an admin panel where users customize the look of a static website (mostly fonts and colors). This panel generate a JSON object with the user values. What I would need to do is passing this JSON to the LESS compiler, so that it can dynamically generate a CSS file from a LESS one using the JSON content as LESS variables. The filename should be different every time, something line file-ID.css (the ID is for the user and it could be passed via JSON too).
Is it technically possible (without extending LESS)? I noticed, for example, that you can pass functions to the parser object when you create it, could I use this functions to evaluate the JSON and passing the variables to the compiler?
Obviously I don't need to know the details, just if it is doable and possibly some link to related information if you have it.
Thanks in advance.
The best way I've found to do what I was trying to accomplish was to use a server side LESS library like PHPLESS to parse the variables from the JSON before compiling. Regular LESS compiler doesn't allow to dynamically inject variables.
To my knowledge the LESS compiler doesn't support any other input than LESS. It would be trivial to make your own pre-parser that mixes-in the variables from JSON .. not even a parser, more of a string-replacer.

Piping/streaming JavaScript objects in Node.js

I'm trying to wrap my head around Node.js streams, not that I'm pretty new to JavaScript and node, the last languages I really got were Perl and PHP :-D
I've read the Buffer/Streams documentation # nodejs.org, watched James Halliday # LXJS, read his stream-handbook and Thorsten Lorenz event-stream post. I start to understand the basics :)
I process data which is serialized in RDF (which is neither JSON nor XML). I manage to fetch the data (in real code via request) and parse it into a JS object using rdfstore module.
So far I do this:
s.createReadStream('myRDFdata.ttl').pipe(serialize()).pipe(process.stdout);
Where serialize()does the job of parsing an serializing the code at the same time right now. I use through module to interface to the stream.
Now I have some more methods (not the real function declaration but I hope you get the point):
getRecipe(parsedRDF) -> takes the parsed RDF (as a JavaScript object) and tells me how to use it
createMeal(parsedRDF, recipe) -> takes the parsed RDF and the recipe from above and creates a new RDF object out of it
this new object needs to get serialized and sent to the browser
(In the real world getRecipe will have to do a user interaction in the browser)
I like the idea of chaining this together via pipes for higher flexibility when I enhance the code later. But I don't want to serialize it to a RDF serialization every time but just send around the JS object. From what I've read in the documentation I could use the stringify module to get a string out of each step for piping it to the next step. But:
does this actually make sense? In terms of do I add unnecessary overhead or is this negligible?
I don't see how I could give the parsedRDF to both methods with the dependency that getRecipe would have to be called first and the output is input for createMeal as well. Are there modules which help me on that?
It might be that I have to ask the user for the final recipe selection so I might need to send stuff to the browser there to get the final answer. Can I do something like this over sockets while the pipe is "waiting"?
I hope this shows what I'm trying to do, if not I will try to give more details/rephrase.
Update: After sleeping over it I figured out some more things:
It probably doesn't make sense to serialize a format like RDF into something non-standard if there are official serialization formats. So instead of using stringify I will simply pass an official RDF serialization between the steps
This does imply that I parse/serialize the objects in each step and this surely does add overhead. Question is do I care? I could extend the RDF module I use to parse from stream and serialize into one
I can solve the problem with the dependency between getRecipe and createMeal by simply adding some information from getRecipe to parseRDF, this can be done very easily with RDF without breaking the original data model. But I would still be interested to know if I could handle dependencies like this with pipes
yes, It's okay to make a stream of js objects,
you just have to remember to pipe it through something that will serialize the stream again after before writing it to IO.
I'd recomend writing a module called rdfStream that parses and serializes rdf, you would use it like this
var rdf = require('rdf-stream')
fs.createReadStream(file) //get a text stream
.pipe(rdf.parse()) //turn it into objects
.pipe(transform) //optional, do something with the objects
.pipe(rdf.stringify()) //turn back into text
.pipe(process.stdout) //write to IO.
and it could also be used by other people working with rdf in node, awesome!

Categories

Resources