Typescript and REST API response - javascript

Let me preface this question by saying that I am fairly new to typescript, but a long time JavaScript developer.
I have an existing JavaScript application that I am looking to transition over to typescript. In the JavaScript app, it makes a rest API call to fetch data, then checks to see if it exists, and then conditionally renders the app. In JavaScript, I can dynamically check to see if Properties exist on the response object and conditionally render that. In typescript, it throws an error because the data is of any type, and it doesn’t know if that property exists or not.
In a typescript application, is it pretty common to go create types for all of your API responses, so that you can get type safety on them? Using node as your backend, I could see a huge opportunity where you might be able to share backend and front end models. I am currently using .net core for my backend, and I am concerned I might be shooting myself in the foot trying to always create typescript models from the entity framework models.
Does anyone else use .net core for the backend and react on the front end? How do you handle API responses in typescript?

I can't tell you if it's common, but we write json-schema files for every endpoint. These schema files are:
Used to validate request bodies and generate errors.
Used to generate documentation.
Converted in to typescript types, which are used in both front- and backend.

We are using the same stack as you--.Net Core backend/React frontend--with typescript. The way we handle it is by creating types for the objects the backend sends us and then converting the dynamic checkers, like the ones you mention, into user-defined type guards.
So, roughly speaking, for the various data transfer objects the server returns we have a type/type guard pair that looks like this:
// type
export type SomeDto = { someKey: string; someOtherKey: number }
// type guard
export const isSomeDto = (returnedObj: any): returnedObj is SomeDto =>
returnedObject.someKey && typeof returnedObj === "string"
returnedObject.someOtherKey && tyepeof returnedObj === "number"
Then we have basically have the following in the fetching code:
const someReturn = fetchDatafromApi(endpoint)
if isSomeDto(someReturn) {
//now typescript will recognize someReturn as SomeDto
} else {
// throw or otherwise handle the fact you have bad data from the
// server
}
This way you get the dynamic checking in javascript at runtime and type safety at compile time. These guards aren't super-fun to write (and you'll want to unit test them all), and I imagine if the possible number of objects was bigger we'd opt for a more automated solution like #Evert mentions in the other answer. But for a few objects, and existing javascript validators, it is a pretty convenient way to get typing on your API returns.

Related

TS: How to get interface from a dynamically created object

I have a schema object that contains the typed property that starts empty.
const schema = {
typed: {},
// ...
}
schema.typed will be filled dynamically when the application starts, example
typed['name'] = 'Yung Silva'
typed['age'] = 22
in another moment
typed['facebook'] = 'fb.com/yungsilva'
typed['whatsapp'] = 81981355509
there is no pattern, really each time the application is started it will be a totally different and random structure.
I would like to get an interface for this object that was dynamically assembled, example
type Fields = typeof schema.typed
it is possible?
is disturbing me at the beginning, at the moment to create the object dynamically, I don’t know what type to define for schema.typed
This is not possible since Typescript "checks" your types at compile time.
"The goal of TypeScript is to help catch mistakes early (before running the code, at compile time) through a type system and to make JavaScript development more efficient." more
At runtime the code that runs is a normal (sorta) javascript code.
there are several libraries (typescript-is) that can help you check types at run time, but the common use case doesn't need them.
TypeScript is about type checking ahead of runtime. Its purpose is to check the code consistency with itself and the third party library it uses, before running it. TypeScript won't be able to assign a type depending on the actual data, because it simply doesn't exist anymore at run time. When you write will be a totally different and random structure that means you're using plain JavaScript and its dynamic nature, so if your data never has common parts, just don't type it at all (or as any).

How to generate HTML with javascript on the server side (Node.js)?

I am trying to validate the user input from a register form, basically one of those forms when you create an account (signup), since after some research, people recommend to validate on both client side and server side to improve security.
Objective:
What I would like to achieve is when the user submits the form, therefore making a POST request to my server (the url can be /users/signup), the input will be validated with the use of express-validator. This way I can verify if the user specified a valid email, if the password and its confirmation match, etc, and if there are errors, I want to update the html page by adding a list of errors.
Note that I'd prefer to only update the necessary parts instead of the whole page to remove redundant rendering.
Now I know that I can use a template engine like Jade or Handlebars, which would result in code similar to this: res.render('signup', {errors: errors.array()}); where errors is a variable containing the validation result, and then the signup file with the code of the particular templating engine. But I would like to know if there are other ways of achieving the same without learning a new template engine, maybe similar to what JSX code looks like.
Here is the code I propose that uses a module with the necessary implementation (its not completed):
let express = require("express");
const html = require("../htmlGenerator/html");
const { check, validationResult } = require("express-validator/check");
let router = express.Router();
/* Create a new account/user. */
router.post("/signup", [
// Input Validation.
check("email", "Email is not valid").isEmail()
// ...
], (req, res) => {
const errors = validationResult(req);
if (!errors.isEmpty())
res.send(htmlErrorList(errors.array()));
//...
});
function htmlErrorList(errors) {
return html.ul(
errors.reduce((res, error) =>
res + html.li({class: "alert alert-danger"}, error.msg), "")
);
}
The idea of these functions, ul() and li(), is to create the HTML string with the given attributes (represented by an object with the properties being the attribute name and then its according value) and the content inside the tag we are creating.
The benefits I see with making or using a module that allows this sort of use are:
It promotes functional style coding
We don't need to learn a new template language to achieve something we can do with javascript
We can use the full power and capabilities of the javascript language, for example make a function that produces the HTML code of the navbar used in all end-points of the app.
Final notes:
I know that many times we want to access a database or any other data source, on the server, and pull some data to make complex computations in order to format it and display in a way the user understands. One of the solutions is, again the use of a template engine, and I'd like to know if the idea I suggest is valid and if there are any other solutions to this problem.
I appreciate any help or feedback on this subject.
This question is kind of "opinion" based so i will give my opinion.
1) Use a templating engine. It makes your life a lot easier.
2) Templating engines are easy to use. You can learn to use most them in half an hour. Making your own code would require a lot more time than just using an already proven method. Plus, other developers can easily read and edit the code if it's written using common tools whereas they would have to spend the time to learn whatever custom solution you have come up with.
3) Your example is so simple that you could generate that "HTML" just by concatenating strings and using template literals. If your example gets more complicated...than your solution would become much ... much harder to read and maintain.
4) You don't really need to "output" HTML for errors at all. Just output a JSON object and have the frontend handle it. e.g
res.send({errors: [{field: 'username', message: 'username is required'}, ...]});
5) You don't really need to output HTML for anything...other than the first page of your app/site. Keywords here are webpack + angular 2(angular is the "A" in MEAN stack).
It is used for a reason...it works and its good.
So, in summary, I would say, unless you have a really good reason to avoid template engines, I would go with a template engine.

Eloquent model attributes as camel case [Laravel 5.2] [Dingo API]

Our Eloquent models have attributes following the Laravel snake case convention.
e.g. first_name, last_name and created_at
Although my frontend (react) follows the javascript camel case standard.
e.g. firstName, lastName and createdAt
Is there a simple way to convert ALL attributes to camel case when sending an API response?
We are using Larave 5.2 and the Dingo API package.
UPDATE
Following on from the accepted answer I used the Custom Response Format approach. See the following gist for the implementation (includes unit tests):
https://gist.github.com/andrewmclagan/c5e0fe601d23f3b859b89a9f8922be68
You really have a few options. I won't go into implementing them (unless needed), but here are a few I can think of:
In Laravel:
Override the toArray() method on the model. When a model is converted to JSON, it calls toArray(). You can use this method to go through and convert all the keys to camelCase. You would need to override this on every model, though, but that can be abstracted through a parent class or a trait.
In Dingo:
Use Transformers with the Response Builder. You could create a transformer for every model, or you could create one CamelTransformer and register it with every model. You would convert the keys to camelCase in the transformer.
Create a Custom Response Format. You can create a camelJson response format that extends the default json format. Override the necessary methods with the logic to camelCase the keys.
Use the ResponseWasMorphed event. You can create an event handler for the ResponseWasMorphed event, and go through and camelCase your keys in there.
Any one of these options should be able to do the job, it's just a matter of how global or granular you want these transformations to be. For example, modifying toArray() will affect all code that converts your model to an array (or json), not just your response code, so you may not want your change to be that global.
I would think that the Custom Response Format is probably the best combination of ease and appropriateness for your situation.
Don't do this. The server doesn't need to know anything about its clients. If your React application needs to handle properties in camel case, delegate that task to the client. They should have on point in the system where all requests pass through. That's the correct place where the client, just after receiving the response must transform it.
I solved by following below :
In Your controller class
return response()->json($model);
// => ['userName' => 'foo', 'userKey' => 'bar', ...]
Requirements
Laravel 5+
Install
$ composer require grohiro/laravel-camelcase-json:~1.0
# Laravel 5.7+
$ composer require grohiro/laravel-camelcase-json:~2.0
Add the service provider.
config/app.php
'provider' => [
// default providers
// ...
Grohiro\LaravelCamelCaseJson\CamelCaseJsonResponseServiceProvider::class,
],
Reference : https://github.com/grohiro/laravel-camelcase-json

Is it possible to use Graphql generated schema as Flow definitions?

I have a project that uses the convo GraphQL, Relay and Flow. Is there a way to use GraphQL schema to feed Flow so the React props can be typed without redeclaring the types?
For Relay I'm already using the script that generates two files: schema.json and schema.graphql. The Babel plugin uses schema.json to translate the queries into javascript.
The other file, schema.graphql looks similar to Flow syntax definition for types, but there are differences. I've tried once to import type {Definition} from "./schema.graphql" from it but was failing silently.
The problem with that is GraphQL scalar-types can be used to represent values in various formats and they do not inherit from a base type you could fall back to for flow.
Scalar can not only verify a "string" but "string of 32 hexadecimal digits". It's not possible to declare this kind of type in flow.
Here are some alternatives: babel-plugin-typecheck, json-to-flow
So after a while I decided to tackle this since our project started to grow and we started to loosen our types. In the end the simplest solution has been to grab the printerSchema and modify it so it outputs flow friendly text, save it to a .js file and point libraries to there.
I'll try to post it in github in a later date

How to manage ASP.NET parameters and Javascript keys

So here is my scenario.
Im using ASP.NET MVC 3 along with HTML, CSS, JavaScript/JQuery to make a web application.
Im using Visual Studio 2010
We have already released the product (its in 1.0), however now that we are in "maintenance" mode for the project, I have a feeling that as the project has new features added, that it will be harder to maintain the set of constants between both the C# (ASP.NET MVC) and the JavaScript.
For example, in the JavaScript I would create a $.post and have it link to the MVC url Controller/Action and then I would pass in parameters { key1: value1, key2: value2}
The issue is that if the C# parameter names change or if the position of parameters in the signature change, I will only know at run-time that the JavaScript needs to be updated (im assuming that im a programmer that doesn't know the architecture well enough to do this before run time).
So my question is, how do you manage the JavaScript side more easily so that i can stay "in-sync" with changes made on the C# side. Can the compiler do this for me in some way, or is there a plug-in that can help me out?
Thanks.
Your question asks about syncing C# constants and JavaScript constants, but then also talks about parameter names and positions.
The positions of parameters matter less in the MVC world than the names, and I've not found a good way of keep those in sync short of extensive unit and integration testing. You are doing those tests, right? ;)
As far as actual constants and enums, I've taken to using T4 templates to generate both a .cs and a (namespaced) .js file for the constants/enums I need (in my case, out of a database, but could just as easily be anything else).
I can't think of any easy way, but here is something that may help. when I usually develop some website , first of all I try to write as least possible javascript code in views and have them all in .js file, this way you can be sure that you can reuse many codes and since all codes are pure javascript there won't be any problem you mentioned. I also keep the record of all actions with their controller and area name in database and use them for manage permissions and security issues. for your problem you can add all this method to database and later with a piece of code check if this method exist anymore.
adding to DB:(in base controller, so you don't need to do anything manually )
protected override void OnActionExecuting(ActionExecutingContext filterContext)
{
var area = filterContext.RouteData.DataTokens["area"];
string areaName = area != null ? area.ToString() : "";
var controllerName = filterContext.ActionDescriptor.ControllerDescriptor.ControllerName;
string actionName = filterContext.ActionDescriptor.ActionName;
//Add to DB
base.OnActionExecuting(filterContext);
}
check if that exist:
bool exist = false;
try
{
HttpWebRequest request = (HttpWebRequest)System.Net.WebRequest.Create("http://www.example.com/image.jpg");
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
exist = response.StatusCode == HttpStatusCode.OK;
}
}
catch
{
}
Your best option is integration tests. You'll be able to test exactly the actions your users would do. Seleno is a good option (it wraps Selenium) for writing integration tests.
It's worth doing. If you have good integration test coverage you'll run into fewer bugs in production.

Categories

Resources