Variables versus template literal interpolation - javascript

Are there any cases in which you should use ${foo} in your Apollo query instead of variables: {foo: foo}?

You can do a lot with variables in a GraphQL query!
Obviously, you can pass variables as arguments to various fields
You can also use the #skip and #include directives to modify which fields are in the query
By combining these two tools and possibly others in the future, there's basically no need to do string interpolation.
Why is string interpolation in a GraphQL query bad? Well, for a few reasons it's best to think of a GraphQL query string as a static object, rather than a string to be manipulated:
You could accidentally string-manipulate your way to an invalid query. If you don't properly escape some of the arguments, you could end up with a query that returns an error or some totally different data that you didn't expect. Variables are JSON-encoded, so there's no need to escape them.
If your query is dynamically generated, there's no way to do static analysis on it using tools like eslint-plugin-graphql, so you can't check if your query is valid without actually running your code.
It's going to be more difficult for other developers to understand what the shape of the returned data will be if a query is composed of many different strings. You also can't copy-paste the query into something like GraphiQL to try running it if there is arbitrary JavaScript code in there.
In short, there are tons of opportunities to do cool stuff with GraphQL query strings, but once you start manipulating them with arbitrary code that becomes much more difficult or impossible. So my suggestion is, stick to variables.

TL;DR no, don't use string interpolation in templates. It screws up any static analysis one could do.

Related

why use translation strings instead of the object itselft?

Going through https://www.npmjs.com/package/i18n and https://github.com/fnando/i18n-js it`s a pretty common approach to have a locales file with a translate function like i18n.t('string'), and multiple translation files, using the received string to find the proper translations
ie (from i18n-js):
I18n.t("some.scoped.translation");
// or translate with explicit setting of locale
I18n.t("some.scoped.translation", { locale: "fr" });
Well, why instead of using those string paths, don't we just access the translate JSON directly?
it would still be possible to use the i18N lib to get the user language and even set it, but instead of using .t() method, we could just redirect to the proper translation object.
Wouldn't it help avoid typos since you will use the object itself to verify the path?
it seems like a so well diffused practice to use the locales string, I feel like I'm missing something about my approach, but couldn`t get to what it might be, I considered that it would be a heavier duty to go through the whole strings data, but i18n.t() would still have to do it plus parse the string into object path

Using Elvis Operator within String Interpolated Expressions in Angular 2

In my Angular 2 app I am using string interpolation to pull data from a mongoDB. Right now the server/database in in flux, so on occasion that will break my client-side display because, for instance, if I am pulling in data via string interpolation, like this:
{{record.addresses[0].zipCode}}
... and then, at a later time, the way the data is organized in the database changes, this will break my display - because the client is trying to pull in from fields that are no longer there. So I think I should be able to use something like the elvis operator in a use case like this. That way, if the data is where the client is looking for it, it will print out to the screen. But, if the data is not where it's looking for it, it will just ignore the field altogether - not breaking anything in the display.
So, in short, how would I implement the elvis operator on an expression like I have above?
You use it like this:
{{record?.addresses[0]?.zipCode}}
This will check if record is defined then if addresses[0] is defined under record object and then if zipCode is defined under that object

Go templating engine that also runs in the browser

I'm developing a web application with Go on the server, and the router will use PushState, so the server will also have to be able to render my templates. That means that I'll need a templating engine that works with Go and Javascript. The only one I've come across so far is Mustache, but it doesn't seem to be able to handle lowercase properties of structs, and there also doesn't seem to be a possibility to provide custom names like JSON:
type Person struct {
Name string `json:"name"`
Age int `json:"age"`
}
So, is there a templating engine that is available in both Go and JavaScript, and that can handle lowercase struct properties?
As the comments above state, you can't expect any 3rd party library to be able to read lowercase properties on your struct, but it appears you are trying to use tags to represent alternative representations of your struct (as you can with the encoding/json library).
What you can do is use something like github.com/fatih/structs to convert your structs to maps and then range through to lowercase all of your keys (copying the values and removing the uppercase versions) and pass it into mustache.Render() as your context. If you want to use struct tags like the encoding/json library does, you'd have to use the reflect package and write a struct-to-map function that takes into account the tags on the struct (basic example given in the documentation here). There are some SO answers on how to write a struct-to-map function using reflection, which you can improve upon to add struct tag handling as you need.
To answer your question, I don't think this is something a current templating library does that also works with javascript, but it shouldn't be too hard to get working with mustache given the idea above.

JavaScript to evaluate simple math string like 5*1.2 (eval/white-list?)

I have an input onchange that converts numbers like 05008 to 5,008.00.
I am considering expanding on this, to allow simple calculations. For example, 45*5 would be converted automatically to 225.00.
I could use a character white-list ()+/*-0123456789., and then pass the result to eval, I think that these characters are safe to prevent any dangerous injections. That is assuming I use an appropriate try/catch, because a syntax error could be created.
Is this an OK white-list, and then pass it to eval?
Do recommend a revised white-list
Do you recommend a different approach (maybe there is already a function that does this)
I would prefer to keep it lightweight. That is why I like the eval/white-list approach. Very little code.
What do you recommend?
That whitelist looks safe to me, but it's not such a simple question. In some browsers, for example, an eval-string like this:
/.(.)/(34)
is equivalent to this:
new RegExp('.(.)').exec('34')
and therefore returns the array ['34','4']. Is that "safe"?
So while the approach can probably be made to work safely, it might be a very tricky proposition. If you do go forward with this idea, I think you should use a much more aggressive approach to validate your inputs. Your principle should be "this is a member of a well-defined set of strings that is known to be 'safe'" rather than "this is a member of an ill-defined set of strings that excludes all strings known to be 'unsafe'". Furthermore, to avoid any risk of operators peeking through that you hadn't considered (such as ++ or += or whatnot), I think you should insert a space in front of every non-digit-non-dot character; and to avoid any risk of parentheses triggering a function call, I think you should handle them yourself by repeatedly replacing (...) with a space plus the result of evaluating ... (after confirming that that result is a number) plus a space.
(By the way, how come = is in your whitelist? I just can't figure out what that's useful for!)
Given that extremely restrictive whitelist, I can't see any way of performing a malicious action beyond throwing an exception. The bracket trick won't work since it requires square brackets [].
Perhaps the safest option is to modify your page's default values parser to only accept numbers and throw out anything else. That way, potentially malicious code in a link will never make it to eval.
This only leaves the possibility of the user typing something malicious into a field, but why even bother worrying about that? The user already has access to a console (Dev Tools) they could use to execute arbitrary code.
An often overlooked issue with eval is that it causes problems for javascript minifiers.
Some minifiers like YUI take the safe route and stop renaming variables as soon as they see an eval statement. This means your javascript will work but your compressed file will be larger than it needs to be.
Other's like Google Closure Compiler will continue to rename variables but if you are not careful they can break your code. You should avoid passing strings with variable names in it to eval. so for example.
var input = "1+2*3";
var result = eval("input"); // unsafe
var result = eval(input); // safe

How dangerous is it to store JSON data in a database?

I need a mechanism for storing complex data structures created in client side javascript. I've been considering using the stringify method to convert the javascript object into a string, store it in the database and then pull it back out and use the reverse parse method to give me the javascript object back.
Is this just a bad idea or can it be done safely? If it can, what are some pitfalls I should be sure to avoid? Or should I just come up with my own method for accomplishing this?
It can be done and I've done it. It's as safe as your database.
The only downside is it's practically impossible to use the stored data in queries. Down the track you may come to wish you'd stored the data as table fields to enable filtering and sorting etc.
Since the data is user created make sure you're using a safe method to insert the data to protect yourself from injection attacks (don't just blindly concatenate the data into a query string).
It's fine so long as you don't deserialize using eval.
Because you are using a database it means you need a serverside language to communicate with the database. Any data you have is easily converted from and to json with most serverside languages.
I can't imagine a proper usecase unless you have a sh*tload of javascript, it needs to be very performant, and you have exhausted all other possibilities such as caching, query optimization, etc...
An other downside of doing this is that you can't easily query the data in your database which is always nice when you want to get any kind of reporting done.
And what if your json structure changes? Will you update all the scripts in your database? Or will you force yourself to cope with the changes in the parsing code?
Conclusion
Imho it is not dangerous to do so but it leaves little room for manageability and future updates.

Categories

Resources