NodeJS use variables in JSON file strings - javascript

I use a JSON file for common phrases so I don't have to type them and maybe in the future they can be translated. So for example in my main code I want to say You don't have the permission to use ${command_name}. This works perfectly fine hardcoded into my .js file but ultimately I want this to be in a JSON file, which does not allow any variables to be inserted.
Does anyone know a solution to my problem?
EDIT: Thanks for the suggestions. I guess string.replace would be my best option here. Wish there was some built in feature that'd convert variables in a JSON string to variables declared in that JS file.

You cannot treat template string literals in JSON files like in Javascript "code". You said it yourself. But: You could use a template engine for this - or just simple String.replace().
Example for a template engine: https://github.com/janl/mustache.js
With Mustache (as an example) your code will look like this
var trans = {
command_name: "dump"
};
var output = Mustache.render("You don't have the permission to use {{command_name}}", trans);
With simple String.replace():
var str = "You don't have the permission to use %command_name%";
console.log(str.replace('%command_name%', 'dump'));

You can simply use placeholders. The following function replaces the placeholders with user-defined values:
const messages = {
msgName: 'Foo is :foo: and bar is :bar:!'
}
function _(key, placeholders) {
return messages[key].replace(/:(\w+):/g, function(__, item) {
return placeholders[item] || item;
});
}
Usage:
_('msgName', { foo: 'one', bar: 'two' })
// "Foo is one and bar is two!"
It's just an example. You can change the placeholders style and the function behavior the way you want!

You can use config npm module and separate your JSON files according to your environment.

./name.json
{
command: "this is the output of 'command'"
}
./Node.js
cost names = require('./name.json');
console.log('name === ', name.command);
// name === this is the output of 'command'

So the main challenge is getting separated file with string constants when some of them being parametrizable, right?
JSON format itself operates on strings(numbers, booleans, lists and hashmap) and knows nothing about substitution and parameters.
You are also unable to use template strings like you don't have permission to do ${actionName} since template strings are interpolated immediately.
So what can you do?
Writing your own parser that takes config data from JSON file, parse a string, find a reference to variable and substitute it with value. Simple example:
const varPattern = /\${([^{}]+)}/g;
function replaceVarWithValue(templateStr, params) {
return templateStr.replace(varPattern, (fullMatch, varName) => params[varName] || fullMatch);
}
or you can use any npm package aimed on localization like i18n so it would handle templates for you

Basically you can implement a function parse which, given a text and a dictionary, it could replace any ocurrence of each dictionary key:
const parse = (template, textMap) => {
let output = template
for (let [id, text] of Object.entries(textMap)) {
output = output.replace(new RegExp(`\\$\{${id}}`, 'mg'), text)
}
return output
}
const textMap = {
commandName: 'grep',
foo: 'hello',
bar: 'world'
}
const parsed = parse('command "${commandName}" said "${foo} ${bar}"', textMap)
console.log(parsed)
BTW, I would suggest you that you should use some existing string templating engine like string-template to avoid reinventing the wheel.

Related

Automatically wrapping & unwrapping wrappers.proto types in protobuf.js

I have been using protobuf.js (command line tools) pbjs and pbts to generate my js and typescript classes for my defined .proto files. I get a json response from my backend API that I am looking to deserialize into the protobuf generated classes. the recommended way to do this is to use the fromObject method which takes in the json object.
Let's say I have
message ChangeEvent {
string source = 1;
google.protobuf.StringValue code = 2;
}
I'd like to be able to pass in:
const changeEventWithCode = {
source = 'test',
code = 'code',
}
const changeEventWithoutCode = {
source = 'test',
code = null,
}
and have them both encode & decode to the same thing. However it seems if I want to set the code string, I have to do:
const changeEventWithCode = {
source = 'test',
code = {
value: 'code',
},
}
I was hoping fromObject ould handle this, but it doesnt - is there any way I can hook in some customisation to do this. Alternatively how can this be achieved with protobufjs using typescript?
I do not think that you can. The wrappers are messages, and I don't think protobuf.js has an option to unbox the values for you.
But ts-proto does! For a google.protobuf.StringValue code, it will create a property signature code: string | undefined. Sounds like this is what you want.
But it looks like the "optional" label is coming back for proto3. This means you can write:
message ChangeEvent {
string source = 1;
optional string code = 2;
}
This is still an experimental feature, added in protoc v3.12.0. And you need a plugin for typescript that supports it.
And there is one: protobuf-ts. It will generate the following typescript:
interface ChangeEvent {
source: string;
code?: string
}
Disclaimer: I am the author of protobuf-ts.

Converting a nodejs buffer into a javascript object with functions

I know that converting a buffer to a json object is trivial with:
JSON.parse(buffer)
But what about converting a buffer that contains a javascript object with functions to a real javascript object ??
I have files that contact objects with functions and I need to read these files in and save them to another master object.
While the task of reading the files is not a problem, I cannot find any way of converting the buffer back into a real javascript object again.
This is a very simple example of a file 'test.js' that I am trying to load
{
get:function(){return 'hello'},
somevar: "xxx",
put: function(){return 'world'}
}
Reading this data in it is a buffer, I can't convert using JSON as this contains functions and I cannot read using utf8 encoding as it will become a string !
var funcs = {}
fs.readFile('test.js',function(err,buff){
funcs['test'] = buff;
})
Is it possible to read a file and convert it into a real javascript object ?
Edit
OK, I have found a solution but it is using the dreaded eval(), however this is backend code and as far as I can tell there's no way for anything to be injected by a user from the frontend, I would prefer not to use it but unless there's anything that will work without modifying the format of the files, I will use it:
var funcs = {}
fs.readFile('test.js','utf8',function(err,buff){
eval('var_='+buff);
funcs['test'] = _;
})
For what it's worth, you could use the Function constructor. It's slightly safer than eval because it doesn't access the local scope. But it can still access global variables.
var script = buffer.toString('utf8'); // assuming the file is in UTF-8
var returnObject = new Function('return ' + script);
var myObject = returnObject();
Depending on the complexity of the code you're dealing with, this approach may or may not suit your needs:
You can create a temporary file, say "test-module.js", which includes the object from "test.js" but with a module export prependend, for example:
module.exports = {
get: function() { return 'hello' },
somevar: 'xxx',
put: function() { return 'world' }
}
Then, from your main file, you can retrieve the file content already available as a Javascript object:
var funcs = {}
var buff = require('./test-module');
funcs['test'] = [buff.get, buff.put];
Would this solution work for you?

How to handle pluralization in Hogan

I'm using Hogan.js, which is compatible with the Mustache spec.
And im having trouble implementing a solid way of doing pluralization.
I would like to keep using Hogan and use http://i18next.com/ for i18n handling
doing something like this works for the simple cases
tpl:
{{#plural(count)}}
I have {{count}} apples!
{{/plural(count)}}
data:
{
count: 2,
'plural(count)': function () {
return function () {
return _t[arguments[0].trim()][this['count']]
}
}
}
this requires parsing/scanning/rendering in seperate steps to be able to generate all of the required plural methods (plural(key.val) etc.) but thats fine, it only needs to be done once, at server boot.
this breaks on things like
{{#plural(key.nested)}}
that would match if the data looked like
{
'plural(key': {
'val)': ...
}
}
this also requires me to manually lookup the values from the context, not a major problem but there are some cases with lambda's/partials that might be impossible to resolve
for the default translation mappings, thing are a lot less complex, and thats easy to handle
Ok found the way I think is best to handle this problem:
var tpl_data = fs.readFileSync('./tpl/test.hjs', 'utf8');
var scan = Hogan.scan(tpl_data);
var tree = Hogan.parse(scan);
var gen = Hogan.generate(tree, tpl_data, {asString:false});
var out = gen.render(data);
alter the tree, replacing all tag keys to i18n
where the n matches your pattern /i18n .+/
I use {{#i18n {count: count, text: 'I have <%count%> apples!'} }} and the like to add the options for i18next
so i match all n's starting with i18n
add the i18n to Hogan.codegen
Hogan.codegen.i18n = function (node, context) {
context.code += 't.b(t.v(t.i18n("' + esc(node.n) + '",c,p,0)));';
}
add the i18n method to the prototype of Hogan.Template
Hogan.Template.prototype.i18n = function (key, ctx, partials, returnFound) {
//here the ctx is an array with from right to left the render scopes
// most right is the most inner scope, most left is the full render data
//1. get the config from the key,
//2. get the values out of the scope
//3. get the values for the translation
//4. lookup the translation, and repleace the values in the translation
//5. return the translated string
};
note that inside the Hogan.Template.prototype.i18n you can access all of the template's methods

Using locales options inside the jade filter failed

I'm trying to use locales option inside the filter but faced with problem, that locale-object is not accessible from filter.
Locales:
json
{"title": "HAMPI"}
Filter:
var jade = require(jade);
jade.filters.Posts = function(block) {
return '{block:Posts}'+jade.render(block)+'{/block:Posts}';
};
Input:
body
|#{title}
:Posts
div
a
#{title}
Output:
<body>
HAMPI
{block:Posts}<div><a><undefined></undefined></a></div>{/block:Posts}
</body>
Can I fix or handle this error?
PS You can look at the code in this repository — I'm using grunt and grunt-contrib-jade plugin, but to force grunt-contrib-jade work with filters you should edit ./node_modules/grunt-contrib-jade/tasks/jade.js to reflect changes from this pull request.
Filters are applied at compile time, where as rendering which has access to local variables is done at runtime. So your local variables are not accessible to filters. They only see raw text. So you can do this :
jade.filters.Posts = function(block) {
return '{block:Posts}'+block+'{/block:Posts}'; //remove render
};
This way you will defer rendering of #{title} until you have the variables. It produces this output.
<body>HAMPI{block:Posts}HAMPI{/block:Posts}</body>
How I tested it :
var jade = require(jade);
fn = function(block) {
return '{block:Posts}'+jade.render(block)+'{/block:Posts}';
};
var fn = jade.compile(fs.readFileSync(__dirname + '/file2.jade'));
console.log(fn({"title": "HAMPI"}));
The same issue is mentioned in here: in node.js, how to pass variables to :stylus filter from jade?
For reference you can see these links :
Jade: Pass markdown filter a variable.
:markdown filter processing the text of a String variable
:markdown with variable
[Update]
If you want to use render then why not pass the local vars with it. So if you do :
jade.filters.Posts = function(block) {
return '{block:Posts}'+jade.render(block,{"title": "HAMPI"})+'{/block:Posts}'
};
It gives this:
<body>HAMPI{block:Posts}<div><a><HAMPI></HAMPI></a></div>{/block:Posts}</body>
Downside being your view local cannot be used and you would have to pass it directly.

prototypejs 1.6 to 1.7 json problems

I had a json parsing problem updating my app from prototype 1.6.1 to 1.7.0
This is a very simplified model of my json as it, saved in tmp.js:
{
"text":"hello world",
"fn": function(){alert('hello world')
}
and this is my code:
new Ajax.Request('tmp.js', {
onSuccess: function(transport){
var json = transport.responseText.evalJSON();
var button = new Element('button')
.update(json.text)
.observe('click', function(){
json.fn();
});
$('my_div').update(button);
}});
All this worked correctly with 1.6.1: it produced a button that alerted 'hello world' on click.
This does not work in v. 1.7.0, because of the fact that my json is not valid. I know it should not contain functions, but only data.
My question is: why did it worked with 1.6.1 (and still works) and is there a way to accomplish the same with 1.7.0. I need to get via ajax a js object containing user defined functions.
Thank you
Update:
The reconstruct function is a good solution and I think I'll use it in the future.
Anyway I found eval() function that seems to be a good and fast solution:
tp.js JSON:
{
"text":"hello world",
"fn": "my_alert('hello world')"
}
JS
function my_alert(string){
alert(string);
}
new Ajax.Request('tmp.js', {
onSuccess: function(transport){
var json = transport.responseText.evalJSON();
var button = new Element('button')
.update(json.text)
.observe('click', function(){
eval(json.fn);
});
$('my_div').update(button);
}});
What you've got in that sample data you posted is not JSON. In strict JSON the value of a property can be
a string
a number
boolean true or false
null
an array
an object
There is no way to include a function definition in JSON. (Well, that's not exactly true; you're free to use strings, numbers, arrays, objects, etc. to describe a function in such a way that your code can reconstruct it after the JSON is parsed. The point is that straight JavaScript function expressions are disallowed.)
One simple, slightly disturbing thing you could do is save the function body as a string, and then reconstruct it by calling
foo.fn = new Function(foo.fn);
once the JSON parse is complete.
edit more details:
The "Function()" constructor takes as its arguments a list of strings representing argument names, followed by a string that's to be used as the function body. If you wanted to encode a complete JavaScript function, therefore, you might want to have it look like an object:
{
'foo': 'plain property',
'someFunction': {
'arguments': [ 'x', 'y' ],
'body': 'if (x > y) return "x"; return "y";'
}
}
Now to turn "someFunction" into a real function, you'd use something like this:
function reconstructFunction(descr) {
var params = (descr.arguments || []).slice(0);
params.push(descr.body);
return Function.apply(null, params);
}
Then you can just pass the function descriptor from your JSON into something like that, and then you have a bona fide JavaScript function to call.

Categories

Resources