Nodejs Couchbase deserialize date property from document - javascript

I'm saving in Couchbase a document which has javascript Date values, and wish to get it exactly the same, not as string '2016-01-02T12:13:14Z'.
Found a way to achieve this using plain Javascript, by using the second parameter of JSON.parse , but Couchbase does the deserialization internally and can't really use this.
Is there any way to disable the Couchbase deserialization, and to avoid doing JSON.stringify + JSON.parse and neither deep-walking the object?
bucket.get(key, (err, result) => {
if (err) {
//deal with error here
} else {
//here "result.value" is already deserialized
done(result.value);
}
});

As you probably know, JSON and Date object handling can be a bit specific depending on what you're trying to do. We tend to stick to the defaults. What you're looking for is a fairly advanced use case. At the moment, we don't have direct support for changing the way we do that parsing.
However, there is an interface for this. It's called a "transcoder" and it lets you be very specific about how you want to handle converting incoming data to what is stored. I don't have an example that shows quite what you want to do, but a good place to start looking for this at a lower layer is shown in the tests.
It might be easier, however, to just treat whatever you're storing differently at the application level. From my read of the revivier parameter you pointed to, that'd be only at time of retrieval. There's nothing stopping you from wrapping that get() and then mutating the object before passing it along to the next layer at very low cost I do believe.

Related

why use translation strings instead of the object itselft?

Going through https://www.npmjs.com/package/i18n and https://github.com/fnando/i18n-js it`s a pretty common approach to have a locales file with a translate function like i18n.t('string'), and multiple translation files, using the received string to find the proper translations
ie (from i18n-js):
I18n.t("some.scoped.translation");
// or translate with explicit setting of locale
I18n.t("some.scoped.translation", { locale: "fr" });
Well, why instead of using those string paths, don't we just access the translate JSON directly?
it would still be possible to use the i18N lib to get the user language and even set it, but instead of using .t() method, we could just redirect to the proper translation object.
Wouldn't it help avoid typos since you will use the object itself to verify the path?
it seems like a so well diffused practice to use the locales string, I feel like I'm missing something about my approach, but couldn`t get to what it might be, I considered that it would be a heavier duty to go through the whole strings data, but i18n.t() would still have to do it plus parse the string into object path

Compare JSON structures

I want to compare received JSON data to a JSON 'template', and if it differs in structure (not in data itself) then do something,like discarding that JSON.
Template:
{
"data":{
"id":"1",
"cmd":"34"
}
Succesfull Json:
{
"data":{
"id":"15",
"cmd":"4"
}
Unsuccesfull Json:
{
"data":{
"id":"15"
}
This is only an example, the JSON to evaluate is going to be larger, and I want to avoid checking if each property exists. (This is possible in other languages, hence this question)
It sounds like you're looking for JSON Schema or other similar tools.
JavaScript itself doesn't provide anything built-in to do this for you. So you'll need an already-written tool to do it (such as JSON Schema) or you'll have to do it yourself, checking the existence and (depending on how strict you want to be) type of each property in the received JSON. You can do that either after parsing or during parsing by hooking into the parsing process via a "reviver" function you pass into JSON.parse, but either way is going to require doing the checks. (Given the inside-to-outside way JSON.parse works, I suspect using a reviver for this would be quite hard though. Much better to use a recursive function on the parsed data afterward.)
I would recommend converting it to object JSON.parse(), so you can use javascript API.
If your structure gets more robust in the future (more levels etc), you would be still able to do deep compares. Libraries like Immutable.js will come handy, as it used to compare complex states in React applications.
I recently faced this problem as well. I needed to make some comparisons on some objects (should work for JSON as well). I wrote a package for doing this that might come in handy for people facing this issue in the future (bit late for the topic starter).
https://www.npmjs.com/package/js-object-compare

Javascript example in mongodb

I'm trying to write a javascipt that will take a large amount values from a text files and will use those values to make a large set of queries to find a filed. for example there are 2000 values given in the text file, then i'm trying to read the values from the file and it will use each value to excute each query. So im not getting the concept how to write it, cause it is not possible to write 2000 queries separately, please help
var values = new Array();
$.get('UserFile.txt', function(data){
values = data.split('\n');
console.log(value);
db.students.find(CID:{$in[values]})
});
This is the concept I want to apply, the UserFile.txt got 20000 values and it will use those values to find it. Also it gives "$" is not defined in the shell
The concept is sound provided that you actually have a library that can load the contents of a file for you, so would likely be better suited to another language implementation or something like nodejs if you want JavaScript, as opposed to doing this in the mongo shell.
Aside from that the only issue is your syntax for the use of the $in operator, which should rather be:
values = filecontent.split('\n'); // Somehow obtained from reading a file
db.students.find({ "CID": { "$in": values } });
That is because whatever went into values is already a native array to the language and will be passed in as such for an argument to $in which expects it's argument to be an array.
It probably isn't the best idea to pass in a very large array and your total request must be under the 16MB BSON limit. So you may need to split up those array elements into several queries, but in batch sizes rather than singular, so that is at least better.
I could be wrong, but I don't find it likely you are going to find a library you could load that allows this, you can load native JavaScript as a structure but that would be beside the point.
Use something else as your language implementation and you can get the desired result by following the correct syntax.

Understanding Nodejs documentation

I am probably over thinking, but I am having trouble digesting the Nodejs documentation. I am new to javascript and come from a Java Background.
My question is not about any specific nodejs function just overall understanding. Below I give an example of what I am trying to understand...
When working with a statically typed language like Java it is very clear what types are needed for method calls. A trivial example, if I want to sort an array of int's I can just look at Arrays.sort and see that it takes an int[] (same for other types as well). I can also see that it returns void.
public static void sort(int[] a)
However javascript is a dynamic language thus there are no types for api calls. Take this example in the crypto module
crypto.pbkdf2(password, salt, iterations, keylen, callback)
Asynchronous PBKDF2 applies pseudorandom function HMAC-SHA1 to derive a
key of given length from the given password, salt and iterations.
The callback gets two arguments (err, derivedKey).
So without going out and finding example code, or looking at the nodejs source how do I know the argument types of the function? I realized that it is possible to derive the types by looking at the name (ie callback is a function type) but is there any other way?
For example the documentation says that the callback gets two arguments err and derivedKey. What is the type of derivedKey, what is the type or err? Am I missing something about the documentation? How do you know if you are passing in the right types?
Note: I am already know what the type of derivedKey and err is so I don't need answers like "derivedKey is ...." My question is about overall understanding of the Nodejs documentation for someone coming from a statically typed language and is not specific to crypto.pdkdf2.
Well you are pretty much over thinking. You'll have to guess most of them if it's not explained explicitly. like you can guess iterations and keylen are numbers rather than strings. NodeJS docs explain parameters explicitly when they think you can't guess, or you have to know something additional about it. like in crypto.createCredentials(details) they explain that details is a dictionary and which keys you need to use.
F.i. in case of err and derivedKey, since there is no explicit info, i would have assumed both are strings. If it turns out they are not, i would console.log them in callback function to see what they are.
Documentation could be a lot more clear if they have written down types of all parameters but don't know if it's worth the effort.
I have some experience with C# and Java, and have been programming JavaScript for about a year, so I might be able to frame this.
objects
One aspect of JavaScript is that you can make objects like this, on the spot:
var options = {
name: "something",
age: 9,
what: function() {
return 8;
}
};
Everyone takes advantage of this, so it's a big key to understanding JavaScript libraries.
You can also take the above options object and then go like this:
options.mood = "ok";
In other words, objects are just bundles of properties, and the structure isn't set. You can use language constructs like the for ... in loop to iterate through them. That is to say, the "type" of things like err is basically an associative array.
callbacks
Callbacks are basically everywhere, so the question becomes how to deal with them. A common pattern is function (err, maybeSomething). Most of the time, you only care if err is "something." That is, you'll go like this a lot:
if (err) {
...
}
Frankly, I do a lot of console.log(err) to see what I'm getting back during development.
After that it's really up to the documentation. Some of it is better than others. You're not really missing anything. About the only "trick" is that sometimes a doc will explain everything at the top.
You'll find yourself going into the source sometimes to find out what exactly a library is doing, but 97% of the time a few quick guesses and checks will get you moving.

Custom JSON.stringify fails to Stringify object as whole, but works when iterated one level deep

Hoping someone can spot the error, because I'm having trouble
Alright, I built my own JSON.stringify for just custom large objects. It may not be exactly to specification for some edge case things, but's only meant for stringify on large objects that I'm building myself.
Well, it works, and works well for most objects, but I have an Object I'm trying to stringify and it's failing and printing this before exiting:
node.js:134
throw e; // process.nextTick error, or 'error' event on first tick
^
undefined
Not very helpful. The object is fine because the regular call to JSON.stringify(object) works fine, and when I iterate over the object with for (var x in obj) if (obj.hasOwnProperty(x)) { myStringify(obj); } that works fine, but if I call it on the top level of the object, it goes to hell... It doesn't really make sense to me, and the only thing I can think of is the level if recursion is somehow breaking something...
The Parser : https://gist.github.com/958776 - The stringify function I'm calling
ObjectIterator.js : https://gist.github.com/958777 - Mostly to provide the asynchronous iteration
Edit So, I iterated over the object one level deep and compared the resulting string to the string of JSON.stringify(sameLevelDeep) and they're equal. Since the output is equal, I'm not sure that it's how I'm parsing something, but possible that it's such a large object or the amount of recursion is so high?
Edit 2 So, I "fixed" the problem, I guess. Instead of every 25th iteration being pushed to the next event loop, I push every fifth. I'm not sure why this would make a difference but it does... I guess the question is now "Why does that make a difference"?
Okay well, beyond it being a very specific question helping a very specific person, I would like to take this to a different place, that might also remove your problem and maybe help others.
Since you are not specifying why you are going through this process, I will have to break it down and guess -- and provide a solution for each guessed idea.
1. (Browser) You are trying to use JavaScript to crunch data, and provide the user with a result
Downloading at least several megabytes of raw data ("some of these objects are 5-10million characters") on a webpage to process and display a result is far from optimal, you should probably be doing this operation on the server side and download the pre-calculated result.
Besides, no matter what you are doing, JavaScript does not support threads.
setTimeout(1, function() { JSON.stringify(data); }); shouldn't be much different from what you are doing.
2. (Browser) You are trying to display the downloaded content
You should attempt downloading smaller chunks instead of the whole 10+ million character content using the built-in JSON.stringify method.
3. (Non-browser) You are trying to use JavaScript for an application that requires threading
You should consider using a different programming language for this application.
In summary
I think you are climbing the wrong mountain, you can achieve the same thing walking around it without breaking sweat. If you want to climb a mountain for kicks, there are mountains out there that need it -- but it's not this one.
Translation: Work on the architecture to obsolete the obstacle instead of trying to solve it, if you want to solve a problem there are problems that need a solving -- but it's not this one.

Categories

Resources