nodejs: how to get intermediate hash values - javascript

I want to secure the content of a sequential file using a sha256 hash value.
As the file has to be extended during runtime, a hash calculation is required upon each append/extension.
What I would like to have is a perpetual hash object, being updated with the appended data only and allowing to retrieve intermediate results.
As the implementation should be done in nodejs, the javascript code could look like this:
const fs = require('fs'), crypto = require('crypto')
var hashPerp = crypto.createHash('sha256')
var data = 'record 1'
fs.writeFileSync('mylist.dat', data)
hashPerp.update(data)
var hashInter = CLONE(hashPerp)
console.log(hashInter.digest('base64'))
data = 'record 2'
fs.appendFileSync('mylist.dat', data)
hashPerp.update(data)
hashInter = CLONE(hashPerp)
console.log(hashInter.digest('base64'))
...
I did not find an appropriate object CLONE() function, and cloning the hash object with its internal buffers also may not be the only way to solve the problem.
Any suggestions?

I was thinking too complicated: in my case (to express the content of a file) a "SHA" -Secure Hash Algorithm - actually is NOT required, a simple hash function can do the job. So, CRC32 will be used, which allows consecutive addition of bytes.

Related

Iterate over cells in a CSV file in Node.js

I have a CSV file: "myCSV.csv" with two columns: "first" and "second".
All the data inside is just numbers. So the file looks like this:
first, second
138901801, 849043027
389023890, 382903205
749029820, 317891093
...
I would like to iterate over these numbers and perform some custom parsing on them, then store results in an array.
How can I achieve a behavior like the following?
const parsedData = [];
for (const row of file) {
parsedData.push(row[0].toString() + row[1].toString());
}
If you're working with a file the user has selected in the browser, you'd make a FileReader in response to the user's action. (See FileReader - MDN.)
But it sounds like you already have the file on your server, in which case you'd use Node's built-in File System module. (See File System - NodeJS.)
If you just want the module's readFile function, you'd require it in your file like:
const {readFile} = require("fs");
And you'd use it to process a text file like:
readFile("myFile.txt", "utf8", (error, textContent) => {
if(error){ throw error; }
const parsedData = [];
for(let row of textContent.split("\n")){
const rowItems = row.split(",");
parsedData.push(rowItems[0].toString() + rowItems[1].toString());
}
}
(See Node.js - Eloquent JavaScript).
However, if you want to handle your CSV directly as binary data (rather than converting to a text file before reading), you'd need to add something like this before invoking readFile:
const textContent = String.fromCharCode.apply(null, new Uint16Array(buffer));
...with the textContent parameter in the arrow function replaced by a buffer parameter to handle the binary data.
(If Uint16Array is the wrong size, it might be Uint8Array instead. See Buffer to String - Google.)
You might also find these resources helpful:
JS CSV Tutorial - SeegateSite
JS read-text demo - GeeksForGeeks

How can I convert an SQL primary key in JSON to a javascript object key with the other data as it's value

Whenever I read all my data from a database table and receive it as JSON from my API, I get my data like this (unique_name being the primary key in this example):
[{"unique_name":"alice", "age":18, "city":"kansas"},{"unique_name":"bob", "age":20, "city":"chicago"}]
In my javascript application, however, I need my data to be formatted like so:
const myData = {alice: {age:18, city:"kansas"}, bob: {age:20, city:"chicago"}}
I guess this could be done with object mapping of some sort, but I'm afraid this would be too slow with a lot of entries. Is there any clean way to do this?
There is no real problem to map a lot of entries.
If you want more performences you should do it server side (eg: on a nodejs server)
You can simply use a .forEach()
Exemple (not complete so you just get the idea)
const myData = {};
mySQL.forEach(entry => myData[entry.unique_key] = entry)

NodeJS md5 'bytestring' like PHP md5(str, true)

I've faced with following issue: i try to convert some string str to md5 bytestring hash. In PHP we can use md5(str, true), but in JS (nodejs express) i can't find some way to receive the same result. I've included npm module js-md5, but arrayBuffer method of this module returns another result (differes from PHP md5(str, true)).
Could somebody help me, please.
Thanks
var md5 = require('md5');
console.log(md5('text'))
Use CryptoJS module :
NPM link here
And do something like :
// Requires
var crypto = require('crypto');
// Constructor
function Crypto() {
this.hash;
}
// Hash method
Crypto.prototype.encode = function(data) {
this.hash = crypto.createHash('md5').update(data);
var result = this.hash.digest('hex');
return result;
};
// Comparison method (return true if === else false)
Crypto.prototype.equals = function(data, model) {
var bool = false;
var data = data.toUpperCase();
var model = String(model).toUpperCase();
if (data == model){
bool = true;
} else {
bool = false;
}
return bool;
};
// Exports
module.exports = Crypto;
Then instantiate this "tool" object in your code and use methods.
Easy as pie, and the same thing can be done with anothers encryption methods like AES, SHA256, etc.
About the raw_output option (binary answer, padded on 16 bits) you can easily convert the returned var in binary format with a simple function, see this SO post to know how.
Have fun.
Short answer:
const crypto = require('crypto');
const buffer = crypto.createHash('md5').update(str).digest();
Long answer: you need to use NodeJS’ default crypto module (no need for a dependency here), which contains utility function and classes. It is able to create hashes (for instance MD5 or SHA-1 hashes) for you using synchronous or asynchronous methods. A short utility function named crypto.createHash(algorithm) is useful to create a hash with minimal coding. As the docs specifies:
The algorithm is dependent on the available algorithms supported by the version of OpenSSL on the platform. Examples are 'sha256', 'sha512', etc. On recent releases of OpenSSL, openssl list-message-digest-algorithms will display the available digest algorithms.
Now, this createHash function returns a Hash object, which can be used with a stream (you can feed it a file, HTTP request, etc.) or a string, as you asked. If you want to use a string, use hash.update(string) to hash it. This method returns the hash itself, so you can chain it with .digest(encoding) to generate a string (if encoding is set) or a Buffer (if it’s not). Since you asked for bytes, I believe a Buffer is what you want (Buffers are Uint8Array instances).

Transforming JSON in a node stream with a map or template

I'm relatively new to Javascript and Node and I like to learn by doing, but my lack of awareness of Javascript design patterns makes me wary of trying to reinvent the wheel, I'd like to know from the community if what I want to do is already present in some form or another, I'm not looking for specific code for the example below, just a nudge in the right direction and what I should be searching for.
I basically want to create my own private IFTTT/Zapier for plugging data from one API to another.
I'm using the node module request to GET data from one API and then POST to another.
request supports streaming to do neat things like this:
request.get('http://example.com/api')
.pipe(request.put('http://example.com/api2'));
In between those two requests, I'd like to pipe the JSON through a transform, cherry picking the key/value pairs that I need and changing the keys to what the destination API is expecting.
request.get('http://example.com/api')
.pipe(apiToApi2Map)
.pipe(request.put('http://example.com/api2'));
Here's a JSON sample from the source API: http://pastebin.com/iKYTJCYk
And this is what I'd like to send forward: http://pastebin.com/133RhSJT
The transformed JSON in this case takes the keys from the value of each objects "attribute" key and the value from each objects "value" key.
So my questions:
Is there a framework, library or module that will make the transform step easier?
Is streaming the way I should be approaching this? It seems like an elegant way to do it, as I've created some Javascript wrapper functions with request to easily access API methods, I just need to figure out the middle step.
Would it be possible to create "templates" or "maps" for these transforms? Say I want to change the source or destination API, it would be nice to create a new file that maps the source to destination key/values required.
Hope the community can help and I'm open to any and all suggestions! :)
This is an Open Source project I'm working on, so if anyone would like to get involved, just get in touch.
Yes you're definitely on the right track. There are two stream libs I would point you towards, through which makes it easier to define your own streams, and JSONStream which helps to convert a binary stream (like what you get from request.get) into a stream of parsed JSON documents. Here's an example using both of those to get you started:
var through = require('through');
var request = require('request');
var JSONStream = require('JSONStream');
var _ = require('underscore');
// Our function(doc) here will get called to handle each
// incoming document int he attributes array of the JSON stream
var transformer = through(function(doc) {
var steps = _.findWhere(doc.items, {
label: "Steps"
});
var activeMinutes = _.findWhere(doc.items, {
label: "Active minutes"
});
var stepsGoal = _.findWhere(doc.items, {
label: "Steps goal"
});
// Push the transformed document into the outgoing stream
this.queue({
steps: steps.value,
activeMinutes: activeMinutes.value,
stepsGoal: stepsGoal.value
});
});
request
.get('http://example.com/api')
// The attributes.* here will split the JSON stream into chunks
// where each chunk is an element of the array
.pipe(JSONStream.parse('attributes.*'))
.pipe(transformer)
.pipe(request.put('http://example.com/api2'));
As Andrew pointed out there's through or event-stream, however I made something even easier to use, scramjet. It works the same way as through, but it's API is nearly identical to Arrays, so you can use map and filter methods easily.
The code for your example would be:
DataStream
.pipeline(
request.get('http://example.com/api'),
JSONStream.parse('attributes.items.*')
)
.filter((item) => item.attibute) // filter out ones without attribute
.reduce((acc, item) => {
acc[item.attribute] = item.value;
return acc;
.then((result) => request.put('http://example.com/api2', result))
;
I guess this is a little easier to use - however in this example you do accumulate the data into an object - so if the JSON's are actually much longer than this, you may want to turn it back into a JSONStream again.

Parameter retrieval for HTTP PUT requests under IIS5.1 and ASP-classic?

I'm trying to implement a REST interface under IIS5.1/ASP-classic (XP-Pro development box). So far, I cannot find the incantation required to retrieve request content variables under the PUT HTTP method.
With a request like:
PUT http://localhost/rest/default.asp?/record/1336
Department=Sales&Name=Jonathan%20Doe%203548
how do I read Department and Name values into my ASP code?
Request.Form appears to only support POST requests. Request.ServerVariables only gets me to header information. Request.QueryString doesn't get me to the content either...
Based on the replies from AnthonyWJones and ars I went down the BinaryRead path and came up with the first attempt below:
var byteCount = Request.TotalBytes;
var binContent = Request.BinaryRead(byteCount);
var myBinary = '';
var rst = Server.CreateObject('ADODB.Recordset');
rst.Fields.Append('myBinary', 201, byteCount);
rst.Open();
rst.AddNew();
rst('myBinary').AppendChunk(binContent);
rst.update();
var binaryString = rst('myBinary');
var contentString = binaryString.Value;
var parameters = {};
var pairs = HtmlDecode(contentString).split(/&/);
for(var pair in pairs) {
var param = pairs[pair].split(/=/);
parameters[param[0]] = decodeURI(param[1]);
}
This blog post by David Wang, and an HtmlDecode() function taken from Andy Oakley at blogs.msdn.com, also helped a lot.
Doing this splitting and escaping by hand, I'm sure there are a 1001 bugs in here but at least I'm moving again. Thanks.
Unfortunately ASP predates the REST concept by quite some years.
If you are going RESTFull then I would consider not using url encoded form data. Use XML instead. You will be able to accept an XML entity body with:-
Dim xml : Set xml = CreateObject("MSXML2.DOMDocument.3.0")
xml.async = false
xml.Load Request
Otherwise you will need to use BinaryRead on the Request object and then laboriously convert the byte array to text then parse the url encoding yourself along with decoding the escape sequences.
Try using the BinaryRead method in the Request object:
http://www.w3schools.com/ASP/met_binaryread.asp
Other options are to write an ASP server component or ISAPI filter:
http://www.codeproject.com/KB/asp/cookie.aspx

Categories

Resources