Can I fetch a Readable Stream then convert to JSON client side? - javascript

I'm hoping to use a Google Sheets CSV as a data source. I'd like to fetch the CSV data on the client, then convert into JSON.
I'm able to fetch the CSV, which returns a ReadableStream. However, I'm not sure how to correctly read that response and convert into JSON.
I've found an npm package which should help convert the CSV data to JSON, but having a little bit of a time working with the stream.
Example: https://jsfiddle.net/21jeq1h5/3/
Can anyone point me in the right direction to use the ReadableStream?

Since CSV is simply text, the solution is the use the response.text() method of the fetch() API.
https://developer.mozilla.org/en-US/docs/Web/API/Body/text
Once the text is onboard, it is as simple as parsing the CSV out of the file. If you want objects as an output it is imperative the headers are included in the CSV (which yours are).
I've included the code snippet below. It won't run on SO because SO sets the origin to null on AJAX requests. So I've also included a link to a working codepen solution.
fetch('https://docs.google.com/spreadsheets/d/e/KEY&single=true&output=csv')
.then(response => response.text())
.then(transform);
function transform(str) {
let data = str.split('\n').map(i=>i.split(','));
let headers = data.shift();
let output = data.map(d=>{obj = {};headers.map((h,i)=>obj[headers[i]] = d[i]);return obj;});
console.log(output);
}
Pen
https://codepen.io/randycasburn/pen/xjzzvW?editors=0012
Edit
I should add that if you truly want this in a JSON string (per your question), you can run
json = JSON.stringify(output);

Related

How to append an item to a JSON stored in a file?

Im triying to write a JSON object to a DB with FS but it´s not working as expected. What i want to do is write the JSON data inside this: [] Example:
[
{"name":"jhon"},
{"name":"jhonathan"}
]
Here is the code:
Thanks.
The comment you provided doesn't make much sense, because the JSON data you provided in the post and in the comments is completely different. But I get the gist of it. I guess you have a JSON file containing an array and you want to push new items to it. Let's do this.
The thing is, when you call fs.appendFile, you're only writing to the end of the file. You're not following JSON format by doing so.
You need to do this:
Read file content.
Parse JSON text into an object.
Update the object in memory.
Write object in JSON format back to the file system.
I'll call the synchronous methods for simplicity's sake, but you should be able to convert it to async quite easily.
const path = __dirname + 'db.json'
// Reading items from the file system
const jsonData = fs.readFileSync(path)
const items = JSON.parse(jsonData)
// Add new item to the item list
items.push(newItem)
// Writing back to the file system
const newJsonString = JSON.stringify(items)
fs.writeFileSync(path, newJsonString)

Reading changes in JSON file using JS

I'm trying to read an updating JSON file from syslog-ng. Currently, syslog, a logging software, is set to continually append a JSON file with logs of the data I want. I'm displaying the data on my cyber attack map only for only 30 seconds until it's not needed anymore. I can read the file and parse what I need, but is there a way to, over time, read & parse only the most recent additions to the file?
Sample code:
//Assume JSON output = {attack source, attack destination, attack type}
//Required modules
var JSONStream = require('JSONStream')
var fs = require('fs');
//Creates readable stream for JSON file parsing
var stream = fs.createReadStream( 'output.json', 'utf8'),
parser = JSONStream.parse(['source', 'dest', 'type']);
//Send read data to parser function
stream.pipe(parser);
//Intake data from parser function
parser.on('data', function (obj) {
//Do something with the object
console.log(obj);
});
I'm using JSONStream to avoid having to read the whole log file into memory, JSONstream should still be able to parse the bits I want, but is there a method to only read changes after the original reading is complete?
Use this code example provided in the library
JSONStream Test code
You don't have to wait for the end, you can use the callback to do your work object by object
But the file structure should suite the library expectation as the files given in the folder
Example file all_npm.json

Reading the headers of a CSV file using angularjs

My CSV file consists of headers namely (DateTime,Samplevalue and hostname).
And
I want to read these headers and for instance, if the csv file consists of DateTime as it's header I want to print ( this is suitable for timeseries database)
How to achieve this using angularjs ?
How to code this ?
For instance, my CSV looks like
Hostname,Samplevalue,DateTime
Host1,1,2018-05-04 31:21:11
Host2,1,2018-05-05 21:15:10
Host3,1,2018-05-04 11:11:13
Host4,1,2018-05-06 41:21:15
Need to just read any one of the header and print something like
Console.log(datetime header is suitable fr this database)
Thanks in advance.
You can use $http and 1) split the returned string by line breaks 2)
split the first line by commas :
$http.get('path/to/csv/').then(function(csv) {
var headers = csv.data.split('\n')[0].split(',')
headers.forEach(function(header, index) {
console.log('header #'+index, header)
})
})
Do you mean you wish to read the CSV data?
If yes, the best way to deal with CSV files in JavaScript is to convert it into JSON data.
Check out https://github.com/evanplaice/jquery-csv, its a simple library where you can pass your CSV data and get JSON in return. Hope it helps!

Extracting gzip data in Javascript with Pako - encoding issues

I am trying to run what I expect is a very common use case:
I need to download a gzip file (of complex JSON datasets) from Amazon S3, and decompress(gunzip) it in Javascript. I have everything working correctly except the final 'inflate' step.
I am using Amazon Gateway, and have confirmed that the Gateway is properly transferring the compressed file (used Curl and 7-zip to verify the resulting data is coming out of the API). Unfortunately, when I try to inflate the data in Javascript with Pako, I am getting errors.
Here is my code (note: response.data is the binary data transferred from AWS):
apigClient.dataGet(params, {}, {})
.then( (response) => {
console.log(response); //shows response including header and data
const result = pako.inflate(new Uint8Array(response.data), { to: 'string' });
// ERROR HERE: 'buffer error'
}).catch ( (itemGetError) => {
console.log(itemGetError);
});
Also tried a version to do it splitting the binary data input into an array by adding the following before the inflate:
const charData = response.data.split('').map(function(x){return x.charCodeAt(0); });
const binData = new Uint8Array(charData);
const result = pako.inflate(binData, { to: 'string' });
//ERROR: incorrect header check
I suspect I have some sort of issue with the encoding of the data and I am not getting it into the proper format for Uint8Array to be meaningful.
Can anyone point me in the right direction to get this working?
For clarity:
As the code above is listed, I get a buffer error. If I drop the Uint8Array, and just try to process 'result.data' I get the error: 'incorrect header check', which is what makes me suspect that it is the encoding/format of my data which is the issue.
The original file was compressed in Java using GZIPOutputStream with
UTF-8 and then stored as a static file (i.e. randomname.gz).
The file is transferred through the AWS Gateway as binary, so it is
exactly the same coming out as the original file, so 'curl --output
filename.gz {URLtoS3Gateway}' === downloaded file from S3.
I had the same basic issue when I used the gateway to encode the binary data as 'base64', but did not try a whole lot around that effort, as it seems easier to work with the "real" binary data than to add the base64 encode/decode in the middle. If that is a needed step, I can add it back in.
I have also tried some of the example processing found halfway through this issue: https://github.com/nodeca/pako/issues/15, but that didn't help (I might be misunderstanding the binary format v. array v base64).
I was able to figure out my own problem. It was related to the format of the data being read in by Javascript (either Javascript itself or the Angular HttpClient implementation). I was reading in a "binary" format, but it was not the same as that recognized/used by pako. When I read the data in as base64, and then converted to binary with 'atob', I was able to get it working. Here is what I actually have implemented (starting at fetching from the S3 file storage).
1) Build AWS API Gateway that will read a previously stored *.gz file from S3.
Create a standard "get" API request to S3 that supports binary.
(http://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-payload-encodings-configure-with-console.html)
Make sure the Gateway will recognize the input type by setting 'Binary types' (application/gzip worked for me, but others like application/binary-octet and image/png should work for other types of files besides *.gz). NOTE: that setting is under the main API selections list on the left of the API config screen.
Set the 'Content Handling' to "Convert to text(if needed)" by selecting the API Method/{GET} -> Integration Request Box and updating the 'Content Handling' item. (NOTE: the example in the link above recommends "passthrough". DON'T use that as it will pass the unreadable binary format.) This is the step that actually converts from binary to base64.
At this point you should be able to download a base64 verion of your binary file via the URL (test in browser or with Curl).
2) I then had the API Gateway generate the SDK and used the respective apiGClient.{get} call.
3) Within the call, translate the base64->binary->Uint8 and then decompress/inflate it. My code for that:
apigClient.myDataGet(params, {}, {})
.then( (response) => {
// HttpClient result is in response.data
// convert the incoming base64 -> binary
const strData = atob(response.data);
// split it into an array rather than a "string"
const charData = strData.split('').map(function(x){return x.charCodeAt(0); });
// convert to binary
const binData = new Uint8Array(charData);
// inflate
const result = pako.inflate(binData, { to: 'string' });
console.log(result);
}).catch ( (itemGetError) => {
console.log(itemGetError);
});
}

load json data to dataset in d3js

I am having json file like this. It contains some data.
[{\"Frequency\":\"Building 1\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":46,\"Value1\":26,\"Value2\":22},{\"Name\":\"food\",\"Value\":32,\"Value1\":26,\"Value2\":12}]},{\"Frequency\":\"Building 2\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":48,\"Value1\":26,\"Value2\":23},{\"Name\":\"food\",\"Value\":34,\"Value1\":33,\"Value2\":12}]},{\"Frequency\":\"Building 3\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":57,\"Value1\":22,\"Value2\":24},{\"Name\":\"food\",\"Value\":42,\"Value1\":16,\"Value2\":11}]},{\"Frequency\":\"Building 4\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":59,\"Value1\":26,\"Value2\":33},{\"Name\":\"food\",\"Value\":44,\"Value1\":26,\"Value2\":35}]},{\"Frequency\":\"Building 5\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":62,\"Value1\":26,\"Value2\":11},{\"Name\":\"food\",\"Value\":48,\"Value1\":26,\"Value2\":3}]},{\"Frequency\":\"Building 6\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":62,\"Value1\":26,\"Value2\":21},{\"Name\":\"food\",\"Value\":47,\"Value1\":26,\"Value2\":24}]},{\"Frequency\":\"Building 7\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":58,\"Value1\":26,\"Value2\":22},{\"Name\":\"food\",\"Value\":43,\"Value1\":26,\"Value2\":22}]},{\"Frequency\":\"Building 8\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":48,\"Value1\":26,\"Value2\":2},{\"Name\":\"food\",\"Value\":34,\"Value1\":26,\"Value2\":33}]}]
I want to store this json file into dataset in d3.js. or
I have given all data are static into my code. I want to give these data from json file to d3.js can any one give me example.
my expected result is
dataset = JSON.parse("[{\"Frequency\":\"Building 1\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":46,\"Value1\":26,\"Value2\":22},{\"Name\":\"food\",\"Value\":32,\"Value1\":26,\"Value2\":12}]},{\"Frequency\":\"Building 2\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":48,\"Value1\":26,\"Value2\":23},{\"Name\":\"food\",\"Value\":34,\"Value1\":33,\"Value2\":12}]},{\"Frequency\":\"Building 3\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":57,\"Value1\":22,\"Value2\":24},{\"Name\":\"food\",\"Value\":42,\"Value1\":16,\"Value2\":11}]},{\"Frequency\":\"Building 4\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":59,\"Value1\":26,\"Value2\":33},{\"Name\":\"food\",\"Value\":44,\"Value1\":26,\"Value2\":35}]},{\"Frequency\":\"Building 5\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":62,\"Value1\":26,\"Value2\":11},{\"Name\":\"food\",\"Value\":48,\"Value1\":26,\"Value2\":3}]},{\"Frequency\":\"Building 6\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":62,\"Value1\":26,\"Value2\":21},{\"Name\":\"food\",\"Value\":47,\"Value1\":26,\"Value2\":24}]},{\"Frequency\":\"Building 7\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":58,\"Value1\":26,\"Value2\":22},{\"Name\":\"food\",\"Value\":43,\"Value1\":26,\"Value2\":22}]},{\"Frequency\":\"Building 8\",\"Data\":[{\"Name\":\"Medicine\",\"Value\":48,\"Value1\":26,\"Value2\":2},{\"Name\":\"food\",\"Value\":34,\"Value1\":26,\"Value2\":33}]}]");
inside the bracket my data should be come i dont know how to do this can any tell me how to do this.
before i tried this but it is not working.
d3.json("D3json.json", function(error, data) {
var datas = data;
})
here is my jsfiddle example: Click here to see the example
Thanks
Vinoth S
If I understood your question correctly, you are trying to dynamically load JSON data in contrast to having it hard-coded in your file.
Here is an example how to do it: http://bl.ocks.org/Jverma/887877fc5c2c2d99be10
In general, you have to execute the drawing part after you successfully loaded the data (within the callback function of d3.json())

Categories

Resources