Trying to parse JSON - javascript

I'm a complete beginner at node.js and trying to figure out how to send the contents of a JSON-object using REST. My code only results in an error saying "SyntaxError: Unexpected token  in JSON at position 0". I have used an online validator to see that the JSON is corrent. What is the issue?
// GET courses
const fs = require('fs');
app.get('/api/courses', function(req, res) {
var rawdata = fs.readFileSync('miun-db.json');
var data = JSON.parse(rawdata);
res.send(data.courses);
});

The data inside of your file is not formatted properly. Verify that miun-db.json is properly formatted before attempting to parse.

The issue is that your file is not properly formatted, check this out Working with JSON.
Also you can just import the JSON file and use it without fs.
import miunDb from './miun-db.json';
app.get('/api/courses', function(req, res) {
res.send(miunDb.courses);
});

According to the fs.readFileSync docs you can pass options to the function.
fs.readFileSync(path[, options])
If the encoding option is specified then this function returns a string. Otherwise it returns a buffer.
Since you don't pass any options you get a buffer and not the string content of the file.
Either you pass encoding to the function, e.g.
var rawdata = fs.readFileSync('miun-db.json', {encoding:'utf8'});
var data = JSON.parse(rawdata);
or you convert the buffer to string. E.g.
var rawdata = fs.readFileSync('miun-db.json');
var data = JSON.parse(rawdata.toString('utf8'));
Edit:
Your error message says you have an invalid character which is non printable. Seems like your file starts with a BOM (zero width no-break space (ZWNBSP)).
What happens if you try to remove it?
function stripBOM(content) {
content = content.toString()
// Remove byte order marker. This catches EF BB BF (the UTF-8 BOM)
// because the buffer-to-string conversion in `fs.readFileSync()`
// translates it to FEFF, the UTF-16 BOM.
if (content.charCodeAt(0) === 0xFEFF) {
content = content.slice(1)
}
return content
}
// [...]
var rawdata = fs.readFileSync('miun-db.json', {encoding:'utf8'});
var data = JSON.parse(stripBOM(rawdata));
// [...]
Credit goes to this gist.

Related

Swift 5 - How to convert LONGBLOB/Buffer into Data

I am currently working on a project for school.
I have written an API using Express connected to a mysql database. And now I am writing the iOS app.
My problem is that I need to save profile pictures. So I saved the png data of the picture into a **LONGBLOB** into db and I want to recreate the image into a **UIImage**.
To do that I am trying to convert the buffer into ```Data```
So, the API is returning a buffer created that way:
let buffer = Buffer.from(ppData.data, 'binary').toString('base64');
And on the iOS side I tried:
guard let data = dict["data"] as? Data else {return nil}
Where dict["data"] is the buffer returned by the API.
But it always enter into the "else" part.
What am i doing wrong
Edit:
As what it was said in comments, I decoded the Base64 encoded string. Now the data are decoded but creating a UIImage from it, fails, without any details. What I tried is:
let image = UIImage(from: base64DecodedData)
For example:
guard let strData = dict["data"] as? String else {
return nil
}
guard let data = Data(base64Encoded: strData, options: .ignoreUnknownCharacters) else {
return nil
}
guard let picture = UIImage(data: data) else {
return nil
}
Thanks.
The mistake was not in the swift code part but in my API and database structure. After reading some MySQL and Node.js documentaion, I switched from LONGBLOB (which is totally oversized) to MEDIUMTEXT.
Also, in the API I was trying to create a buffer from binary data but not from a base64 string encoded data, so I removed this line:
let buffer = Buffer.from(ppData.data, 'binary').toString('base64');

Iterate over cells in a CSV file in Node.js

I have a CSV file: "myCSV.csv" with two columns: "first" and "second".
All the data inside is just numbers. So the file looks like this:
first, second
138901801, 849043027
389023890, 382903205
749029820, 317891093
...
I would like to iterate over these numbers and perform some custom parsing on them, then store results in an array.
How can I achieve a behavior like the following?
const parsedData = [];
for (const row of file) {
parsedData.push(row[0].toString() + row[1].toString());
}
If you're working with a file the user has selected in the browser, you'd make a FileReader in response to the user's action. (See FileReader - MDN.)
But it sounds like you already have the file on your server, in which case you'd use Node's built-in File System module. (See File System - NodeJS.)
If you just want the module's readFile function, you'd require it in your file like:
const {readFile} = require("fs");
And you'd use it to process a text file like:
readFile("myFile.txt", "utf8", (error, textContent) => {
if(error){ throw error; }
const parsedData = [];
for(let row of textContent.split("\n")){
const rowItems = row.split(",");
parsedData.push(rowItems[0].toString() + rowItems[1].toString());
}
}
(See Node.js - Eloquent JavaScript).
However, if you want to handle your CSV directly as binary data (rather than converting to a text file before reading), you'd need to add something like this before invoking readFile:
const textContent = String.fromCharCode.apply(null, new Uint16Array(buffer));
...with the textContent parameter in the arrow function replaced by a buffer parameter to handle the binary data.
(If Uint16Array is the wrong size, it might be Uint8Array instead. See Buffer to String - Google.)
You might also find these resources helpful:
JS CSV Tutorial - SeegateSite
JS read-text demo - GeeksForGeeks

SyntaxError: Unexpected token U in JSON when reading a file using fs

I am using Node.js v12.14.1 and I am facing an issue while parsing a JSON file containing \U0001f970.
Here's the content of the file that I am trying to read and parse:
{"randomKey":{"random value \U0001f970\U0001f970":1}}
And here's the program that I wrote to read it:
var fs = require('fs');
var data = JSON.parse(fs.readFileSync('sample.json', 'utf8'));
I am getting the following error when executing the program: SyntaxError: Unexpected token U in JSON at position 29
When I try to parse the JSON in the REPL, it works without any issue.
JSON.parse('{"randomKey":{"random value \U0001f970\U0001f970":1}}')
How do I read the file and parse the JSON without any issue?
The \U is not a valid escape sequence in JSON (and not in JavaScript string literals either). Also, a normal \u escape sequence is followed by 4 digits, not by 8 of them. You will need to transform your file to valid JSON - best by fixing whatever program has written it, but you can also do it on the fly:
const json = '{"randomKey":{"random value \\U0001f970\\U0001f970":1}}';
const data = JSON.parse(json.replace(/\\U([0-9a-f]{4})([0-9a-f]{4})/g, '\\u$1\\u$2'))
After realizing the JSON is invalid, thanks to #Bergi's answer, I have used the following code to transform the data to valid JSON:
var fs = require('fs');
var json = fs.readFileSync('sample.json', 'utf8');
const sanitizedJSON = json.replace(/\\U([0-9a-f]{8})/g, (match) => String.fromCodePoint(parseInt(match.replace(/\\U/g, ''), 16)));
const data = JSON.parse(sanitizedJSON);
console.log(data);
This prints the desired output:
{ randomKey: { 'random value 🥰🥰': 1 } }

Convert TCP Data in with pipe to JSON

Sorry if the issue is not particularly clear (I am still a novice). I have a simple setup to get data from a mock feed and then convert the data to JSON. I can retrieve the data and display it but converting it to JSON has proved a bit tricky.
var completeData = '';
let client = net.createConnection({ port: 8282 }, () => {
client.write('Test Worked!\r\n');
});
client.on('data', (data) => {
// change buffer to string
let readData = data.toString();
// Merge response data as one string
completeData += readData += '\n';
// End client after server's final response
client.end();
});
This is the sample output for one packet below (Pipe delimited with some data escaped):
|2054|create|event|1497359166352|ee4d2439-e1c5-4cb7-98ad-9879b2fd84c2|Football|Sky Bet League Two|\|Accrington\| vs \|Cambridge\||1497359216693|0|1|
I would like the pipes to represent keys/values in an object. The issue is that some of the values are escaped (e.g '\|' ). That kind of makes using the split function in Javascript difficult.
My question is there a way to get pipe delimited data from TCP packets and then convert them to a JSON object?
Any help will be highly appreciated. Thank you.
const { StringDecoder } = require("string_decoder");
let header = "";
const decoder = new StringDecoder("utf8");
let str = decoder.write(chunk); // chunk is your data stream
header += str;
header.replace(/s/, "");
var obj = JSON.parse(header);
console.log(obj);
While receiving data through tcp ip net npm package, we face trouble in parsing the continous stream of data. string_decoder is a npm package which can parse it into a json readable format. Also since the packets being received has a variable length so one can basically put a check on header.length to fetch only a complete set of object.

How to seperate objects with commas when writing them to a json file using nodejs?

I am sending right click coordinates from the client side to the server side and from there I am writing them down in a json file, having coordinates of different points stored as different objects. This is how my json file looks:
{"x":344,"y":158}{"x":367,"y":152}{"x":641,"y":129}
Now the problem is that I have to put a comma between the first two objects in order to make it a valid json. How can i do that? Here is my request handler function:
.post(function(request, response){
console.log("Request received");
var util = require('util');
var coordinates = request.body;
var imageCoordinates = JSON.stringify(coordinates, null, 2);
fs.appendFile('coords.json', imageCoordinates, finished);
function finished(err){
console.log('all set.');
response.send("All OK");
}
console.log('coords are:', coordinates);
});
Ok, one idea.
If you alter your appendFile to also add a newline.. after each append, not only would this make it easier to read if you opened it, it would also make easier to parse.
eg.. fs.appendFile('coords.json', imageCoordinates + '\r\n', finished);
Because we can't run node here in the browser, in the code below pretend lines is the data read from your file after the above mods with newlines.
I basically generate some valid JSON from splitting and rejoining with commas, and then wrap inside an array.
ps. the filter(String) is just to filter out the last array element that would be blank due to the none required last comma.
let lines = '{"x":344,"y":158}\r\n{"x":367,"y":152}\r\n{"x":641,"y":129}\r\n';
var data = JSON.parse('[' + (lines.split('\r\n').filter(String)).join(',') + ']');
console.log(data);

Categories

Resources