Convert TCP Data in with pipe to JSON - javascript

Sorry if the issue is not particularly clear (I am still a novice). I have a simple setup to get data from a mock feed and then convert the data to JSON. I can retrieve the data and display it but converting it to JSON has proved a bit tricky.
var completeData = '';
let client = net.createConnection({ port: 8282 }, () => {
client.write('Test Worked!\r\n');
});
client.on('data', (data) => {
// change buffer to string
let readData = data.toString();
// Merge response data as one string
completeData += readData += '\n';
// End client after server's final response
client.end();
});
This is the sample output for one packet below (Pipe delimited with some data escaped):
|2054|create|event|1497359166352|ee4d2439-e1c5-4cb7-98ad-9879b2fd84c2|Football|Sky Bet League Two|\|Accrington\| vs \|Cambridge\||1497359216693|0|1|
I would like the pipes to represent keys/values in an object. The issue is that some of the values are escaped (e.g '\|' ). That kind of makes using the split function in Javascript difficult.
My question is there a way to get pipe delimited data from TCP packets and then convert them to a JSON object?
Any help will be highly appreciated. Thank you.

const { StringDecoder } = require("string_decoder");
let header = "";
const decoder = new StringDecoder("utf8");
let str = decoder.write(chunk); // chunk is your data stream
header += str;
header.replace(/s/, "");
var obj = JSON.parse(header);
console.log(obj);
While receiving data through tcp ip net npm package, we face trouble in parsing the continous stream of data. string_decoder is a npm package which can parse it into a json readable format. Also since the packets being received has a variable length so one can basically put a check on header.length to fetch only a complete set of object.

Related

Trying to parse JSON

I'm a complete beginner at node.js and trying to figure out how to send the contents of a JSON-object using REST. My code only results in an error saying "SyntaxError: Unexpected token  in JSON at position 0". I have used an online validator to see that the JSON is corrent. What is the issue?
// GET courses
const fs = require('fs');
app.get('/api/courses', function(req, res) {
var rawdata = fs.readFileSync('miun-db.json');
var data = JSON.parse(rawdata);
res.send(data.courses);
});
The data inside of your file is not formatted properly. Verify that miun-db.json is properly formatted before attempting to parse.
The issue is that your file is not properly formatted, check this out Working with JSON.
Also you can just import the JSON file and use it without fs.
import miunDb from './miun-db.json';
app.get('/api/courses', function(req, res) {
res.send(miunDb.courses);
});
According to the fs.readFileSync docs you can pass options to the function.
fs.readFileSync(path[, options])
If the encoding option is specified then this function returns a string. Otherwise it returns a buffer.
Since you don't pass any options you get a buffer and not the string content of the file.
Either you pass encoding to the function, e.g.
var rawdata = fs.readFileSync('miun-db.json', {encoding:'utf8'});
var data = JSON.parse(rawdata);
or you convert the buffer to string. E.g.
var rawdata = fs.readFileSync('miun-db.json');
var data = JSON.parse(rawdata.toString('utf8'));
Edit:
Your error message says you have an invalid character which is non printable. Seems like your file starts with a BOM (zero width no-break space (ZWNBSP)).
What happens if you try to remove it?
function stripBOM(content) {
content = content.toString()
// Remove byte order marker. This catches EF BB BF (the UTF-8 BOM)
// because the buffer-to-string conversion in `fs.readFileSync()`
// translates it to FEFF, the UTF-16 BOM.
if (content.charCodeAt(0) === 0xFEFF) {
content = content.slice(1)
}
return content
}
// [...]
var rawdata = fs.readFileSync('miun-db.json', {encoding:'utf8'});
var data = JSON.parse(stripBOM(rawdata));
// [...]
Credit goes to this gist.

Swift 5 - How to convert LONGBLOB/Buffer into Data

I am currently working on a project for school.
I have written an API using Express connected to a mysql database. And now I am writing the iOS app.
My problem is that I need to save profile pictures. So I saved the png data of the picture into a **LONGBLOB** into db and I want to recreate the image into a **UIImage**.
To do that I am trying to convert the buffer into ```Data```
So, the API is returning a buffer created that way:
let buffer = Buffer.from(ppData.data, 'binary').toString('base64');
And on the iOS side I tried:
guard let data = dict["data"] as? Data else {return nil}
Where dict["data"] is the buffer returned by the API.
But it always enter into the "else" part.
What am i doing wrong
Edit:
As what it was said in comments, I decoded the Base64 encoded string. Now the data are decoded but creating a UIImage from it, fails, without any details. What I tried is:
let image = UIImage(from: base64DecodedData)
For example:
guard let strData = dict["data"] as? String else {
return nil
}
guard let data = Data(base64Encoded: strData, options: .ignoreUnknownCharacters) else {
return nil
}
guard let picture = UIImage(data: data) else {
return nil
}
Thanks.
The mistake was not in the swift code part but in my API and database structure. After reading some MySQL and Node.js documentaion, I switched from LONGBLOB (which is totally oversized) to MEDIUMTEXT.
Also, in the API I was trying to create a buffer from binary data but not from a base64 string encoded data, so I removed this line:
let buffer = Buffer.from(ppData.data, 'binary').toString('base64');

How would I parse a large TSV file in node.js?

I'm extremely new to Node and JS. I have a large TSV file (1.5gb) that I need to read in and parse into either an array or JSON object. How would I go about doing that? I don't get an error when I try the code below but it doesn't even enter into it.
var d3 = require("d3-dsv");
d3.tsvParse("amazon_reviews_us_Mobile_Apps_v1_00.tsv", function(error, data)
{
var sum = 0;
data.forEach(function(d)
{
d.helpful_votes += d.helpful_votes;
sum += d.helpful_votes;
});
console.log("Total Helpful Votes: " + sum);
});
Any help would be appreciated.
You need to find a module that provides a streaming parser for a TSV file, meaning that it doesn't load the whole file into memory. You can use readline if your parser is synchronous:
const {createInterface} = require("rl");
const {createReadStream} = require("fs");
createInterface({input: createReadStream("amazon_reviews_us_Mobile_Apps_v1_00.tsv")})
.on('line', (data) => doSomethingWith(data.split("\t")))
.on('end', () => doSomethingWhenDone())
You wrote that you want to parse that file and change it to an array or object of some sort. You'll still need to be looking at your memory, but you could use my scramjet which will allow you to transform the data anyway you like:
const {StringStream} = require("scramjet");
const {createReadStream, createWriteStream} = require("fs");
StringStream.from(createReadStream("amazon_reviews_us_Mobile_Apps_v1_00.tsv"))
// read the file
.CSVParse({delimiter: "\t"})
// parse as csv
.map((entry) => doSomething(entry))
// whatever you return here it will be changed
// this can be asynchronous too, so you can do requests...
.toJSONArray()
.pipe(createWriteStream("somefile.json"))
Let me know what are you trying to achieve besides counting. I'll edit the answer.
BTW, for just counting votes the solution by #hugo-elhaj-lahsen is also good, I'm not sure why it was downvoted.
Use d3.tsv with the promise-based API. Since your file is very large, one optimisation we can do is instead of doing a for-each on each element after they get parsed by D3, use the loop done at parsing time via the initialization function:
var d3 = require("d3-dsv");
var sum = 0
d3.tsvParse("amazon_reviews_us_Mobile_Apps_v1_00.tsv", data => {
sum += d.helpful_votes;
return d // Since this is the parser, need to return the parsed object at the end
}).then(data => {
console.log("Total helpful votes", sum)
})

How to seperate objects with commas when writing them to a json file using nodejs?

I am sending right click coordinates from the client side to the server side and from there I am writing them down in a json file, having coordinates of different points stored as different objects. This is how my json file looks:
{"x":344,"y":158}{"x":367,"y":152}{"x":641,"y":129}
Now the problem is that I have to put a comma between the first two objects in order to make it a valid json. How can i do that? Here is my request handler function:
.post(function(request, response){
console.log("Request received");
var util = require('util');
var coordinates = request.body;
var imageCoordinates = JSON.stringify(coordinates, null, 2);
fs.appendFile('coords.json', imageCoordinates, finished);
function finished(err){
console.log('all set.');
response.send("All OK");
}
console.log('coords are:', coordinates);
});
Ok, one idea.
If you alter your appendFile to also add a newline.. after each append, not only would this make it easier to read if you opened it, it would also make easier to parse.
eg.. fs.appendFile('coords.json', imageCoordinates + '\r\n', finished);
Because we can't run node here in the browser, in the code below pretend lines is the data read from your file after the above mods with newlines.
I basically generate some valid JSON from splitting and rejoining with commas, and then wrap inside an array.
ps. the filter(String) is just to filter out the last array element that would be blank due to the none required last comma.
let lines = '{"x":344,"y":158}\r\n{"x":367,"y":152}\r\n{"x":641,"y":129}\r\n';
var data = JSON.parse('[' + (lines.split('\r\n').filter(String)).join(',') + ']');
console.log(data);

Transferring strings containing emotions to server

On my website whenever a user enters a mobile emoji like 😀 into an input field it will be saved as ?? in my database.
Those emojis are encoded in utf8mb4, so I already updated my database collation to utf8mb4_general_ci.
While the emoticons can be saved successfully now when transfering the message containing a emoji from a client to my server, it still get's somewhere changed into ?? and I am now trying to figure out where and how to solve it.
Sending the message to my server happens in this ajax call:
function updateStatus() {
var status = $("#status").val();
jsRoutes.controllers.Userdata.updateStatus( status ).ajax({
success : function(data) {
$("#profil").cftoaster({content: data});
},
error : function(err) {
$("#profil").cftoaster({content: err.responseText});
}
});
}
On serverside I am using java based Play Framework 2.4.4.
This is the beginning of the method which is called in the ajax call:
public static Result updateStatus(String status) {
String receivedName = Application.getSessionUser();
Logger.debug("RecvStatus: " + status);
...
}
The Logger output already is ?? for an emoticon.
The route looks like this:
PUT /status/ controllers.Userdata.updateStatus(status: String)
EDIT:
To make sure the transfer from client to server is alright I am now transferring the actual unicode values, I change my server function like this
Logger.debug("RecvStatus: " + status);
status = status.replace("\\","");
String[] arr = status.split("u");
status = "";
for(int i = 1; i < arr.length; i++){
int hexVal = Integer.parseInt(arr[i], 16);
status += (char)hexVal;
}
Logger.debug("RecvStatus: " + status);
and get the following output:
[debug] application - RecvStatus: \ud83d\ude01
[debug] application - RecvStatus: ?
which means the problem is probably with java
So for anyone having a similiar problem here my workaround.
First I tried to convert the String in java to base64 and store it this way. Unfortunately after decoding the base64 the unicode information still got lost and ?? was shown instead of emoticons.
What I than did was converting the received String into unicode and than into base64, and when I load the Strings from the database I first decode base64 and then convert the unicode information into an actual String. This way the emoticons are stored and afterwards shown correctly.

Categories

Resources