Problems with Blob acquisition when receiving data through a web-socket - javascript

In the process of communicating with the server using a web socket, the data is handed over to the server to Json.
And in the process of receiving data through onmessage, the data came into Blob, not Json or string.
I couldn't find a way to change this Blob to Json, so I posted a question here.
Is there any way to receive the data in string or Json?
Or how to change Blob to string or Json.
The server is written in c++ and sent as a string.
useEffect(() => {
//var socket = new WebSocket("ws://172.30.1.50:65432/websocket");
const socket = new WebSocket("ws://ip:port/websocket"); //For convenience, I wrote down ip and port.
socket.Type = "blob"
socket.onopen = function (event) {
socket.send(JSON.stringify({ 'CMD': 'test' }))
console.log("connetion?");}
socket.close
socket.onmessage = function (event) {
console.log((event.data))
}, []})

The WebSocket protocol supports two kinds of data messages: text and binary. When a text message is received by Javascript, it is deserialised as a string; when a binary message is received, it is deserialised as either an ArrayBuffer or a Blob, depending on the value of the websocket's binaryType property (the default is to return a Blob).
The fact that you got a Blob indicates that the server sent a binary message. This is unusual for a JSON message, so either the server is very unusual, or the protocol is not based on JSON.
In any case, you can convert r data into a string by calling the async .text() method:
let data = await event.data.text();
If the contents is indeed JSON, then you may parse it as usual:
let value = JSON.parse(data);

Related

Sending DOM element from client to server

I tried to clone a DOM object and send it to the server using socket.io and express, but after copying the element
( const element = document.getElementById('id').cloneNode; )
my server receives an empty object {}
and when i do console.log(element); on the client side everything works
(I send it into server by:
socket.on('document', (data, callback) => {
callback( document.getElementById(data[2]).cloneNode(true) );
})
Socket.io converts objects to JSON to send over the wire. DOM elements have no properties that will convert to JSON.
const body = JSON.stringify(document.body);
console.log(body);
You'll need to copy the values you care about into a new object and convert any non-JSON data types into something that JSON can represent.

How can I write a Buffer directly to websocket-stream in Node.js without converting to string?

This one is a bit dense. I'm building a web-socket based FUSE filesystem in Node.js (v14.14.0) using the fuse-native package.
To transfer the file data between the client and the server, I'm using the websocket-stream package to stream the binary data back and forth.
This works fine when transferring a file from the server to the client, but when trying to transfer a file from the client to the server, I'm running into a problem.
fuse-native passes around Buffer instances with binary segments of the file being transferred. I'm trying to write the Buffer to a websocket-stream stream and receive it on the server, where it will be streamed to a temporary file.
Here's how this happens. On the client-side, the following method is called:
write(buffer) {
console.log('write buffer pre slice', buffer)
const write_stream = WebSocketStream(`ws://localhost:5746/?socket_uuid=${this.socket_uuid}&node_uuid=${this.node_uuid}&length=${this.length}&position=${this.position}&writing_file=true`, {
perMessageDeflate: false,
binary: true,
})
console.log(write_stream)
console.log('writing buffer', buffer.toString(), buffer)
write_stream.push(buffer)
write_stream.push(null)
}
According to the Node.js docs, I should be able to pass the Buffer directly to the stream. However, on the server, no data is ever received. Here's how the server is receiving:
async on_file_write_stream(stream, request) {
let { socket_uuid, node_uuid, length = 4096, position = 0 } = url.parse(request.url, true).query
if ( typeof position === 'string' ) position = parseInt(position)
if ( typeof length === 'string' ) length = parseInt(length)
const socket = this.sockets.find(x => x.uuid === socket_uuid)
if ( !socket.session.temp_write_files ) socket.session.temp_write_files = {}
const placeholder = socket.session.temp_write_files?.[node.uuid] || await tmp.file()
socket.session.temp_write_files[node.uuid] = placeholder
console.log('Upload placeholder:', placeholder)
console.log('write stream', stream)
console.log('write data', { placeholder, position, length })
stream.pipe(fs.createWriteStream(placeholder.path, { flags: 'r+', start: position }))
}
Once the client-side code finishes, the temporary file is still completely empty. No data is ever written.
The strange part is that when I cast the buffer to a string before writing it to the stream (on the client side), all works as expected:
write(buffer) {
console.log('write buffer pre slice', buffer)
const write_stream = WebSocketStream(`ws://localhost:5746/?socket_uuid=${this.socket_uuid}&node_uuid=${this.node_uuid}&length=${this.length}&position=${this.position}&writing_file=true`, {
perMessageDeflate: false,
binary: true,
})
console.log(write_stream)
console.log('writing buffer', buffer.toString(), buffer)
write_stream.push(buffer.toString())
write_stream.push(null)
}
This works fine for text-based files, but binary files become mangled when transferred this way. I suspect it's because of the string cast before transfer.
How can I send the buffer data along the stream without casting it to a string first? I'm not sure why the server-side stream isn't receiving data when I write the Buffer directly.
Thanks in advance.
For the curious, here is the full server-side file and the client-side file.
Kind of a hack, but I worked around the problem by base64 encoding the Buffer on the client-side and decoding it on the server side.
Thanks to user Bojoer for pointing me in the direction of the combined-stream library, which I used to pipe the Buffer to the stream:
const combined_stream = CombinedStream.create()
combined_stream.append(buffer.toString('base64'))
combined_stream.pipe(write_stream)
Then, decode it on the server-side:
const encoded_buffer = await this._bufferStream(stream)
const decoded_buffer = new Buffer(encoded_buffer.toString(), 'base64')
console.log({encoded_buffer, decoded_buffer})
const combined_stream = CombinedStream.create()
combined_stream.append(decoded_buffer)
combined_stream.pipe(fs.createWriteStream(placeholder.path, { flags: 'r+', start: position }))
This is sub-optimal, though, as converting back and forth from base64 takes processing, and it imposes a bandwidth penalty. I'm still curious about the original problem why I can't write a binary Buffer to the websocket stream. Maybe it's a limitation of the library.

How do I stream JSON objects from ExpressJS?

I'm trying to stream JSON objects from an ExpressJS / Node backend API to a frontend site.
I do not want to use Sockets.IO for various reasons. As I understand it, the native streaming libraries should support streaming objects, it appears that just Express is complicating this.
My frontend code seams straight forward. I use Fetch to get my target URL, get a read stream from the response object, and set that read stream to objectMode: true.
Frontend Example:
async function () {
let url = "myurl";
let response = await fetch( url, {
method: 'GET',
mode: 'cors',
wtihCredentials: 'include'
}
const reader = response.body.getReader({objectMode: true });
// Where things are a bit ambiguous
let x = true;
while (x) {
const {done, value} = reader.read()
if (done) { break; }
// do something with value ( I push it to an array )
}
}
Backend Bode Example ( fails because of I cannot change the stream to objectMode )
router.get('/', (request, response) => {
response.writeHead(200, { 'Content-Type' : 'application/json' });
MongoDB.connection.db.collection('myCollection').find({}).forEach( (i) => {
response.write(i);
}).then( () => {
response.end()
})
})
Now my problem is that there does not appear to be anyway to change the ExpressJS write stream to objectMode: true. To my dismay, the ExpressJS documentation doesn't even acknoledge the existence of the write() function on the response object: https://expressjs.com/en/api.html#res
How do I change this over to objectMode: true ?
conversely, I tried to work with the writeStream as a string. The problem that I run into is that when the send buffer fills up, it does it by characters, not by the object. These means that at some point invalid JSON is passed to requester.
A suggested solution that I run into often is that I could read all of the chunks on the client and assemble valid JSON. This defeats the purpose of streaming, so Im trying to find a better way.
For what I believe is the same problem, I cannot figure out how to talk directly to the write stream object from the express code so I am unable to use the native writeStream operation writable.length in order to manually check to see if there is space for the entire JSON object as a string. This is preventing me from using stringified JSON with new line terminators.
https://nodejs.org/api/stream.html#stream_writable_writablelength
https://nodejs.org/api/stream.html#stream_writable_writableobjectmode
Could someone set me straight? I am working with 100k + records in my Mongo database, I really need partical page loading to work so that the users can start picking through the data.

Is this a correct way to parse incoming JSON over websocket, and respond depending on the type of message?

So I am receiving JSON over a websocket from a chargepoint using OCPP 1.6 JSON.
I am trying to parse the message and respond appropriately, depending on what the message is, using Node.js
Here is the message that I recieve:
[ 2,
'bc7MRxWrWFnfQzepuhKSsevqXEqheQSqMcu3',
'BootNotification',
{ chargePointVendor: 'AVT-Company',
chargePointModel: 'AVT-Express',
chargePointSerialNumber: 'avt.001.13.1',
chargeBoxSerialNumber: 'avt.001.13.1.01',
firmwareVersion: '0.9.87',
iccid: '',
imsi: '',
meterType: 'AVT NQC-ACDC',
meterSerialNumber: 'avt.001.13.1.01' } ]
In this case it is the 'BootNotification' message, to which I need to respond with an 'Accepted' message.
Here is my code:
const WebSocket = require('ws');
const wss = new WebSocket.Server({ port: 8080 });
wss.on('connection', function connection(ws) {
ws.on('message', function incoming(message) {
//Make incoming JSON into javascript object
var msg = JSON.parse(message)
// Print whole message to console
console.log(msg)
// Print only message type to console. For example BootNotification, Heartbeat etc...
console.log("Message type: " + msg[2])
// Send response depending on what the message type is
if (msg[2] === "BootNotification") {
//Send correct response
} // Add all the message types
});
});
With this I get the message type printed to the console as a string:
Message type: BootNotification
So my question is that is this the correct way to get the type of the message?
I am new to this so I want to make sure.
The specification for OCPP 1.6 JSON is available here: OpenChargeAlliance website
I guess YES. JSON.parse is the built-in to, well, pares JSON strings. In case that goes wrong it throws an error, so you might try/catch this.
Since the response you get is an array, there is no other way as to access its items with a numeric index.
In such cases I personally prefer to have something like that:
const handlers = {
'BootNotification': request => { 'msg': 'what a request' }
};
Than you can:
let respone = {'msg': 'Cannot handle this'}
if (handlers.hasOwnProperty(msg[2])) {
response = handlers[msg[2]](msg);
}
But that is just the way I would go.
If you are asking about OCPP message structure (not parsing a JSON), I can provide you the details about OCPP 1.6 version.
In OCPP 1.6, the client (Charging Station) sends a CALL (similar to request in HTTP) to server (Charging Station Management Systems). All CALLs have a strict structure of 4 elements:
MessageTypeId (integer)
UniqueId (UUID, string)
Action (string)
Payload (JSON object containing the arguments relevant to the Action.)
or as in your example:
[
2,
'bc7MRxWrWFnfQzepuhKSsevqXEqheQSqMcu3',
'BootNotification',
{ chargePointVendor: 'AVT-Company',
chargePointModel: 'AVT-Express',
chargePointSerialNumber: 'avt.001.13.1',
chargeBoxSerialNumber: 'avt.001.13.1.01',
firmwareVersion: '0.9.87',
iccid: '',
imsi: '',
meterType: 'AVT NQC-ACDC',
meterSerialNumber: 'avt.001.13.1.01' }
]
So the Type of Action should be always at index of 2 (as you retrieve it when you parse received message). You can refer to #philipp anwser on way how to handle an errors when parsing.

How to send ArrayBuffer in json from js client to nodejs server?

I am trying to send data of type ArrayBuffer in json to my server using socket.io like this:
socket.emit('record', {
name: myUsername + '.wav',
data: data //arraybuffer
});
On server side, when i receive 'record' event in socket, I get the data from JSON and save it in name file like this:
socket.on('record', function(message){
var fileWriter = new wav.FileWriter(message.name, {
channels: 1,
sampleRate: 48000,
bitDepth: 16
});
message.data.pipe(fileWriter);
});
I am using require('wav') & require('stream') package from npm. The problem is that my server crashes on message.data.pipe(fileWriter); with this error:
TypeError: message.data.pipe is not a function
What am I doing wrong? Can't i send ArrayBuffer like this in socket.io?
data is an ArrayBuffer, not a ReadableStream. Enqueue data to be read by a ReadableStream, passed to a WritableStream or TransformStream, for example using .pipeThrough(), see Receiving data via stdin and storing as a variable/array.

Categories

Resources