Sending DOM element from client to server - javascript

I tried to clone a DOM object and send it to the server using socket.io and express, but after copying the element
( const element = document.getElementById('id').cloneNode; )
my server receives an empty object {}
and when i do console.log(element); on the client side everything works
(I send it into server by:
socket.on('document', (data, callback) => {
callback( document.getElementById(data[2]).cloneNode(true) );
})

Socket.io converts objects to JSON to send over the wire. DOM elements have no properties that will convert to JSON.
const body = JSON.stringify(document.body);
console.log(body);
You'll need to copy the values you care about into a new object and convert any non-JSON data types into something that JSON can represent.

Related

Problems with Blob acquisition when receiving data through a web-socket

In the process of communicating with the server using a web socket, the data is handed over to the server to Json.
And in the process of receiving data through onmessage, the data came into Blob, not Json or string.
I couldn't find a way to change this Blob to Json, so I posted a question here.
Is there any way to receive the data in string or Json?
Or how to change Blob to string or Json.
The server is written in c++ and sent as a string.
useEffect(() => {
//var socket = new WebSocket("ws://172.30.1.50:65432/websocket");
const socket = new WebSocket("ws://ip:port/websocket"); //For convenience, I wrote down ip and port.
socket.Type = "blob"
socket.onopen = function (event) {
socket.send(JSON.stringify({ 'CMD': 'test' }))
console.log("connetion?");}
socket.close
socket.onmessage = function (event) {
console.log((event.data))
}, []})
The WebSocket protocol supports two kinds of data messages: text and binary. When a text message is received by Javascript, it is deserialised as a string; when a binary message is received, it is deserialised as either an ArrayBuffer or a Blob, depending on the value of the websocket's binaryType property (the default is to return a Blob).
The fact that you got a Blob indicates that the server sent a binary message. This is unusual for a JSON message, so either the server is very unusual, or the protocol is not based on JSON.
In any case, you can convert r data into a string by calling the async .text() method:
let data = await event.data.text();
If the contents is indeed JSON, then you may parse it as usual:
let value = JSON.parse(data);

How to save big object to file nodejs?

I have a big object that I need to send from server (nodejs) to client.
But every time I try send I get "invalid string length" error.
And it's ok, because the object is really big. That's why I'd like to save it to file and then send the file to client.
I don't know the depth of the object. The object itself is and octree.
I don't have any code of saving an object to file, because every time I think about this it leads me to stringify the object, that latter leads to "invalid string length".
Here is a screenshot of the object. Every q(n) key has the same recursive structure as result key.
Thanks!
Firstly try to save in a JSON file and send JSON file directly to client-side
const fs = require('fs');
fs.writeFileSync("file_name.json", data);
res.header("Content-Type",'application/json');
res.sendFile(path.join(__dirname, 'file_name.json'));
A good solution to handling large data transfer between server and client is to stream the data.
Pipe the result to the client like so.
const fs = require('fs');
const fileStream = fs.createReadStream("path_to_your_file.json");
res.writeHead(206, {'Content-Type': 'application/json'})
fileStream.pipe(response)
or follow this blog which stringifies each element at a time and concatenates them.
I will suggest you stream it though.
You could try something like this, I'm unsure if Express would throw the same length error by using res.json.
// Assuming you're using Express and data is valid JSON.
res.json(data);

How do I allow url string to alter JSON message returned when creating a REST API using express?

I'm relatively new to express and I'm trying to create a locally hosted API that contains my resume. I understand that in a call such as:
app.get("/url", (res, req, next) => {
res.json({
//json body
});
});
the "url" can be used to return a different JSON based upon the html string. However, I'm hoping that there's a way for me to make it so that if the url is just "/", it returns the whole JSON object. Otherwise, if it contains an identifier like: "/experience", then it'll just return a JSON object containing the experience section of the JSON. However, I don't want to copy and paste each section of the JSON for every different possible "GET" call. Does anyone know a way I can do this?
If you have the object, iterate over its keys to make a route for each.
app.get("/", (res) => {
res.json(theObj);
});
for (const [key, value] of Object.entries(theObj)) {
app.get("/" + key, (res) => {
res.json(value);
});
}

How to return multiple updates of a JSON using expressjs and nodejs

I have a server side task that takes some time to compute, and I'd like to periodically send updates to the client. I store and send the information as an object (via JSON), and it isn't an array where I can send data sequentially. Rather, I want to send some new information, and update others as the calculation continues.
From other posts on here I realize that:
response.json(object) is a nice and easy way to send an object json in one go, with headers set and everything. However, this - like response.send() - terminates the connection:
var app = express()
app.get('/', (request, response) => {
response.json( { hello:world } );
})
Alternatively, one could set the headers manually, and then use response.write with JSON.stringify
response.setHeader('Content-Type', 'application/json');
response.write(JSON.stringify({ hello:world } ));
response.end();
The above two methods work for sending an object in one go, but ideally what I'd like to do is send incremental updates to my object. E.g.
response.setHeader('Content-Type', 'application/json');
response.write( JSON.stringify( { hello:[world], foo:bar } ) );
// perform some operations
response.write( JSON.stringify( { hello:[world, anotherWorld], foo:cat } ) );
response.end()
However, what is happening on the clientside is:
After the first response.write, the client receives { hello:[world], foo:bar } but does not trigger my callback
After the second response.write, I can see the data received is { hello:[world], foo:bar }{ hello:[world, anotherWorld], foo:cat } still does not trigger my callback
My callback is only called after response.end(), and then I get an exception when trying to parse it as JSON, because it isn't a valid JSON anymore, but a bunch of JSONs stuck back to back with no comma or anything: Uncaught (in promise) SyntaxError: JSON.parse: unexpected non-whitespace character after JSON data at line 1 column XXX of the JSON data.
Ideally my client callback would be triggered upon receiving each write, and it would remove that bit of data from the buffer so to speak, so the next incoming json would be standalone.
Is what I want to do possible? If so, how?
My fetch code btw:
fetch(url)
.then(response => response.json()) // parse the JSON from the server
.then(returnInfo => {
onReturn(returnInfo);
});
For your use-case, you can consider using WebSockets to deliver incremental updates to your UI. There are 3 stages of WebSockets connections. Connect, message and disconnect. One page load your front-end maintains persistent connection with backend. You can send first JSON data on connect and then when your backend has updates, send data in your message call back. I have written a blog post that implements WebSockets using Python and Javascript. However, you can implement similar logic using NodeJS
https://blog.zahidmak.com/create-standalone-websocket-server-using-tornado/

How do I stream JSON objects from ExpressJS?

I'm trying to stream JSON objects from an ExpressJS / Node backend API to a frontend site.
I do not want to use Sockets.IO for various reasons. As I understand it, the native streaming libraries should support streaming objects, it appears that just Express is complicating this.
My frontend code seams straight forward. I use Fetch to get my target URL, get a read stream from the response object, and set that read stream to objectMode: true.
Frontend Example:
async function () {
let url = "myurl";
let response = await fetch( url, {
method: 'GET',
mode: 'cors',
wtihCredentials: 'include'
}
const reader = response.body.getReader({objectMode: true });
// Where things are a bit ambiguous
let x = true;
while (x) {
const {done, value} = reader.read()
if (done) { break; }
// do something with value ( I push it to an array )
}
}
Backend Bode Example ( fails because of I cannot change the stream to objectMode )
router.get('/', (request, response) => {
response.writeHead(200, { 'Content-Type' : 'application/json' });
MongoDB.connection.db.collection('myCollection').find({}).forEach( (i) => {
response.write(i);
}).then( () => {
response.end()
})
})
Now my problem is that there does not appear to be anyway to change the ExpressJS write stream to objectMode: true. To my dismay, the ExpressJS documentation doesn't even acknoledge the existence of the write() function on the response object: https://expressjs.com/en/api.html#res
How do I change this over to objectMode: true ?
conversely, I tried to work with the writeStream as a string. The problem that I run into is that when the send buffer fills up, it does it by characters, not by the object. These means that at some point invalid JSON is passed to requester.
A suggested solution that I run into often is that I could read all of the chunks on the client and assemble valid JSON. This defeats the purpose of streaming, so Im trying to find a better way.
For what I believe is the same problem, I cannot figure out how to talk directly to the write stream object from the express code so I am unable to use the native writeStream operation writable.length in order to manually check to see if there is space for the entire JSON object as a string. This is preventing me from using stringified JSON with new line terminators.
https://nodejs.org/api/stream.html#stream_writable_writablelength
https://nodejs.org/api/stream.html#stream_writable_writableobjectmode
Could someone set me straight? I am working with 100k + records in my Mongo database, I really need partical page loading to work so that the users can start picking through the data.

Categories

Resources