I created an Express Backend which sends JSON data as a chunk of text using res.write(JSON.stringify(JSONChunk)).
I want to handle and process each chunck of res.write in react front end and am using the following method:
My Backend pseudo code
for(let i = 0; i < slices; i ++) {
var JSONChunck = await getData(i); // getData operation can take some time
res.write(JSON.stringify(JSONChunck));
}
res.end();
FE pseudocode:
fetch(API, OPTS).then(async (response) => {
const reader = response.body.getReader();
while (true) {
const { done, value } = await reader.read();
if (done) {
break;
}
var text = new TextDecoder("utf-8").decode(value);
var result = JSON.parse(text);
var processedResult = process(result);
}
})
However, when I try the above code in some systems, I receive an error when trying to do JSON.parse(text) and see that the value of 'text' does not fetch full JSON string and only a partial string.
I was wondering if I am doing anything wrong or if there is any better way to do the above.
Any help/suggestions appreciated.
Thank you!
Related
I am trying to upload a pdf from the frontend to my node server. The PDF successfully uploads on the node server but when I go to open it, I am unable to. Instead, I see a message that says "File cant be opened. Something went wrong." Why is this happening?
Also please dont suggest third party pdf uploaders like multer, etc. I am aware of these third party libraries but I just want pure node. Thank you so much.
Frontend code:
const uploadFile = document.getElementById("uploadFile");
uploadFile.addEventListener("change", (event) => {
readFile(event.target.files[0]);
});
function readFile(file) {
const uploadDesignPDF = `http://localhost:7000/api/upload/design`;
let fileReader = new FileReader();
fileReader.readAsDataURL(file);
fileReader.addEventListener("load", async (event) => {
let pdfStrChunk = event.target.result.replace(
/^data:application\/[a-z]+;base64,/,
""
);
let fileSize = file.size;
const chunk = 85000;
let numOfChunkSet = Math.ceil(fileSize / chunk);
let remainingChunk = fileSize;
let currentChunk = 0;
let chunkSet = [];
let range = {};
let data = {};
for (let i = 0; i < numOfChunkSet; i++) {
remainingChunk -= chunk;
if (remainingChunk < 0) {
remainingChunk += chunk;
chunkSet.push(remainingChunk);
range.start = currentChunk;
range.end = currentChunk + chunk;
currentChunk += remainingChunk;
} else {
chunkSet.push(chunk);
range.start = currentChunk;
range.end = (i + 1) * chunkSet[i];
currentChunk += chunk;
}
const chunkRead = pdfStrChunk.slice(range.start, range.end);
data.dataPDF = chunkRead;
let response = await fetch(uploadDesignPDF, {
method: "POST",
body: JSON.stringify(data),
headers: {
"Content-Type": "application/json",
},
responseType: "arrayBuffer",
responseEncoding: "binary",
});
let results = await response.json();
console.log(results);
}
});
}
Backend route:
const { uploadDesigns } = require("./upload.designs.controller.js");
const router = require("express").Router();
router.post("/upload/design", uploadDesigns);
Backend:
uploadDesigns: async (req, res) => {
try {
fs.writeFileSync(`./designs/testingPDF6.pdf`, req.body.dataPDF, "base64");
res.status(200).json({
message: "done with chunk",
});
} catch (error) {
res.status(500).json({
message: "Something went wrong. Please refresh page.",
});
}
}
You are working with base64-URL in vain. It is much more effective to use ArrayBuffer. The main advantage of ArrayBuffer is the 1-byte unit, while base64 breaks the byte representation three out of four times.
Instead of sending the file in chunks, I would suggest tracking progress through XMLHttpRequest.upload.onprogress(). I would only use chunks if the upload is through a WebSocket.
If the PDF file is the only information sent to the server, I'd prefer to send the file directly without any field names or other FormData information provided. In that case, it would be appropriate to change the POST method to PUT.
If you prefer to send the file directly, it would be ideal to use fs.createWriteStream() instead of fs.writeFileSync().
Then this approach will work
const ws = fs.createWriteStream(tmpFilePath);
request.pipe(ws);
To control the integrity of the data, you can add md5 or sha hash to the request headers and, on the server, duplicate the data stream into the object created by crypto.createHash(). In case of a hash mismatch, the file can be uploaded again.
I am implementing FIDO2(WebAuthn) in a Angular application.
I have gotten the PublicKeyCredentialCreationOptions object and seccessfullt register.
But after calling
let response = await navigator.credentials.create({'publicKey': myPublicKeyCredentialCreationOption })
I try to send the response to the server.. But this fails.
When I tried to look at the object in the browser using
console.log(JSON.stringify(response))
I get
{}
as output (?..) but when doing
console.log(response)
I get a object with values in the console...
How should the object get serialized to send to the server?
PublicKeyCredential objects contains ArrayBuffer objects that cannot be serialized as JSON. You could base64 encode these values in your Angular app and decode on the server to get the same byte array back. A helper library to do exactly that for WebAuthn exists: https://github.com/github/webauthn-json
Here's a very simple example for anyone who needs it:
function bufferToBase64url (buffer) {
// modified from https://github.com/github/webauthn-json/blob/main/src/webauthn-json/base64url.ts
const byteView = new Uint8Array(buffer);
let str = "";
for (const charCode of byteView) {
str += String.fromCharCode(charCode);
}
// Binary string to base64
const base64String = btoa(str);
// Base64 to base64url
// We assume that the base64url string is well-formed.
const base64urlString = base64String.replace(/\+/g, "-").replace(
/\//g,
"_",
).replace(/=/g, "");
return base64urlString;
}
...
create publicKeyCredentialCreationOptions
...
navigator.credentials.create({
publicKey: publicKeyCredentialCreationOptions
}).then(credential => {
// credential created
// console.log(credential); <-- check what is output to see what you need to call bufferToBase64url(credential.<...>) on down below
// convert credential to json serializeable
const serializeable = {
authenticatorAttachment: credential.authenticatorAttachment,
id: credential.id,
rawId: bufferToBase64url(credential.rawId),
response: {
attestationObject: bufferToBase64url(credential.response.attestationObject),
clientDataJSON: bufferToBase64url(credential.response.clientDataJSON)
},
type: credential.type
};
const serialized = JSON.stringify(serializeable);
console.log(serialized);
}).catch(err => {
// an error occurred
console.error(err);
});
I am currently getting drag and drop / uploaded images as a data url and displaying them with that url.
What I am now trying to do is send those uploaded images to my backend web api using ASP.Net core to store then in a sqlite database this is a requirement for my application.
Currently I am converting the data url to an arraybuffer using the following code.
async srcToFile(context, asset) {
const files = asset[0].files.fileList;
let results = [];
for (let i = 0; i < files.length; i++) {
const file = files[i];
const data = file.data;
const name = file.name;
const mimeType = file.type;
await fetch(data)
.then(function(res) {
const r = res.arrayBuffer();
console.warn('resource ', r);
return r;
})
.then(function(buf) {
console.warn('buffer: ', [buf]);
const fileData = {data:[buf], name:name, type:mimeType};
results.push(fileData);
console.warn('results of file: ', fileData);
});
}
console.warn(results);
return results;
}
then I put it in an data object to send to my server via axios this is what that data object looks like
const data = {
Name: asset[0].name,
Detail: asset[0].detail,
Files: asset[0].files.fileList
};
When I console out the Files it shows there is Arraybuffer data in it. But when I send it to my server it looks like that data is stripped out of the header call. Cause when I look at the header I no longer have that data in there and I cannot figure out why that is happening.
this is my axios call.
axios.post('https://localhost:5001/api/Assets', data)
.then(res => console.log(res))
.catch(error => console.log(error));
and my back end web api post controller
public async Task<ActionResult> PostAsset([FromBody] AssetSaveRequest request,[FromForm] List<IFormFile> files)
{
foreach (var file in files)
{
if (file.Length > 0)
{
using (var ms = new MemoryStream())
{
file.CopyTo(ms);
var fileBytes = ms.ToArray();
string s = Convert.ToBase64String(fileBytes);
// act on the Base64 data
}
}
}
var assetCreationDto = new AssetCreationDto(request);
//var assetCreationDto = "";
try
{
var asset = _mapper.Map<Asset>(assetCreationDto);
_context.Assets.Add(asset);
//await _context.SaveChangesAsync();
var assetDto = _mapper.Map<AssetDto>(asset);
return CreatedAtAction("GetAsset", new {assetDto.Id}, assetDto);
}
catch (DbUpdateException dbe)
{
var errorCode = ((Microsoft.Data.Sqlite.SqliteException) dbe.InnerException).SqliteErrorCode;
switch (errorCode)
{
case 19:
Console.WriteLine(((Microsoft.Data.Sqlite.SqliteException)dbe.InnerException).Message);
break;
default:
Console.WriteLine("Something went wrong");
break;
}
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
return null;
}
That of which I don't know is working because I never get the file data, I do how ever get the Name and the details which come in fine.
I am looking for advice on what I should do here to get this to work. I have tried converting the arraybuffer to base64 string but that does not come out right any help and suggestions would be great to get me back on track with this project .
UPDATE:
I have modified my srcToFile code to give me a file, now I am using axios to send the file and data to the backend working with one file at this time and all im getting in the header now is [object object]. I've tried JSON.stringify on my data like so
const data = JSON.stringify({
Name: asset[0].name,
Detail: asset[0].detail,
Files: asset[0].files.fileList
});
It stringify's the name and detail but wipes out the file and I get nothing on the backend.
I have tested with postman and made several successful posts. but I can't seem to get the correct data from my Vue front end.
that is where I am at now. any suggestions always helps
I would like to be able to convert the JSON string sent from the server into a JavaScript object on a HMTL page. The raw JSON string data is being displayed, but I would rather display it as a JavaScript object instead.
case '/get_list':
if (req.method == 'POST') {
console.log("POST");
var body = '';
req.on('data', function(data) {
body += data;
console.log("Partial body: " + body);
});
req.on('end', async function() {
console.log("Body: " + body);
var json = JSON.parse(body)
console.log("name is " + json.name) // get name
const {
Client
} = require('pg');
const connectionString = 'postgresql://postgres:password#localhost/app';
const client = new Client({
connectionString: connectionString,
});
await client.connect(); // create a database connection
console.log("user input is " + json.name1);
//Returns the result from the database in a JSON string onto the HTML page
const res3 = await client.query('SELECT name, studentno, proname FROM applications WHERE name =$1 LIMIT 1', [json.name1]);
await client.end();
// json = res2.rows;
json = res3.rows;
var obj = JSON.parse(res3.rows);
var json_str_new = JSON.stringify(json); //working
console.log(obj);
console.log(json_str_new);
res.end(json_str_new);
});
}
break;
Actual results
{"name":"jake","studentno":10001212,"proname":"asdasdas"}
Expected/required results
{
name: 'jake',
studentno: 10001212,
proname: 'asdasdas'
}
If you're planning on using the JSON for anything, as in traversing and reading the data from that, then JSON.parse() is exactly what you need. Pretty-printing is something only useful for humans, so unless you're exclusively using the output for human consumption, the result you have should be fine.
However, if you are going to just show the data, then I would recommend just formatting the output into some HTML/CSS display.
But assuming you are planning on using the data for something, as mentioned before and by others, JSON.parse() is all you need to generate a JS object.
I'm querying SOLR7.5 for some large objects and would like to render them to a Browser UI as they are returned.
What are my options for reading the response bit by bit using when using the select request handler
I don't think there is anything native to Solr to do what you are asking.
One approach to handle this would be to return only the ID of the documents that match the criteria in your query (and not include the heavy part of the document) and then fetch the large part of the document asynchronously from the client.
i was looking in the wrong place. I just needed to read up on my webAPI fetch().
the response.json() reads the response to completion.
response.body.getReader() allows you to grab the stream in chunk and decode it from there.
let test = 'https://my-solr7/people/select?q=something'
fetchStream(test);
function fetchStream(uri, params = {}){
const options = {
method: 'GET',
};
var decoder = new TextDecoder();
fetch(uri, options)
.then ()
.then( (response) => {
let read;
const reader = response.body.getReader();
reader.read()
.then(read = (result) => {
if (result.done) return;
console.log(result.value);
let chunk = decoder.decode(result.value || new Uint8Array, {stream: !result.done});
console.log(chunk)
reader.read().then(read);
});
});
}