Perhaps the question is not worded in the greatest way but here's some more context. Using GridFSBucket, I'm able to store a file in mongo and obtain a download stream for that file. Here's my question. Let's say I wanted to send that file back as a response to my http request.
I do:
downloadStream.pipe(res);
On the client side now when I print the responseText, I get some long string with some funky characters that look to be encrypted. What is the format/type of this string/stream? How do I setup my response so that I can get the streamed data as an ArrayBuffer on my client side?
Thanks
UPDATE:
I haven't solved the problem yet, however the suggestion by #Jekrb, gives exactly the same output as doing console.log(this.responseText). It looks like the string is not a buffer. Here is the output from these 2 lines:
console.log(this.responseText.toString('utf8'))
var byteArray = new Uint8Array(arrayBuffer);
UPDATE 2 - THE CODE SNIPPETS
Frontend:
var savePDF = function(blob){
//fs.writeFile("test.pdf",blob);
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
if (this.readyState === XMLHttpRequest.DONE && this.status === 200){
//TO DO: Handle the file in the response which we will be displayed.
console.log(this.responseText.toString('utf8'));
var arrayBuffer = this.responseText;
if (arrayBuffer) {
var byteArray = new Uint8Array(arrayBuffer);
}
console.log(arrayBuffer);
}
};
xhr.open("POST","/pdf",true);
xhr.responseType = 'arrayBuffer';
xhr.send(blob);
};
BACKEND:
app.post('/pdf',function(req,res){
MongoClient.connect("mongodb://localhost:27017/test", function(err, db) {
if(err) return console.dir(err);
console.log("Connected to Database");
var bucket = new GridFSBucket(db, { bucketName: 'pdfs' });
var CHUNKS_COLL = 'pdfs.chunks';
var FILES_COLL = 'pdfs.files';
// insert file
var uploadStream = bucket.openUploadStream('test.pdf');
var id = uploadStream.id;
uploadStream.once('finish', function() {
console.log("upload finished!")
var downloadStream = bucket.openDownloadStream(id);
downloadStream.pipe(res);
});
// This pipes the POST data to the file
req.pipe(uploadStream);
});
});
My guess is that either the response is being outputted as plain binary which is not base64 encoded (still a buffer) or it is a compressed (gzip) response that needs to be uncompressed first.
Hard to pinpoint the issue without seeing the code though.
UPDATE:
Looks like you're missing the proper response headers.
Try setting these headers before the downloadStream.pipe(res):
res.setHeader('Content-disposition', 'attachment; filename=test.pdf');
res.set('Content-Type', 'application/pdf');
Your stream is likely already a buffer. You might be able to call responseText.toString('utf8') to convert the streamed data into readable string.
I solved it!!!
Basically preset the response type to "arraybuffer" before you make the request using
req.responseType = "arraybuffer"
Now, once you receive the response, don't use responseText, instead use response. response contains the arraybuffer with the data for the file.
Related
I am uploading a csv file using FormData and XmlHttpRequest. Here is the code for that.
I have a form wrapped around an html input type file, whose onchange event I am executing this code. I have tried to send the form directly as well and also read the form element into the FormData object.
let formData = new FormData();
let file = e.target.files[0];
var blob = new Blob([file],{type: 'text/csv'});
formData.append("payoutUpload", blob, 'processed.csv');
let uri = encodeURI(`${window.serviceUri}${path}`);
var req = new XMLHttpRequest();
req.onload = (result) => {
if (req.status === 500 && result && result.code === 'ECONNRESET') {
console.log(
'Connection was reset, hence retry the sendRequest function'
);
} else if (req.status === 200) {
} else {
console.log("Error while retrieving data");
}
}
req.onerror = (e) => {
console.log('There was an error while retrieving data from service', e);
};
req.open('POST', uri, true);
req.setRequestHeader('Content-Type', 'multipart/form-data');
req.setRequestHeader('Authorization', 'Bearer ' + token);
req.send(formData);
When I send the request, I can see that the file is being sent in the form of Request Payload.
On the NodeJs backend, I am running Express and Formidable. I am not using body-parser, I am using express's inbuilt json and urlencoding methods.
Here is the formidable part.
const form = formidable({multiples: true});
form.parse(req, (err, fields, files) => {
console.log(`error is ${JSON.stringify(err)}`);
console.log(`fields is ${JSON.stringify(fields)}`);
console.log(`files JSON: ${JSON.stringify(files)}`);
console.log('file in request: ' + files.payoutUpload);
console.log(`req.body: ${req.body}`);
options.file = files.payoutUpload;
});
I get err, fields and files as empty. I have searched through all similar questions and set the request headers correctly(which is usually the issue). I can see that the request.body still has the file payload on the server end. But formidable does not parse this. Can anyone tell what I am doing wrong?
UPDATE: I have tried other packages for parsing the file, like multer, express-fileupload, all of them return files as empty. I have also tried fetch API to send my request, but with no luck.
req.setRequestHeader('Content-Type', 'multipart/form-data')
When you send multipart/form-data you must include a boundary parameter in the header however you can't know what value you need to set for this.
Don't set the Content-Type header at all. Allow XMLHttpRequest to generate it automatically from the FormData object.
Overarching goal is to save some JSON data I create on a webpage to my files locally. I am definitely sending something to the server, but not in format I seem to able to access.
JsonData looks like:
{MetaData: {Stock: "UTX", Analysis: "LinearTrend2"}
Projections: [2018-10-12: 127.62, 2018-10-11: 126.36000000000001, 2018-10-10: 132.17, 2018-10-09: 140.12, 2018-10-08: 137.73000000000002, …]}
XMLHttpRequest on my webpage:
function UpdateBackTestJSON(JsonUpdate){ //JsonUpdate being the JSON object from above
var request = new XMLHttpRequest();
request.open('POST', 'UpdateBackTestJSON');
request.setRequestHeader("Content-Type", "application/json;charset=UTF-8");
// request.setRequestHeader("Content-Type", "text/plain;charset=UTF-8");
request.onload = function() {
console.log("Updated JSON File");
};
console.log("about to send request");
console.log(JsonUpdate);
request.send(JSON.stringify(JsonUpdate));
}
and I handle posts on my server (rather carelessly I realize, just going for functionality as a start here)
var http = require('http')
, fs = require('fs')
, url = require('url')
, port = 8008;
var server = http.createServer (function (req, res) {
var uri = url.parse(req.url)
var qs = require('querystring');
if (req.method == 'POST'){
var body = '';
req.on('data', function (data){
body += data;
// 1e6 === 1 * Math.pow(10, 6) === 1 * 1000000 ~~~ 1MB
if (body.length > 1e6){
// FLOOD ATTACK OR FAULTY CLIENT, NUKE REQUEST
req.connection.destroy();
}
});
req.on('end', function () {
var POST = qs.parse(body);
console.log(POST); // PARSED POST IS NOT THE RIGHT FORMAT... or something, idk whats going on
UpdateBackTestData(POST);
});
}
function UpdateBackTestData(TheJsonData){
console.log("UpdateBackTestData");
console.log(TheJsonData);
JsonUpdate = JSON.parse(TheJsonData);
console.log(JsonUpdate["MetaData"]);
//var Stock = JsonUpdate["MetaData"]["Stock"];
// var Analysis = JsonUpdate["MetaData"]["Analysis"];
fs.writeFile("/public/BackTestData/"+Analysis+"/"+Stock+".json", TheJsonData, function(err){
if(err){
console.log(err);
}
console.log("updated BackTest JSON!!!");
});
}
Most confusing to me is that when I run this, the Json object Im am trying to pass, does go through to the server, but the entirety of the data is a string used as a key for a blank value in an object. when I parse the body of the POST, I get: {'{MetaData:{'Stock':'UTX','Analysis:'LinearTrend2'},'Projections':[...]}': ''}. So my data is there... but not in a practical format.
I would prefer not to use express or other server tools, as I have a fair amount of other services set up in my server that I don't want to go back and change if I can avoid it.
Thanks for any help
I have the URL to a blob which I'm trying to upload to azure storage, there doesn't seem to be an obvious way of doing this as none of the APIs handle uploading a blob url directly.
I'm trying to do something like this:
blobService.createBlockBlobFromLocalFile('taskcontainer', 'myfile.png', blobUrl, (error, result, response) => {
});
Which doesn't work, I've tried to find ways to read the blob url to a readable stream and upload that but haven't gotten very far either.
I basically have a file selected by the user using react-dropzone which provides me with a blob url (which can look like this: blob:http://localhost:3000/cd8ba70e-5877-4112-8131-91c594be8f1e) pointing to the local file. My goal is to now upload that blob url to an azure container.
Firebase storage has a 'put' function which allows you to upload the blob from a url: https://firebase.google.com/docs/storage/web/upload-files
This is the closest I have gotten:
var blobUrl = acceptedFiles[0].preview;
var xhr = new XMLHttpRequest();
xhr.open("GET", blobUrl);
xhr.responseType = "text";//force the HTTP response, response-type header to be blob
xhr.onload = function () {
const Stream = require('stream')
const readable = new Stream.Readable()
readable.push(xhr.responseText);
readable.push(null);
blobService.createBlockBlobFromStream('taskcontainer', 'myblob.png', readable, xhr.responseText.length, (error, result, response)=>{
var ok = 0;
})
}
xhr.send();
The file (or parts of it?) seem to get uploaded but the end result is the file type is lost and I can't view the png uploaded..
You could try the following
var azure = require('azure-storage');
var blobService = azure.createBlobService('', '');
blobService.createBlockBlobFromLocalFile('nodecontainer', 'AzureDC', 'azure_center.png', function(error, result, response) {
publicAccessLevel: 'blob'
}, function(error, result, response) {
if (!error) {
console.log(response);
} else {
console.log(error);
}
});
EDIT
Check this code snippet to upload blob to azure storage
I'm using protobufs for serializing my data. So far I serialized my data on the server (node restify) send it, receive it (Request is made by XMLHttpRequest) and serialize it on the Client.
Now I want to employ zipping to reduce the transfered file size. I tried using the library pako, which uses zlib.
In a basic script that I used to compare the protobuf and zipping performance to json I used it this way, and there were no problems
var buffer = proto.encode(data,'MyType'); // Own library on top of protobufs
var output = pako.deflate(buffer);
var unpacked = pako.inflate(output);
var decoded = proto.decode(buffer,'MyType');
However if I try to do this in a client-server model I can't get it working.
Server:
server.get('/data', function (req, res) {
const data = getMyData();
const buffer = proto.encode(data, 'MyType');
res.setHeader('content-type', 'application/octet-stream;');
res.setHeader('Content-Encoding', 'gzip;');
return res.send(200,buffer);
});
My own protolibrary serializes the data in protobuf and then deflates it:
...
let buffer = type.encode(message).finish();
buffer = pako.deflate(buffer);
return buffer;
The request looks like this:
public getData(){
return new Promise((resolve,reject) => {
const request = new XMLHttpRequest();
request.open("GET", this.url, true);
request.responseType = "arraybuffer";
request.onload = function(evt) {
const arr = new Uint8Array(request.response);
const payload = proto.decode(request.response ,'MyType')
resolve(payload);
};
request.send();
});
}
The proto.decode method first inflates the buffer buffer = pako.inflate(buffer); and then deserializes it from Protobuf.
If the request is made i get following error: "Uncaught incorrect header check" returned by the inflate method of pako:
function inflate(input, options) {
var inflator = new Inflate(options);
inflator.push(input, true);
// That will never happens, if you don't cheat with options :)
if (inflator.err) { throw inflator.msg || msg[inflator.err]; }
return inflator.result;
}
Also I looked at the request in Postman and found following:
The deflated response looks like this: 120,156,60,221,119,64,21,237,119,39,240,247,246,242,10,49,191,244,178,73,54,157 and has a length of 378564
The same request without deflating (the protobuf) looks like this
�:�:
(� 0�8#H
(� 0�8#H
� (�0�8#�Hand has a length of 272613.
I'm assuming, that I'm doing something incorrectly on the server side, since the inflated request is larger than the one not using compression.
Is it the content-type Header? I'm out of ideas.
I'm consuming a JSON stream and am trying to use fetch to consume it. The stream emits some data every few seconds. Using fetch to consume the stream gives me access to the data only when the stream closes server side. For example:
var target; // the url.
var options = {
method: "POST",
body: bodyString,
}
var drain = function(response) {
// hit only when the stream is killed server side.
// response.body is always undefined. Can't use the reader it provides.
return response.text(); // or response.json();
};
var listenStream = fetch(target, options).then(drain).then(console.log).catch(console.log);
/*
returns a data to the console log with a 200 code only when the server stream has been killed.
*/
However, there have been several chunks of data already sent to the client.
Using a node inspired method in the browser like this works every single time an event is sent:
var request = require('request');
var JSONStream = require('JSONStream');
var es = require('event-stream');
request(options)
.pipe(JSONStream.parse('*'))
.pipe(es.map(function(message) { // Pipe catches each fully formed message.
console.log(message)
}));
What am I missing? My instinct tells me that fetch should be able to mimic the pipe or stream functionality.
response.body gives you access to the response as a stream. To read a stream:
fetch(url).then(response => {
const reader = response.body.getReader();
reader.read().then(function process(result) {
if (result.done) return;
console.log(`Received a ${result.value.length} byte chunk of data`);
return reader.read().then(process);
}).then(() => {
console.log('All done!');
});
});
Here's a working example of the above.
Fetch streams are more memory-efficient than XHR, as the full response doesn't buffer in memory, and result.value is a Uint8Array making it way more useful for binary data. If you want text, you can use TextDecoder:
fetch(url).then(response => {
const reader = response.body.getReader();
const decoder = new TextDecoder();
reader.read().then(function process(result) {
if (result.done) return;
const text = decoder.decode(result.value, {stream: true});
console.log(text);
return reader.read().then(process);
}).then(() => {
console.log('All done!');
});
});
Here's a working example of the above.
Soon TextDecoder will become a transform stream, allowing you to do response.body.pipeThrough(new TextDecoder()), which is much simpler and allows the browser to optimise.
As for your JSON case, streaming JSON parsers can be a little big and complicated. If you're in control of the data source, consider a format that's chunks of JSON separated by newlines. This is really easy to parse, and leans on the browser's JSON parser for most of the work. Here's a working demo, the benefits can be seen at slower connection speeds.
I've also written an intro to web streams, which includes their use from within a service worker. You may also be interested in a fun hack that uses JavaScript template literals to create streaming templates.
Turns out I could get XHR to work - which doesn't really answer the request vs. fetch question. It took a few tries and the right ordering of operations to get it right. Here's the abstracted code. #jaromanda was right.
var _tryXhr = function(target, data) {
console.log(target, data);
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function () {
console.log("state change.. state: "+ this.readyState);
console.log(this.responseText);
if (this.readyState === 4) {
// gets hit on completion.
}
if (this.readyState === 3) {
// gets hit on new event
}
};
xhr.open("POST", target);
xhr.setRequestHeader("cache-control", "no-cache");
xhr.setRequestHeader("Content-Type", "application/json");
xhr.send(data);
};