How to save zip file represented as a string in Node js - javascript

I have a response from an ebay-api
--MIMEBoundaryurn_uuid_C91296EA5FF69EE9571479882375576565344 Content-Type: application/xop+xml; charset=utf-8; type="text/xml"
Content-Transfer-Encoding: binary Content-ID:
<0.urn:uuid:C91296EA5FF69EE9571479882375576565345>
Success1.1.02016-11-23T06:26:15.576Z514
--MIMEBoundaryurn_uuid_C91296EA5FF69EE9571479882375574545344 Content-Type: application/zip Content-Transfer-Encoding: binary
Content-ID:
PKY'uIi[��#�50014028337_report.xmlUT y�2Xy�2Xux
00�R�j�#��+��[��PlX#�(�x,=l�q]Lfewc��w Ĥ��O��١�HT���t��GGT�
��6�;���'������.$����=d����m;c}Wߦ�RW�A
f�����g�I��4U��x��3��f���ғ{f��xj�,+���ۖI%5��B's��G,#��t,L{�c�����MD笓��)!�9��
�M�o;8_��<�i�y����sz���u���=��Ջ^2�S��%+2�2�`QV�$�����~?�w�ǥ�_Q�퉦�'PKY'uIi[��#���50014028337_report.xmlUTy�2Xux
00PK\�
--MIMEBoundaryurn_uuid_C91296EA5FF69EE9571479882375576565344--
This is of type string. and i extracted the attached zip file data i.e.
PKY'uIi[��#�50014028337_report.xmlUT y�2Xy�2Xux
00�R�j�#��+��[��PlX#�(�x,=l�q]Lfewc��w Ĥ��O��١�HT���t��GGT�
��6�;���'������.$����=d����m;c}Wߦ�RW�A
f�����g�I��4U��x��3��f���ғ{f��xj�,+���ۖI%5��B's��G,#��t,L{�c�����MD笓��)!�9��
�M�o;8_��<�i�y����sz���u���=��Ջ^2�S��%+2�2�`QV�$�����~?�w�ǥ�_Q�퉦�'PKY'uIi[��#���50014028338_report.xmlUTy�2Xux
00PK\�
This shows that it has a report.xml in it. So when i write this data in a zip file, it creates a zip file and upon extract gives error.
fs.writeFile("./static/DownloadFile.zip", fileData, 'binary', function(err){
if (err) throw err;
console.log("success");
});
How can i write this data in a zip file properly. Pls advice. If required any more information.
EDIT:
I tried writing the zip file in PHP and is succssfully writing it with this code:
$zipFilename="DownloadFile.zip";
$data = $fileData;
$handler = fopen($zipFilename, 'wb')
or die("Failed. Cannot Open $zipFilename to Write!</b></p>");
fwrite($handler, $data);
fclose($handler);
Please advice how can i achieve the same thing in nodejs.

Depending on what HTTP Client you are using the implementation might change a little.
With axios I'm doing something like so:
I'm requesting a zip file so I specify the Accept header as application/zip
In order to get a buffer and not Binary, specify the responseType as arrayBuffer
const res = await axios.get('/routToThat/file', {
headers: {
Accept: 'application/zip',
},
responseType: 'arraybuffer',
});
By doing the latter, instead of receiving a Binary from the response:
A#B�ArE⏾�7�ϫ���f�걺N�����Yg���o_M^�D�T�U X_���e?� hi\...
I receive a Buffer:
Buffer(22781691) [80, 75, 3, …]
Once the request is resolved and I have that Buffer, I use that same writeFile function from fs
NOTE: I'm not specifying the Encoding in writeFile
fs.writeFile(name, res.data, (err) => {
if (err) throw err;
console.log("success");
});

As I see in your code example your binary data is already mangled by request module. Just use in request setting
encoding:null
and the zip file is a valid binary in body (now buffer instead of utf-8 string!) you can decompress. As long as you see the questions marks you still have the encoding issue.

Related

Fetch API is modifying request body inconsistently

I'm trying to upload an arbitrary list of files:
for (let i=0;i<files.length;i++) {
let fileTypeToUse = files[i].type;
fetchWithRetry(url, 1000, 2, {
method: 'POST',
credentials: 'same-origin',
headers: {
'Content-Type': fileTypeToUse
},
body: files[i],
}
}
This works for most file types (including images) by taking the bytes and sending them in the body of the request. But when I try to upload audio of type "audio/mpeg" my server receives a file which is about 60% larger than I expected.
I initially assumed this meant the file was being base64 encoded by the browser, so I tried to decode the file. Unfortunately, this hypothesis seemed to be incorrect. I received a decoding error on the server: base64.StdEncoding.Decode: illegal base64 data at input byte 3
For reference, here is a screenshot of the files object I am trying to upload:
And here is the oversized object being sent by the browser to the server:
Related issue: https://stackoverflow.com/a/40826943/12713117 Unfortunately they are encouraging people to upload the file with form/multipart instead of directly uploading the file.
I have 2 questions. First, why is the uploaded object larger than the file accessible in javascript? Second, how do I know if an arbitrary file will be sent as-is (which is the case with images) or if it will be encoded and need decoding on the server?
Fetch With Retry Code
function fetchWithRetry(url, delay, tries, fetchOptions = {}) {
return fetch(url, fetchOptions).catch((err) => {
let triesLeft = tries - 1;
if (triesLeft == null || triesLeft < 1) {
throw err;
}
return wait(delay).then(() => fetchWithRetry(url, delay * 2, triesLeft, fetchOptions));
});
}

How can I compress JSON and uncompress on PHP?

BTW, this is not a duplicate, I'm not trying to compress in PHP, I'm trying to compress Client Side and uncompress in PHP.
I'm trying to compress a JSON array containing 5 base64 images and some text to my PHP api.
I have tried lz-string and pako and it appears to be sending the compressed payload as expected but I'm having issues decompressing on the PHP backend.
Typescript Gzip.
var payload = pako.gzip(JSON.stringify(data), { to: 'string' });
let headers = new Headers({ "Content-Encoding" : "gzip"});
headers.append('Content-Type', 'application/json; charset=x-user-defined-binary');
let requestOptions = new RequestOptions({ headers: headers });
var url = this.baseUrl + "/app/notice/issue/compressed";
return this.http.post(url, data, requestOptions).pipe(timeout(100000), map(res => { return res.json() }), catchError(err => { return throwError(err.message); }));
PHP
$input = file_get_contents("php://input");
$test = gzdecode($input);
echo $test;
Output
"
<div style="border:1px solid #990000;padding-left:20px;margin:0 0 10px 0;">
<h4>A PHP Error was encountered</h4>
<p>Severity: Warning</p>
<p>Message: gzdecode(): data error</p>
</div>"
If anyone can help in reducing the size of my JSON, that would be really helpful.
Thanks.
You PHP exapmle looks fine and should work.
I'm not sure about the Typescript part: binary string can be additionaly encoded (as UTF-8 maybe?) before being sent by browser or http.post() method.
Check sent Content-Length of the request and actual length of the binary string after compression. They obviously must be the same.
I suggest removing {to: 'string'} option in pako.gzip() method to get Uint8Array instead of string and use XMLHttpRequest instead of http.post()
See also:
Angular Post Binary Data
https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Sending_and_Receiving_Binary_Data

Node JS + DIME - sending binary data in POST

There's a file 1740 bytes long, it's contents is read to a Buffer res. res.length is 1740 and res.toString('binary', 0, res.length).length is also 1740.
I send a POST request using request lib
request.post({
url: endpoint,
headers: headers,
body: res.toString('binary', 0, res.length)
}, callback);
The request goes to a gSOAP server. Through hours of debugging on the server I send request to, we found the following: the request that comes to the server is 1753 bytes long and some characters are converted. In particular, hex B7 becomes C2 B7, so it's converted as described here: http://www.fileformat.info/info/unicode/char/b7/index.htm
I tried setting encoding: 'binary' and encoding: null to request params, same result (with encoding : null I only get the error message as a buffer, but that's all).
I tried using https library and piping a strean into the request, same result.
Best regards, Alexander
EDIT
At the moment, I found a workaround with cURL, just sending a request from cli with --data-binary "#file_to_which_i_dumped_the_request"' does the thing. But the app and the nodejs server itself is shipped within an installer, so we'd have to install cURL on users' machines too which is... acceptable, but is not the best option.
So is there way to send a binary POST body with nodejs?
Thanks.
Don't use the binary string encoding: it has been deprecated (see here) and it only makes sense if "the other side" will decode it back to a buffer.
Just use the buffer directly:
request.post({
url : endpoint,
headers : headers,
body : res
}, callback);

Node js - How to serve multiple SVG files to a browser

I'm new to Node and server oriented code and am trying to get multiple svg files which are stored in the server.
Here is my code client-side using jQuery:
$.ajax({
url: someURL,
data: someData
})
.done(function(data) {
console.log('data got', data);
callback(null, data);
})
.fail(function() {
callback(new Error('Cannot access files'));
});
And here is my code server side:
// links is an array of links to the different svg files
var svgs = [];
async.eachSeries(links, function(link, next) {
fs.readFile(link, function(err, svg) {
svgs.push(svg);
next(err);
});
}, function(err) {
if (err) {
response.writeHead(500);
response.end(JSON.stringify(err));
return;
}
response.writeHead(200);
response.end(svgs); // Doesn't work
// response.end(svgs[0]); // Works
});
As long as I send only one file to the browser (which seem to be a Buffer instance), everything seems to work fine, but when I try to send multiple ones as an Array the transaction succeed but I got nothing in my returned data. That may be related to the MIME type of what I'm trying to send, but I couldn't find how to handle that.
You'll have to convert svgs into a String or Buffer first. One option is to stringify it as JSON:
response.writeHead(200, {
'Content-Type': 'application/json'
});
response.end(JSON.stringify(svgs));
This is because response.write(), which response.end() is calling to handle the data (svgs), doesn't accept Arrays.
chunk can be a string or a buffer. If chunk is a string, the second parameter specifies how to encode it into a byte stream. By default the encoding is 'utf8'.
Whereas each svg provided by fs.readFile() is a Buffer, so it has no issues writing svgs[0].

Receiving POST request with node.js

I have an Axis M1011 camera which is set up to send a series of jpeg images as long as it is detecting motion, to a service (using HTTP POST). I'm building the service using node.js.
I'm successfully receiving POST requests with their headers, but I am having trouble saving the data in the body of the request. Here is the code:
function addEvent(req, res)
{
var buffer = '';
console.log(req.headers);
req.on("data", function(chunk)
{
console.log("chunk received");
buffer += chunk;
});
req.on("end", function()
{
console.log("saving file");
fs.writeFile("./tmp/"+ new Date().getTime()+".jpg", buffer, function(error)
{
if(error)
{
console.log(error);
}
else
{
console.log("saved");
res.send("OK");
res.end();
}
});
});
}
On the console, I get this kind of output. Ofcourse, the content-length differs from file to file:
{ host: '192.168.0.100:8888',
'content-type': 'image/jpeg',
'content-disposition': 'attachment; filename="file13-07-19_20-49-44-91"',
'content-length': '18978' }
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
chunk received
saving file
saved
The problem is that I am getting one same, corrupted, file in the tmp folder which size is about 33KB, no matter how big is the image. What am I doing wrong with receiving these files?
You need to process the POST request to get the file that has been sent. When you submit a file in POST request, you wrap file meta data as well as data and send it to the server.
The server has to decode the request, and get the file. Simply saving the request won't do. You did not mention if you are using any web-server framework. It is better you use one like express which does this for you. Express will parse the request, get the file object and save the file into temporary file.

Categories

Resources