I am using fs.writefile and I can't save it in ANSI1255.
The file is UTF-8 encoded.
const encodedData = windows1255.encode(doc);
Don't work for me
Can't you just use the optional options.encoding param like this?
filehandle.writeFile(data, {
encoding: 'ANSI1255' // or whatever it's called properly
});
If that doesn't work, this might: https://stackoverflow.com/a/53855344/7451109
Related
I'm trying to get a blob for UTF-16 (I think). Here's the code. I'm not including the full string because it is too long.
const creatBlobUrl = (str) => {
console.log(str)
//?PNG\r\n\u001a\n\u0000\u0000\u0000\rIHDR\u0000\u0000\b$\u0000\u0000\u00016\b\u0006\u0000\u0000\u0000???\u0000\u0000\fliCCPICC Profile\u0000\u0000H??W\u0007XS?\u0016?[????\u0002?H\t?\tҫ?\u0010Z\u0004\u0001????\u0004\u0012J?\tAņ????E\u0014+?*??Z\u0000YT?^\u0016??\u0017\u000b*+?bAQTބ\u0004t?W?w?o?̙??;s?\u001d\u00004{?\u0012I.?\u0005#?8_\u001a\u001f\u0011?\u001c???$u\u0000\u0002 \u0001\n0\u0006\b?'????\u0001?????&#\u0014?5'\u0005??????\u00052\u001e\u0000?x?3?2^\u001e??\u0000?\u001bx\u0012i>\u0000D??rj?D?? ֕?\u0000!^??YJ?K?3??i?&1?\r?\u0015\u0000Ԩ\\?4\u000b\u0000??P?,?eA\u001e??\u0010???\"1\u0000??!\u000e?\t?|?\u0015?\u000f?˛??\u0015\u0010?A{\t?0\u001e??\u001dg??3???ܬ!??k#?BE2I.w??Y??-y??A\u001f6?Q???xE????s&G)0\u0015?.qFL???\u0010???ʺ\u0003?R???$?=j̓?a?"
const blob = new Blob([str], {
type: 'image/png',
});
return URL.createObjectURL(blob);
};
First of all, is this even UTF-16. Second what am I doing wrong? BTW, my goal is to pass this url into an anchor tag
Figured it out. #Kaiido you were right. I was in fact losing data over http. I had to encode the data to base64 on the backend before I sent it over http, and then decode it on the front end with something like:
https://stackoverflow.com/a/35713609/11477406
I am downloading a .csv.gz file from a remote server, and I have the contents of this file stored as a string. Here is a small sample of what I see when I console.log it:
�}�v������)��t�Y�j�8p0�eCR��
l��1�=���6������~̵r�����0c7�7L���������U:���0�����g��
How can I unzip this in Node.js so that it converts it to the original .csv file?
I have tried zlib.gunzip(Buffer.new(body), callback), but then I get an error
incorrect header check at Gunzip.zlibOnError (zlib.js:152:15)
The file itself is valid, and I can double-click to unzip and open it on my computer.
I create the file using: zlib.createGzip(); and then gzip.pipe(writeStream);
Update
The (actual) issue was my data was utf8 encoded so I needed to ensure that it remained either as a Buffer or binary.
The problem is that fs.createWriteStream defaults to utf-8 encoding, you should change that to binary, then you'll be able to create a valid buffer that gunzip will happily accept.
You could probably accomplish this by changing your code to:
gzip.pipe(data => writeStream(data, { encoding: 'binary'})
see https://nodejs.org/api/fs.html#fs_fs_createwritestream_path_options
UPDATE:
I have modified the code so that now you have a ArrayBuffer which gets actually decompressed.
function decompressFile(filename) {
var decompress = zlib.createUnzip(),
input = fs.createReadStream(filename);
var data = [];
input.on('data', function(chunk){
data.push(chunk);
}).on('end', function(){
var buf = Buffer.concat(data);
zlib.gunzip(buf, function(err, buffer) {
if (!err) {
console.log(buffer.toString()+'\n');
}else{
console.log(err);
}
});
});
}
decompressFile('TestFileSheet1.csv.gz');
This looks straight forward. But I think the problem might be somewhere else in your code Or the http library that you are using. Check whether the response header's content encoding is gzip and then call the zlib.gunzip. I think your http library might already be decompressing the csv file.
I have linked a JSON response on user's request to fetch an excel document, the response is in the manner
{
"format": // file extn ----only xls
"docTitle": //file name
"document" :// base 64 encoded data
}
I tried to handle the API on Frontend by decoding the document key using atob() function.
downloadTemplate() {
this.fetchData.fetchDownloadTemplate().subscribe(
res => {
let docString = atob(res.document);
let format = res.format;
let fileName = res.docTitle;
let file = new Blob([docString], {type: "text/xls"});
let url = URL.createObjectURL(file);
let dwldLink = document.getElementById('template_link');
dwldLink.setAttribute("href", url);
dwldLink.setAttribute("download", fileName);
document.body.appendChild(dwldLink);
dwldLink.click();
},
err => {
console.log(err.responseJSON.message);
})
}
But the data gets corrupted, On doing some research I got to know that atob() decodes using ASCII charset, while I have done the encoding using charset UTF-8.
Can you suggest any alternative method for decoding the data with UTF-8 charset in typescript, as the angular(v4+) project I am working throws error on using JS validators.
While searching for a suitable module which would support Unicode to binary conversion, I came across
https://github.com/dankogai/js-base64#decode-vs-atob-and-encode-vs-btoa
Using Javascript's atob to decode base64 doesn't properly decode utf-8 strings
The Base64.decode() support utob decrpyption.
The module supports angular(v4+), and also add dependencies for Typescript(2)
I'm trying to generate a QR code and then return it as a Base64-encoded data URL. I know that there's a module (https://github.com/cmanzana/qrcode-npm) which is supposed to be able to do this directly, but I'm having trouble installing the canvas module dependency. I'm still working on that front but for now, my attempted workaround is to generate an image stream with an alternate module, then convert it to Base64. This is what I have so far:
var qrBase64 = '';
var qrImg = qr.image(qrText, { type: 'png', ec_level: 'L' });
qrImg.on('readable', function () {
qrBase64 += qrImg.read().toString('base64');
});
qrImg.on('end', function () {
qrBase64 = "data:image/png;base64," + qrBase64;
return res.json({
success: true,
qrBase64: qrBase64
});
});
It seems to work in that it gives me a string which resembles a Base64-encoded string. However, if I try to view it in a browser, I get an invalid URL error. If I pipe the qrImg to a file and then use an online tool to convert it to Base64, the result (which is valid and works in a browser) does not match my Node result.
You need to base64 encode all of the image data at once. Concatenating chunks before and after base64 encoding usually doesn't yield the same result. Take a look at this example:
btoa("12" + "34") -> "MTIzNA=="
btoa("12") + btoa("34") -> "MTI=MzQ="
I am encoding a MP3 file to Base64 in Node Js using this method :
encodebase64 = function(mp3file){
var bitmap = fs.readFileSync(mp3file);
var encodedstring = new Buffer(bitmap).toString('base64');
fs.writeFileSync('encodedfile.bin', encodedstring);}
and then again I want to construct the MP3 file from the Base64 bin file, but the file created is missing some headers , so obviously there's a problem with the decoding.
the decoding function is :
decodebase64 = function(encodedfile){
var bitmap = fs.readFileSync(encodedfile);
var decodedString = new Buffer(bitmap, 'base64');
fs.writeFileSync('decodedfile.mp3', decodedString);}
I wondered if anyone can help
Thanks.
Perhaps it is an issue with the encoding parameter. See this answer for details. Try using utf8 when decoding to see if that makes a difference. What platforms are you running your code on?
#Noah mentioned an answer about base64 decoding using Buffers, but if you use the same code from the answer, and you try to create MP3 files with it, then they won't play and their file size will be larger than original ones just like you experienced in the beginning.
We should write the buffer directly to the mp3 file we want to create without converting it(the buffer) to an ASCII string:
// const buff = Buffer.from(audioContent, 'base64').toString('ascii'); // don't
const buff = Buffer.from(audioContent, 'base64');
fs.writeFileSync('test2.mp3', buff);
More info about fs.writeFile / fs.writeFileAsync