Create image from ArrayBuffer in Nodejs - javascript

I'm trying to create an image file from chunks of ArrayBuffers.
all= fs.createWriteStream("out."+imgtype);
for(i=0; i<end; i++){
all.write(picarray[i]);
}
all.end();
where picarray contains ArrayBuffer chunks. However, I get the error TypeError: Invalid non-string/buffer chunk.
How can I convert ArrayBuffer chunks into an image?

Have you tried first converting it into a node.js. Buffer? (this is the native node.js Buffer interface, whereas ArrayBuffer is the interface for the browser and not completely supported for node.js write operations).
Something along the line of this should help:
all= fs.createWriteStream("out."+imgtype);
for(i=0; i<end; i++){
var buffer = new Buffer( new Uint8Array(picarray[i]) );
all.write(buffer);
}
all.end();

after spending some time i got this, it worked for me perfectly.
as mentioned by #Nick you will have to convert buffer array you recieved from browser in to nodejs Buffer.
var readWriteFile = function (req) {
var fs = require('fs');
var data = new Buffer(req);
fs.writeFile('fileName.png', data, 'binary', function (err) {
if (err) {
console.log("There was an error writing the image")
}
else {
console.log("The sheel file was written")
}
});
});
};

Array Buffer is browser supported which will be unsupportable for writing file, we need to convert to Buffer native api of NodeJs runtime engine.
This few lines of code will create image.
const fs = require('fs');
let data = arrayBuffer // you image stored on arrayBuffer variable;
data = Buffer.from(data);
fs.writeFile(`Assets/test.png`, data, err => { // Assets is a folder present in your root directory
if (err) {
console.log(err);
} else {
console.log('File created successfully!');
}
});

Related

Why is nodeJs not reading entire binary file from disk?

I have a PDF file which I want to read into memory using NodeJS. Ideally I'd like to encode it using base64 for transferring it. But somehow the read function does not seem to read the full PDF file, which makes no sense to me. The original PDF was generated using pdfKit, and is ok and viewable using a PDF reader program.
The original file test.pdf has 90kB on disk. But if I read and write it back to disk there are just 82kB and the new PDF test-out.pdf is not ok. The pdf viewer says:
Unable to open document. The pdf document is damaged.
The base64 encoding therefore also does not work correctly. I tested it using this webservice. Does someone know why and what is happening here? And how to resolve it.
I found this post already.
fs = require('fs');
let buf = fs.readFileSync('test.pdf'); // returns raw buffer binary data
// buf = fs.readFileSync('test.pdf', {encoding:'base64'}); // for the base64 encoded data
// ...transfer the base64 data...
fs.writeFileSync('test-out.pdf', buf); // should be pdf again
EDIT MCVE:
const fs = require('fs');
const PDFDocument = require('pdfkit');
let filepath = 'output.pdf';
class PDF {
constructor() {
this.doc = new PDFDocument();
this.setupdocument();
this.doc.pipe(fs.createWriteStream(filepath));
}
setupdocument() {
var pageNumber = 1;
this.doc.on('pageAdded', () => {
this.doc.text(++pageNumber, 0.5 * (this.doc.page.width - 100), 40, {width: 100, align: 'center'});
}
);
this.doc.moveDown();
// draw some headline text
this.doc.fontSize(25).text('Some Headline');
this.doc.fontSize(15).text('Generated: ' + new Date().toUTCString());
this.doc.moveDown();
this.doc.font('Times-Roman', 11);
}
report(object) {
this.doc.moveDown();
this.doc
.text(object.location+' '+object.table+' '+Date.now())
.font('Times-Roman', 11)
.moveDown()
.text(object.name)
.font('Times-Roman', 11);
this.doc.end();
let report = fs.readFileSync(filepath);
return report;
}
}
let pdf = new PDF();
let buf = pdf.report({location: 'athome', table:'wood', name:'Bob'});
fs.writeFileSync('outfile1.pdf', buf);
The encoding option for fs.readFileSync() is for you to tell the readFile function what encoding the file already is so the code reading the file knows how to interpret the data it reads. It does not convert it into that encoding.
In this case, your PDF is binary - it's not base64 so you are telling it to try to convert it from base64 into binary which causes it to mess up the data.
You should not be passing the encoding option at all and you will then get the RAW binary buffer (which is what a PDF file is - raw binary). If you then want to convert that to base64 for some reason, you can then do buf.toString('base64') on it. But, that is not its native format and if you write that converted data back out to disk, it won't be a legal PDF file.
To just read and write the same file out to a different filename, leave off the encoding option entirely:
const fs = require('fs');
let buf = fs.readFileSync('test.pdf'); // get raw buffer binary data
fs.writeFileSync('test-out.pdf', buf); // write out raw buffer binary data
After a lot of searching I found this Github issue. The problem in my question seems to be the call of doc.end() which for some reason doesn't wait for the stream to finish (finish event of write stream). Therefore as suggested in the Github issue, the following approaches work:
callback based:
doc = new PDFDocument();
writeStream = fs.createWriteStream('filename.pdf');
doc.pipe(writeStream);
doc.end()
writeStream.on('finish', function () {
// do stuff with the PDF file
});
or promise based:
const stream = fs.createWriteStream(localFilePath);
doc.pipe(stream);
.....
doc.end();
await new Promise<void>(resolve => {
stream.on("finish", function() {
resolve();
});
});
or even nicer, instead of calling doc.end() direcly, call the function savePdfToFile below:
function savePdfToFile(pdf : PDFKit.PDFDocument, fileName : string) : Promise<void> {
return new Promise<void>((resolve, reject) => {
// To determine when the PDF has finished being written sucessfully
// we need to confirm the following 2 conditions:
//
// 1. The write stream has been closed
// 2. PDFDocument.end() was called syncronously without an error being thrown
let pendingStepCount = 2;
const stepFinished = () => {
if (--pendingStepCount == 0) {
resolve();
}
};
const writeStream = fs.createWriteStream(fileName);
writeStream.on('close', stepFinished);
pdf.pipe(writeStream);
pdf.end();
stepFinished();
});
}
This function should correctly handle the following situations:
PDF generated successfully
Error is thrown inside pdf.end() before write stream is closed
Error is thrown inside pdf.end() after write stream has been closed

create-react-app javascript convert file to a Uint8Array

I have a create-react-app that updates firmware of connected bluetooth devices.
In order to do this, I need to convert the firmware file (.zip) to a Uint8Array.
The firmware file is saved locally in my public/ folder
And so I try to extract these bytes by with this function:
var fimware_zip = process.env.PUBLIC_URL + '/ZioV8_1.2.7.zip'
this.loadFile(fimware_zip)
With loadFile defined as:
// Load a file, set the bytes to firmware_byte_array
loadFile = async (my_file) => {
console.log(my_file)
var fr = new FileReader();
fr.onload = (e) =>
{
var arrayBuffer = e.target.result;
var array = new Uint8Array(arrayBuffer);
this.setState({ firmware_byte_array: array})
}
fr.readAsArrayBuffer(my_file);
}
However I get the following error:
Unhandled Rejection (TypeError): Failed to execute 'readAsArrayBuffer' on 'FileReader': parameter 1 is not of type 'Blob'.
I've searched high and low looking how to convert a file into a Blob type and I just cant do it.
I've also tried putting the .zip file in the src/ folder and importing it in using
import fimware_zip from './ZioV8_1.2.7.zip'
But that also doesnt work
Any help would be greatly appreciated
You can only use readAsArrayBuffer on Blobs or File objects (such as those you get from input type="file" elements).
I assume there's some kind of server process involved in this app, in which case you can use fetch:
const loadFile = async (my_file) => {
const response = await fetch(my_file);
if (!response.ok) {
throw new Error("HTTP error " + response.status);
}
const array = new Uint8Array(await response.arrayBuffer());
// ...use or return `array` here...
};

Buffer not recognized as buffer on nodejs with sails js

I'm trying to get the buffer from a blob being sent to my SailsJs server.
An example of what is being sent to the server is this:
Blob(3355) {size: 3355, type: "video/x-matroska;codecs=avc1,opus"}
Once on the server side, I do the following:
let body = new Array(0);
let buffer;
let readable = req.file('recordingPart');
readable.on('data', (chunk) => {
body.push(new Buffer(chunk));
});
readable.on('end', () => {
buffer = Buffer.concat(body);
console.log('There will be no more data.', buffer.length, buffer);
});
When running this part of the code I get the error:
buffer.js:226
throw new errors.TypeError(
^
TypeError [ERR_INVALID_ARG_TYPE]: The first argument must be one of type string, buffer, arrayBuffer, array, or array-like object. Received type object
at Function.from (buffer.js:226:9)
at new Buffer (buffer.js:174:17)
...
In this case the error is at the body.push(new Buffer(chunk)); on new Buffer(chunk)
My first approach was similar:
let body = [];
let buffer;
let readable = req.file('recordingPart');
readable.on('data', (chunk) => {
body.push(chunk);
});
readable.on('end', () => {
buffer = Buffer.concat(body);
console.log('There will be no more data.', buffer.length, buffer);
});
but I've got this error:
buffer.js:475
throw kConcatErr;
^
TypeError [ERR_INVALID_ARG_TYPE]: The "list" argument must be one of type array, buffer, or uint8Array
at buffer.js:450:20
In this one the error pops at Buffer.concat(body);
I got some guidance from this answer Node.js: How to read a stream into a buffer?
Can anyone help me in getting the buffer from that req.file.
You can get uploadedFile as below.
req.file('recordingPart').upload(function (err, uploadedFiles){
if (err) return res.serverError(err);
// Logic with uploadedFiles goes here
});
You can get file descriptor from uploadedFiles[0].fd and use it to read/stream the file as below.
fs.readFile(uploadedFiles[0].fd, 'utf8', function (err,data) {
// Logic with data goes here
var myBuffer = Buffer.from(data);
});
To use fs as above create fs instance as below.
var fs = require('fs');
Your current upload approach will work but there's another new way you might want to consider:
// Upload the image.
var info = await sails.uploadOne(photo, {
maxBytes: 3000000
})
// Note: E_EXCEEDS_UPLOAD_LIMIT is the error code for exceeding
// `maxBytes` for both skipper-disk and skipper-s3.
.intercept('E_EXCEEDS_UPLOAD_LIMIT', 'tooBig')
.intercept((err)=>new Error('The photo upload failed: '+util.inspect(err)));
Full Example Here
Also check out the Sails.JS Platzi Course for video tutorials on this latest upload functionality using the example project Ration.

Upload a file stream to S3 without a file and from memory

I'm trying to create a csv from a string and upload it to my S3 bucket. I don't want to write a file. I want it all to be in memory.
I don't want to read from a file to get my stream. I would like to make a stream with out a file. I would like this method createReadStream, but instead of a file, I would like to pass a string with my stream's contents.
var AWS = require('aws-sdk'),
zlib = require('zlib'),
fs = require('fs');
s3Stream = require('s3-upload-stream')(new AWS.S3()),
// Set the client to be used for the upload.
AWS.config.loadFromPath('./config.json');
// Create the streams
var read = fs.createReadStream('/path/to/a/file');
var upload = s3Stream.upload({
"Bucket": "bucket-name",
"Key": "key-name"
});
// Handle errors.
upload.on('error', function (error) {
console.log(error);
});
upload.on('part', function (details) {
console.log(details);
});
upload.on('uploaded', function (details) {
console.log(details);
});
read.pipe(upload);
You can create a ReadableStream and push your string directly to it which, can then be consumed by your s3Stream instance.
const Readable = require('stream').Readable
let data = 'this is your data'
let read = new Readable()
read.push(data) // Push your data string
read.push(null) // Signal that you're done writing
// Create upload s3Stream instance and attach listeners go here
read.pipe(upload)

Convert Blob data to Raw buffer in javascript or node

I am using a plugin jsPDF which generates PDF and saves it to local file system. Now in jsPDF.js, there is some piece of code which generates pdf data in blob format as:-
var blob = new Blob([array], {type: "application/pdf"});
and further saves the blob data to local file system. Now instead of saving I need to print the PDF using plugin node-printer.
Here is some sample code to do so
var fs = require('fs'),
var dataToPrinter;
fs.readFile('/home/ubuntu/test.pdf', function(err, data){
dataToPrinter = data;
}
var printer = require("../lib");
printer.printDirect({
data: dataToPrinter,
printer:'Deskjet_3540',
type: 'PDF',
success: function(id) {
console.log('printed with id ' + id);
},
error: function(err) {
console.error('error on printing: ' + err);
}
})
The fs.readFile() reads the PDF file and generates data in raw buffer format.
Now what I want is to convert the 'Blob' data into 'raw buffer' so that I can print the PDF.
If you are not using NodeJS then you should know that the browser does not have a Buffer class implementation and you are probably compiling your code to browser-specific environment on something like browserify. In that case you need this library that converts your blob into a Buffer class that is supposed to be as perfectly equal to a NodeJS Buffer object as possible (the implementation is at feross/buffer).
If you are using node-fetch (not OP's case) then you probably got a blob from a response object:
const fetch = require("node-fetch");
const response = await fetch("http://www.stackoverflow.com/");
const blob = await response.blob();
This blob is an internal implementation and exists only inside node-fetch or fetch-blob libraries, to convert it to a native NodeJS Buffer object you need to transform it to an arrayBuffer first:
const arrayBuffer = await blob.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
This buffer object can then be used on things such as file writes and server responses.
For me, it worked with the following:
const buffer=Buffer.from(blob,'binary');
So, this buffer can be stored in Google Cloud Storage and local disk with fs node package.
I used blob file, to send data from client to server through ddp protocol (Meteor), so, when this file arrives to server I convert it to buffer in order to store it.
var blob = new Blob([array], {type: "application/pdf"});
var arrayBuffer, uint8Array;
var fileReader = new FileReader();
fileReader.onload = function() {
arrayBuffer = this.result;
uint8Array = new Uint8Array(arrayBuffer);
var printer = require("./js/controller/lib");
printer.printDirect({
data: uint8Array,
printer:'Deskjet_3540',
type: 'PDF',
success: function(id) {
console.log('printed with id ' + id);
},
error: function(err) {
console.error('error on printing: ' + err);
}
})
};
fileReader.readAsArrayBuffer(blob);
This is the final code which worked for me. The printer accepts uint8Array encoding format.
Try:
var blob = new Blob([array], {type: "application/pdf"});
var buffer = new Buffer(blob, "binary");

Categories

Resources