Node.js: Asynchronous issue when using ImageMagick - javascript

I am trying to use imagemagick to resize my images, then passing it off to image compression tools to compress. I am trying to utilize pngquant`
Here is a snippet of my code:
// Rename and move original file
if (fs.existsSync(tmpPath + '/' + file)) {
fs.renameSync(tmpPath + '/' + file, filePath + '/' + fileName);
// Create new versions of each file
Object.keys(geddy.config.uploader.imageVersions).forEach(function (version) {
console.log(version);
counter += 1;
var opts = geddy.config.uploader.imageVersions[version];
console.log(fileName);
imageMagick.resize({
width: opts.width,
srcPath: filePath + '/' + fileName,
dstPath: filePath + '/' + fileName.replace(/\.[^/.]+$/, "") + '_' + version + fileType
}, finish(filePath + '/' + fileName.replace(/\.[^/.]+$/, "") + '_' + version + fileType));
});
} else {
console.log('Unable to find tmp file!');
}
Then here is my callback finish:
finish = function (file) {
execFile(pngquantPath, ['256', '--force', file], function(err, data) {
console.log(err, data);
});
};
However, every time pngquant it is saying there is no file found. If I take the parameter file and go into the shell, run pngquant file, it runs the process. So I am assuming it is an asynchronous issue (files not there, it tries to run the process).
This is the error I end up with in the console:
{ [Error: spawn EACCES] code: 'EACCES', errno: 'EACCES', syscall: 'spawn' } ''
Any help is appreciated.

imageMagick.resize() expects two arguments: an options object and a callback.
For the second argument, you are not passing finish as your callback. Instead, you are invoking finish and passing whatever finish returns (probably undefined) as the callback argument.
Because you are invoking finish at the same time you make the resize() call, it is running immediately, before ImageMagick can complete its resize operation.
Try changing the resize() line to something like this:
imageMagick.resize({
// (your resize options)
},
function(err) {
if (err) { return console.error(err); } // whatever error handling you need
var thePath = filePath + '/' + fileName.replace(/\.[^/.]+$/, "") + '_' +
version + fileType;
finish(thePath);
});
Now you’re passing a callback function as the second argument, and that function will not be invoked until imageMagick.resize() is done. And of course it has the signature of all Node callbacks, which is that the first argument to the callback is err. When it’s invoked, the callback does some error-checking and then calls finish.

Related

write() does not write sequentialy?

I have found that write() method of stream.Writable class does not write data sequentially. When I an sending am attachment to the server in chunks, this code assembles data chunks in wrong order if no delay occurs. If I put a debug message like console.log() in the middle of the loop (like to dump the data to watch what is being written, actually), this bug disappears. So, what is the race condition in this code ? Looks like I am enforcing a sequential assembling of the file, so I do not understand what is wrong.
My code:
function join_chunks(company_id,attachment_id,num_chunks) {
var stream;
var file;
var output_filename=ATTACHMENTS_PATH + '/comp' + company_id + '/' + attachment_id + '.data';
var input_filename;
var chunk_data;
var chunk_count=0;
stream=fs.createWriteStream(output_filename,{flags:'w+',mode: 0666});
console.log('joining files:');
for(var i=0;i<num_chunks;i++) {
input_filename=ATTACHMENTS_PATH + '/comp' + company_id + '/' + attachment_id + '-' + (i+1) + '.chunk';
console.log(input_filename);
fs.readFile(input_filename , (err, chunk_data) => {
if (err) throw err;
stream.write(chunk_data,function() {
chunk_count++;
if (chunk_count==num_chunks) {
console.log('join finished. closing stream');
stream.end();
}
});
});
}
}
The console:
joining files:
/home/attachments/comp-2084830518/67-1.chunk
/home/attachments/comp-2084830518/67-2.chunk
/home/attachments/comp-2084830518/67-3.chunk
/home/attachments/comp-2084830518/67-4.chunk
join finished. closing stream
Node version: v6.9.2
stream.write is an asynchronous operation. This means that multiple calls to it may be serviced out of order.
If you want your writes to happen in order, use stream.writeSync, or use the callback argument to stream.write to sequence your writes.

Getting 'undefined' After fs.stat

I have a function that tests whether a file exists or not before editing it. I use fs.stat.
fs.stat('../fill/bower.json', function (err, stats) {
if (err) {
console.log('You don\'t have a ' + clc.red('bower.json') + ' file! Type ' + clc.bgBlack.white('touch bower.json') + ' to get started.');
return;
} if (stats.isFile()) {
var json = JSON.parse(fs.readFileSync('../bower.json', 'utf8')),
string = '\n Dependencies: ' + json;
fs.writeFile('../fill/README.md,', string, 'utf8');
console.log('it\'s saved!');
}
})
However, every time I run it (bower.json doesn't exist on purpose), it returns undefined before You don't have a bower.json file!. Why does this happen and how can I stop the function printing undefined?
Edit: for reference, here's my terminal window after running the command:
Why is undefined printed, and what do I do to have that not be displayed?
You're returning nothing or undefined from your reading function.
Gist for posterity

Can I make node.js FTP synchronous?

I have a little FTP script which basically transfer an entire directory tree (by walking it with fs.readdir) to an FTP server one file at a time (I have to do some analysis on each file as it's uploaded hence the one-at-a-time behaviour).
However, the bit that does a single file (there's another bit for directories which uses c.mkdir rather than c.put) looks like this:
console.log('Transferring [' + ival + ']');
var c = new Ftp();
c.on('ready', function() {
c.put(ival, ival, function(err) {
console.log(err);
});
c.end();
});
As you can see, it's using a very simple method of logging in that failures simply get sent to the console.
Unfortunately, since the FTPs are done asynchronously, errors are being delivered to the console in a sequence totally unrelated to the file name output.
Is there a way to force the FTP to be done synchronously so that errors would immediately follow the file name? Basically, I want the entire sequence from the initial console.log to the final }); to be done before moving on to the next file.
Even if there is, it's not recommended. You generally don't want to block the event loop with such a long synchronous operation.
What would probably be more useful is using recursion or Promises to ensure that things happen in a sequence.
Example:
let ivals = [/* lots of ivals here */];
function putItems(ivals) {
let ival = ivals[0];
console.log('Transferring [' + ival + ']');
var c = new Ftp();
c.on('ready', function() {
c.put(ival, ival, function(err) {
console.log(err);
c.end();
// Don't continue if we're out of items.
if (ivals.length === 1) { return; }
putItems(ivals.slice(1)); // Call again with the rest of the items.
});
});
}
putItems(ivals);
It can probably be done more intelligently by using a nested function and a single FTP context. But you get the point.
Without making things synchronous, you can solve your error logging problem by just logging the name with the error. You can just wrap this in a closure so you can keep track of ival that goes with a particular error:
(function(ival) {
console.log('Transferring [' + ival + ']');
var c = new Ftp();
c.on('ready', function() {
c.put(ival, ival, function(err) {
console.log('[' + ival + ']', err);
});
c.end();
});
})(ival);
Why dont you just push the errors to an array, and when all uploads are done, you will have that array
with all those errors in order ?
I will do something like this:
var errArray = [];
console.log('Transferring [' + ival + ']');
var c = new Ftp();
c.on('ready', function() {
c.put(ival, ival, function(err) {
errArray.push( err );
});
c.end();
});
c.on('end', function() {
errArray.forEach( function( err ){
console.log( err );
})
});

Incorrect header check zlib

Running the code below to to download and unzip files. It works as intended when I try with one but when I do multiple at the same time I get the following error:
Error: incorrect header check at Zlib._handle.onerror
var downloadUnzipFile = function (mID) {
try {
// Read File
console.log("Started download/unzip of merchant: " + mID + " # " + new Date().format('H:i:s').toString());
request(linkConst(mID))
// Un-Gzip
.pipe(zlib.createGunzip())
// Write File
.pipe(fs.createWriteStream(fileName(mID)))
.on('error', function (err) {
console.error(err);
})
.on('finish', function() {
console.log("CSV created: " + fileName(mID));
console.log("Completed merchant: " + mID + " # " + new Date().format('H:i:s').toString());
//console.log("Parsing CSV...");
//csvReader(fileName);
});
} catch (e) {
console.error(e);
}
}
module.exports = function(sMerchants) {
var oMerchants = JSON.parse(JSON.stringify(sMerchants));
oMerchants.forEach(function eachMerchant(merchant) {
downloadUnzipFile(merchant.merchant_aw_id);
})
};
Any ideas?
Thanks
EDIT:
To clarify, i'd like to run through each item (merchant) in the array (merchants) and download a file + unzip it. The way I currently do it means it this downloading/zipping occurs at the sametime (which I think might be causing the error). When i remove the foreach loop and just try to download/zip one merchant the code works.
Yeah, as you suggest, it's likely that if you try to unzip too many files concurrently, you will run out of memory. Because you are handling streams, the unzip operations are asynchronous, meaning your forEach loop will continue to be called before each unzip operation completes. There are plenty of node packages that allow you to handle async operations so you can run the unzip function sequentially, but the simplest approach might just be to use a recursive function call. E.g.:
var downloadUnzipFile = function (mID) {
try {
// Read File
console.log("Started download/unzip of merchant: " + mID + " # " + new Date().format('H:i:s').toString());
return request(linkConst(mID))
// Un-Gzip
.pipe(zlib.createGunzip())
// Write File
.pipe(fs.createWriteStream(fileName(mID)))
} catch (e) {
console.log(e);
return false;
}
}
module.exports = function(sMerchants) {
var merchants = JSON.parse(JSON.stringify(sMerchants)),
count = 0;
downloadUnzipFile(merchants[count][merchant_aw_id])
.on('error', function(err){
console.log(err);
// continue unzipping files, even if you encounter an error. You can also remove these lines if you want the script to exit.
if(merchants[++count]){
downloadUnzipFile(merchants[count][merchant_aw_id]);
}
})
.on('finish', function() {
if(merchants[++count]){
downloadUnzipFile(merchants[count][merchant_aw_id]);
}
});
};
Haven't tested, of course. The main idea should work thought: call downloadUnzipFile recursively whenever the previous call errors out or finishes, as long as there are still items in the merchants array.

How to test a custom module running node-fluent-ffmpeg (an async module)?

How do I test a custom module which is simply running a node-fluent-ffmpeg command with Mocha&Chai?
// segment_splicer.js
var config = require('./../config');
var utilities = require('./../utilities');
var ffmpeg = require('fluent-ffmpeg');
module.exports = {
splice: function(raw_ad_time, crop) {
if (!raw_ad_time || !crop) throw new Error("!!!!!!!!!! Missing argument");
console.log("##### LAST SEGMENT IS BEING SPLITTED.");
var segment_time = utilities.ten_seconds(raw_ad_time);
var last_segment_path = config.akamai_user_base + 'segment' + (segment_time + 1) + "_" + config.default_bitrate + "_av-p.ts?sd=10&rebase=on";
var command = ffmpeg(last_segment_path)
.on('start', function(commandLine) {
console.log('##### COMMAND: ' + commandLine);
})
.seekInput('0.000')
.outputOptions(['-c copy', '-map_metadata 0:s'])
.duration(crop)
.on('error', function(err, stdout, stderr) {
throw new Error('##### VIDEO COULD NOT BE PROCESSED: ' + err.message);
console.log('##### VIDEO COULD NOT BE PROCESSED: ' + err.message);
})
.output('public/' + 'segment' + (segment_time + 1) + "_" + config.default_bitrate + "_av-p.ts").run();
}
}
Here is what I tried:
// test/segment_splicer.js
var expect = require('chai').expect;
var segment_splicer = require('../lib/segment_splicer');
describe('Segment Splicer', function() {
it('should work', function(done) {
expect(segment_splicer.splice(1111111, 20)).to.throw(Error);
done();
});
});
I get this:
1) Segment Splicer should work:
AssertionError: expected undefined to be a function
Because I receive undefined from segment_splicer.spice method.
Thank you!
This test should be passing.
A test will only fail if you either assert or expect something in the test which is not true, or if the subject under test throws an uncaught error.
You are not asserting anything in your test, and the only error your subject will throw is if you pass less than 2 arguments, which is not the case in your test.
The ffmpeg method also seems to be asynchronous, which is not compatible with the way you have structured your test.
There are many examples available on setting up async tests, including:
How Mocha Makes Testing Asynchronous JavaScript Processes
Fun
Asynchronous Unit Tests With Mocha, Promises, And
WinJS
Testing Asynchronous
JavaScript
You've gone some way to doing this by referencing the done argument. When this is specified, Mocha will wait until it is called before considering the test finished.

Categories

Resources