I'm writing a Node.js server that watches a directory full of empty files for changes. When a file changes, it notifies a client, then empties the file. The watch code is:
fs.watch("./files/", function(event, targetfile){
console.log(targetfile, 'is', event)
fs.readFile("./files/"+targetfile, 'utf8', function (err,data) {
if (err) {
return console.log(err);
}
if (data=="") return; //This should keep it from happening
//Updates the client here
fs.truncate("./files/"+targetfile, 0);
});
});
The change event happens twice, thus the client gets updated twice. This can't happen. Its like the watch function is being called twice at the same time, and both execute before either can get to the truncate command. How do I keep this from happening? I can't, say, block a thread because I need it to be responsive in real time for the other files.
Thank you for the help. I'm new to Node.js, but so far I'm loving it.
You can use the underscore utility method Once that keeps the function from executing more than once. You'd have to make your code look like this:
var func = _.once(function(targetfile){
fs.readFile("./files/"+targetfile, 'utf8', function (err,data) {
if (err) {
return console.log(err);
}
if (data=="") return; //This should keep it from happening
//Updates the client here
fs.truncate("./files/"+targetfile, 0);
});
});
fs.watch("./files/", function(event, targetfile){
console.log(targetfile, 'is', event);
func(targetfile);
});
If you want it executed more than once, but you want to filter out duplicate events, you can use a function such as throttle or debounce.
Related
I wanna encapsulate socket.io events into functions to make it more readable for my project, here is an example. I am trying to put this:
io.on('connection', (socket) =>{
//code
});
Into something like this:
function isThereANewConnection() {
io.on('connection', (socket) =>{
//..return true?
});
}
function update() {
if(isThereANewConnection()) {
//do something with the socket data..
}
}
I cannot seem to figure out how i could implement this since i cannot return something from the function. Does anyone know how to do this?
You haven't really explained what you're trying to accomplish by putting "events into functions" so it's hard to know precisely what you want it to look like.
At it's heart, nodejs is an event driven environment. As such, you register event listeners and they call you when there's an event. That's the structure of the socket.io connection event.
If you don't want everything inline such as this:
io.on('connection', (socket) =>{
//code
});
You can put the event listener callback into a function itself such as:
io.on('connection', onConnection);
Then, somewhere else (either in the same module or imported into this module), you would define the onConnection function.
function onConnection(socket) {
// put your code here to handle incoming socket.io connections
socket.on('someMsg', someData => {
// handle incoming message here
});
}
Note: inline callback functions (of the type you seem to be asking to avoid) are very common in nodejs programming. One of the reasons for using them is that when they are inline, they have available to them all the variables from the parent scope (without having to pass them all as arguments) and there are times this is very useful. So, it might be a style that you will find useful rather than avoiding.
I don't think this is going to structure your code in a way that is more readable and useful to you. But if you did want this, you could use async/await.
async function getNewConnection() {
return await new Promise((resolve, reject) => {
io.on('connection', resolve);
});
}
Then in your other code:
const socket = await getNewConnection();
Really though, for this connection handler you should stick with the callback pattern. It keeps things simple for you. Run your updates when the connection comes in, and not the other way around.
Update: I have kinda found the OOP solution for this (it is not the best):
class ConnectionHandler {
init() {
this.listenForConnections();
}
listenForConnections() {
io.on('connection', (socket) =>{
//do stuff
this.newConnection = true;
this.update();
});
}
// the update method contains all the logical decisions of a class
update() {
if(this.isThereANewConnection()) {
//add connection to list.. or handle the rest of the logic
}
//render screen
}
isThereANewConnection() {
if(this.newConnection) {
this.newConnection = false;
return true;
}
return false;
}
}
Here are my demo code:
doGet('../loaderDemo/1.lst');
doGet('../loaderDemo/2.lst');
doGet('../loaderDemo/3.lst');
doGet('../loaderDemo/4.lst');
doGet('../loaderDemo/5.lst');
function doGet(filename) {
$.get(filename,function (data) {
console.log(data + filename);
});
}
the line "console.log(...)" may not be executed as the order of doGet(), the output contents is not as the order of 1.lst -> 2.lst -> 3.lst -> 4.lst -> 5.lst.
Actually the output order is just random in each execution.
how could I let it outputs in order?
Thanks for your help :-)
-------------Update-------------------
the ".lst" files are 3D models that I want to load. I just want to load the models in order so that I can render an animation properly. so which is the best solution in this case?
each ".lst" files includes the information of one frame. and in this demo,the outputs of "console.log()" must be in order as 1.lst -> 2.lst -> 3.lst -> 4.lst -> 5.lst so that I can handle rendering a frame animation.
jQuery $.get returns a Promise (of sorts)
So, with minimal rewrite, you can do as follows
doGet('../loaderDemo/1.lst')
.then(function() {
doGet('../loaderDemo/2.lst');
})
.then(function() {
doGet('../loaderDemo/3.lst');
})
.then(function() {
doGet('../loaderDemo/4.lst');
})
.then(function() {
doGet('../loaderDemo/5.lst');
});
function doGet(filename) {
// added return
return $.get(filename,function (data) {
console.log(data + filename);
});
}
If, however, the order of download completion is not important, but order of "processing" is - you can use jQuery.when to "wait" for all the downloads to complete, then process the results in the order required
$.when(doGet('../loaderDemo/1.lst'),
doGet('../loaderDemo/2.lst'),
doGet('../loaderDemo/3.lst'),
doGet('../loaderDemo/4.lst'),
doGet('../loaderDemo/5.lst')
)
.done(function(v1, v2, v3, v4, v5) {
[].forEach.call(arguments, function(arg) {}
console.log(arg.data, arg.filename);
})
});
function doGet(filename) {
return $.get(filename)
.then(function(data) {
// need this so we can access the filename and the data for each result
return {
data: data,
filename: filename
};
});
}
Welcome to the world of asynchronous programming. The best way to handle this is to call all 5 asynch functions at the same time, but delay the execution of the console log statements until they are all completed, and then run them in order.
(This is, of course, assuming that it's really important to run them in the same order all the time. An even better solution might be refactoring your code so that it doesn't matter which one completes first)
Here's an example for your problem.
Mostly, this is way faster than any of the sequential solutions posted because it's going to run 5 calls at the same time instead of 5 calls one after the other.
return $.when(
doGet('../loaderDemo/1.lst'),
doGet('../loaderDemo/2.lst'),
doGet('../loaderDemo/3.lst'),
doGet('../loaderDemo/4.lst'),
doGet('../loaderDemo/5.lst'),
).done( function(res1, res2, res3, res4, res5 ) {
console.log(res1),
console.log(res2),
console.log(res3),
console.log(res4),
console.log(res5),
});
(My previous edit used $q, but it turns out that jquery has a built-in thing that works almost the same)
I'm facing troubles trying to accomplish pretty banal task. I need to create nodejs Readable stream from input txt file. I need to perform some transforming on this stream (create JSON object for each line).
Problem is that I want this stream to be infinitive: after last line is read, stream should just start from beginning. My solution works bit I'm getting warning message:
(node) warning: possible EventEmitter memory leak detected. 11 drain listeners added. Use emitter.setMaxListeners() to increase limit.
I hoped to find simple solutions without reading and buffering file directly.
//Transform stream object
var TransformStream = function () {
Transform.call(this, {objectMode: true});
};
util.inherits(TransformStream, Transform);
TransformStream.prototype._transform = onTransform;
TransformStream.prototype._flush = onEnd;
var ts = new TransformStream();
var infinStream = function () {
var r = fs.createReadStream(filePath);
r.pipe(split(), {end: false})
.pipe(ts, {end: false});
r.once('end', function () {
//console.log('\n\n\n\nRead file stream finished. Lines counted:\n\n\n\n' + detectionCounter);
r.removeAllListeners();
r.destroy();
infinStream();
});
return r;
};
infinStream();
return ts;
From the comments:
I need server that will be live for 24h/day, and will simulate device
output all the time.
To do that a recursive function is a good idea. The approach you make is ok. When you don't need different transforming tasks on your data, a stream is not really needed. Simple Events can do exactly what you want and they are easier to understand.
The error in your code is the point, where you put your listener. The listenster r.once is inside your recursive function. You are defining r inside your function, so with every function call a new r is created. Because of that r.once does not work like you are expecting it.
What you can do:
Make a recursive function which emitts an event
use the data from your event outside
This is just a simple concept by using simple events, which are fireing the whole time the data from your file:
// Your recursive function
var simulateDeviceEvents = function(){
fs.readFile('device.txt', function (err, data) {
if (err) throw err;
// Just emit the event here
Emitter.emit('deviceEvent', data);
});
//If this happens to fast you could also call it with
//a timeout.
simulateDeviceEvents();
};
// Start the function
simulateDeviceEvents();
//IMPORTANT: The listener must be defined outside your function!
Emitter.on('deviceEvent', function(data){
// Do something with your data here
});
gulp.task('minify-css', function () {
gulp.src(['src/test/test.css])
.pipe(concat('test1.css'))
.pipe(minifyCSS())
.pipe(gulp.dest('src/source/'))
.pipe(filesize())
});
gulp.task('copy-css',['minify-css'], function () {
gulp.src('src/source/*.css')
.pipe(gulp.dest('src/dest/'));
});
It seems that the first time I run 'gulp copy-css'
Starting 'minify-css'...
[18:54:39] Finished 'minify-css' after 61 ms
[18:54:39] Starting 'copy-css'...
[18:54:39] Finished 'copy-css' after 1.86 ms
but the copy operation doesn't execute probably because it executes even before the file to be copied is not generated
Even though I have mentioned minify-css as dependency for copy-css task, it seems it is not following that convention in this case.
When gulp copy-css is run another time, this time the file is copied because the file is already generated from previously executed command. But this would beat the purpose when the code is being used in production.
change both the tasks as below
gulp.task('minify-css', function () {
return gulp.src(['src/test/test.css])
.pipe(concat('test1.css'))
.pipe(minifyCSS())
.pipe(gulp.dest('src/source/'))
.pipe(filesize())
});
gulp.task('copy-css',['minify-css'], function () {
return gulp.src('src/source/*.css')
.pipe(gulp.dest('src/dest/'));
});
add return so that next task runs after first one runs, else your copy-csss is running even before minify-css is finished.. hence the error...
I would check out this question, as it seems you might have to add return keyword before gulp.src(['src/test/test.css])...
Try with a callback in the on('finish') event:
gulp.task('minify-css', function (cb) {
gulp.src(['src/test/test.css'])
.pipe(concat('test1.css'))
.pipe(minifyCSS())
.pipe(gulp.dest('src/source/'))
.pipe(filesize())
.on('finish', function() {
// All tasks should now be done
return cb();
})
});
gulp.task('copy-css',['minify-css'], function () {
gulp.src('src/source/*.css')
.pipe(gulp.dest('src/dest/'));
});
PS:
You also go an syntax error in your gulp.src array at the very top, just a missing quote ' here gulp.src(['src/test/test.css])
I've written happily a node.js server, which uses socket.io to communicate with the client.
this all works well.
the socket.on('connection'...) handler got a bit big, which made me think of an alternative way to organize my code and add the handlers in a generator function like this:
sessionSockets.on('connection', function (err, socket, session) {
control.generator.apply(socket, [session]);
}
the generator takes an object that contains the socket events and their respective handler function:
var config = {
//handler for event 'a'
a: function(data){
console.log('a');
},
//handler for event 'b'
b: function(data){
console.log('b');
}
};
function generator(session){
//set up socket.io handlers as per config
for(var method in config){
console.log('CONTROL: adding handler for '+method);
//'this' is the socket, generator is called in this way
this.on(method, function(data){
console.log('CONTROL: received '+method);
config[method].apply(this, data);
});
}
};
I was hoping that this would add the socket event handlers to the socket, which it kind of does, but when any event comes in, it always calls the latest one added, in this case always the b-function.
Anyone any clues what i am doing wrong here?
The problem appears because by that time this.on callback triggers (let's say in a few seconds after you bind it), the for loop is finished and method variable becomes the last value.
To fix that you may use some JavaScript magic:
//set up socket.io handlers as per config
var socket = this;
for(var method in config){
console.log('CONTROL: adding handler for '+method);
(function(realMethod) {
socket.on(realMethod, function(data){
console.log('CONTROL: received '+realMethod);
config[realMethod].apply(this, data);
});
})(method); //declare function and call it immediately (passing the current method)
}
This "magic" is hard to understand when you first see it, but when you get it, the things become clear :)