problems on learnyounode exercise 5 - javascript

Hi guys im trying to solve the exercises from nodeschool, i started with learnyounode, people recomende it a lot for beginners, the exercise question is this:
# LEARN YOU THE NODE.JS FOR MUCH WIN!
## FILTERED LS (Exercise 5 of 13)
Create a program that prints a list of files in a given directory,
filtered by the extension of the files. You will be provided a directory
name as the first argument to your program (e.g. '/path/to/dir/') and a
file extension to filter by as the second argument.
For example, if you get 'txt' as the second argument then you will need to
filter the list to only files that end with .txt. Note that the second
argument will not come prefixed with a '.'.
Keep in mind that the first arguments of your program are not the first
values of the process.argv array, as the first two values are reserved for
system info by Node.
The list of files should be printed to the console, one file per line. You
must use asynchronous I/O.
─────────────────────────────────────────────────────────────────────────────
## HINTS
The fs.readdir() method takes a pathname as its first argument and a
callback as its second. The callback signature is:
function callback (err, list) { /* ... */ }
where list is an array of filename strings.
Documentation on the fs module can be found by pointing your browser here:
file://C:\Users\Filipe\AppData\Roaming\npm\node_modules\learnyounode\node_
apidoc\fs.html
You may also find node's path module helpful, particularly the extname
method.
Documentation on the path module can be found by pointing your browser
here:
file://C:\Users\Filipe\AppData\Roaming\npm\node_modules\learnyounode\node_
apidoc\path.html
─────────────────────────────────────────────────────────────────────────────
» To print these instructions again, run: learnyounode print
» To execute your program in a test environment, run: learnyounode run
program.js
» To verify your program, run: learnyounode verify program.js
» For help run: learnyounode help
i tried to solve the exercise doing this:
var fs = require('fs');
fs.readdir(process.argv[2],process.argv[3],function(err,list){
for(var i = 0;i<list.length;i++)
{
var fileParts = list[i].split(".");
if(fileParts[1] == process.argv[3]){
console.log(list[i] + "\n");
}
}
});
why it doesnt work, i dont know what i am doing wrong, if someone can give me a tip to solve this i appreciate a lot, since the solutions on web are quite bit strange.

Hm, I just tested your code and it actually runs fine for me. Verify returns with solved. Not sure what you are experiencing. Btw.: put in at least some error handling. Below my solution:
const fs = require('fs');
const test = '.' + process.argv[3]
fs.readdir(process.argv[2], (err, data) => {
if (err) {
console.error(err);
} else {
data.filter(file => {
if (file.substring(file.length - test.length) === test) {
console.log(file)
}
})
}
});

Related

Node.js Function not updating value after 1st invoke

I've recently taken interest in the Discord.js framework, and was designing a bot for a server. Apologies in advance for the messiness of the code.
The issue I'm facing is that after I first run the command, the the function is invoked, the value of ticketValue does not update to the update value in my JSON file.
const fs = require("fs");
module.exports = {
commands: ["ticket"],
minArgs: 1,
expectedArgs: "<message>",
callback: (message, arguments, text) => {
// Console Log to notify the function has been invoked.
console.log("FUNCTION RUN")
let jsondata = require("../ticketNum.json")
let ticketValue = jsondata.reportNews.toString()
// Turning the number into a 4 digit number.
for(let i = ticketValue.length; i<4;i++) {
ticketValue = `0${ticketValue}`
}
console.log(`#1 ${ticketValue}`)
// Creating the Discord Chanel
message.guild.channels.create(`report-incident-${ticketValue}`, {
type: 'text',
permissionOverwrites: [
{
id: message.author.id,
deny: ['VIEW_CHANNEL'],
},
],
})
// Adding one to the ticket value and storing it in a JSON file.
ticketValue = Number(ticketValue)+1
console.log(`TICKET VALUE = ${ticketValue}`)
fs.writeFile("./ticketNum.json",JSON.stringify({"reportNews": Number(ticketValue)}), err => {
console.log(`Done writing, value = ${ticketValue}`)
})
console.log(require("../ticketNum.json").reportNews.toString())
},
}
I believe this is due to something called require cache. You can either invalidate the cache for your JSON file each time you write to it, or preferably use fs.readFile to get the up-to-date contents.
Also worth noting that you are requiring from ../ticketNum.json but you are writing to ./ticketNum.json. This could also be a cause.
You seem to be using JSON files as a database and while that is perfectly acceptable depending on your project's scale, I would recommend something a little more polished like lowdb which stills uses local JSON files to store your data but provides a nicer API to work with.
You should only use require when the file is static while the app is running.
The caching require performs is really useful, especially when you are loading tons of modules with the same dependencies.
require also does some special stuff to look for modules locally, globally, etc. So you might see unintended things happen if a file is missing locally and require goes hunting.
These two things mean it's not a good replacement for the fs tools node provides for file access and manipulation.
Given this, you should use fs.readFileSync or one of the other read functions. You're already using fs to write, so that isn't a large lift in terms of changing the line or two where you have require in place or a read.

In Node.js, asking for a value using Prompt, and using that value in a main js file

I'm pretty new to node.js and it seems fairly easy to use but when it comes to getting a value using the command line and returning that value to be used in another package or .js, it seems harder than I expected.
Long story short, I've used a npm package (akamai-ccu-purge), to enter a file to purge on the akamai network successfully.
I want to make it more dynamic though by prompting the user to enter the file they want purged and then using that in the akamai package.
After making a few tries using var stdin = process.openStdin(); I actually found another npm package called Prompt that seemed to be easier. Both ways seem to have the same problem though.
Node doesn't seem to want to stop for the input. It seems to want to automatically make the purge without waiting for input even though I've called that module first. It actually gets to the point where I should enter the file but it doesn't wait.
I am definitely missing something in my understanding or usage here, what am I doing wrong?
My code so far is:
var purgeUrl = require('./getUrl2');
var PurgerFactory = require('../../node_modules/akamai-ccu-purge/index'); // this is the directory where the index.js of the node module was installed
// area where I placed the authentication tokens
var config = {
clientToken: //my tokens and secrets akamai requires
};
// area where urls are placed. More than one can be listed with comma separated values
var objects = [
purgeUrl // I am trying to pull this from the getUrl2 module
];
// Go for it!
var Purger = PurgerFactory.create(config);
Purger.purgeObjects(objects, function(err, res) {
console.log('------------------------');
console.log('Purge Result:', res.body);
console.log('------------------------');
Purger.checkPurgeStatus(res.body.progressUri, function(err, res) {
console.log('Purge Status', res.body);
console.log('------------------------');
Purger.checkQueueLength(function(err, res) {
console.log('Queue Length', res.body);
console.log('------------------------');
});
});
});
The getUrl2 module looks like this:
var prompt = require('../../node_modules/prompt');
//
// Start the prompt
//
prompt.start();
//
// Get property from the user
//
prompt.get(['newUrl'], function (err, result) {
//
// Log the results.
//
console.log('Command-line input received:');
console.log(' http://example.com/custom/' + result.newUrl);
var purgeUrl = 'http://example.com/custom/' + result.newUrl;
console.log(purgeUrl);
module.exports = purgeUrl;
});
Thanks again for the help!
I would probably just allow getURL2 to expose a method that will be invoked in the main module. For example:
// getURL2
var prompt = require('../../node_modules/prompt');
module.exports = {
start: function(callback) {
prompt.start();
prompt.get(['newUrl'], function (err, result) {
// the callback is defined in your main module
return callback('http://example.com/custom/' + result.newUrl);
});
}
}
Then in your main module:
require('./getUrl2').start(function(purgeURL) {
// do stuff with the purgeURL defined in the other module
});
The implementation may differ, but conceptually, you need to make your second module, which requires some sort of input from the user, happen as a result of that input. Callbacks are a common way to do this (as are Promises). However, as prompt is not necessarily exposing a method that would necessitate a Promise, you can do it with plain old callbacks.
You might also want to search around for articles on writing command line tools (sometimes referenced as CLIs) or command line apps with Node. I found the following article to be helpful when trying to figure this out myself:
http://javascriptplayground.com/blog/2015/03/node-command-line-tool/
Also, the command-line-args module worked well for me (though there's a number of other modules out there to choose from):
https://www.npmjs.com/package/command-line-args
Good luck!

Issue with output list for learnyounode #6 MAKE IT MODULAR

Just started coding last thursday, bear with me here:
my code for this question of the tutorial is returning a list of just the extension names from the directory and not a list of the files with the said extension, e.g. if i used a directory with 3 .js files and used js as my extension argument in the command line, then i would get
1. js
2. js
3. js
as the output, here is the question from the tutorial and my code. THANK YOU!
the question from learnyounode tutorial number 6:
LEARN YOU THE NODE.JS FOR MUCH WIN!
─────────────────────────────────────
MAKE IT MODULAR
Exercise 6 of 13
This problem is the same as the previous but introduces the concept of modules. You will need to create two files to solve this.
Create a program that prints a list of files in a given directory, filtered by the extension of the files. The first argument is the directory name and the second argument is the extension filter. Pr
int the list of files (one file per line) to the console. You must use asynchronous I/O.
You must write a module file to do most of the work. The module must export a single function that takes three arguments: the directory name, the filename extension string and a callback function, in
that order. The filename extension argument must be the same as was passed to your program. i.e. don't turn it into a RegExp or prefix with "." or do anything else but pass it to your module where y
ou can do what you need to make your filter work.
The callback function must be called using the idiomatic node(err, data) convention. This convention stipulates that unless there's an error, the first argument passed to the callback will be null, a
nd the second will be your data. In this case, the data will be your filtered list of files, as an Array. If you receive an error, e.g. from your call to fs.readdir(), the callback must be called wi
th the error, and only the error, as the first argument.
You must not print directly to the console from your module file, only from your original program.
In the case of an error bubbling up to your original program file, simply check for it and print an informative message to the console.
These four things are the contract that your module must follow.
Export a single function that takes exactly the arguments described.
Call the callback exactly once with an error or some data as described.
Don't change anything else, like global variables or stdout.
Handle all the errors that may occur and pass them to the callback.
The benefit of having a contract is that your module can be used by anyone who expects this contract. So your module could be used by anyone else who does learnyounode, or the verifier, and just work. *
and my code is:
module (p6m.js):
var fs=require('fs'), ph=require('path'), exports =module.exports={}
exports.f=function(path,ext,callbk){
fs.readdir(path,function(err,files){
if(err){
return callbk(err,null)
}
files=files.filter(
function(file){
return ph.extname(file)==="."+ext
}
)
return callbk(null,files)}
)}
and my program (p6.js):
var p6m=require('./p6m'), path=process.argv[2], ext=process.argv[3]
p6m.f(path, ext, function(err,files){
if(err){return console.log.error('Error occured:', err)};
files.forEach(function(file){
console.log(file)})})
I got the same problem with my code as of need to use a single function export . So instead of exporting a module function like this :
exports =module.exports={}
exports.f=function(path,ext,callbk){...};
try it doing this way :
module.exports = function (path, ext, callbk) {...};
because its a single function so you don't need to specify that function with a name " f " as if you are doing it in this statement :
exports.f = function(path,ext,callbk){...};
whenever you will import the module,it will automatically call this function only, since the module contains this single function.
You can try this piece of code, it works well for me.
module code: mymodule.js
var fs = require('fs');
var ph= require('path');
module.exports = function (path, ext, callbk) {
var pathio = "." + ext;
fs.readdir(path, function (err, files) {
if (err)
return callbk(err);
else {
var listf = []; //listf is the resultant list
for (var i = 0; i < files.length; i++) {
if (ph.extname(files[i]) === pathio) {
listf.push(files[i]);
}
}
callbk(null, listf);
}
});
}
program code : moduletest.js
var mod = require('./mymodule');
mod(process.argv[2], process.argv[3], function (err, listf) {
if (err) {
console.log('Error!')
} else {
for (var i = 0; i < listf.length; i++) {
console.log(listf[i]);
}
}
});
and do remember, learnyounode series is very specific about its way of coding and syntax, so even if you are doing the logic right way still you won't get pass,you need your code to be the best and optimized. I'll suggest you to refer to discussions on nodeschool itself for various issues you might get in learnyounode series.
That will work and output the right results, but what they are looking for is something like this:
module.exports = function() {};
Because they only want one function total in the exports.
You could also do something like this:
module.exports = FindFilesByExtension;
function FindFilesByExtension(path, ext, callback) {
//your code
}
Here is my solution,
Thsi is my module file filteredls.js
var fs = require('fs');
var path = require('path');
module.exports = function filterFiles(folder, extension, callback) {
fs.readdir(folder, function(err, files) {
if(err) return callback(err);
var filesArray = [];
files.forEach(function(file) {
if(path.extname(file) === "."+extension) {
filesArray.push(file);
}
});
return callback(null, filesArray);
});
}
And here is my test file for reading module modular.js
var ff = require('./filteredls.js');
ff(process.argv[2], process.argv[3], function(err, data) {
if(err)
return console.error(err);
data.forEach(function(file) {
console.log(file);
});
});
And this is my result screenshot,

Delete (unlink) files matching a regex

I want to delete several files from a directory, matching a regex. Something like this:
// WARNING: not real code
require('fs').unlink(/script\.\d+\.js$/);
Since unlink doesn't support regexes, I'm using this instead:
var fs = require('fs');
fs.readdir('.', (error, files) => {
if (error) throw error;
files.filter(name => /script\.\d+\.js$/.test(name)).forEach(fs.unlink);
});
which works, but IMO is a little more complex than it should be.
Is there a better built-in way to delete files that match a regex (or even just use wildcards)?
No there is no globbing in the Node libraries. If you don't want to pull in something from NPM then not to worry, it just takes a line of code. But in my testing the code provided in other answers mostly won't work. So here is my code fragment, tested, working, pure native Node and JS.
let fs = require('fs')
const path = './somedirectory/'
let regex = /[.]txt$/
fs.readdirSync(path)
.filter(f => regex.test(f))
.map(f => fs.unlinkSync(path + f))
You can look into glob https://npmjs.org/package/glob
require("glob").glob("*.txt", function (er, files) { ... });
//or
files = require("glob").globSync("*.txt");
glob internally uses minimatch. It works by converting glob expressions into JavaScript RegExp objects. https://github.com/isaacs/minimatch
You can do whatever you want with the matched files in the callback (or in case of globSync the returned object).
I have a very simple solution to do this. Read the directory in node.js using fs.readdir API. This will give an array of all the files in the directory. Once you have that array, iterate over it using for loop, apply regex.
The below code will delete all files starting with "en" and extension ".js"
fs.readdir('.', (err, files)=>{
for (var i = 0, len = files.length; i < len; i++) {
var match = files[i].match(/en.*.js/);
if(match !== null)
fs.unlink(match[0]);
}
});
The answer could depend on your environment. It looks like you are running on node.js. A quick perusal of the node.js documentation suggests there is no "built in" way to do this, i.e., there isn't a single function call that will do this for you. The next best thing might involve a small number of function calls. As I wrote in my comment, I don't think there's any easy way to make your suggested answer much briefer just relying on the standard node.js function calls. That is, if I were in your shoes, I would go with the solution you already suggested (though slightly cleaned up).
One alternative is to go to the shell, e.g.,
var exec = require('child_process').exec;
exec('ls | grep "script[[:digit:]]\\\+.js" | xargs rm');
Personally, I would strongly prefer your offered solution over this gobbledygook, but maybe you're shooting for something different.

node.js execute system command synchronously

I need in node.js function
result = execSync('node -v');
that will synchronously execute the given command line and return all stdout'ed by that command text.
ps. Sync is wrong. I know. Just for personal use.
UPDATE
Now we have mgutz's solution which gives us exit code, but not stdout! Still waiting for a more precise answer.
UPDATE
mgutz updated his answer and the solution is here :)
Also, as dgo.a mentioned, there is stand-alone module exec-sync
UPDATE 2014-07-30
ShellJS lib arrived. Consider this is the best choice for now.
UPDATE 2015-02-10
AT LAST! NodeJS 0.12 supports execSync natively.
See official docs
Node.js (since version 0.12 - so for a while) supports execSync:
child_process.execSync(command[, options])
You can now directly do this:
const execSync = require('child_process').execSync;
code = execSync('node -v');
and it'll do what you expect. (Defaults to pipe the i/o results to the parent process). Note that you can also spawnSync now.
See execSync library.
It's fairly easy to do with node-ffi. I wouldn't recommend for server processes, but for general development utilities it gets things done. Install the library.
npm install node-ffi
Example script:
var FFI = require("node-ffi");
var libc = new FFI.Library(null, {
"system": ["int32", ["string"]]
});
var run = libc.system;
run("echo $USER");
[EDIT Jun 2012: How to get STDOUT]
var lib = ffi.Library(null, {
// FILE* popen(char* cmd, char* mode);
popen: ['pointer', ['string', 'string']],
// void pclose(FILE* fp);
pclose: ['void', [ 'pointer']],
// char* fgets(char* buff, int buff, in)
fgets: ['string', ['string', 'int','pointer']]
});
function execSync(cmd) {
var
buffer = new Buffer(1024),
result = "",
fp = lib.popen(cmd, 'r');
if (!fp) throw new Error('execSync error: '+cmd);
while(lib.fgets(buffer, 1024, fp)) {
result += buffer.readCString();
};
lib.pclose(fp);
return result;
}
console.log(execSync('echo $HOME'));
Use ShellJS module.
exec function without providing callback.
Example:
var version = exec('node -v').output;
There's an excellent module for flow control in node.js called asyncblock. If wrapping the code in a function is OK for your case, the following sample may be considered:
var asyncblock = require('asyncblock');
var exec = require('child_process').exec;
asyncblock(function (flow) {
exec('node -v', flow.add());
result = flow.wait();
console.log(result); // There'll be trailing \n in the output
// Some other jobs
console.log('More results like if it were sync...');
});
Native Node.js solution is:
const {execSync} = require('child_process');
const result = execSync('node -v'); // 👈 this do the trick
Just be aware that some commands returns Buffer instead of string. And if you need string just add encoding to execSync options:
const result = execSync('git rev-parse HEAD', {encoding: 'utf8'});
... and it is also good to have timeout on sync exec:
const result = execSync('git rev-parse HEAD', {encoding: 'utf8', timeout: 10000});
This is not possible in Node.js, both child_process.spawn and child_process.exec were built from the ground up to be async.
For details see: https://github.com/ry/node/blob/master/lib/child_process.js
If you really want to have this blocking, then put everything that needs to happen afterwards in a callback, or build your own queue to handle this in a blocking fashion, I suppose you could use Async.js for this task.
Or, in case you have way too much time to spend, hack around in Node.js it self.
This is the easiest way I found:
exec-Sync:
https://github.com/jeremyfa/node-exec-sync
(Not to be confused with execSync.)
Execute shell command synchronously. Use this for migration scripts, cli programs, but not for regular server code.
Example:
var execSync = require('exec-sync');
var user = execSync('echo $USER');
console.log(user);
Just to add that even though there are few usecases where you should use them, spawnSync / execFileSync / execSync were added to node.js in these commits: https://github.com/joyent/node/compare/d58c206862dc...e8df2676748e
You can achieve this using fibers. For example, using my Common Node library, the code would look like this:
result = require('subprocess').command('node -v');
my way since 5 years is to have 2 lines ;
const { execSync } = require('child_process');
const shell = (cmd) => execSync(cmd, {encoding: 'utf8'});
Then enjoy:
shell('git remote -v')
or
out = shell('ls -l')
.. so on
I get used to implement "synchronous" stuff at the end of the callback function. Not very nice, but it works. If you need to implement a sequence of command line executions you need to wrap exec into some named function and recursively call it.
This pattern seem to be usable for me:
SeqOfExec(someParam);
function SeqOfExec(somepParam) {
// some stuff
// .....
// .....
var execStr = "yourExecString";
child_proc.exec(execStr, function (error, stdout, stderr) {
if (error != null) {
if (stdout) {
throw Error("Smth goes wrong" + error);
} else {
// consider that empty stdout causes
// creation of error object
}
}
// some stuff
// .....
// .....
// you also need some flag which will signal that you
// need to end loop
if (someFlag ) {
// your synch stuff after all execs
// here
// .....
} else {
SeqOfExec(someAnotherParam);
}
});
};
I had a similar problem and I ended up writing a node extension for this. You can check out the git repository. It's open source and free and all that good stuff !
https://github.com/aponxi/npm-execxi
ExecXI is a node extension written in C++ to execute shell commands
one by one, outputting the command's output to the console in
real-time. Optional chained, and unchained ways are present; meaning
that you can choose to stop the script after a command fails
(chained), or you can continue as if nothing has happened !
Usage instructions are in the ReadMe file. Feel free to make pull requests or submit issues!
EDIT: However it doesn't return the stdout yet... Just outputs them in real-time. It does now. Well, I just released it today. Maybe we can build on it.
Anyway, I thought it was worth to mention it.
you can do synchronous shell operations in nodejs like so:
var execSync = function(cmd) {
var exec = require('child_process').exec;
var fs = require('fs');
//for linux use ; instead of &&
//execute your command followed by a simple echo
//to file to indicate process is finished
exec(cmd + " > c:\\stdout.txt && echo done > c:\\sync.txt");
while (true) {
//consider a timeout option to prevent infinite loop
//NOTE: this will max out your cpu too!
try {
var status = fs.readFileSync('c:\\sync.txt', 'utf8');
if (status.trim() == "done") {
var res = fs.readFileSync("c:\\stdout.txt", 'utf8');
fs.unlinkSync("c:\\stdout.txt"); //cleanup temp files
fs.unlinkSync("c:\\sync.txt");
return res;
}
} catch(e) { } //readFileSync will fail until file exists
}
};
//won't return anything, but will take 10 seconds to run
console.log(execSync("sleep 10"));
//assuming there are a lot of files and subdirectories,
//this too may take a while, use your own applicable file path
console.log(execSync("dir /s c:\\usr\\docs\\"));
EDIT - this example is meant for windows environments, adjust for your own linux needs if necessary
I actually had a situation where I needed to run multiple commands one after another from a package.json preinstall script in a way that would work on both Windows and Linux/OSX, so I couldn't rely on a non-core module.
So this is what I came up with:
#cmds.coffee
childproc = require 'child_process'
exports.exec = (cmds) ->
next = ->
if cmds.length > 0
cmd = cmds.shift()
console.log "Running command: #{cmd}"
childproc.exec cmd, (err, stdout, stderr) ->
if err? then console.log err
if stdout? then console.log stdout
if stderr? then console.log stderr
next()
else
console.log "Done executing commands."
console.log "Running the follows commands:"
console.log cmds
next()
You can use it like this:
require('./cmds').exec ['grunt coffee', 'nodeunit test/tls-config.js']
EDIT: as pointed out, this doesn't actually return the output or allow you to use the result of the commands in a Node program. One other idea for that is to use LiveScript backcalls. http://livescript.net/

Categories

Resources