Passing multiple parameters to NodeJS callback using export module - javascript

Scenarios 1:
I have two file in my atom project weather.js and exmaple.js and weather is using export module to export everything to example.js and in turn example.js is using require module
my weather.js
var request = require('request');
module.exports= function(justAnothercallback)
{
justAnothercallback('This is from weather');
}
myExample.js
var fromWeather = require('./weather.js');
fromWeather(function(weather){
console.log(weather);
});
If I do Node myExample.js the output is :
This is from weather
Scenarios 2:
Now I just pass one more callback in my weather.js
module.exports= function(justAnothercallback, SecondCallback) {
justAnothercallback('This is from weather');
SecondCallback('This is second callback)');
}
And my example.js is modified to accommodate the second callback function !!
var fromWeather = require('./weather.js');
fromWeather(function(weather, anotherfunc){
console.log(weather);
console.log(anotherfunc);
});
From the terminal we get :
/>node example-callback.js
This is from weather
undefined
/Users/NodeCourse/async/weather.js:7
SecondCallback('This is second callback)');
^
TypeError: SecondCallback is not a function
at module.exports (/Users/oogway/NodeCourse/async/weath
My question is are they not the same , I just added one more callback and it barfed !! why !!?? but it works fine if I pass just one callback ..
Please help one this .

In your code here, you are only passing one callback with two parameters
var fromWeather = require('./weather.js');
fromWeather(function(weather, anotherfunc){
console.log(weather);
console.log(anotherfunc);
});
This is what two callbacks would look like
var fromWeather = require('./weather.js');
fromWeather(function(){
console.log('Hello from callback 1');
}, function(){
console.log('Hello from callback 2');
});

Related

node.js using nested functions from different files

I wanted to write a understandable code in node.js, so I want to put some functions, which are used very often, into other node.js files, and access them from there.
So I get a function, which calls a function from another node.js file and in this other node.js file, also another one is called.
Important to know, if I put all in one file, the code works, so it should be an issue with module export and using functions in another file.
I have one file, getting quotes from a decentralised exchange. Looking like this (quoter_uni_v2.js):
module.exports = function quotes_uni_v2(tokenIn, tokenOut, amountIn, router) {
const quotedAmountOut = router.getAmountsOut(amountIn.toString(), [
tokenIn,
tokenOut,
]);
return quotedAmountOut;
};
And I am importing this function in my second helper file (quotes_5.js) (It is splitted in two files, because in the second one I have to call the function multiple times):
var quotes_uni_v2 = require("./quotes_uni_v2");
module.exports = async function (router1, router2, route, amount_wei) {
console.log(route);
var amount_Out = await quotes_uni_v2.quotes_uni_v2(
route[1],
route[2],
amount_wei,
router1
);
...
return (
Math.round(ethers.utils.formatEther(amount_Out[1].toString()) * 100) / 100
);
};
After that I try to call the function in my main.js:
const quotes_uni_v2 = require("./quotes_uni_v2");
const quotes_5 = require("./quotes_5");
async function calc(route) {
amountOut = await new quotes_5(
quickswap_router,
sushiswap_router,
route,
amount_wei
);
return amountOut;
};
But calling the quotes function does not work... The error is:
TypeError: quotes_5 is not a constructor...
Can someone help me?
Thanks!

Run separate functions in node.js file from terminal

I have a javascript file the that I'm running some node tasks in, and would like to be able to run them separately based on the terminal command I trigger.
For instance, my nodejs file myFile code could look like this:
const mysql = require('mysql');
const fs = require('fs');
const getDbData = () => {
...
...
}
const fileTransform = () => {
file transformation functionality
}
I'd like to be able to run each function separately, so that I can say node myFile.js getDbData in terminal. Do I need to export each of the functions to be able to do this?
You can supply command-line arguments to your script on the node command line. You receive them in the process.argv array. Your arguments start at index 2 (0 is the full path to node, 1 is the full path to your script).
So for instance:
switch (process.argv[2]) {
case "getData":
getData();
break;
case "etlData":
etlData();
break;
// ...
}
Note that it's true that your arguments start at index 2 even if a Node argument precedes your script on the actual command line. For instance:
node --use-strict your-script.js
...will still have the full path to node in process.argv[0] and the full path to your script in process.argv[1]. The --use-strict argument isn't in the array at all.
Or you can put your functions on an object and use the argument as a key:
function getData() {
// ...
}
function getData() {
// ...
}
const functions = {
getData,
etlData
};
const fn = functions[process.argv[2]] || () => { console.log("Invalid option"); };
fn();
Try using process.argv.
https://stackabuse.com/command-line-arguments-in-node-js/
parse the command line arguments and, for example, evaluate them with eval()

How to export get request parameters to another JS file

app.js
app.get('/save', function(req,res){
var switchInput = {
sw1: req.query.switch1,
sw2: req.query.switch2,
sw3: req.query.switch3,
sw4: req.query.switch4,
sw5: req.query.switch5,
sw6: req.query.switch6,
}
console.log(switchInput);
module.exports = switchInput
res.send(switchInput);
});
simulate.js
var mongoose = require('mongoose');
var suit = require('../app')
...
function batteryLife(t){
var elapsed = Date.now() - t;
t_remaining = fullTime - elapsed;
t_battery = secondsToHms(Math.floor(t_remaining/1000));
//console.log(Math.floor(elapsed/1000) + ' s');
console.log(suit.sw1);
return t_battery;
};
Console Log:
{ sw1: 'true',
sw2: 'true',
sw3: 'true',
sw4: 'true',
sw5: 'true',
sw6: 'true' }
--------------Simulation started--------------
undefined
undefined
undefined
undefined
--------------Simulation stopped--------------
When I try to access these values from a different js file they print as undefined I am using postman to simulate values
The values will log from here but print undefined from the other js file
Is there a way to correct this I'm not sure what I am doing wrong
the values are loading into "inputSwitch" but are not coming out on the simulate.js side
First of all, while using youre favorite webserver like Express, you request aka (req) will/could flow amongst middleware before reaching your specific endpoint. Which means your req params are accessible at anytime there which could help you for specific code logic middleware-the-core-of-node-js-apps.
I agree with vibhor1997a, you should not export something there, basically you only module.exports "things" at the end of a file, not something at run-time.
You could do if you really want to deal with switchInput in another file do :
do what vibhor1997a suggest (sync or async function)
have a middleware before reaching your endpoint
raised an event with your switchInput as argument example
You're exporting on an event which isn't a good idea. What you can do instead is call a function in the other file where you need values with the values.
Example
app.js
const simulate = require('./simulate');
app.get('/save', function(req,res){
var switchInput = {
sw1: req.query.switch1,
sw2: req.query.switch2,
sw3: req.query.switch3,
sw4: req.query.switch4,
sw5: req.query.switch5,
sw6: req.query.switch6,
}
simulate(switchInput);
res.send(switchInput);
});
simulate.js
module.exports = function(input){
//have all your functions and code that require input here
function foo(){...}
function bar(){...}
}

How to test node data chunking function

I'm working on a project which uses node and we're trying to achieve 100% coverage of our functions. This is the only function we haven't tested, and it's within another function.
var userInput = "";
req.on("data", function(data){
userInput += data;
});
How do you go about testing this function? We tried exporting the function from another file but no luck.
I should mention that we are using tape as a testing module.
You need to trigger this "data" event on req. So that this callback will be called.
For instance, let's suppose you have req on your test, you could do something like that (this is Mocha):
req.trigger('data', 'sampleData');
expect(userInput).to.equal('sampleData');
req.emit('data', {sampleData: 'wrongOrRightSampleDataHere'}) should do it.
When instantiating the http or hence the req object make sure you instantiate a new one, that no other test receives this event.
To be more complete...
var assert = require('assert')
function test() {
var hasBeenCalledAtLeastOnce = false
var userInput = "";
// req must be defined somewhere though
req.on("data", function(data){
userInput += data;
if(hasBeenCalledAtLeastOnce) {
assert.equal(userInput, "HelloWorld", "userInput is in fact 'HelloWorld'")
}
hasBeenCalledAtLeastOnce = true
});
req.emit('data', "Hello")
req.emit('data', "World")
}
test()

How to setup yeoman test for subgenerator that reads package.json

I have a subgenerator that uses the name from the package.json. Now I want to test that function and wrote a before() that is supposed to create a dummy package.json for the test.
Problem is that the subgenerator cannot read the dummy json file.
test file:
before(function (done) {
helpers.run(path.join( __dirname, '../addcomponent'))
.inDir(path.join( __dirname, './tmp'), function(dir) {
fs.copyTpl(
path.join(__dirname, '../app/templates/_package.json'),
dir + 'package.json',
{ ProjectName: 'foo' }
);
var test = fs.readJSON(dir + 'package.json');
console.log('test: ' + test); // returns the object
console.log('test.name: ' + test.name); // returns the correct name
})
.withArguments(['foo'])
.withPrompts(prompts)
.withOptions(options)
.on('end', done);
});
but in my sub-generator:
var memFs = require('mem-fs');
var editor = require('mem-fs-editor');
var store = memFs.create();
var fs = editor.create(store);
...
init: function() {
this.pkg = this.fs.readJSON('package.json');
console.log('this.pkg: ' + this.pkg); // returns undefined
}
// or
init: function() {
this.on('ready', function() {
this.pkg = this.fs.readJSON('package.json');
console.log('this.pkg: ' + this.pkg); // returns undefined
});
}
// or
anyOther: function() {
this.pkg = this.fs.readJSON('package.json');
console.log('this.pkg: ' + this.pkg); // returns undefined
}
The whole setup can be found here: https://travis-ci.org/markusfalk/generator-kickstart/builds/58892092
thanks for any help
Edit: I'll keep the old answer underneath and that's probably relevant to most people running into this issue, but not to you.
The idea behind mem-fs is to have an in memory store. It doesn't write anything to disk automatically. As so, it keep the state in the mem-fs instance. In this case, you're creating your own mem-fs instance, while yeoman use another instance. This mean the file you write is never seen by Yeoman (and never written to disk).
For you, the fix would be to use the generator instance provided as the first parameter of the ready event.
helpers.run(path.join( __dirname, '../addcomponent'))
.on('ready', function (generator) {
generator.fs.write('file.txt', 'foo');
});
Another option is to use the node.js sync fs methods. (fs.writeFileSync(), etc)
My guess is you're using this.fs.readJSON() inside your generator constructor.
The constructor is initialized before the ready event is triggered. This mean you read the file before it is actually written.
The usual fix is to never read inside the constructor. You can delay this step until the initializing phase where the inDir() (or the ready event) callback has run.
As a side note, you should use inTmpDir() rather than inDir()

Categories

Resources