Return array with fast-csv in Node - javascript

I am attempting to parse a large file using the fast-csv library and return its values as an array to a config.js file. Please help as the value of countries in the config's model.exports section ends up being undefined.
Parser:
import csv from 'fast-csv';
export function getCountries() {
let countries = [];
csv.fromPath('./src/config/csv_configs/_country.csv')
.on('data',
function(data) {
countries.push(data);
})
.on('end', function () {
return countries;
});
}
Config:
import {getCountries} from '../tools/csv_parser';
let countryList = [];
module.exports = {
port: process.env.PORT || 8000,
token: '',
countries: getCountryList()
};
function getCountryList() {
if (countryList.length === 0) {
countryList = getCountries();
}
return countryList;
}
I understand this is due to me attempting to return a value from the anonymous function on(), however I do not know the proper approach.

You're correct that returning values from the callback in .on('end' is the source of your problem.
Streams are asynchronous. If you want to use this fast-csv library, you're going to need to return a promise from getCountries(). However, I'm assuming that's not what you want, since you're using the result in a config file, which is synchronous.
Either you need to read your csv synchronously, or you need to refactor the way your application works to be able to have your config be asynchronous. I'm assuming the second option isn't possible.
You probably want to look into using another CSV library that doesn't use streams, and is synchronous. Two examples from a quick Google search are:
https://www.npmjs.com/package/csv-load-sync
https://www.npmjs.com/package/csvsync
I haven't used either of these libraries personally, but it looks like they'd support what you're trying to do. I'm assuming your CSV file is small enough to all be stored in memory at once, if not, you're going to have to explore more complicated options.
As a side note, is there any specific reason that the data has to be in CSV format? It would seem to be much easier to store it in JSON format. JSON can be imported to your config file directly with require; no external libraries needed.

Related

How to run a pickle file in node js

Working in an AI/ML project, I need a way to run the pickle file inside nodejs so I can use it to run the algorithm on the data they submitted.
Try to use the node-pickle library to convert the pickle file to the JSON object. Here documentation of node-pickle
const nodePickle = require('node-pickle');
// Convert pickled object to JSON object
nodePickle.load(pickledData)
.then(data => ({
// data is a JSON object here
})
Then you can use tensorflow.js to run that JSON object as a model.
Tenrsorflow.js Documentation
The only one I could find from a quick google search is node-jpickle. According to it's docs, it seems to be able to handle most of the situations pickle can, even some of the more advanced ones such as classes:
function MyClass() {}
var jpickle = require('jpickle');
jpickle.emulated['__main__.MyClass'] = MyClass;
var unpickled = jpickle.loads(pickled);
If you're just trying to pickle/unpickle, you can just do it like you would in Python:
var unpickled = jpickle.loads(pickled);
The docs don't state anything about a normal load function however.

Node.js Function not updating value after 1st invoke

I've recently taken interest in the Discord.js framework, and was designing a bot for a server. Apologies in advance for the messiness of the code.
The issue I'm facing is that after I first run the command, the the function is invoked, the value of ticketValue does not update to the update value in my JSON file.
const fs = require("fs");
module.exports = {
commands: ["ticket"],
minArgs: 1,
expectedArgs: "<message>",
callback: (message, arguments, text) => {
// Console Log to notify the function has been invoked.
console.log("FUNCTION RUN")
let jsondata = require("../ticketNum.json")
let ticketValue = jsondata.reportNews.toString()
// Turning the number into a 4 digit number.
for(let i = ticketValue.length; i<4;i++) {
ticketValue = `0${ticketValue}`
}
console.log(`#1 ${ticketValue}`)
// Creating the Discord Chanel
message.guild.channels.create(`report-incident-${ticketValue}`, {
type: 'text',
permissionOverwrites: [
{
id: message.author.id,
deny: ['VIEW_CHANNEL'],
},
],
})
// Adding one to the ticket value and storing it in a JSON file.
ticketValue = Number(ticketValue)+1
console.log(`TICKET VALUE = ${ticketValue}`)
fs.writeFile("./ticketNum.json",JSON.stringify({"reportNews": Number(ticketValue)}), err => {
console.log(`Done writing, value = ${ticketValue}`)
})
console.log(require("../ticketNum.json").reportNews.toString())
},
}
I believe this is due to something called require cache. You can either invalidate the cache for your JSON file each time you write to it, or preferably use fs.readFile to get the up-to-date contents.
Also worth noting that you are requiring from ../ticketNum.json but you are writing to ./ticketNum.json. This could also be a cause.
You seem to be using JSON files as a database and while that is perfectly acceptable depending on your project's scale, I would recommend something a little more polished like lowdb which stills uses local JSON files to store your data but provides a nicer API to work with.
You should only use require when the file is static while the app is running.
The caching require performs is really useful, especially when you are loading tons of modules with the same dependencies.
require also does some special stuff to look for modules locally, globally, etc. So you might see unintended things happen if a file is missing locally and require goes hunting.
These two things mean it's not a good replacement for the fs tools node provides for file access and manipulation.
Given this, you should use fs.readFileSync or one of the other read functions. You're already using fs to write, so that isn't a large lift in terms of changing the line or two where you have require in place or a read.

Loading local json file with require?

I've been googling for hours now, thought I just ask here. For some reason my 'require()' does not work. The recuirejs is included and as far as I can see my return value should be my data in the exact same order as my json file.
here is my code:
$(document).ready(async function() {
let data = await fetchData('./data/file.json');
console.log(data);
}
// fetch and return data
function fetchData(path) {
return require([path]);
}
I originally had this solution (which worked with a local host but I need it without a host):
function fetchData(path) {
return fetch(path).then(response => {
return response.json().then((data) => {
return data;
}).catch((err) => {
console.log(err);
})
});
}
It gives me several script errors and MIME type mismatches plus it logs this instead of my data:
s(e, t, i)
​ arguments: null
caller: null
​ >defined: function defined(e)
​ isBrowser: true
length: 3
name: "s"
​ >prototype: Object { … }
>specified: function specified(e)​
>toUrl: function toUrl(e)​
>undef: undef(i)
I don't know what else I should try.
Thank you!
RequireJS is not compatible with Node.js's require method. It is designed for AMD modules not CommonJS modules and it does not support the loading of plain JSON files. This is why your first attempt does not work.
Your second attempt does not work because file systems requests are treated as cross-origin requests.
The only way to load a JSON file when working on the load filesystem is to have the user select it with an <input type="file"> and then read it with JavaScript.
If you want to read hard-coded JSON then you might consider baking it into your app. The simple way to do that would be to just paste it in as a JS object literal. More complex programs might benefit from using a tool like Webpack (which would need a JSON loader) and pulling it into the JS at build-time rather than development time (the aforementioned pasting approach) or run time (which is impossible as mentioned in previous paragraphs).
You have two options:
Use a tool like Webpack to put all your frontend files into the bundle
Download JSON file from web server:
I assume you are using jQuery already, so:
$.get('https://petstore.swagger.io/v2/swagger.json').then(function(data) {
console.log(data.swagger);
});
In this case you have to make your json file available for webserver
$.get('/dist/data/file.json').then(function(data) {
console.log(data);
});

Node.js can not always get value from my config file

I am using node.js in the server side. I have a RESTful service like;
app.get('/getAndroidVersion', function(req,res){res.json({version: config.androidVerion});});
This service must return version value from the config file. Although I am changing the androidVerion value in the config file node.js returns different value from the config value. For example in the config file version value is 2, but node.js returns 3.
After changing the version value in the config file I wrote a console.log statement before service call like that
app.get('/getAndroidVersion', function(req,res){console.log(config.androidVerion)
res.json({version: config.androidVerion});
});
It also writes different value from the config file to the console. What is the problem? Is node.js caches or return random values for the version number.
How can I handle this problem? Thank you.
Calls to require are cached.
From Node.js: Modules docs:
Modules are cached after the first time they are loaded. This means (among other things) that every call to require('foo') will get exactly the same object returned, if it would resolve to the same file.
You got a lot of options here:
Use fs.readFile for each request to read from disk
You can use readFile instead of require, to read your config file from disk for every request. This ensures you always get the most up-to-date version of the file.
Store config in a database
If this is a high-traffic app you might want to keep those config settings in a database instead of a file. DB engines usually avoid touching the disk for frequently executed queries.
Write your own persisted object mechanism
Going a step further, you can just write your own mechanism for storing K/V pairs; This mechanism should persist to disk any changes to the object when setting a value whilst avoiding touching the disk when attempting to get a value.
Here's an (untested) example:
const fs = require('fs')
const path = require('path')
class PersistedObject {
constructor(path) {
this.path = path
this.obj = {}
}
load() {
const json = fs.readFileSync(this.path, 'utf8')
this.obj = json ? JSON.parse(json) : {}
}
set(key, value, cb) {
this.obj[key] = value
fs.writeFile(this.path, JSON.stringify(this.obj), cb)
}
get(key) {
return this.obj[key]
}
}
// Construct with a path to your config.json.
const config = new PersistedObject(path.resolve(__dirname, './config.json'))
// Load data from disk. Run this once on startup.
config.load()
// Calling `.set` sets the in-memory `obj`,
// and also persists the obj on disk.
config.set('firstName', 'Mary', (err, result) => {
// Calls to `.get` use the in-memory `obj` and do not touch the disk.
console.log(config.get('firstName')) // logs 'Mary'
})

NodeJS Group Functions Under A Sub-Class

perhaps I have not worded the title correctly. Below is the explanation of what I'm trying to do.
I'm creating a helper.js file for my project. Inside it contains many functions, one I've pasted below. I export this function using module.exports.
function generateInternalError(message, stack_trace) {
if (process.env.NODE_ENV == 'dev') {
console.log({ message: message, stack_trace: stack_trace });
} else {
console.log({message: message});
}
}
module.exports = {
generateInternalError: generateInternalError
};
Where I want to utilize this function I would call:
helper.generateInternalError('Not Found',new Error().stack);
And it works as expected.
But, what I have been tasked with, is creating categories of functions. Essentially I need the following:
helper.errors.generateInternalError('Not Found',new Error().stack);
I can not seem to figure out the right way to export a class of functions or an object of functions in NodeJS such that I don't get an error like:
TypeError: helper.errors.generateClientError is not a function
Any assistance is appreciated.
Thank you
The module.exports property of a file is simply an object that maps names to functions. You can define it arbitrarily, for example:
module.exports = {
errors: {
generateInternalError,
...
},
...
};
Then, require('./helper').errors.generateInternalError will be defined.
helpers is just noise if everything is a helper, so drop that. Just use regular modules and unless you are sure you only have one function in that category export multiple functions. If you only export one function with module.exports then you don't need to do that as a property of an object which also means you can just say const genError=require('./errors')
Don't make something like helpers.errors.someErrorFunc because helpers is noise and you make categories with just separate module files. Don't try to make Node.js look like Java or something equally horrible.
It might be better to structure your helper sub classes in separate files.
Example
src/helpers.js
src/helpers/
src/helpers/errors.js
File helpers.js
module.exports = {
errors: require('./helpers/errors')
}
File helpers/errors.js
module.exports = {
generateInternalError: function(){
//write some internal error code here
}
};
Structuring your code like this will keep your root helpers file very organized and create a pattern that is easy to replicate for new subclasses.
If you prefer a less modular approach you could simply just return one big JSON object as other's have demonstrated...
module.exports = {
errors: {
generateInternalError: function(){
//internal error code
},
generateDatabaseError: function(){
//db error code
}
}
}

Categories

Resources