How to run a pickle file in node js - javascript

Working in an AI/ML project, I need a way to run the pickle file inside nodejs so I can use it to run the algorithm on the data they submitted.

Try to use the node-pickle library to convert the pickle file to the JSON object. Here documentation of node-pickle
const nodePickle = require('node-pickle');
// Convert pickled object to JSON object
nodePickle.load(pickledData)
.then(data => ({
// data is a JSON object here
})
Then you can use tensorflow.js to run that JSON object as a model.
Tenrsorflow.js Documentation

The only one I could find from a quick google search is node-jpickle. According to it's docs, it seems to be able to handle most of the situations pickle can, even some of the more advanced ones such as classes:
function MyClass() {}
var jpickle = require('jpickle');
jpickle.emulated['__main__.MyClass'] = MyClass;
var unpickled = jpickle.loads(pickled);
If you're just trying to pickle/unpickle, you can just do it like you would in Python:
var unpickled = jpickle.loads(pickled);
The docs don't state anything about a normal load function however.

Related

Importing an external library into nodejs vm

I am developing a nodejs library that allows for the user to write their own JS code which will be executed. For example:
var MyJournal = Yurnell.newJournal();
module.exports = function(deployer) {
MyJournal.description = "my first description"
// deployment steps
deployer.deploy(MyJournal)
};
This eventually gets called using nodejs VM
var script = vm.createScript(fileWithFrontend.content, file);
script.runInNewContext(context);
Passing in the Yurnell and deployer object via the context parameter.
My question is whether there is a way for the user to also import their own libraries into the script? and if so where in users path would the script look for the library?
For example in their code it would be useful for them to do something like var moment = require('moment'); and format the dates using that library also.
Thanks
you can use node:VM linker
for a detailed explanation - https://nodejs.org/api/vm.html#modulelinklinker

Jest mock pdfmake and fs.createWriteStream

I am very new to Jest and unit testing in general. I am trying to generate a test for a piece of code that essentially uses pdfmake and fs.createWriteStream in order to create and write to a pdf file
As I am reading tutorials - and getting confused a bit as it's info I am not used to - I have tried to put together a couple of things. An fs.js module to try and get the writestream mocked. About the pdfmake I am not very sure - could i perhaps skip that and assume I have a target file and therefore mock just the stream creation?
describe('createWriteStream', () => {
const MOCK_FILE_INFO = {
'/path/to/file1.pdf': 'console.log("file1 pdf contents");'
};
})
Something like the above?
And then just have a result as success for the fs.createWriteStream?
The original code looks something like:
let pdfDoc = new pdfPrinter({
Roboto: {
normal: new Buffer(require('pdfmake/build/vfs_fonts.js').pdfMake.vfs['Roboto-Regular.ttf'], 'base64')
}
}).createPdfKitDocument(docDefinition);
pdfDoc.pipe(fs.createWriteStream(path));
pdfDoc.end();
I understand is not a lot of info but I guess if someone is very aware of unit testing might know how to best shape this. Thanks

Return array with fast-csv in Node

I am attempting to parse a large file using the fast-csv library and return its values as an array to a config.js file. Please help as the value of countries in the config's model.exports section ends up being undefined.
Parser:
import csv from 'fast-csv';
export function getCountries() {
let countries = [];
csv.fromPath('./src/config/csv_configs/_country.csv')
.on('data',
function(data) {
countries.push(data);
})
.on('end', function () {
return countries;
});
}
Config:
import {getCountries} from '../tools/csv_parser';
let countryList = [];
module.exports = {
port: process.env.PORT || 8000,
token: '',
countries: getCountryList()
};
function getCountryList() {
if (countryList.length === 0) {
countryList = getCountries();
}
return countryList;
}
I understand this is due to me attempting to return a value from the anonymous function on(), however I do not know the proper approach.
You're correct that returning values from the callback in .on('end' is the source of your problem.
Streams are asynchronous. If you want to use this fast-csv library, you're going to need to return a promise from getCountries(). However, I'm assuming that's not what you want, since you're using the result in a config file, which is synchronous.
Either you need to read your csv synchronously, or you need to refactor the way your application works to be able to have your config be asynchronous. I'm assuming the second option isn't possible.
You probably want to look into using another CSV library that doesn't use streams, and is synchronous. Two examples from a quick Google search are:
https://www.npmjs.com/package/csv-load-sync
https://www.npmjs.com/package/csvsync
I haven't used either of these libraries personally, but it looks like they'd support what you're trying to do. I'm assuming your CSV file is small enough to all be stored in memory at once, if not, you're going to have to explore more complicated options.
As a side note, is there any specific reason that the data has to be in CSV format? It would seem to be much easier to store it in JSON format. JSON can be imported to your config file directly with require; no external libraries needed.

Remove line from file without mutating the buffer variable in node js

I am new to Nodejs
I have written a function that reads contents from a file and stores into a variable as an array. And finally I am mutating the variable and writing it to a file. See below:
function(file, item) {
return fs.readLine(file).then(function(contents) {
var data = contents.split(/\n/);
data.splice(data.indexOf(item), 1);
return fs.writeFile(file, data.join(/\n/));
});
}
Is there a way to do the same without mutating the variable or even having had to store the contents of a file into a variable and delete it in node js?
Thanks
If you don't want to mutate the variable with data.splice() then you can use data.slice() that doesn't mutate the original array. See the docs:
https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Array/slice
Also instead of reading the contents of the file into a variable, you can create a readable stream and filter out the line that you don't want.
See how I created my filt module that filters lines of standard input:
https://www.npmjs.com/package/filt
The source code and a lot of examples are on GitHub:
https://github.com/rsp/node-filt
Basically what you can do here is - using the handy split module - something like this:
fs.createReadStream(file).pipe(split()).on('data', (line) => {
// if line is something ... etc.
});

How can I return a real JavaScript array from a Java method with Nashorn?

I'm writing an API to be used by some JavaScript code. Some of the methods in this API should return a real JavaScript array. Unfortunately, this doesn't work:
// MyApi.java
public class MyApi {
String[] returnsJavaArray();
List<String> returnsJavaList();
}
// MyScript.js
var api = getMyApi();
var strings = api.returnsJavaArray(); // Returns some kind of proxy
strings = api.returnsJavaList(); // Also a proxy
While the proxies support basic things like strings[i], I need them to be actual arrays in order to use some Array polyfills. What's the best way to do this in Nashorn?
My only idea so far is to write a JavaScript wrapper of the whole MyApi and wrap the results with Java.from(api.returnsJavaArray()) but that's pretty tedious.
You should make you API return a string of js.
Then, in your js file, you can call
eval(stringOfJsReturnFromServer);
To make your script run

Categories

Resources