How to run user-submitted modules securely in a node.js? - javascript

We are planning to develop a business oriented application platform on node.js + express. And we like to allow users to run their own native node.js modules (set of files js, css, html), so generally it should be like portal and portles/servlets. Users should have ability to install modules on server side with its client part and these modules should interact with platform and other modules throw some api. So needed to isolate these modules from direct access to the system files and database, but they should have access to their own files and database. Please help me what direction should we dig to make it secure.
I have checked information about: sandbox in vm and child process.
I tried:
// Main file:
var util = require('util'),
vm = require('vm'),
fs = require('fs'),
sandbox = {
animal: 'cat',
count: 2,
require: require // I pass it to make possible for the module to
// include some additional files
// but it opens access for all system files
};
var context = vm.createContext(sandbox);
fs.readFile('./user_modules/index.js', 'utf8', function (err, data) {
vm.runInNewContext(data, context);
console.log(util.inspect(context));
});
//** User Module
// user_modules/index.js
var fs = require('fs');
count++;
animal = 'Dog';
fs.readFile('README.md', 'utf8', function (err, data) {
animal = 'Fox';
});
I passed REQUIRE object to module to make possible to include some additional files but it opens access for all system files, is it possible to tell VM or child process to work only with specific folders? Currently I have no idea how to work with database, but I think when user will install his module the platform should copy all files and create a db scheme for the user then when the module will launch I need pass only object which connected to the user dbscheme .
Please help me, I’m really new with nodes, any suggestions how to solve my issue?
Thanks in advance

One thing you could do is to create a shim function around require that does whatever validation you want, and then calls the system's require function. You can then pass that in to the sandbox as a replacement for "require".
I'm not sure of all the changes that would be necessary to make a "secure" sandbox for node.js. To some extent, that's going to depend on what the user-submitted modules need to do.
One way to help ensure that the user modules can't interfere with your code would be to run them in their own process. On a unix system, you can use chroot to create an isolated filesystem for the process to run in, and then communicate with the process over a stdio pipe, or a socket.

Related

How to manage releases with ASP.NET pointing to new versions of Webpacked JS files?

I'm working with a client who has a monolithic ASP.NET application as their backend, but also hosts Webpacked JS files for their React.js frontend in the same GIT repo. The obvious problem is that anytime they want to do a frontend release, they have to release and build the entire .NET application along with it, due to the manifest.json that is in the EC2 instance. The idea is that there will also be many versions over many instances, canaried out to different users, going through various levels of post-production testing, etc., standard DevOps CI/CD pipeline with post-production health checks and automated rollbacks. That is the end goal. In the meantime, they need to split their frontend/backend releases apart, which means separate versioning for both. So after much preamble, the question to the community is two-fold:
Has anyone experienced this type of environment and come up with a viable solution?
Does anyone have a good suggestion for how to approach this problem?
Keep in mind that this solution should also expect local development, PR testing, and a blue/green prod setup.
Assuming that you can refactor your project so that the client-side stuff will be a folder of its own, let's call this folder client-{version}, you can create a setting for your project for client-side path, which would be client-v1/ and then, if you need to change v1 -> v2, a command-line command would be needed in order to ensure that this setting can be changed for your project at runtime. So, deploying the client-side would have this algorithm:
deploy the client-side
change the client-{version} to a newer version
trigger command-line command to the project which will ensure that the new client-side path will be used from now on
I have used following solution, (I learned from JSDelivr, UnPkg ..)
Private NPM Repository (ProGet)
Version Table (stores package name and production version to be used)
A controller to download and extract package to a local folder for requested package with version.
Here is how it works,
/// lets assume this will serve JS/image/css everything from
/// from a path /js-pkg/packageName[#version]/....
[RoutePrefix("js-pkg")]
public class JSContent: Controller {
[HttpGet("{packageName}/{*path}")]
public ActionResult Get(
string packageName,
string path) {
string version = null;
string releasedVersion = db.Versions
.Where(x => x.Package == packageName)
.Select(x => x.Version)
.First();
if (packageName.Contains("#")) {
var tokens = packageName.Split("#");
version = tokens[1];
packageName = tokens[0];
}
if (version == null) {
// this is for performance reason...
// explained in next line...
return Redirect($"/js-pkg/{packageName}#{releasedVersion}/{all}");
}
// since all requests will be versioned...
// you don't have to worry about stale cache ...
Response.CacheControl = "public, max-age=36000000";
// you need to setup a file based locking here
string folder = $"d:\\temp\\{packageName}\\{version}";
if (!System.IO.Directory.Exists(folder))
{
DownloadAndExtract(
$"https://proget...../npm/repo/{package}/-/{package}-{version}.tgz",
folder
);
}
string file = $"{folder}//{path}";
if(!System.IO.File.Exists(file)) {
return HttpNotFound();
}
return File(file);
}
}
Now to host file, you can simply put a CSHTML or view with following line
<script src="/js-pgk/packageName/dist/ui/start.packed.js" ></script>
<link rel="stylesheet" href="/js-pgk/packageName/dist/ui/start.css" ></link>
There are multiple advantages to it,
All your JS packages can stay outside the main repository, as most of the time UI becomes more and more complicated, your UI can be divided into multiple packages and they all can exist independently
Multiple teams can independently manage packages, and undoing released package is easy as you can simply go in database and change version.
In debug environment, you can also supply version in querystring or in package name to test different JS package for production/staging backend
Separate Frontend will remove all node related code from backend, backend will not need unnecessary frontend processing.
You will eventually have many small repositories, each independently developed and maintained. Your commits will be smaller and clearer. Single repository for frontend/backend has too many commits for single feature/change.

How can I require modules with patterns in the path?

How can I include all files in nodeJS like
require('./packages/city/model/cities')
require('./packages/state/model/states')
require('./packages/country/model/countries')
like as
require('./packages/*/model/*')
same like grunt is loading files.
You can't (or at least you shouldn't)
In order to do this, you would have to overload node's native require function, which is highly inadvisable.
The CommonJS pattern might seem tedious to you, but it's a very good one and you shouldn't try to break it just because you saw shortcuts in other languages/frameworks.
By introducing some form of magic in your module, you suddenly change everything that programmers can (and should be able to) safely assume about the CommonJS pattern itself.
Due to one-to-one correspondence in node module loading system, it wont be possible natively, but would not be surprised if there is a package for this method.
Best you can do is create a index.js that loads modules present in directory and exports them as its own.
module.exports = function() {
return {
city : require('./city/model/'),
state : require('./packages/state/model/'),
country : require('./packages/country/model/')
}
}
you would have to load models in similary fashion in all three dirrectories as well.
I know that this solution is not what you are looking for but in my experirence, this method allows to better manage custom packages as you can add/remove features easily.
Node.js's require allows you to
load only one module at a time
load modules only in synchronous fashion.
That is how the module system works in Node.js. But if you want to have minimatch kind of matching functionality, you can roll one on your own, like this
var path = require("path"),
glob = require("glob");
function requirer(pattern) {
var modules = {},
files = glob.sync(pattern);
files.forEach(function(currentFile) {
var fileName = path.basename(currentFile);
fileName = fileName.substring(0, fileName.lastIndexOf(".js"));
modules[fileName] = require(currentFile);
});
return modules;
}
This depends on glob module, which allows you to use minimatch patterns to search files and then we require the found files, store them in an object and return the object. And this can be used like this
var modules = requirer('./packages/*/model/*.js');
console.log(modules.cities);
P.S: I am working on making this a public module already.

How to call a program on the server using MeteorJs?

Currently, I am building an application in Meteor. The application is based on transforming application information in an XML format. And then run a program on the XML created on the server. After that read the xml file created and bring it to a new collection.
On the other hand, personally I run on my linux machine manually:
(*) ./ myprogram -xml = myOutput.xml -out = myout.xml -timelimit = 300
myOutput.xml where is the generated file.
myout is the file generated by the program.
And timelimit is the maximum runtime of the program.
So far I have done the following:
Once all income, build a xml file with the values stored in my collections. This was performed as follows:
(1) server startup declare the use of fs
Meteor.startup (function () {
fs = Npm.require ('fs');
// ...
}
-In My methods I build the file with its corresponding information
Meteor.methods ({
transform: function () {
fs.createWriteStream wstream = var ('myOutput.xml');
...
//....operations over the collections
...
wstream.end ();
... (2) ...
},
});
All good so (2). The file is either created.
After (2), I need to make program implementation as in (*). The program depends on the xml file built and an outlet.
After contextualize my problem my questions are:
How I can run the program on the server using the file created? And
What I locate the executable folder "myprogram" to use it?

Is it possible to seek file list in a certain public-directory from HTML/javascript?

I plan to implement module component system (plugin) for node.js client side.
For instance(simplified),
/www/index.html
index.js
modules/moduleA/module.js
some.html
moduleB/....
.....
It's surely possible to hardcode the list of module dir-names, and require (with browserify) components, but I would rather point them automatically depending on the modules directory structure.
In node.js, I do this easily as
var modules;
fs.readdir('./www/modules', function(err, modulesDir)
{
var modules = [];
modulesDir.map(function(modulename)
{
modules[modulename] = require('./www/modules/' + modulename + '/module');
});
}
However, in the client side, even with browserify, since fs is basically not supported in the browser side, the same manner is not possible as far as I tried.
How can we achieve to read through directory names under a certain directory path?
Thanks.
PS. I just found
https://github.com/brianloveswords/filesystem-browserify
but it looks a bit old, if you have a good recommendation, please let me know.

Access node.js File System module in Meteor

I'm creating a web app that will edit some config files stored on a user's HD, and decided to give Meteor a shot.
I'd like to use Node.js's File System module to handle to I/O of the config files, but I haven't been able to figure out how to include the module. After some searching, I found the following code here on StackOverlow, which is supposed to allow me to require the module:
var require = __meteor_bootstrap__.require;
var fs = require('fs');
However, even with this placed inside of the if(server) portion of my code, my application is still throwing an error and telling me that 'fs' is undefined.
Has anyone else encountered this issue?
From 0.6.0 you need to use Npm.require
var fs = Npm.require('fs');

Categories

Resources