Node JS fs module inside browser - javascript

I have a scenario where I want to export data to CSV from the client-side. I will have a textbox/area or whatever where the user can input data and then ideally with one click, a local CSV file will be updated with that data.
This is easily achievable with NodeJS with server interaction, and its core modules (specifically fs module), but apparently not so much from a browser.
I found out that certain node modules (for example underscore), support RequireJS's method of making a particular module work in the browser. So for underscore I did this:
methods.js
define(['underscore'],function(_) {
var Methods = {
doSomething: function() {
var x = _.size({one: 1, two: 2, three: 3, xuz: 3});
alert(x);
}
};
return Methods;
});
common.js
requirejs.config({
baseURL: 'node_modules',
paths: {
underscore: 'underscore/underscore',
}
});
require(['methods'], function(y){
y.doSomething();
});
index.html
<script data-main="common" src="require.js"></script>
<script>
require(['common'], function() {
require(['methods.js']);
});
</script>
The above works fine and will show an alert: 4.
But when I try the same with the fs module it won't work. It will show this error:
Module name "util" has not been loaded yet for context: _. Use require([])
As far as I can understand, this is because fs requires several other modules, one of them being util.
So I proceeded to add all these modules to the RequireJS config. But still no luck, so I specifically tested util module by itself as this doesn't seem to require other modules to work.
And now I am stuck on this error: Uncaught ReferenceError: exports is not defined
I tried modularizing this util module by encapsulating the whole module source code in this:
define([], function() {})
But it didn't work either... I've also tried copying underscore's model but still no luck.
So I was wondering if anyone managed to use util & fs modules (or any other core NodeJS modules) within the browser with libraries such as RequireJS or Browserify.

That's right, exports is node-specific JS (used to make part of your module available outside the module) and isn't supported by web-browsers. Even though NodeJS is technically JS, there are client specific properties (like the window property for browsers, and exports for NodeJS apps) that can't be interchanged.
That said, here's a client side JS answer to the CSV problem.

Your best bet (and probably only one) is to use HTML5 FileSystem API. I am not aware of any other browser feature that would allow to work with files on client computer except maybe Flash and similar solution.
I am bit confused by your browserify tag since you are not obviously using Browserify. That would fix your issue with "exports is not defined".

Related

Difficulty Loading Modules with Require.js

I'm trying to load and use a module I defined with Require. As far as I can tell, I'm following the patterns on the official site, but I'm not getting the loaded module inside the require call.
main.js:
define(function (require) {
require(['./util'], function(util){
util.dictionary();
});
...
});
util.js:
define(function(){
"use strict";
var util = {};
...
return util;
});
With this, util.dictionary() fails because util is undefined.
Both files are on the same directory level, and I haven't defined any baseURL for Require.
Why is this failing?
After a lot of poking around, I discovered that Chrome was using a cached versions of some of my files, and clearing the cache caused the modules to load correctly.
Yes, I'm a web development noob.

Require JS files dynamically on runtime using webpack

I am trying to port a library from grunt/requirejs to webpack and stumbled upon a problem, that might be a game-breaker for this endeavor.
The library I try to port has a function, that loads and evaluates multiple modules - based on their filenames that we get from a config file - into our app. The code looks like this (coffee):
loadModules = (arrayOfFilePaths) ->
new Promise (resolve) ->
require arrayOfFilePaths, (ms...) ->
for module in ms
module ModuleAPI
resolve()
The require here needs to be called on runtime and behave like it did with requireJS. Webpack seems to only care about what happens in the "build-process".
Is this something that webpack fundamentally doesn't care about? If so, can I still use requireJS with it? What is a good solution to load assets dynamically during runtime?
edit: loadModule can load modules, that are not present on the build-time of this library. They will be provided by the app, that implements my library.
So I found that my requirement to have some files loaded on runtime, that are only available on "app-compile-time" and not on "library-compile-time" is not easily possible with webpack.
I will change the mechanism, so that my library doesn't require the files anymore, but needs to be passed the required modules. Somewhat tells me, this is gonna be the better API anyways.
edit to clarify:
Basically, instead of:
// in my library
load = (path_to_file) ->
(require path_to_file).do_something()
// in my app (using the 'compiled' libary)
cool_library.load("file_that_exists_in_my_app")
I do this:
// in my library
load = (module) ->
module.do_something()
// in my app (using the 'compiled' libary)
module = require("file_that_exists_in_my_app")
cool_library.load(module)
The first code worked in require.js but not in webpack.
In hindsight i feel its pretty wrong to have a 3rd-party-library load files at runtime anyway.
There is concept named context (http://webpack.github.io/docs/context.html), it allows to make dynamic requires.
Also there is a possibility to define code split points: http://webpack.github.io/docs/code-splitting.html
function loadInContext(filename) {
return new Promise(function(resolve){
require(['./'+filename], resolve);
})
}
function loadModules(namesInContext){
return Promise.all(namesInContext.map(loadInContext));
}
And use it like following:
loadModules(arrayOfFiles).then(function(){
modules.forEach(function(module){
module(moduleAPI);
})
});
But likely it is not what you need - you will have a lot of chunks instead of one bundle with all required modules, and likely it would not be optimal..
It is better to define module requires in you config file, and include it to your build:
// modulesConfig.js
module.exports = [
require(...),
....
]
// run.js
require('modulesConfig').forEach(function(module){
module(moduleAPI);
})
You can also try using a library such as this: https://github.com/Venryx/webpack-runtime-require
Disclaimer: I'm its developer. I wrote it because I was also frustrated with the inability to freely access module contents at runtime. (in my case, for testing from the console)

Basic requirejs concept needed to make work musicjson package

I'd like to use the musicjson.js package that helps to convert musicXML files into json notation, looking if it's a good way to import for example exported musicXML Finale scores into a browser playing with the Fermata/VexFlow class.
https://github.com/saebekassebil/musicjson
The thing is that this module works with require (calling for
nodes packages like fs) and I'm just a newbee in requirejs...Even if I spent few time in understanding the tutorial in the website, I don't still get how to solve this kind of basic problem when the dependencies of my musicjson.js need to be called like :
var xmldom = require('flat-xmldom'),
fs = require('fs'),
path = require('path'),
util = require('util');
My index.php page does the classic require call:
<!DOCTYPE html>
<head>
<!-- javascript head -->
<!-- REQUIRE -->
<script data-main="scripts/main" src="bower_components/requirejs/require.js"></script>
</head>
<body>
</body>
</html>
In my scripts/main.js, I'd like to do simply what it is told from musicjon :
var music = require('musicjson');
music.musicJSON(xml, function(err, json) {
// Do something with the MusicJSON data
});
I putted also, in the same directory scripts/, the flat-xmldom folder, fs.js, path.js, util.js
When I do this, I've just obtain this classic error of :
*Error: Module name "musicjson" has not been loaded yet for context: _. Use require([])*
...That looks like a common error referenced in the requirejs website,
but if I try things that I guess it should be written, I get a bit lost to determine where is the fundamental conceptual mistake here :
require.config({
baseUrl: '/scripts/',
paths: {
flatxmldom:'./flat-xmldom/__package__',
fs: 'fs',
path:'path',
util:'util',
musicjson: 'musicjson'
}
});
require(['flatxmldom','fs','path','util','musicjson'],function(flatxmldom,fs,path,util,musicjson){})
Error returned in this case for example :
*Module name "fs" has not been loaded yet for context: _. Use require([])*
Thanks a lot for your attention.
So, this is not a RequireJS problem per-se. The package you want to use is a Node.js package. It is intended to run in node (a server/desktop execution environment for JavaScript). It cannot/will not run in web page in a browser.
The packages it is trying to use (fs in particular) provide access to system resources such as the file system. Node provides these packages as part of its core libraries to any package that runs in node. A browser is specifically designed for security reasons never to allow direct access to such resources to any code that run in the browser because who knows where it came from or might try to do.
I haven't really tried to do this myself, but browserify (alternative to requirejs) claims that it will allow you to use any node package in your application.
If musicjson is at the heart of what your app should achieve and requirejs is a small step on the way to getting there, you could try your luck with browserify instead.

Is there a way to lazily set the path of a resource with RequireJS?

So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.

Refactoring a userscript to use javascript modules

I'm working on a userscript - in particular this userscript - which has been designed to encapsulate functionality in modules. In order to be able to do some automated testing I would like to split the modules into their own files and use node.js's module exporting and require functions to combine into one file for use in Greasemonkey or simple browser extensions.
My first thought was to just copy the modules into their own files as such
module.js
var exportedModule = (function (){
var Module = {
// public functions and members
};
//private functions and members
return Module;
}());
module.exports = exports = exportedModule;
And then have a central file that requires each of these modules, perhaps compiling them with something like Browserify.
script.js
var importedModule = require(./module);
importedModule.init();
Is this possible?
It seems to me that you would be better off using Requirejs, which uses AMD style modules and is inherently more browser friendly. Node commonjs-style modules are synchronous and do not fit the browser model very well.
Of course, using requirejs will change your scripts a bit.
It's possible, and Browserify makes it easy:
browserify src/my.user.js -o dist/my.user.js
The metadata in the source file may get moved, but it's still parsed correctly (by Greasemonkey at least).
For a more complex example which compiles various assets, including CSS and images, see here.

Categories

Resources