I am trying to Install a module onto my nodejs.
I have researched for tutorials to install modules, but only managed to find easy-made methods of installing via the command prompt.
I require to install the following module : https://gist.github.com/1510680 (which are javascript files). But as im new to node and all, I have no idea how to do so.
I have been searching for quite some time and I really appreciated if someone could provide a extremely lay-man guide and it would mean so much to me if someone could give me a dummy-prove step by step guide to 'install' this module ( I have never installed any module before, only recently stumbled into the world of node js and still exploring ) Help is appreciated and thanks so much.
Take a look at the node API docs. I've linked directly to the section on modules, and it includes most of what you need to know.
In a nutshell, though, you include external scripts by calling require(...). If the JS file you want is already in common.js form, you can simply save the file in your project folder, and use it by assigning the results of a require call to a local variable. For example, to use the SimpleCluster module you linked to, you would add the following line to your main script:
var SimpleCluster = require('./SimpleCluster.js')
The above will instantiate the file "SimpleCluster.js" found in the current directory, and anything that it exports will be accessible through your local SimpleCluster variable. Since the gist that you linked to is in common.js format, the above should work for you without needing to modify SimpleCluster.js.
If you need more information, I'd recommend starting with nodebeginner.org. While the above advice should be sufficient to get you started, it can be brittle. If you go through Manuel Kiessling's little book (and you can read all of it online, though I'm sure he'd appreciate it if you bought it), you'll get an intro to node's package manager, NPM, which is the preferred way of handling dependencies.
Related
I was wondering if anyone here had some ideas or experience getting rid of long, hard-to-maintain relative imports inside of large node.js cloud functions projects. We’ve found that the approach which uses local NPM packages is very sub-optimal because of how quickly we tend to roll out and test new packages and functionality, and refactoring from JS to TS is impossible for us at the moment. We'd love to do it in the near future but are so slammed as it is currently :(
Basically what I’m trying to do in cloud functions is go from const {helperFunction} = require(‘../../../../../../helpers’)
to
const {helperFunction} = require(‘helpers’)
I have been unable to get babel or anything similar to that working in cloud functions. Intuitively I feel like there is an obvious solution to this beyond something like artifact registry or local NPM packages but i’m not seeing it! Any help would be greatly, greatly appreciated :)
For CommonJS modules in general, you have several options. I don't know the Google Cloud environment so you will have to decide which options seem appropriate to it.
The most attractive option to me is #5 as it's just a built-in search up the directory tree, designed specifically to look in the node_modules sub-directories of your parent directories. If you can install the shared modules in that way, then nodejs should be able to find them as long as your directory hierarchy is retained by Google Cloud.
Options #3 and #4 are hacking on the loader which has its own risks, but does give you imlementation flexibility as you could implement your own prefix that looks in a particular spot. But, it's hacking and may or may not work in the Google cloud environment.
Options #1 and #2 rely on environment variables and shared directories which may or may not be relevant in the Google cloud environment.
You can specify the environment variable NODE_PATH as a colon-delimited (or semi-colon on Windows) list of paths to search for modules. Doc here.
In addition, nodejs will search $HOME/.node_modules and $HOME/.node_libraries where $HOME is the user's home directory.
There is a package called module-alias here that is designed specifically to help you solve this. You define module aliases in your package.json, import this one module and then you can use the directory aliases in your require() statements.
You can make your own pre-processor for resolving a module filename that is being loaded by require. You cando this by monkey patching Module._resolveFilename to either modify the filename passed to it or to add additional search paths to the options argument. This is the general concept that the module-alias package (mentioned in point #3 above) uses.
If the actual location of the helper module you want to load is in a node_modules directory somewhere above your current module directory on this volume, it can be found automatically as long as the require is just a filename as in require("helpers"). An example in the doc here describes this:
For example, if the file at /home/ry/projects/foo.js called require('bar.js'), then Node.js would look in the following locations, in this order:
/home/ry/projects/node_modules/bar.js
/home/ry/node_modules/bar.js
/home/node_modules/bar.js
/node_modules/bar.js
You can see how it automatically searches up the directory tree looking in each parent node_modules sub-directory all the way up to the root. If you put these common, shared modules in your main project file's node_modules or in any directory above, then it will be found automatically. This might be the simplest way to do things as it's just a directory structuring and removing of all the ../../ stuff you have in the paths. Clearly these common, shared modules are already located somewhere common - you just need to make sure they're in this search hierarchy so they can be found automatically.
Note this info is for CommonJS modules and may be different for ESM modules.
I have been looking for a way to mostly share some code between projects specifically for SPFX and fluent ui. We found 3 main ways to do that.
1.
Creating a component library is the way that seemed least complicated cause it uses the same infrastructure and do all building without the need to configure it.
But this adds some issues, we need to built and manually link the solution locally to make it work, this will also work if we put in a repo. so this is mitigated.
The second is that implicitly this will also require the fluent ui and react. Plus having to place it inside a SPFX component library project.
2.
I saw some promise using paths in ts and this works fine while using the ts compiler. It will go to the folder that your proj is referring and build it at calling time. which is great. But it did not work in SPFX.
3.
Another way was to have a post install to sync the folders which seems easy enough but I wonder how practical this is plus how people are doing it, if they are, how.
All I wanted to figure out now is a way to take my component code and share as if they were in a folder of my src or a simple extension of the code. No need to have extra dependencies or build steps, just the code that can be used as a ts/tsx file. ex:
shared lib:
//assuming I have react and fluentui already installed for the project.
import button from 'fluentui';
export const fancyCustomButtom = (props) => {
return (<Button text="Standard" />);
};
src project folder:
import {fancyCustomButtom} from 'shared-lib'
It is fine if it needs to build the files before we can use it but can we do it at build time or when the package is installed? also wouldn't it increase my bundle size by making both module dependent on things already available (react, fluentui)?
Given the way Microsoft have architected the loading of bundles in SharePoint and Teams - I believe an SPFX component library is the best way to share code between different solutions, particularly if you are looking to minimise bundle size...
Imagine you have a library for something re-usable: a form, a set of standard branded components - something of that nature. You could put your shared code in repos and add references to it - either by publishing your own repo publicly or using the npm install git+https://yourUrl syntax; but what effectively happens there is that your code is pulled down in to node_modules for each project, and any referenced module code is included in your bundles. If you have two, three, four or more webparts on the same page - using those same libraries, you're multiplying how many times that code is included on the page.
If you follow Microsoft's guide on setting up a component library project however, your npm link commands allow your types to be recognised in consuming projects without needing to actually include the bundled distribution code. You can deploy your library code once to the App Catalog, and if it's referenced in other solutions -- it's loaded on pages as needed: once.
I have found the development experience to be quite flaky at times, but it does work. When I run gulp clean on my library code, or come back to it after some time, I sometimes find that I need to run npm link and npm link my-project-name again as per the instructions in the above tutorial. Whenever you run gulp build on your library, you should also rebuild the project that consumes the library, either by using gulp build / bundle or by saving a file (if you're running gulp serve). This process works well for developing, and once it comes time to deploy, all you need to do is add a named reference to your library inside package.json and then deploy both .sppkg files to your App Catalog.
In terms of your last question re: bundle size - react is not actually included in the dependencies for an SPFX library project, but you will find it's available to use. When you build your library, if you take a look in the generated javascript in your dist folder, you will see it's listed as one of the dependencies for the webpacked content along with react-dom and ControlStrings. It's on line 1.
office-ui-fabric-react is included as a devDependency thanks to the #microsoft/sp-webpart-workbench package that gets scaffolded with all SPFX projects - and if you check your library's dist javascript, you will see that included components are being webpacked and included in your bundle. I'm not entirely sure if when you pull this code in to your consuming project, whether webpack then tree-shakes to de-duplicate and ensures only necessary code is included: I don't know. Someone else may be able to illuminate us there or provide a more accurate explanation of what's going on... I will update this response if anyone comments to let me know.
And finally, this is more of a personal way of working, but it may be worth consideration:
When developing a library, I sometimes reference it in other projects via a local npm install ../filepath command. This ensures that when I install the library as described, the consuming project installs any necessary dependencies. I'm able to tweak both projects if I need o. When it comes time to deploy, I commit my changes to both projects, deploy my library code to the App Catalog, and then npm uninstall the library from the consuming project and add a reference as described in the above tutorial. When I deploy projects that use my library, they just work.
I recently developed a library that uses pnpjs, in particular the #pnp/sp library that is used to talk to SharePoint. If you look at the Getting Started guide for that library, they expect you to pass a reference to your Application Customizer or Web Part context during setup, or explicitly set things up using a base URL and so forth - and of course, a library doesn't really have a page context of any sort - the whole point of this code is that it's reusable. So that poses a challenge. My solution was to do the setup in the consuming web part(s) and ensure that they pass a reference to the sp object (which is of type SPRest) to any code or components that exist in my library. My library has peerDependencies on those pnp libraries so that the code isn't duplicated in consuming projects. Sometimes you have to think about what your library needs to include in its bundle and what you expect consuming solutions to already have, and maybe find ways to ensure things aren't included that aren't needed.
For example, in the scenario you talk about, you may want to ensure fluentui or office-ui-fabric-react are only devDependencies or peerDependencies for your library. As long as your library and the project(s) consuming your library both use the right version(s) you shouldn't have any trouble, and you can document any pre-requisites with your library documentation. You can check which versions of these libraries are installed per the SPFX version you are currently using ie. SPFX v1.11 or v1.12 etc. Just run npm ls <packagename> to get a breakdown, or check your package.json file.
So I am using the following tutorial. I can access the localhost, but I am putting in an html file. The tutorial does not go into depth about setting up a javascript file and getting that file to transpile. Can someone help me with the setup. I have googled this topic and I keep coming across tutorials that talk about needing babel for the transpiling, yet the tutorial says
Now that everything’s ready to go, we can get webpack to watch for
code changes in our directory and transpile.
But I feel like this means it should just bundle up everything in my project since webpack is a bundling product. Further, I am confused by the call to webpack:
CMD webpack --watch --watch-polling
Does this mean I don't need to put in an entry point and endpoint? I thought with webpack I need to provide those two details, but the tutorial again is calling directly to just polling.
Thanks in advance for any help getting my structure setup so I can run webpack, react in docker.
In terms of setting up the files, I had to add
COPY wwwroot ./wwwroot
Just to get the project to display something at localhost. But I want files to be brought over automatically if possible. But I am not sure of the intent of the tutorial with regards to file structure.
I'm using a package that loads Font Awesome from a CDN. However, the CDN used is pretty terrible, often stalling or timing out, so I'd like to switch it out.
The source code is incredibly simple, and looking through it I found this:
style.href = '//maxcdn.bootstrapcdn.com/font-awesome/4.3.0/css/font-awesome.min.css';
Switching to a different CDN would only require changing this one line (I presume), but the problem is I don't know how! Or rather, I don't know where it is available in my project.
How do I actually do this?
Create a packages/ directory at the top level directory of your app (myapp/packages) and simply git clone the package you want to modify into it.
Add the package via meteor add and you'll be able to edit the files of the package you just cloned.
I am in the process of converting an existing Rails 3.1 app I made for a client into a Backbone.js app with the Rails app only as a backend server extension. This is only a personal project of mine, to learn more about Backbone.js.
While setting up Backbone.js (using Backbone-on-Rails), I noticed I have some dependencies (like backbone-forms) that come from external sources and are frequently updated.
I've grown accustomed to using Bundler to manage my Ruby gems, but I haven't found anything similar for JavaScript files. I'm wondering if there is any way to do the same for Javascript (and possibly css) files.
Basically I can see three possibilities to solve this issue:
Simply write down all the sources for each JS file and check these sources from time to time to see what has changed.
Use some kind of existing "Bundler for Javascript" type of tool, I've been looking for something like this but have yet to find anything (good).
Since most of these JS files will be coming from Git anyway, use Git to get the files directly and use checkout to get the latest version from time to time.
I prefer the last option, but was hoping on some more input from other people who have gone this route or preferred some other way to tackle this issue (or is this even an issue?).
I figure the Git way seems easy, but I am not quite sure yet how I could make this work nicely with Rails 3.1 and Sprockets. I guess I'd try to checkout a single file using Git and have it be cloned in a directory that is accessible to Sprockets, but I haven't tried this yet.
Any thoughts?
You don't mention it in your alternatives, but ideally you should use something like Maven to manage your dependencies. Unfortunately, there are no public repositories for javascript files. This discussion lists some other options which might be of help to you: JQuery Availability on Maven Repositories
For now I've settled on using the Git solution combined with some guard-shell magic.
The steps I follow:
Create a dependencies directory somewhere on your local drive
Clone the repositories with javascript (or css) files you want to use in the app
Set up a custom guard-shell command to do the following:
group 'dependencies' do
guard 'shell' do
dependencies = '~/path/to/dependencies/'
watch(%r{backbone-forms/src/(backbone\-forms\.js)}) {|m| `cp #{dependencies + m[0]} vendor/assets/javascripts/#{m[1]}` }
end
end
Place the Guardfile at the root of the app directory
It takes some time to set things up, but after that, when you have the Guard running, and you pull changes into your dependencies, the required files are automatically copied to your application directory, which are then part of your repository.
It seems to work great, you need to do some work for each new file you want to include in the asset pipeline, but all that is required is cloning the repository in your dependencies directory and adding a single line to your Guardfile, for example for the backbone-form css:
watch(%r{backbone-forms/src/(backbone\-forms\.css)}) {|m| `cp #{dependencies + m[0]} vendor/assets/stylesheets/#{m[1]}` }
Also, the reason I added this Guard to a group is because I keep my dependencies outside the main application directory, which means guard normally doesn't check my dependencies directory. To make this work, I start up my main Guard processes using bundle exec guard -g main and use bundle exec guard -w ~/path/to/dependencies -g dependencies in a new terminal window/tab to specify the -w(atchdir).