How to configure eslintrc for a project like in this example?
user
└── projectA
├── index.html
└── lib
├── .eslintrc
├── main.js
└── main_lib.js
html file includes both of the js files. How .eslintrc is needed to configure to overcome function is not defined and defined but not used errors properly?
The rules concerned by your question are:
for defined but not used it's the no-unused-vars rule
for function is not defined, if you're trying to use a global method declared from an external script, then you can declare it as part of the global config in .eslintrc, like "globals": { "yourFunctionName": "readonly" }, to make eslint aware that this function was not declared by you but still it's available from somewhere.
Related
This seems like a dumb question, but I struggle to find the answer.
The situation
Here is my folder structure:
myProject/
├── module1/
│ ├── config.ts
│ └── init.ts #symlink
├── module2/
│ ├── config.ts
│ └── init.ts #symlink
└── symlinks/
└── init.ts # the real not duplicated file
The file init.js import local files like so:
import config from './config'
// ...
The problem
The problem is that typescript throws Cannot find module './config' or its corresponding type declarations
I tried playing with typescript option preserveSymlinks but it didn't solve my problem
I know about other ways to achieve my goal but it's overkill for just one file and it doesn't solve relaying on a relative path (like creating a npm module, creating a function and pass relative file content as parameter or even generating files at runtime...)
=> I am working with typescript in a monorepo.
Is it possible to use symlinks this way? If no, are there other (simple) alternatives?
Assuming that I have a monorepo with structure like this:
├── packages
│ ├── foo
│ ├── a.js
│ ├── index.js
│ ├── index.d.ts
│ ├── package.json
├── tsconfig.json
foo is a legacy packge written in JavaScript, and the index.js file exports all available objects that may be imported by other packages.
Recently, I wrote a declaration file index.d.ts for this package, which looks like:
foo/index.d.ts
declare module 'foo' {
// export all available object here
export const something: SomeThing
...
}
Now, I want to add a file b.ts under package foo:
foo/b.ts
import { something } from './a' // error: can not find the declaration file of the module ...
As I expected, it occurs error, though I have make type declaration in index.d.ts.
Through my attempts, I found that redeclaring an a.d.ts file solves this problem:
foo/a.d.ts
export const something: SomeThing
besides, I can also solve it not using relative import:
foo/b.ts
import { something } from 'foo'
however, this solution looks strange, I am not sure if it would cause module circular dependency.
Is there any other way which can reuse the type declaration in index.d.ts with minimal change?
Change tsconfig.json file add include like this code.
{
"include": ["foo/**/*"]
}
Recently, I've supporting a Laravel project that uses Laravel Mix as front-end asset bundler (CSS and JS). I've noticed that, in main JS file, the developer was used some require() rules at start, and those require() rules was referring to other JS files located in same directory, and those JS files are "normal" browser-friendly JS script (like jQuery, Bootstrap, GSAP etc). After those require() rules, the developer used normal import rules from ES6, referring to NodeJS dependencies and other modules within the project.
To better explain what I'm trying to show...
File tree:
.
└── src
├── bootstrap.js
├── components
│ ├── AnotherFeature.js
│ ├── Carousel.js
│ └── Something.js
├── jquery.js
└── main.js
The main.js file...
require("jquery");
require("bootstrap");
import {AnotherFeature} from "./components/AnotherFeature";
import {Carousel} from "./components/Carousel";
import {Something} from "./components/Something";
class Main {
constructor() {
new AnotherFeature();
new Carousel();
new Something();
}
}
new Main();
And, even using Laravel Mix (that uses Webpack), the jQuery, $ and bootstrap are available to run through browser`s console.
My question is: how can I do something like this outside Laravel Mix? I'm working on a project that uses JS, and some third-party resources aren`t ES6 modules, and I can't find a beautiful solution to use them in my project. The way Laravel Mix "import" extra JS files is a good way to fix my issues, keeping simple and all thing in one bundled JS file.
You can find my project at my GitHub. It consists in a Wordpress starter template that people want to use at my job, and this method of "requiring" and "importing" things will be very helpful.
Thanks!
I am trying to write some JS modules which should be usable in both a browser and node.js environment.
To facilitate this I'm writing these modules as AMD modules, and using requirejs in the Node.js environment to provide the define() function as well as the ability to load these modules.
However, I've encountered some behavior I do not understand.
I made a SSCCE that illustrates the problem:
├── bar.js
├── node_modules
│ └── foo
│ ├── foo.js
│ ├── index.js
│ ├── node_modules
│ │ └── requirejs
│ │ ├── bin
│ │ │ └── r.js
│ │ ├── package.json
│ │ ├── README.md
│ │ └── require.js
│ └── package.json
└── test.js
foo is a node module which wraps the AMD module foo.js
node_modules/foo/foo.js
define([], function() {
return function() {
console.log("This is foo!");
};
});
node_modules/foo/index.js
var requirejs = require('requirejs');
requirejs.config({
baseUrl: __dirname,
nodeRequire: require
});
module.exports = requirejs('foo');
bar.js is another AMD module:
define([], function() {
return function() {
console.log("This is bar!");
};
});
test.js is a script that wants to use both foo and bar:
var requirejs=require('requirejs');
requirejs.config(
{
baseUrl: __dirname,
nodeRequire: require
}
);
var foo=require("foo");
foo();
var bar=requirejs("bar");
console.log("bar is:",bar);
If I run test.js, I get this:
This is foo!
bar is: undefined
/home/harmic/tmp/test_rjs/node_modules/foo/node_modules/requirejs/bin/r.js:393
throw err;
^
Error: Mismatched anonymous define() module: function () {
return function() {
console.log("This is bar!");
};
}
http://requirejs.org/docs/errors.html#mismatch
at makeError (/home/harmic/tmp/test_rjs/node_modules/foo/node_modules/requirejs/bin/r.js:418:17)
at intakeDefines (/home/harmic/tmp/test_rjs/node_modules/foo/node_modules/requirejs/bin/r.js:1501:36)
at null._onTimeout (/home/harmic/tmp/test_rjs/node_modules/foo/node_modules/requirejs/bin/r.js:1699:25)
at Timer.listOnTimeout (timers.js:119:15)
There are two things that don't make sense to me here.
In test.js, the call to requirejs("bar") returns undef. In fact it appears
that the loading of the module is deferred, as if there were some circular
dependancy going on, because it is only after it has returned that the
module definition for bar is being executed.
Why does it think this is an anonymous define() ? My use case does not appear
to meet the criteria in the given URL.
To supress the second issue I tried naming the define in bar.js, like this:
define('bar', [], function() {
...
That works in the sense that the exception is gone, but requirejs("bar") still
returns undef.
The example is a simplified version of what I am trying to do - basically I will
have a number of modules, which will contain some common components that can be
used in browser and in node, and some node specific components that will only
be used in node. There will be dependancies between the modules.
If there is some better way of doing this then I'm open to that also.
I've recreated locally the code and hierarchy you show in the question but I'm not able to get the exact error message you report getting. I'll say that looking at what you show in the question, I do not get how your code could result in that error message. When I run your code, the error I do get is:
This is foo!
/tmp/t1/node_modules/requirejs/bin/r.js:2604
throw err;
^
Error: Tried loading "bar" at /tmp/t1/node_modules/foo/bar.js then tried node's require("bar") and it failed with error: Error: Cannot find module 'bar'
at /tmp/t1/node_modules/requirejs/bin/r.js:2597:27
Which is exactly what I would expect: RequireJS running in node will try to use Node's own require if it cannot find a module through its own machinery.
Now, the problem you ran into is not a problem with running RequireJS in Node but a problem with how you use RequireJS. You could run into the exact same issue in a browser. The problem is that you are running requirejs.config twice, without using contexts so one config overwrites the other. (RequireJS is able to merge configs but the configs you are using are such that they'll overwrite one another.) I can get the code to run by changing test.js so that:
The configuration of RequireJS uses a context value.
I save the return value of requirejs.config and use this to require modules.
The end result is:
var requirejs=require('requirejs');
var r = requirejs.config(
{
context: "me",
baseUrl: __dirname,
nodeRequire: require
}
);
var foo=require("foo");
foo();
var bar= r("bar");
console.log("bar is:",bar);
Ideally, the code in index.js should also use a context.
This being said, I've written code that runs in Node and the browser as AMD modules for years. It is rare that I need to load AMD modules with RequireJS in Node. One rare case where I'd want to do it is if I'm testing how a module gets its configuration through module.config. I've never ever found the need to use amdefine. What I do use when I want to load AMD modules in Node is amd-loader (aka node-amd-loader). It just hooks into Node's module loading machinery to make it able to load AMD modules. I guess the downside of amd-loader is that projects that want to use your code then have to depend on having a loader installed like amd-loader.
In the end I did not get any satisfactory answer to this problem.
I have worked around it by using amdefine when using these modules in the node.js environment, instead of trying to use requirejs.
The main drawback to this approach is that you have to add some boilerplate to the front of all the AMD modules so that they will load amdefine if needed, but other than that, amdefine is working for my use case at least.
I also found the UMD project, which provides some other alternatives.
My problem lies in the next. I have a javascript application. It utilises the so called module pattern. That is I have multiple js files (one for each class) and during the build process all these files are put to a single file and wrapped in the IIFE. So in my karma config file I specify
files: ['src/**/*.js', 'tests/**/*.js']
The problem arises because I need to use several "modules" in this app. Here is the example of the tree structure of the code:
├── karma_unit.conf.js
├── src
│ ├── Bar
│ │ └── module.js
│ └── Foo
│ └── module.js
└── tests
└── unit
├── Bar
│ └── test.js
└── Foo
└── test.js
So I have two Module classes at the same time. This is not the problem with the "built" code. But for the unit tests this is the problem, because this is the name conflict.
I know that I can have different config files for each such a module and run tests several times (one per a single config file), but this is very undesirable.
Also I supposed that files are executed with respect to their inclusion order, so I tried to write in the config file:
files: [
'src/Foo/*.js',
'tests/Foo/*.js',
'src/Bar/*.js',
'tests/Bar/*.js',
]
But this did not help.
So my question is: how can I circumvent this situation when I'm forced to have several javascript classes with the same name in a single project without running tests several times or renaming these classes?
My appreciation in advance.
This is the reference link that details a solution for your query:
http://karma-runner.github.io/0.8/plus/RequireJS.html