Sequelize synch doesn't create table in db - javascript

I have project structure like
├── index.ts
├── package.json
├── package-lock.json
├── src
│ ├── controllers
│ ├── models
│ │ └── repository.ts
│ ├── routes
│ │ └── repository.ts
│ └── services
│ ├── minio.ts
│ └── sequelize.ts
└── tsconfig.json
// services/sequelize.ts
import { Sequelize } from 'sequelize'
const sequelizeConnection = new Sequelize("randomDb", "randomUser", "randomPassword", {
host: "randomHost",
dialect: "mysql"
})
export default sequelizeConnection
// models/user.ts
import { DataTypes } from 'sequelize'
import sequelizeConnection from '../services/sequelize'
const User = sequelizeConnection.define("user", {
name: {
type: DataTypes.STRING(30),
allowNull: false
}
})
export default User
// index.ts
import express from 'express'
import sequelize from './src/services/sequelize'
const app = express()
const startApp = async()=> {
try {
await sequelize.sync()
app.listen(3000, () => {
console.log(`Listen on port 3000`)
})
} catch (error) {
console.log("Synchronization error")
}
}
startApp()
When I start app, "sequelize.sync()" should create table "users", but for some reasons it doesn't. I tried call sync() method separately on "User" and it worked, so there's no problem with connection with db.
await User.sync()
But in my case, I have much more models and I don't want call synch method on each model.

I believe the issue here is just with your folder structure. The models directory should be on the same level as the index.ts.
Documentation on the .sync() method isn't great, likely because migrations are the preferred approach for production applications (Reference). It's possible there is a config attribute you can set somewhere to maintain your current structure, though I dug around a little bit and couldn't find anything.

Related

Allow imports only from index file of folder, except inside the folder with import/no-internal-modules

Given the following folder structure:
my-project
├── components
│ └── getUser.js
│ └── updateUser.js
└── utils
└── models
│ └──index.ts
│ └──api.ts
└── services
└──index.ts
└──api.ts
└──price
└──index.ts
└──getPrice.ts
I would like to make sure that every filt outside of the services folder only imports from the service index file, like so:
import { getPrice } from 'utils/services';
but I would also like files inside the services folder to import from each other, for example that the api file can do import { getPrice } from './price';
I have this rule in the root .eslintrc
"import/no-internal-modules": [ "error", {
"forbid": [ "/**/util/services/*" ]
} ]
and inside the services folder I have another .eslintrc with this rule:
"import/no-internal-modules": [
"error",
{
"allow": [
"util/services/**"
]
}
]
But running this seems to do the exact opposite of what I want, right now it only gives error for all the files in the services folder, saying for example Reaching to "./price/get" is not allowed on a file inside the services folder.

Cannot get basic index.js to work in simple mocha test project

I have a simple project laid out like:
project
│ package.json
│
└───productA
│ │
│ └───test
│ │ spec1.js
│
└───common
index.js
│
└───sharedFunction1
│ auth.js
│ index.js
In spec1.js, I can successfully import a function from auth.js like this:
import { some_function } from "../../common/sharedFunction1/auth.js"
However, I thought that I should be able to use index.js files so that I can import like this:
import { some_function } from "../../common"
The function I am trying to import is something like (just a simple example):
export function get_file_listing (folder) {
var files = fs.readdirSync(folder);
return files
}
and my index.js under sharedFunction1 looks like:
export * from './auth';
and my index.js under common looks like:
export * from "./sharedFunction1";
VSCode auto import in the spec file seems to think the import should be:
import { some_function } from "../../common";
But when I run my test with that I get:
Error [ERR_UNSUPPORTED_DIR_IMPORT]: Directory import ... is not supported resolving ES modules imported from ...
My package.json has:
"type": "module",
I have read a hundred different pages trying to understand what I am doing wrong, but am stuck. Can anyone tell me what I am doing wrong or if this should even work?
Found that I apparently need this file:
.mocharc.json
{
"node-option": [
"experimental-specifier-resolution=node"
]
}

Mocking a yaml file in jest

I have a yaml file, that has some config information and I use it in a module that I want to test. But when I test it I want to mock it so it only has simplified and static data, so it's easy to test and if the config is changed I don't have to edit the tests. Here is what I tried so far:
// config/index.js
const yaml = require('js-yaml');
const fs = require('fs');
const path = require('path');
const filePath = path.join(__dirname, 'stuff.yaml');
module.exports =
{
getStuff()
{
return yaml.safeLoad(fs.readFileSync(filePath, 'utf8'));
},
setStuff(stuff)
{
fs.writeFile(filePath, yaml.safeDump(stuff), err => console.log);
}
}
// test/config.test.js
const config = require("../config")
test('getStuff', () => {
jest.mock('../config/stuff.yaml')
expect(config.getStuff()).toEqual({/*..*/});
});
My file structure being:
project-root/
├── config/
│ ├── __mocks__/
| └── stuff.yaml (the mock file)
│ ├── stuff.yaml (the real file)
│ └── index.js
└── test/
└── config.test.js
But the test still return the data from the real file. Summarizing, I want to mock a text file in the file system, so that any module reads it instead of the real one.
Note: I don't really care if the mock version is on the disk or I just have it as a string in memory. Having it in memory would even be beneficial in the sense of the tests being faster.
You can probably update your Jest configuration and leverage moduleNameMapper to handle this.
{
"moduleNameMapper": {
"config/stuff.yaml": "<rootDir>/config/__mocks__/stuff.yaml"
}
}
You could also try setMock - https://facebook.github.io/jest/docs/en/jest-object.html#jestsetmockmodulename-moduleexports
jest.setMock('config/__mocks__/stuff.yaml', require('config/stuff.yaml');

Node JS Cannot Get /

I am following this Node JS tutorial, but I have encountered a problem where I get the response Cannot GET / instead of Hello. Here is the structure for my code:
myapp/
├── app
│   └── routes
│   ├── index.js
│   └── note_routes.js
├── conf
│   ├── httpd-app.conf
│   └── httpd-prefix.conf
├── htdocs
├── node_modules
│   ├── ...
├── package.json
├── package-lock.json
└── server.js
server.js looks like this:
const express = require('express');
const MongoClient = require('mongodb').MongoClient;
const bodyParser = require('body-parser');
const app = express();
const port = 3000;
require('./app/routes')(app, {});
app.listen(port, () => {
console.log('We are live on ' + port);
});
index.js is this:
const noteRoutes = require('./note_routes');
module.exports = function(app, db) {
noteRoutes(app, db);
// Other route groups could go here, in the future
};
And note_routes.js looks like this:
module.exports = function(app, db) {
app.post('/notes', (req, res) => {
// You'll create your note here.
res.send('Hello')
});
};
As can be seen, this is a very simple setup. According to the above-linked tutorial, this should be enough and should respond with Hello. However, this is not the case.
I am almost 100% sure that the port is correct because there is some not-503 response.
What's wrong here, and how can I get this to work as intended?
Your app.post() in note_routes should be app.get() or if you plan on making post requests then you can chain it like
app.post()
.get();
this is wrong on a whole new kind of level.
I would imagine you are starting your server with node server.js
firstly, server.js does not require index.js anywhere
note_routes should contain .get() NOT .post()
in your browser you should be visiting http://localhost:3000/notes NOT plain old
http://localhost:3000

Intern path error in browser client

I have a conventional recommended Intern directory structure:
MyProject
├── node_modules
│ ├── intern-geezer
│ │ ├── client.html
├── src
│ ├── myFunction.js
├── tests
│ ├── intern.js
│ ├── unit
│ │ ├── ps.js
with a very simple config:
useLoader: {
'host-node': 'dojo/dojo',
'host-browser': 'node_modules/dojo/dojo.js'
},
loader: {
packages: []
},
suites: [ 'tests/unit/ps' ]
and tests:
define(function (require) {
var tdd = require('intern!tdd');
var assert = require('intern/chai!assert');
// Global function to test, not an AMD module
var parseF = require('src/myFunction.js');
var he = require('tests/he');
tdd.suite('My tests', function() {
//etc
});
});
````
but when I open the browser client the loader is looking for the test suite inside the intern-geezer directory:
I am not setting a baseUrl in the config (or in the browser URL). I didn't have this trouble going through the regular (non-geezer) intern tutorial. Since the baseUrl defaults to two directories up from client.html I don't see what I'm doing wrong. Thanks for any help. (Yes, I will need geezer for ancient IE. No, I do not want to rewrite the function I'm testing as an AMD module.)
The Intern loader doesn't know how to get to your tests because they haven't been registered as a package. When using the recommended directory structure, you'll want to also set up the recommended loader configuration so that Intern knows where to find your code and tests.
loader: {
packages: [
{ name: 'app', location: 'src/' },
{ name: 'tests', location: 'tests/' }
]
}
Then, update your tests to correctly find the code you need to test.
// Global function to test, not an AMD module
var parseF = require('app/myFunction.js');
Now, Intern should be able to correctly find the code and tests.

Categories

Resources