In a Nodejs Sequelize model, how to obtain type information of attributes? - javascript

I have a working model with Postgres and sequelize in NodeJS. Say the model is Person and has name and age fields. Now I want to dynamically inspect the model class and obtain information about it's attributes, like their name and most of all type.
Using Person.attributes
I get some information:
name:
{ type:
{ options: [Object],
_binary: undefined,
_length: 255 },
But as you can see the type object does not inform about name being a varchar or boolean.
Does anyone know, how to get this information with sequelize

You can iterate over rawAtributes of Model
for( let key in Model.rawAttributes ){
console.log('Field: ', key); // this is name of the field
console.log('Type: ', Model.rawAttributes[key].type.key); // Sequelize type of field
}
So the example for name being a Sequelize.STRING would be
Field: name
Type: STRING
Or you can do almost the same iteration but instead using Model.rawAttributes[key].type.key you can use Model.rawAttributes[key].type.toSql(), which would generate this result
Field: name
Type: VARCHAR(255)
EDIT
Accessing defaultValue of field:
Model.rawAttributes[field].defaultValue
Checking if field allows NULL values:
Model.rawAttributes[field].allowNull

You are looking for native type information, it seems.
I'm not familiar with Sequelize, except I know it uses node-postgres driver underneath, which automatically provides the type information with every query that you make.
Below is a simple example of dynamically getting type details for any_table, using pg-promise:
var pgp = require('pg-promise')(/*initialization options*/);
var db = pgp(/*connection details*/);
db.result('SELECT * FROM any_table LIMIT 0', [], a => a.fields)
.then(fields => {
// complete details for each column
})
.catch(error => {
// an error occurred
});
There are several fields available for each column there, including name and dataTypeID that you are looking for ;)
As an update, following the answer that does use Sequelize for it...
The difference is that here we get direct raw values as they are provided by PostgreSQL, so dataTypeID is raw type Id exactly as PostgreSQL supports it, while TYPE: STRING is an interpreted type as implemented by Sequelize. Also, we are getting the type details dynamically here, versus statically in that Sequelize example.

item.rawAttributes[key].type.key === this.DataTypes.JSONB.key;
#piotrbienias Is above comparison valid ?
(Sorry for writing here as I can't add comments)

Related

Dynamic tables/models using Sequelize?

This question is more of a theoretical one, since I have not started implementation yet.
The situation is as follows:
I have an application in which users can upload structured data (like Excel, CSV etc.).
Due to requirements, I want to store them in the database, ideally creating a new table on the fly with the table name set to the file name and columns based on the file itself.
This initialisation is still doable with sequelize, I think.
However, as sequelize relies on models, then I am stuck as I am not sure what type and how many columns there will be, thus creating the need for something off 'Dynamic model' or 'Generic model'.
I am not sure how to do this, and I cannot find anything related when searching. I would appreciate your 2 cents on this approach, and if there are other ideas I am very eager to hear them.
Thanks in advance!
First, you need to add code to map your dynamic columns into Sequelize column definition like:
const columnDefinitions = [
{
field: 'field_name_from_csv',
type: DataTypes.INTEGER,
allowNull: true
},
{
field: 'field_name2_from_csv',
type: DataTypes.INTEGER,
allowNull: true
},
]
So you need to determine the data type of a certain column and create an appropriate column definition.
Second, you need to store all these mapping info in the special table(s) so you know what dynamic tables you have and how to register them in Sequelize.
Once you have a table name and column mappings you can register all tables as models in Sequelize:
// here you need to convert an array of mappings into the object where field names should be keys and column definitions should be values.
const columnMappingsAsObject = ...
const tableName = 'dynamic_table_name'
const dynamicModel = sequelize.define(tableName, columnMappingsAsObject, {
tableName
});
// now you can use it to get records and so on:
const records = await dynamicModel.findAll({})

How can I use DEFAULT values via knex insert?

My goal is to dynamically insert data into a table via knex.
Code looks like this:
const knexService = require("../knexService.js")
async function insertObjectToKnex() {
const insertObject = {
id: "DEFAULT",
someKey: "someValue"
};
await knexService.db("table").insert(inserObject);
}
On DEFAULT the next free id should be used as database id - table is configured and it works with raw sql. With knex.js I get the following error:
invalid input syntax for type integer: "DEFAULT"
Using the useNullAsDefault: true, config is not possible, because the id is not nullable.
How can I trigger the default value in knex - I did not find anything in the documentation or with google, that could at least give a hint to this issue!
While it is not mentioned in the documentation of knex.js one should simply not add fields with a DEFAULT assignement to a query. This will set the default value to the row column.

perform a TypeOrm find search operation for matching array of json

i have my typeorm column like this, what i want is an array of JSON object which i manage to get.
#Column({
type: 'jsonb',
array: false,
default: () => "'[]'",
nullable: false,
})
tokens!: Array<{ token: string }>;
this is how the field looks, and am fine with it, what i want is to find a document with a particular token, so i came up with the below code, but it returns an empty array.
const user = await User.find({ where: { _id: decoded._id, tokens: { token: token } } });
normally when am using mongooe i can get this working using
const user = await User.findOneBy({_id: decoded._id, "tokens.token": token,}); and this returns a particular user, with the id and token string passed.
i want help on how to find a user using the id and the token string inside in array of object, thanks.
TypeORM does not natively support queries on PostreSQL jsonb columns. Performing a query on the data in a jsonb column would require you to either issue a raw query or write your own WHERE clause in.where or .addWhere of a query builder (doc).
For reference, the jsonb query syntax documentation can be found here.

Select all the fields in a mongoose schema

I want to obtain all the fields of a schema in mongoose. Now I am using the following code:
let Client = LisaClient.model('Client', ClientSchema)
let query = Client.findOne({ 'userclient': userclient })
query.select('clientname clientdocument client_id password userclient')
let result = yield query.exec()
But I want all the fields no matter if they are empty. As always, in advance thank you
I'm not sure if you want all fields in a SQL-like way, or if you want them all in a proper MongoDB way.
If you want them in the proper MongoDB way, then just remove the query.select line. That line is saying to only return the fields listed in it.
If you meant in a SQL-like way, MongoDB doesn't work like that. Each document only has the fields you put in when it was inserted. If when you inserted the document, you only gave it certain fields, that document will only have those fields, even if other documents in other collections have different fields.
To determine all available fields in the collection, you'd have to find all the documents, loop through them all and build an object with all the different keys you find.
If you need each document returned to always have the fields that you specify in your select, you'll just have to transform your object once it's returned.
const fields = ['clientname', 'clientdocument', 'client_id', 'password', 'userclient'];
let Client = LisaClient.model('Client', ClientSchema)
let query = Client.findOne({ 'userclient': userclient })
query.select(fields.join(' '))
let result = yield query.exec()
fields.forEach(field => result[field] = result[field]);
That forEach loop will set all the fields you want to either the value in the result (if it was there) or to undefined if it wasn't.
MongoDB is schemaless and does not have tables, each collection can have different types of items.Usually the objects are somehow related or have a common base type.
Retrive invidual records using
db.collectionName.findOne() or db.collectionName.find().pretty()
To get all key names you need to MapReduce
mapReduceKeys = db.runCommand({
"mapreduce": "collection_name",
"map": function() {
for (var key in this) {
emit(key, null);
}
},
"reduce": function(key, stuff) {
return null;
},
"out": "collection_name" + "_keys"
})
Then run distinct on the resulting collection so as to find all the keys
db[mapReduceKeys.result].distinct("_id") //["foo", "bar", "baz", "_id", ...]

MarkLogic: find by property value

I have a MarkLogic 8 database:
declareUpdate();
var book0 = {
id: fn.generateId({qwe: 'book'}),
username: 'book',
password: 'pass'
};
var book1 = {
id: fn.generateId({asd: 'book'}),
username: 'user',
password: 'pass1'
};
xdmp.documentInsert(
'zz' + book0.id,
book0,
xdmp.defaultPermissions(),
['qwe']);
xdmp.documentInsert(
'xx' + book1.id,
book1,
xdmp.defaultPermissions(),
['qwe']);
So I want to find them by name with the Node.js API:
var db = marklogic.createDatabaseClient(connection.connInfo);
var qb = marklogic.queryBuilder;
function findByName(name) {
return db.documents.query(
qb.where(
qb.collection('qwe'),
qb.value('username', name)
)
).result();
}
The problem is that it finds not only user or user0, but also users and if I create a document with username book it will find both book and books.
A values query matches the entire text of a JSON property by stemming each word in the text (if stemming is enabled, which is the default).
Where (as in this case) that's not what you want, you can do either of the following:
Create a string range index (with the root collation if you only need exact matches) for the JSON property
Turn on word searches in the database configuration and use the "unstemmed" option on the query.
If you also turn off stemmed search in the database configuration, you don't have to pass the option (and avoid the extra resource required for both types of indexes).
To limit the configuration change to a specific property, you can configure a field for the property instead of configuring the entire database.
For more background, see:
http://docs.marklogic.com/guide/search-dev/stemming
http://docs.marklogic.com/guide/admin/text_index
http://docs.marklogic.com/cts.jsonPropertyValueQuery?q=cts.jsonPropertyValueQuery&v=8.0&api=true
Hoping that helps,

Categories

Resources