How do I override a sequelize defaultValue function for testing purposes? - javascript

I'm trying to add unit/integration testing to my sequelize project, and I'm running into a problem when moving from postgres dialect to sqlite. I am attempting to override the 'defaultValue' function on 'id'. This results in the syntax for CREATE TABLE being correct, but the original defaultValue is used in the INSERT statement generated by .create().
I have created a minimal sample project that illustrates the problem I'm describing with a failing test.
Here's the relevant code snippet:
User = sequelize.define('User', {
id: {
type: Sequelize.UUID,
field: 'id',
defaultValue: sequelize.literal("uuid_generate_v1mc()"),
primaryKey: true
},
name: {
type: Sequelize.STRING,
field: "first_name"
},
});
// why doesn't setting these attributes override the default value
// provided by User.create?
User.attributes.id.defaultValue='b9c96442-2c0d-11e6-b67b-9e71128cae77';
User.tableAttributes.id.defaultValue='b9c96442-2c0d-11e6-b67b-9e71128cae77';
What's the best way to inject or mock the defaultValue function?

The answer would miss the "why" part, but here is how I made it work:
before(function() {
// Define a table that uses a custom default value
User = sequelize.define('User', {
id: {
type: Sequelize.UUID,
field: 'id',
defaultValue: sequelize.literal("uuid_generate_v1mc()"),
primaryKey: true
},
name: {
type: Sequelize.STRING,
field: "first_name"
},
});
User.attributes.id.defaultValue.val = '"b9c96442-2c0d-11e6-b67b-9e71128cae77"';
return sequelize.sync();
});
Though, I suspect there should be an easier way to achieve the same result.
What is interesting is that if the defaultValue would not be literal or fn and be, say, a string value instead, things would be much easier - we could've just added a hook:
User.beforeCreate(function (user) {
user.dataValues.id = 'b9c96442-2c0d-11e6-b67b-9e71128cae77';
});
I suggest you to seek the answers for the "why" question at the sequelize issue tracker.

Related

Sequelize allow null set to false on foreign key

Here is the code below for one of my tables
const { Model, DataTypes } = require('sequelize');
const Algorithm = require('./algorithm');
const Exchange = require('./exchange');
const User = require('./user');
//#JA - This model defines the api keys for each user's exchange
//#JA - For security reasons in case the database gets hacked the keys will be stored using encryption.
module.exports = function(sequelize){
class AlgorithmRule extends Model {}
AlgorithmModel = Algorithm(sequelize);//#JA - Gets a initialized version of Algorithm class
ExchangeModel = Exchange(sequelize);//#JA - Gets initialized version of the Exchange class
UserModel = User(sequelize);//#JA - Gets a initialized version of User class
var AlgorithmRuleFrame = AlgorithmRule.init({
algorithm_id: {
type: DataTypes.INTEGER,
allowNull: false,
references: {
model: AlgorithmModel,
key: 'id',
}
},
exchange_id: {
type: DataTypes.STRING,
references: {
model: ExchangeModel,
key: 'name',
},
},
user_id: {
type: DataTypes.INTEGER,
references: {
model: UserModel,
key: 'id',
},
},
type : { //Partial-Canceled implies that the order was partially filled and then canceled.
type: DataTypes.ENUM('Percent Of Equity','Cash'),
allowNull: false,
defaultValue: 'Percent Of Equity'
},
type_value: { //#JA - This will be either treated as a percentage or 'cash' value for the type chosen for the algorithm.
type: DataTypes.DECIMAL(20,18),
allowNull: false
},
}, {
sequelize,
modelName: 'AlgorithmRule',
indexes: [{ unique: true, fields: ['algorithm_id','exchange_id','user_id'] }]
});
return AlgorithmRuleFrame
};
I'm trying to set this up so that I can allownull:false on algorithm_id and exchange_id and user_id. I want it so there HAS to be values there for any records to be allowed.
I can't even get allowNull:false manually through the database itself. So my first question is, is this even possible?
If it is, how do I do it with sequelize?
I can use the typical hasOne() with foreign key commands because then I can't create a composite unique of the foreign keys. The only way I was able to do this was the way I did using the references: json structure.
How do I allownull:false for a foreignKey reference defined the way I have it?
To be clear something like this will NOT work
Task.belongsTo(User, { foreignKey: { allowNull: false }, onDelete: 'CASCADE' })
This will NOT work because I'm using a composite unique key across 3 foreign keys and in order to do that I need reference to it's name and that is not possible unless it's defined on the table before these commands above our input. Hopefully this makes sense.
Any help is greatly appreciated!
Okay so apparently
algorithm_id: {
type: DataTypes.INTEGER,
allowNull: false,
references: {
model: AlgorithmModel,
key: 'id',
}
},
This code is correct. HOWEVER, if you created the database already and the foreign key was already defined it will NOT change the allowNull via the alter command. You have to COMPLETELY drop the table and THEN it will allow the allowNull:false attribute to work.
This threw me for a loop for a long time, so I help this saves someone else a lot of frustration.

Sequelize running migration results Cannot read property 'key' of undefined

I wanted to update column to set not null to false, however running db:migration fails with this error message:
Cannot read property 'key' of undefined
Here's the code of migration:
'use strict';
module.exports = {
up: (queryInterface, Sequelize) => {
return queryInterface.changeColumn('Notes', 'title', {
allowNull: false
});
},
down: (queryInterface, Sequelize) => {
return queryInterface.changeColumn('Notes', 'title', {
allowNull: true
});
}
};
As followed document, it seems nothing wrong with my code.
Table and field are exists, what am I wrong?
Just had this problem myself. From what I see in the source code Sequelize assumes you always provide the type when changing the column. I see it's also in the documentation you linked: "Please make sure, that you are completely describing the new data type."
Just a tip how you can fix it. I think query interface object is not able to find the name or column of the table which has to be migrated. Can you just print your Sequelize object and models inside it and put the same name which your Sequelize object holds. I have tried the same library and it works like a charm.
You must have to provide the column type either you want to change it or not.
'use strict';
module.exports = {
up: (queryInterface, Sequelize) => {
return queryInterface.changeColumn('Notes', 'title', {
type: Sequelize.STRING,
allowNull: false
});
},
down: (queryInterface, Sequelize) => {
return queryInterface.changeColumn('Notes', 'title', {
type: Sequelize.STRING,
allowNull: true
});
}
};

Dynamoose - cannot query and where with 2 keys

I have opened a related issue on GitHub, but maybe someone here will be able to help quicker.
Summary:
ValidationException: Query key condition not supported
I need to find records in last (amount) seconds on a given location.
Pretty simple, but already related to other issues:
One and another one
WORKS:
Activity.query('locationId').eq(locationId).exec();
DOES NOT WORK:
Activity.query('locationId').eq(locationId).where('createdAt').ge(date).exec();
Code sample:
Schema
const Activity = (dynamoose: typeof dynamooseType) => dynamoose.model<ActivityType, {}>('Activity',
new Schema({
id: {
type: String,
default: () => {
return uuid();
},
hashKey: true,
},
userId: {
type: String,
},
locationId: {
type: String,
rangeKey: true,
index: {
global: true,
},
},
createdAt: { type: Number, rangeKey: true, required: true, default: Date.now },
action: {
type: Number,
},
}, {
expires: 60 * 60 * 24 * 30 * 3, // activity logs to expire after 3 months
}));
Code which executes the function
Funny part is that I found this as workaround proposed to be used until they merge PR giving ability to specify timestamps as keys, but unfortunately it does not work.
async getActivitiesInLastSeconds(locationId: string, timeoutInSeconds: number) {
const Activity = schema.Activity(this.dynamoose);
const date = moment().subtract(timeoutInSeconds, 'seconds').valueOf();
return await Activity.query('locationId').eq(locationId)
.where('createdAt').ge(date).exec();
}
I suspect createdAt is not a range key of your table / index. You need to either do .filter('createdAt').ge(date) or modify your table / index schema.
I'm pretty sure the problem is that when you specifying rangeKey: true on the createdAt property you are telling that to be used on the global index (I don't think that is the correct term). That range key will be linked to the id property.
I believe the easiest solution would be to change your locationId index to be something like the following:
index: {
global: true,
rangeKey: 'createdAt',
},
That way you are being very explicit about which index you want to set createdAt as the rangeKey for.
After making that change please remember to sync your changes with either your local DynamoDB server or the actual DynamoDB service, so that the changes get populated both in your code and on the database system.
Hopefully this helps! If it doesn't fix your problem please feel free to comment and I'll help you further.

Update embedded document mongoose

I'm looking for an easy way of updating an embedded document using mongoose without having to set each specific field manually. Looking at the accepted answer to this question, once you find the embedded document that you want to update you have to actually set each respective property and then save the parent. What I would prefer to do is pass in an update object and let MongoDB set the updates.
e.g. if I was updating a regular (non embedded) document I would do this:
models.User.findOneAndUpdate({_id: req.params.userId}, req.body.user, function(err, user) {
err ? resp.status(500).send(err) : user ? resp.send(user) : resp.status(404).send();
});
Here I don't actually have to go through each property in req.body.user and set the changes. I can't find a way of doing this kind of thing with sub documents as well ?
My Schema is as follows:
var UserSchema = BaseUserSchema.extend({
isActivated: { type: Boolean, required: true },
files: [FileSchema]
});
var FileSchema = new mongoose.Schema(
name: { type: String, required: true },
size: { type: Number, required: true },
type: { type: String, required: true },
});
And I'm trying to update a file based on user and file id.
Do I need to create a helper function to set the values, or is there a MongoDB way of doing this ?
Many thanks.
Well presuming that you have something that has you "filedata" in a variable, and of course the user _id that you are updating, then you wan't the $set operator:
var user = { /* The user information, at least the _id */
var filedata = { /* From somewhere with _id, name, size, type */ };
models.User.findOneAndUpdate(
{ "_id": user._id, "files._id": filedata._id },
{
"$set": {
"name": filedata.name,
"size": filedata.size,
"type": filedata.type
}
},
function(err,user) {
// Whatever in here such a message, but the update is already done.
}
);
Or really, just only $set the fields that you actually mean to "update" as long as you know which ones you mean. So if you only need to change the "size" then just set that for example.

Extend Ext.data.Model (add fields dynamically)

I extended an existing Model by adding fields using the prototype. Everything works fine, the data can be received from the server side and can be used on client side. But, when I now update my data and send it back to the server side, the "new" fields are not recognized by the writer of the proxy.
To be more specific: I have a model like this:
Ext.define('Osgaar', {
extend: 'Ext.data.Model',
fields: [
{ name: 'first', type: 'string' },
{ name: 'second', type: 'string' },
{ name' 'third', type: 'string' }
],
proxy: {
type: 'rest',
url: 'public/svcmethod',
reader: {
type: 'json',
root: 'data'
},
writer: {
type: 'json',
writeAllFields: false
}
}
});
I am extending the model like that:
Osgaar.prototype.fields.add({ name: 'fourth', type: 'string' });
I tried to set writeAllFields to false to get all attributes transferred, there are just those from the defined model, not the one added using the prototype (Fiddler confirms that).
Does anybody now a way to solve this without defining a new model?
Thank you in advance.
I think the best solution here is the following:
Osgaar.prototype.fields.add(new Ext.data.Field({ name: 'fifth', type: 'string'})); // create Ext.data.Field constructor, not just simple Object
I did a quick look on the Writer implementation, and here is a method that is called by write() when you write data:
getRecordData: function(record) {
var isPhantom = record.phantom === true,
writeAll = this.writeAllFields || isPhantom,
nameProperty = this.nameProperty,
fields = record.fields, // <- look here
data = {},
changes,
name,
field,
key;
if (writeAll) {
fields.each(function(field){
if (field.persist) { // <- checks the persist property!
name = field[nameProperty] || field.name;
data[name] = record.get(field.name);
}
});
Then I checked the value of persist property of a field that's added to the prototype after the model is defined and turned out it's undefined. This is because you are not truly creating an Ext.data.Field instance that would inherit all Field defaults and other useful stuff, you're simply adding a plain Object to the fields collection. Osgaar.prototype.fields is just a MixedCollection and since you're working with it directly, there's no place where Ext.data.Field constructor might be called implicitly.
If it's common for your application logic to add Model fields on the fly, consider implementing an addField() method to your custom Models (create another base class in the inheritance chain).
Hope this helps, good luck!
I've been using ExtJS for quite a while so this was like a quiz to me :)
I found another solution for my original issue.
Instead of adding the fields to the prototype of the model I did the following:
Osgaar_Temp = Osgaar;
delete Osgaar;
Ext.define('Osgaar', {
extend: 'Osgaar_Temp',
fields:
[
{ name: 'typeCategories', type: 'string' }
]
});
This seems to be the best solution.

Categories

Resources