I'm currently trying to make changes to an existing DB using the migrations plugin for PersistenceJS. I can add/edit/delete items in the DB just fine — but…
How to add a column to an existing(!) table?
How to change the type of an existing(!) column, e.g. from 'text' to 'integer'?
These changes should retain currently existing data.
Sadly, the documentation is a little scarce, maybe you could help?
Here's the current, working setup:
persistence.store.websql.config(persistence, 'tododatabase', 'todos are fun', 5*1024*1024);
var Todo = persistence.define('Todo', {
task: 'TEXT',
priority: 'INT',
done: 'BOOL'
});
persistence.schemaSync();
function addTodo( item ){
var todo = new Todo();
todo.task = item.task;
todo.priority = item.priority;
todo.done = item.done;
persistence.add(todo);
persistence.flush();
};
function deleteTodo( item, callback ){
// item.id was created automatically by calling "new Todo()"
Todo.all().filter('id','=', item.id ).destroyAll( function(){
persistence.flush( callback );
});
};
The migration code that kinda works:
persistence.defineMigration(1, {
up: function() {
this.createTable('Todo', function(t){
t.text('task');
t.integer('priority');
t.boolean('done');
});
},
down: function() {
this.dropTable('Todo');
}
});
persistence.defineMigration(2, {
up: function() {
this.addColumn('Todo', 'due', 'DATE');
},
down: function() {
this.removeColumn('Todo', 'due');
}
});
function migrate( callback ){
console.log('migrating...');
persistence.migrations.init( function(){
console.log('migration init');
// this should migrate up to the latest version, in our case: 2
persistence.migrate( function(){
console.log('migration complete!');
} );
});
}
Results…
calling migrate() will only log up to "migration init", the complete handler is never called, the "due" column is not created
not calling schemaSync() before calling migrate() as Zef Hemel himself proposed in this post yields the same result as 1.
changing the first line to persistence.store.websql.config(persistence, 'newdatabase', 'testing migration', 5*1024*1024);, not calling schemaSync() and only calling migrate() will successfully log "migration complete!" — but it does so in a new, completely empty database "newdatabase", which will of course not retain any exsiting data.
Summary
There is a database that was created using persistence.store.websql.config(...), persistence.define('Todo',...) and persistence.schemaSync().
I now want to keep all the data that already exist in that database, but want to
change the type of column priority from 'integer' to 'text'
add a column due with type 'date' to all existing Todos
If you could push me in the right direction, I'd greatly appreciate it!
Thanks!
I finally got it working. There are a number of issues with my initial requirements that I'd like to point out for future reference. Take a look at the first migration definition:
persistence.defineMigration(1, {
up: function() {
this.createTable('Todo', function(t){
...
Not surprisingly, createTable will do exactly that: it will execute the SQL statement 'CREATE TABLE Todo ...', which will silently fail and halt the migration if there is a table with the name Todo already. This is why it worked with a new database, but not with the existing one. Bear in mind: I already had a live database with a table "Todo" that needed updating. If you're starting fresh (i.e. you've not used schemaSync), createTable works just fine. Since the Migrations plugin does not provide a createTableIfNotExists method, I needed to utilize executeSql as follows:
persistence.defineMigration(1, {
up: function() {
this.executeSql('CREATE TABLE IF NOT EXISTS Todo (id VARCHAR(32) PRIMARY KEY, task TEXT, priority INT, done BOOL)');
...
Now that the migration from schema version 0 to 1 succeeded, the migration to version 2 was successful as well.
With the migration to version 3 the type of the priority column needed to change from int to text. This would normally be done using the ALTER COLUMN SQL command, wich is not supported by Web SQL / SQLite. See Omitted Features for SQLite.
Altering a column with SQLite requires a 4-step workaround:
persistence.defineMigration(3, {
up: function() {
// rename current table
this.executeSql('ALTER TABLE Todo RENAME TO OldTodo');
// create new table with required columns and column types
this.executeSql('CREATE TABLE Todo (id VARCHAR(32) PRIMARY KEY, task TEXT, priority TEXT, done BOOL)');
// copy contents from old table to new table
this.executeSql('INSERT INTO Todo(id, task, priority, done) SELECT id, task, priority, done FROM OldTodo');
// delete old table
this.executeSql('DROP TABLE OldTodo');
},
...
Of course, after changing the column type, the entity definition for 'Todo' should also be changed:
var Todo = persistence.define('Todo', {
task: 'TEXT',
priority: 'TEXT', // was 'INT'
due: 'DATE',
done: 'BOOL'
});
And finally, the complete source:
persistence.store.websql.config(persistence, 'tododatabase', 'todos are fun', 5*1024*1024);
// persistence.debug = true;
//v0 + v1
// var Todo = persistence.define('Todo', {
// task: 'TEXT',
// priority: 'INT',
// done: 'BOOL'
// });
//v2
// var Todo = persistence.define('Todo', {
// task: 'TEXT',
// priority: 'INT',
// due: 'DATE',
// done: 'BOOL'
// });
//v3
var Todo = persistence.define('Todo', {
task: 'TEXT',
priority: 'TEXT',
due: 'DATE',
done: 'BOOL'
});
persistence.defineMigration(1, {
up: function() {
this.executeSql('CREATE TABLE IF NOT EXISTS Todo (id VARCHAR(32) PRIMARY KEY, task TEXT, priority INT, done BOOL)');
},
down: function() {
this.dropTable('Todo');
}
});
persistence.defineMigration(2, {
up: function() {
this.addColumn('Todo', 'due', 'DATE');
},
down: function() {
this.removeColumn('Todo', 'due');
}
});
persistence.defineMigration(3, {
up: function() {
// rename current table
this.executeSql('ALTER TABLE Todo RENAME TO OldTodo');
// create new table with required columns
this.executeSql('CREATE TABLE Todo (id VARCHAR(32) PRIMARY KEY, task TEXT, priority TEXT, due DATE, done BOOL)');
// copy contents from old table to new table
this.executeSql('INSERT INTO Todo(id, task, priority, due, done) SELECT id, task, priority, due, done FROM OldTodo');
// delete current table
this.executeSql('DROP TABLE OldTodo');
},
down: function() {
this.executeSql('ALTER TABLE Todo RENAME TO OldTodo');
this.executeSql('CREATE TABLE Todo (id VARCHAR(32) PRIMARY KEY, task TEXT, priority INT, due DATE, done BOOL)');
this.executeSql('INSERT INTO Todo(id, task, priority, due, done) SELECT id, task, priority, due, done FROM OldTodo');
this.executeSql('DROP TABLE OldTodo');
}
});
function migrate( callback ){
console.log('migrating...');
persistence.migrations.init( function(){
console.log('migration init');
persistence.migrate( function(){
console.debug('migration complete!');
callback();
} );
});
};
migrate( onMigrationComplete );
function onMigrationComplete(){
// database is ready. do amazing things...
};
That's a great explanation, thank you! But I think I know an easier way to achieve this.
I got in the same trouble like you: I'v got a set of schemas described with persistence.define and created with persistence.schemaSync.
So this is my particular case:
// This is my mixin for all schemas
var Versioned = persistence.defineMixin('Versioned', {
serverId: "TEXT",
intVersion: "INT",
dtSynced: "DATE",
dtCreatedAt: "DATE",
dtUpdatedAt: "DATE",
delete: "BOOL",
update: "BOOL",
add: "BOOL",
isReadOnly: "BOOL"
});
// This is one of the schemas I need to update with a new field.
var Person = persistence.define('Person', {
fullName: "TEXT",
rate: "INT"
});
//... More schema definitions
// Setup mixin
Person.is(Versioned);
// Sync schemas
persistence.schemaSync();
Ok. Nothing special about it. Now after a few months my app's being in production I want to add a new field isEmployed to the Person schema.
According to the docs I should rewrite all of my schema definitions to the migrations and to stop using persistence.schemaSync(). But I don't want to rewrite all of my definitions. Instead of it I define a new migration right behind the PersistenceJS init code:
// Init ORM
persistence.store.websql.config(
persistence,
'Sarafan',
'0.0.2',
'Sarafan.app database',
100 * 1024 * 1024,
0
);
// Define Migrations
persistence.defineMigration(1, {
up: function () {
this.addColumn('Person', 'isEmployed', 'BOOL');
}
});
// ... describing isVersioned mixin
// Updated schema definition with a new field 'isEmployed'
var Person = persistence.define('Person', {
fullName: "TEXT",
rate: "INT",
isEmployed: "BOOL"
});
//... More schema definitions
// Setup mixin
Person.is(Versioned);
// Apply the migration right away from the schemaSync call.
persistence.schemaSync(function (tx) {
persistence.migrations.init(function () {
persistence.migrate(function(){
// Optional callback to be executed after initialization
});
});
});
So that's it! I tested this approach only to add new fields to the schema.
Let me know if it does or doesn't work for you.
Related
I will get through to the point already. I'm having a problem of updating the rows after I have changed the status column attribute.
up: function(queryInterface, Sequelize) {
return queryInterface.changeColumn('projects', 'status', {
type: Sequelize.ENUM('processing', 'unassigned', 'ongoing', 'completed'),
allowNull: false,
defaultValue: 'unassigned'
}).then(function() {
return Project.update({
status: 'unassigned'
}, {
where: {
status: 'processing'
}
});
});
}
The Project.update() seems not working in any case but changing the attributes of the column works.
Any idea guys? I'm somehow a newbie in sequelize and any idea would be a great help. Thanks.
Depending on how you execute the migration ( via sequelize-cli or programmatically via umzug ). There is a different way to expose the table via the ORM.
In your case you have queryInterface passed as an argument to your function. So you can do a "raw query" via the attached sequelize property.
up: function(queryInterface, Sequelize) {
return queryInterface.changeColumn('projects', 'status', {
type: Sequelize.ENUM('processing', 'unassigned', 'ongoing', 'completed'),
allowNull: false,
defaultValue: 'unassigned'
}).then(function() {
return queryInterface.sequelize
.query("UPDATE projects SET status='unassigned' WHERE status='processing'");
});
}
By doing this you will make a raw Query to your database.
You can check out this gist for more details on an advanced way of using the ORM inside the migration.
I'm a fan of using umzug programmatically, which executes the migrations and also provides the initialized models of your database. If you configure it properly, you will benefit the exposed models ( e.g. sequelize.model('project').update() ) and have a better looking code.
I have the following models in my Sailsjs application with a many-to-many relationship:
event.js:
attributes: {
title : { type: 'string', required: true },
description : { type: 'string', required: true },
location : { type: 'string', required: true },
maxMembers : { type: 'integer', required: true },
currentMembers : { collection: 'user', via: 'eventsAttending', dominant: true },
creator : { model: 'user', required: true },
invitations : { collection: 'invitation', via: 'eventID' },
tags : { collection: 'tag', via: 'taggedEvents', dominant: true },
lat : { type: 'float' },
lon : { type: 'float' },
},
tags.js:
attributes: {
tagName : { type: 'string', unique: true, required: true },
taggedEvents : { collection: 'event', via: 'tags' },
},
Based on the documentation, this relationship looks correct. I have the following method in tag.js that accepts an array of tag strings, and an event id, and is supposed to add or remove the tags that were passed in:
modifyTags: function (tags, eventId) {
var tagRecords = [];
_.forEach(tags, function(tag) {
Tag.findOrCreate({tagName: tag}, {tagName: tag}, function (error, result) {
tagRecords.push({id: result.id})
})
})
Event.findOneById(eventId).populate('tags').exec(function(error, event){
console.log(event)
var currentTags = event.tags;
console.log(currentTags)
delete currentTags.add;
delete currentTags.remove;
if (currentTags.length > 0) {
currentTags = _.pluck(currentTags, 'id');
}
var modifiedTags = _.pluck(tagRecords, 'id');
var tagsToAdd = _.difference(modifiedTags, currentTags);
var tagsToRemove = _.difference(currentTags, modifiedTags);
console.log('current', currentTags)
console.log('remove', tagsToRemove)
console.log('add', tagsToAdd)
if (tagsToAdd.length > 0) {
_.forEach(tagsToAdd, function (tag) {
event.tags.add(tag);
})
event.save(console.log)
}
if (tagsToRemove.length > 0) {
_.forEach(tagsToRemove, function (tagId) {
event.tags.remove(tagId)
})
event.save()
}
})
}
This is how the method is called from the event model:
afterCreate: function(record, next) {
Tag.modifyTags(tags, record.id)
next();
}
When I post to event/create, I get this result: http://pastebin.com/PMiqBbfR.
It looks as if the method call itself is looped over, rather than just the tagsToAdd or tagsToRemove array. Whats more confusing is that at the end, in the last log of the event, it looks like the event has the correct tags. When I then post to event/1, however, the tags array is empty. I've also tried saving immediately after each .add(), but still get similar results.
Ideally, I'd like to loop over both the tagsToAdd and tagsToRemove arrays, modify their ids in the model's collection, and then call .save() once on the model.
I have spent a ton of time trying to debug this, so any help would be greatly appreciated!
There are a few problems with your implementation, but the main issue is that you're treating certain methods--namely .save() and .findOrCreate as synchronous methods, when they are (like all Waterline methods) asynchronous, requiring a callback. So you're effectively running a bunch of code in parallel and not waiting for it to finish before returning.
Also, since it seems like what you're trying to do is replace the current event tags with this new list, the method you came up with is a bit over-engineered--you don't need to use event.tags.add and event.tags.remove. You can just use plain old update.
So you could probably rewrite the modifyTags method as:
modifyTags: function (tags, eventId, mainCb) {
// Asynchronously transform the `tags` array into an array of Tag records
async.map(tags, function(tag, cb) {
// For each tag, find or create a new record.
// Since the async.map `cb` argument expects a function with
// the standard (error, result) node signature, this will add
// the new (or existing) Tag instance to the resulting array.
// If an error occurs, async.map will exit early and call the
// "done()" function below
Tag.findOrCreate({tagName: tag}, {tagName: tag}, cb);
}, function done (err, tagRecords) {
if (err) {return mainCb(err);}
// Update the event with the new tags
Event.update({id: eventId}, {tags: tagRecords}).exec(mainCb);
});
}
See the full docs for async.map here.
If you wanted to stick with your implementation using .add and .remove, you would still want to use async.map, and do the rest of your logic in the done method. You don't need two .save calls; just do run all the .add and .remove code first, then do a single .save(mainCb) to finish it off.
And I don't know what you're trying to accomplish by deleting the .add and .remove methods from currentTags (which is a direct reference to event.tags), but it won't work and will just cause confusion later!
I have an MVC ListBoxFor control that I'm trying to bind data to and update using a Kendo MultiSelectFor.
The idea being that there is a list of users in the ListBox, and a list of available users in the MultiSelect box. When users are selected from the MultiSelect box and the add button clicked, an Ajax call is made to an action that updates the users list server side (through various API calls, which all work fine) and client side JavaScript is used to update the users and available users array object and the binding keeps the controls up to date with the updated lists.
I wish I could pin this down to just one issue, but honestly every time I try something I come up with different errors, so I'll just go with the latest iteration.
Model:
public IEnumerable<UserInformation> Users { get; set; }
public IEnumerable<UserInformation> AvailableUsers { get; set; }
JavaScript ViewModel:
var viewModel = kendo.observable({
availableUsersSelected: [],
users: #(Html.Raw(Json.Encode(this.Model.Users))),
availableUsers: #(Html.Raw(JsonConvert.SerializeObject(this.Model.AvailableUsers))),
moveToUsers: function () {
this.availableUsersSelected = this.get('availableUsersSelected');
this.users.push(this.availableUsers);
if (this.availableUsersSelected.length > 0) {
var formAction = '#Url.Combine(Url.Content("~/"), ControllerActions.Groups.GroupDefault, ControllerActions.Groups.AddUser)';
$.ajax({
url: formAction,
type: 'POST',
data: {
model: JSON.stringify(
{
groupId: $('#GroupId').val(),
users: this.availableUsersSelected
}
)
},
success: function (result) {
if (result) {
this.users.remove(this.availableUsersSelected);
}
}
});
}
}
});
MultiSelectFor control
#(Html.Kendo()
.MultiSelectFor(u => u.AvailableUsers)
.Placeholder("Please select")
.BindTo(new SelectList(Model.AvailableUsers, "Id", "Name"))
.HtmlAttributes(new { data_bind = "value: availableUsersSelected" })
)
ListBox control
#(Html.EditorLine(Language.Fields.Users, Html.ListBoxFor(u => u.Users, new SelectList(Model.Users, "Id", "Name"), new { #class = "form-control", data_bind = "source: users", data_value_field ="Id", data_text_field = "Name" })))
Add control
<img src="~/Content/images/up-arrow.jpg" alt="Move to users" width="30" data-bind="events: {click: moveToUsers}" />
To reiterate, the Ajax call and updating server side all work fine, it's the client side control binding that I'm struggling to understand.
The errors I'm getting are 1) a syntax error with the comma on this line users: #(Html.Raw(Json.Encode(this.Model.Users))), and the line after it (same thing, effectively), and 2) a "ReferenceError: Id is not defined" on the moveToUsers function call when the add button is pressed.
(I can honestly say that the amount of frustration I'm experiencing with this is driving me insane, so sorry if it came across in the question)
So after calming down a bit, reading a few more bits of the documentation about data binding and observable arrays, I realised I was making a few fundamental errors.
JavaScript ViewModel:
var viewModel = {
availableUsersSelected: new kendo.data.ObservableArray([]),
users: new kendo.data.ObservableArray(#(Html.Raw(Json.Encode(this.Model.Users)))),
availableUsers: new kendo.data.ObservableArray(#(Html.Raw(Json.Encode(this.Model.AvailableUsers)))),
moveToUsers: function () {
if (viewModel.availableUsersSelected.length > 0) {
var formAction = '#Url.Combine(Url.Content("~/"), ControllerActions.Groups.GroupDefault, ControllerActions.Groups.AddUser)';
$.ajax({
url: formAction,
type: 'POST',
data: {
model: JSON.stringify(
{
groupId: $('#GroupId').val(),
users: viewModel.availableUsersSelected
}
)
},
success: function (result) {
if (result) {
removeFromAvailableUsers();
}
else
alert('add failed!');
},
failure: function () {
alert('ajax failed!');
}
});
}
}
};
function removeFromAvailableUsers() {
for (var i = 0; i < viewModel.availableUsersSelected.length; ++i) {
viewModel.users.push(viewModel.availableUsersSelected[i]);
viewModel.availableUsers.remove(viewModel.availableUsersSelected[i]);
}
var ele = $('#AvailableUsers').data("kendoMultiSelect");
ele.value("");
ele.input.blur();
};
The main differences are instead of declaring the entire object as a kendo observable are declaring each array as an observable array, then referencing them through the viewModel object instead of assuming that the "this" scope will encapsulate them.
Then, as D_Learning mentioned in the comments above, I was unnecessarily using two bindings for the MultiSelect control, so that then became:
#(Html.Kendo()
.MultiSelectFor(u => u.AvailableUsers)
.Placeholder("Please select")
.HtmlAttributes(new { data_bind = "source: availableUsers, value: availableUsersSelected", data_value_field = "Id", data_text_field = "Name" })
)
(Notice no ".BindTo" property)
Aside from that, the MVC side of things stayed the same and it all words perfectly.
If you wish to remove or add data to the Kendo Multiselect then you will need to add them via the DataSource as:
$("#AvailableUsers").data("kendoMultiSelect").dataSource.add({"text": "new Item", "value": 1000});
For more detail about Adding or removing Items to Multiselect (Kendo DataSrouce) see: Kendo DataSource Adding Removing Items
Similarly you can remove the item from the Listbox as below:
var selectedIndex = ListBox1.selectedIndex();
clearSelection();
if (selectedIndex != -1) {
ListBox1.options.remove(selectedIndex);
For more detail about Adding or removing Items from HTML Listbox see: HTML Listbox Items Manipulation.
Please let me know if you have any error after this.
EDIT: ANSWER FOUND
I found the answer in this post. there is a private store config field called remoteSort that is set to true by default, so client-side sorters won't get used unless remoteSort is explicitly set to false.
I'm trying to sort a grid of Features by the number of User Stories in them. I've tried a couple different things. The first was a sorter function in the data Store Configuration:
Ext.create('Rally.data.wsapi.Store',{
model: 'PortfolioItem/Feature',
autoLoad:true,
start: 0,
pageSize: 20,
fetch: ['Name', 'ObjectID', 'Project', 'Release', 'UserStories','State','_ref'],
context://context stuff
filters:[
//filter stuff
],
sorters:[
{
property:'UserStories',
sorterFn: function(o1, o2){
return o1.get('UserStories').Count - o2.get('UserStories').Count;
}
}
],
listeners: //listener stuff
But this always returned nothing. (when the sorter was not included, i did get back all the Correct Features, but I could not sort the the number of user stories per Feature).
I also tried adding a sorter to the column in the grid, as seen in this post:
xtype: 'rallygrid',
width:'50%',
height:400,
scroll:'vertical',
columnCfgs: [
{
text:'Stories',
dataIndex:'UserStories',
xtype:'numbercolumn',
sortable:true,
doSort: function(direction){
var ds = this.up('grid').getStore();
var field = this.getSortParam();
console.log(ds, field, direction);
ds.sort({
property: field,
direction:direction,
sorterFn: function(us1, us2){
return (direction=='ASC'? 1 : -1) * (us1.get(field).Count - us2.get(field).Count);
}
});
},
width:'20%',
renderer:function(us){
return us.Count;
}
}
]
But I was having the same issues that the person in the other thread was having, where nothing was getting sorted.
I'm thinking that I am overlooking something simple - I am so close to making this work. :)
I have a grid that needs to be updated with server information.
Here is the way that it should work:
A user selects an item
Make a JsonRest query with the item ID selected
Update the grid - showing notes relating to item selected
Here is how the grid is setup:
function noteTabSetup() {
var store = JsonRest({target:"//localhost/program/notes", idAttribute:"id"});
var structure = [{ field: 'id', name: 'Id', width: '5em' },
{ field: 'name', name: 'Name', width: '12%' },
{ field: 'description', name: 'Description' }];
var noteGrid = new Grid({
id: 'noteGrid',
pageSize: 20,
store: store,
cacheClass: Cache,
structure: structure,
filterServerMode: true,
selectRowTriggerOnCell: true,
bodyLoadingInfo: "Loading notes ...",
bodyEmptyInfo: "No notes found",
modules: [SingleSort, VirtualVScroller, moveColumn,
selectColumn, dndColumn, selectRow, Filter]}, noteTab);
noteGrid.startup();
When an item is selected, the selected item ID is passed to:
function noteLoad(itemId) {
console.log("In NoteLoad");
var grid = registry.byId("noteGrid");
if (!itemIds || 0 === itemIds.length) { console.log("no ItemId chosen"); }
else {
console.log("In NoteLoad with an itemId");
grid.model.clearCache();
// Error on second run
grid.store.query({ find: "ByItem", item: itemId }).then(function(result) {
grid.setStore(new ItemFileReadStore({data: {items : result}}));
});
grid.body.refresh();
console.log("model: " + grid.rowCount());
};
};
On the first item selected, everything works well - the query fires, and the grid is updated with notes related to the selected item.
On the second item selected, I receive this error from firebug:
TypeError: grid.store.query is not a function
grid.store.query({ find: "ByItem", item: itemIds }).then(function(result) {
-----------------------------------^
Any ideas?! Thank you in advance.
Chris
Thank you for the reply - that makes sense that store was being replaced by ItemFileReadStore. If possible, I would like to use JsonRest directly to update the grid.
I've tried a handful of variations based off of your comment, without luck:
Query fires and result is returned. Grid is not updated:
grid.model.clearCache();
grid.store.query({ find: "ByItem", item: itemIds }).then(function(results){
console.log('notes: ' + results[0].name);
});
grid.body.refresh();
Error: grid.store.fetch is not a function:
grid.store.fetch({ query: { find: "ByItem", item: itemIds }});
Syntax error in Dojo.js (line 15):
grid.store.query({ find: "ByItem", item: itemIds }).then(function(result) {
grid.setStore(new JsonRest({data: {items : result}}));
});
I've done a lot of searches and can't find a good example where the grid is being updated from a JsonRest object. Thank you.
Because the code itself replaces the store the first time.
grid.store.query({ find: "ByItem", item: itemId }).then(function(result) {
grid.setStore(new ItemFileReadStore({data: {items : result}}));
});
Here, the grid's store is initially JsonRest store, which after the query method is run, is replaced by the new ItemFileReadStore object. The mistake here is "query" is not a method of ItemFileReadStore, but the parameter passed to the "fetch" method. Check out some examples from dojo documentation on this.
On the other hand, JsonRest store has the method "query". Hence the contradiction. Change your code accordingly if you want ItemFileReadStore.
eg:-
store.fetch( { query: { name: 'Ice cream' },
onItem: function(item) {
console.log( store.getValue( item, 'name' ) );
console.log( 'cost: ', store.getValue( item, 'cost' ) );
}
});