I'm using Angular Fullstack for an web app.
I'm posting my data by $http.post() my object:
{ title: "Some title", tags: ["tag1", "tag2", "tag3"] }
When I edit my object and try to $http.put() for example:
{ title: "Some title", tags: ["tag1"] }
In console I get HTTP PUT 200 but when I refresh the page I still recive the object with all 3 tags.
This is how I save in the MongoDB:
exports.update = function(req, res) {
if (req.body._id) {
delete req.body._id;
}
Question.findByIdAsync(req.params.id)
.then(handleEntityNotFound(res))
.then(saveUpdates(req.body))
.then(responseWithResult(res))
.catch(handleError(res));
};
function saveUpdates(updates) {
return function(entity) {
var data = _.merge(entity.toJSON(), updates);
var updated = _.extend(entity, data);
return updated.saveAsync()
.spread(function(updated) {
return updated;
});
};
}
Can someone explain how to save the object with removed items?
What I'm doing wrong?
This is pretty bad practice to use things like _.merge or _.extend in client ( meaning your nodejs client to database and not browser ) code after retrieving from the database. Also notably _.merge is the problem here as it is not going to "take away" things, but rather "augment" what is already there with the information you have provided. Not what you want here, but there is also a better way.
You should simply using "atomic operators" like $set to do this instead:
Question.findByIdAndUpdateAsync(
req.params.id,
{ "$set": { "tags": req.body.tags } },
{ "new": true }
)
.then(function(result) {
// deal with returned result
});
You also really should be targeting your endpoints and not having a "generic" object write. So the obove would be specically targeted at "PUT" for related "tags" only and not touch other fields in the object.
If you really must throw a whole object at it and expect an update from all the content, then use a helper to fix the update statement correctly:
function dotNotate(obj,target,prefix) {
target = target || {},
prefix = prefix || "";
Object.keys(obj).forEach(function(key) {
if ( typeof(obj[key]) === "object" ) {
dotNotate(obj[key],target,prefix + key + ".");
} else {
return target[prefix + key] = obj[key];
}
});
return target;
}
var update = { "$set": dotNotate(req.body) };
Question.findByIdAndUpdateAsync(
req.params.id,
update,
{ "new": true }
)
.then(function(result) {
// deal with returned result
});
Which will correctly structure not matter what the object you throw at it.
Though in this case then probably just directly is good enough:
Question.findByIdAndUpdateAsync(
req.params.id,
{ "$set": req.body },
{ "new": true }
)
.then(function(result) {
// deal with returned result
});
There are other approaches with atomic operators that you could also fit into your logic for handling. But it is best considered that you do these per element, being at least root document properties and things like arrays treated separately as a child.
All the atomic operations interact with the document "in the database" and "as is at modification". Pulling data from the database, modifiying it, then saving back offers no such guarnatees that the data has not already been changed and that you just may be overwriting other changes already comitted.
I truth your "browser client" should have been aware that the "tags" array had the other two entries and then your "modify request" should simply be to $pull the entries to be removed from the array, like so:
Question.findByIdAndUpdateAsync(
req.params.id,
{ "$pull": { "tags": { "$in": ["tag2", "tag3"] } } },
{ "new": true }
)
.then(function(result) {
// deal with returned result
});
And then, "regardless" of the current state of the document on the server when modified, those changes would be the only ones made. So if something else modified at added "tag4", and the client had yet to get the noficiation of such a change before the modification was sent, then the return response would include that as well and everything would be in sync.
Learn the update modifiers of MongoDB, as they will serve you well.
Related
I'm trying to make a CRUD operations with NodeJS. But i don't know how to check if JSON file already got the specific part and then update only needed object and not overwrite the other data or add the new data if there is no record of it.
this is what i have so far ( right now the insert operation overwrites the whole file and leaves it with only inserted data ) :
JSON :
JSON:
{
"id": 1,
"name": "Bill",
},
{
"id": 2,
"name": "Steve",
},
Code :
var operation = POST.operation; // POST request comes with operation = update/insert/delete
if (operation == 'insert') {
fs.readFile("data.json", "utf8", function (err) {
var updateData = {
id: POST.id,
name: POST.name,
}
var newUser = JSON.stringify(updateData);
fs.writeFile("data.json", newUsers, "utf8");
console.log(err);
})
}
else if (operation == 'update') {
fs.readFile("data.json", "utf8", function (err) {
})
}
else if (operation == 'delete') {
fs.readFile("data.json", "utf8", function (err) {
})
}
else
console.log("Operation is not supported");
The most examples that i've found were with Database and Express.js. So they didn' really help much.
Sorry , i'm a newbie.
So first off, that is not valid JSON (unless this is only part of the file). You will need to wrap that whole list into an array for it to be valid JSON (You can check if your JSON is valid here)
If you go with a structure like this:
[
{
"id": 1,
"name": "Bill",
},
{
"id": 2,
"name": "Steve",
},
]
I think the easiest way to check if the ID already exists will be to read the JSON in as an array and check if the ID has already been assigned. Something like this:
var json = require('/path/to/data.json'); // Your object array
// In the if (operation == 'insert') block
var hadId = json.some(function (obj) {
return obj.id === POST.ID;
});
if (hasId) {
// Do something with duplicate
}
json.push({
id: POST.id,
name: POST.name
});
// Write the whole JSON array back to file
More on the array.some function
So basically you will be keeping the whole JSON file in memory (in the json array) and when you make a change to the array, you write the whole updated list back to file.
You may run into problems with this if that array gets very large, but at that point I would recommend looking into using a database. Mongo (all be it not the greatest DB in the world) is fairly easy to get setup and to integrate with JavaScript while playing and experimenting.
I hope this helps!
Good luck
So, I'm experimenting with AngularJS, and, as an exercise, figured I would make a simple application using the Steam API. I have made a simple Spring Boot Rest service, which provides a reverse proxy service for the Steam API, in such a way that certain calls can be forwarded. At this time there are two actions:
/user/ provides a list of steam id's.
/user/:id/games provides the output of the following api:
http://api.steampowered.com/IPlayerService/GetOwnedGames/v0001/?key=MY_STEAM_KEY&steamid=STEAM_ID&format=json
which returns an answer in the following format:
{
"response":{
"game_count":3,
"games":[
{
"appid":70000,
"playtime_forever":0
},
{
"appid":550,
"playtime_forever":0
},
{
"appid":274900,
"playtime_forever":0
}
]
}
}
What I want to achieve is to extract the games array from this json object, and append it to the correct user. And I want to do this for all users. I have achieved something close to what I want using the $resource object, by defining the following factories:
angular.module("SAM.Resources", [])
.factory("UserList", ["$resource",
function ($resource) {
return $resource('/user');
}])
.factory("GamesList", ["$resource",
function ($resource) {
return $resource('/user/:id/games', {
id: '#id'
});
}
]);
And then in my controller use the following:
UserList.query(function(response){
$scope.users = response ? response : [];
for(index=0; index < $scope.users.length; ++index){
user = $scope.users[index];
$scope.users[index].games = GamesList.get({id:user.id});
}
});
This is close to what I want, however, it returns something of the format:
{
"id": "76561198119953061",
"name": "Yuri",
"games": {
"response": {
"game_count": 3,
"games": [
{
"appid": 70000,
"playtime_forever": 0
},
{
"appid": 550,
"playtime_forever": 0
},
{
"appid": 274900,
"playtime_forever": 0
}
]
}
}
}
And I don't want the games.response.games construction. I have tried to change it to:
$scope.users[index].games = GamesList.get({id:user.id}).response.games;
which fails, seems logical, as it is a promise, and doesn't immediately contain the response object.
I've also tried to use something like
GamesList.get({id:user.id}), function(response){
angular.extend(user, response);
});
Which does indeed append the response to the user object, only the user object is always the last value in the array by the time the promise resolves.
So basically my question comes down to: How can I extend my User object with the Games list?
You need to change your code around a bit:
UserList.query(function(response){
$scope.users = response ? response : [];
for(index=0; index < $scope.users.length; ++index){
user = $scope.users[index];
(function(index, id){
GamesList.get({id: id}, function(response){ // Get the games from the response
$scope.users[index].games = response.response.games;
}));
})(index, user.id)
}
});
In the for loop, user keeps changing value. By the time the first GameList.get has a value returned, your loop will be at the last user already.
Wrapping that in an IIFE separates those variables in a separate scope.
for(index=0; index < $scope.users.length; ++index){
user = $scope.users[index];
$scope.users[index].games = GamesList.get({id:user.id}, function(response){
angular.extend(user, response);
}));
}
When you do that, the user variable will change at every step. But the anonymous callback will be executed later. So only the last user is used.
You can fix that by using an anonymous function as a scope with forEach :
$scope.users.forEach(function(user) {
$scope.users[index].games = GamesList.get({id:user.id}, function(response){
angular.extend(user, response);
}));
});
If you want to avoid the user.games.response.games, you need to merge the objects in a different way.
$scope.users.forEach(function(user) {
$scope.users[index].games = GamesList.get({id:user.id}, function(response){
user.games = response.games;
user.games_count = response.games_count;
}));
});
I receive a jsonObject and want to perfom a Mongo-DB update:
The jsonObject: "tablename":"1","inventar":[{"ean":"802.6180.222"},{"ean":"657.7412.878"}]}
The existing document (shortened):
"tablename": "1",
"accepted": false,
"inventar": [
{
"ean": "802.6180.222",
"accepted": "0"
},
{
"ean": "657.7412.878",
"accepted": "0"
}
],
I need to set the accepted value to "1" for each object in the Array (which is in the invetar-jsonObject.)
The Code:
app.post('/in_accept', function(request,response){
var jsonString=request.body.json;
var jsonObj = JSON.parse(jsonString);
var InUser = jsonObj.in_user;
var InDate = jsonObj.in_date;
var inventar = jsonObj.inventar; //is an Array
var tablename = jsonObj.tablename;
console.log(inventar);
var query = {"tablename": tablename};
var update = {"accepted": true, CODE FOR UPDATING INVENTAR};
var options = {"upsert": false, "new": true};
Move.findOneAndUpdate(query, update, options,
function(err,Move) {
console.log( Move );
});
response.json({"success": true});
});
I know that mongoDB provides the operator "each" but I stuck on the whole syntax.
For each "ean" the accepted value should be set on "1".
Thanks
The only real "sane" way to do this aside of retrieving the object via .findOne() or variant then making modfications in code and calling .save() ( which is not considered "sane" as the concurency issues juyst make that approach "mental' ), is to perform "multiple" updates, or essentially one for each array member you want to change.
Your "best" approach is to gig into the core driver right now and get access to the Bulk Operations API methods:
var input = { "tablename":"1","inventar":[{"ean":"802.6180.222"},{"ean":"657.7412.878"}]},
bulk = Move.collection.initializeOrderedBulkOp();
// Build the statements
input.inventar.forEach(function(inventar) {
bulk.find({
"tablename": input.tablename,
"inventar.ean": inventar.ean
}).updateOne({
"$set": { "inventar.$.accepted": 1 }
});
});
// Then execute
bulk.execute(function(err,result) {
if (!err) {
response.json({ "sucess": true })
} else {
// handle error
}
})
That makes sure that both requests are sent to the server at the same time in a single request and only one response.
Each "query" from the .find() matches an element in the array and returns it's "index" value via the positional $ operator, which is used in the "update" portion of the method to $set the value at the matched index position.
I think I may have an odd use case here. I've got a Code model with code, title, description attributes. Users are documenting work (healthcare), they enter the code, say 7, and 7 always means that something particular happened, say "The patient was cured." Whatever, doesn't matter. Point is, I don't want to bother saving the title and description in every model, but I want to be able to pull them for displaying.
So the API delivers an array of codes like [ 1, 13, "A4" ]. I'm trying to use both can.Model.parseModel and can.Map.define to coerce that array into Code models, but I'm having a hard time.
Why is parseModel, parseModels never called in this example? fiddle
Code = can.Model.extend({
parseModel: function(data) {
// return { code:data }
console.log('Never hit!');
},
parseModels: function() {
// ...
console.log('Never hit!');
}
},{
_title: can.compute(function() {
// return title from cached lookup
})
});
Model = can.Model.extend({
findAll: 'GET /Models'
},{
define: {
Codes: {
Type: Code.List
}
}
});
can.fixture('GET /Models', function() {
return [
{ Codes: [1,2,3] }, // I want to turn each number into an object
{ Codes: [4,5,6] },
{ Codes: [7,8,9] }
];
});
Model.findAll({});
.parseModels is only called during retrieval of CRUD service data.
To make your example work, you have to make a Model.parseModel convert each Code array to an an array of objects.
Alternately, you could change Model's define.Codes.Type to something like:
Codes: {
type: function(newVal){
if(newVal instanceof Code.List) {
return newVal
} else {
return new Code.List( newVal.map(function(num){ return {value: num}}) )
}
}
}
In products collection, i have an Array of recentviews which has 2 fields viewedBy & viewedDate.
In a scenario if i already have a record with viewedby, then i need to update it. For e.g if i have array like this :-
"recentviews" : [
{
"viewedby" : "abc",
"vieweddate" : ISODate("2014-05-08T04:12:47.907Z")
}
]
And user is abc, so i need to update the above & if there is no record for abc i have to $push.
I have tried $set as follows :-
db.products.update( { _id: ObjectId("536c55bf9c8fb24c21000095") },
{ $set:
{ "recentviews":
{
viewedby: 'abc',
vieweddate: ISODate("2014-05-09T04:12:47.907Z")
}
}
}
)
The above query erases all my other elements in Array.
Actually doing what it seems like you say you are doing is not a singular operation, but I'll walk through the parts required in order to do this or otherwise cover other possible situations.
What you are looking for is in part the positional $ operator. You need part of your query to also "find" the element of the array you want.
db.products.update(
{
"_id": ObjectId("536c55bf9c8fb24c21000095"),
"recentviews.viewedby": "abc"
},
{
"$set": {
"recentviews.$.vieweddate": ISODate("2014-05-09T04:12:47.907Z")
}
}
)
So the $ stands for the matched position in the array so the update portion knows which item in the array to update. You can access individual fields of the document in the array or just specify the whole document to update at that position.
db.products.update(
{
"_id": ObjectId("536c55bf9c8fb24c21000095"),
"recentviews.viewedby": "abc"
},
{
"$set": {
"recentviews.$": {
"viewedby": "abc",
"vieweddate": ISODate("2014-05-09T04:12:47.907Z")
}
}
)
If the fields do not in fact change and you just want to insert a new array element if the exact same one does not exist, then you can use $addToSet
db.products.update(
{
"_id": ObjectId("536c55bf9c8fb24c21000095"),
"recentviews.viewedby": "abc"
},
{
$addToSet:{
"recentviews": {
"viewedby": "abc",
"vieweddate": ISODate("2014-05-09T04:12:47.907Z")
}
}
)
However if you are just looking for for "pushing" to an array by a singular key value if that does not exist then you need to do some more manual handling, by first seeing if the element in the array exists and then making the $push statement where it does not.
You get some help from the mongoose methods in doing this by tracking the number of documents affected by the update:
Product.update(
{
"_id": ObjectId("536c55bf9c8fb24c21000095"),
"recentviews.viewedby": "abc"
},
{
"$set": {
"recentviews.$": {
"viewedby": "abc",
"vieweddate": ISODate("2014-05-09T04:12:47.907Z")
}
},
function(err,numAffected) {
if (numAffected == 0) {
// Document not updated so you can push onto the array
Product.update(
{
"_id": ObjectId("536c55bf9c8fb24c21000095")
},
{
"$push": {
"recentviews": {
"viewedby": "abc",
"vieweddate": ISODate("2014-05-09T04:12:47.907Z")
}
}
},
function(err,numAffected) {
}
);
}
}
);
The only word of caution here is that there is a bit of an implementation change in the writeConcern messages from MongoDB 2.6 to earlier versions. Being unsure right now as to how the mongoose API actually implements the return of the numAffected argument in the callback the difference could mean something.
In prior versions, even if the data you sent in the initial update exactly matched an existing element and there was no real change required then the "modified" amount would be returned as 1 even though nothing was actually updated.
From MongoDB 2.6 the write concern response contains two parts. One part shows the modified document and the other shows the match. So while the match would be returned by the query portion matching an existing element, the actual modified document count would return as 0 if in fact there was no change required.
So depending on how the return number is actually implemented in mongoose, it might actually be safer to use the $addToSet operator on that inner update to make sure that if the reason for the zero affected documents was not just that the exact element already existed.