AngularJS convert Date to getTime() before sending to server - javascript

I have a form that uses <input type="datetime-local" ng-bind="course.endDate".. and sets a variable of the model.
Before sending the date to the server I've to convert the date 2015-04-04T22:00:00.000Z to a integer given by the getTime().
In the controller i added this: course.endDate = course.endDate.getTime(); it works for the server side but angular complains in the console with this error. (as said, it works, but I would like to avoid errors)
Error: [ngModel:datefmt] Expected `1325458800000` to be a date
http://errors.angularjs.org/1.3.15/ngModel/datefmt?p0=1325458800000
at REGEX_STRING_REGEXP (angular.js:63)
at Array.<anonymous> (angular.js:19938)
at Object.ngModelWatch (angular.js:23419)
at Scope.$get.Scope.$digest (angular.js:14300)
at Scope.$get.Scope.$apply (angular.js:14571)
at done (angular.js:9698)
at completeRequest (angular.js:9888)
at XMLHttpRequest.requestLoaded (angular.js:9829)
How can i do then?
I had the idea of adding some fields that are used in the form (formEndDate) and convert to another one (endDate = formEndDate.getTime()) for the server side, but in this way the server refuse the call since the parameter formEndDate is not allowed, and if I remove the formEndDate then everything breaks.
additional problem:
When i fetch data from the server I have an integer that needs to be converted into a date to be used in the form. so I've to convert the date also before allowing an edit. how can I do this? (the data fetched are into an array, so would be terrific to have the conversion without having to iterate on the whole array)
solution
thanks to the two answers (I set correct the first that came) I also (somehow) solved the problem of the form when editing. I did so by creating an extra field and use it for the form when editing (I do inline editing).
I created a gist here

Before sending the data to the server, make a copy and set endDate. Then send the copy to the server:
var courseCopy = angular.copy(course);
courseCopy.endDate = courseCopy.endDate.getTime();

In angular it's good to keep all form data under one property, like:
$scope.formData = {endDate : 'xxx', ...};
Before sending data to server, you need to create a copy of formData:
var formDataCopy= angular.copy($scope.formData);
Then you can make any transfomation operations on given copy. Any changs will not affect data in your scope.

you can use transformRequest to transform the request's body, like so (taken from the official docs:
function appendTransform(defaults, transform) {
// We can't guarantee that the default transformation is an array
defaults = angular.isArray(defaults) ? defaults : [defaults];
// Append the new transformation to the defaults
return defaults.concat(transform);
}
$http({
url: '...',
method: 'GET',
transformRequest: appendTransform($http.defaults.transformRequest, function(value) {
// transform the payload here
return value;
}),
transformResponse: appendTransform($http.defaults.transformResponse, function(value) {
// transform the response here
return value;
})
});

Related

passing non-url encoded parameters to an ajax call in node.js

I am trying to pass parameters from my website to a couchdb server through a node.js server.
I absolutely need to pass {} in a url. Not a string, not an empty object, the actual {} characters. It is used to define the end_key parameter in couchdb views.
At the moment, my call goes like this :
let url = "/trades";
let ajax_options = {
data:{
design_name:'bla',
view_name:'blabla',
params_view:{
group_level:2,
start_key:["1",0],
end_key:["1",{}]
}
}
};
$.ajax(url,ajax_options).then((res) => { ... });
when it passes through NodeJs and the nano library with
db.view(req.query.design_name, req.query.view_name, req.query.params_view)
the end_key object in params_view becomes ["1"] instead of ["1",{}] which I would like to see.
I have verified that with the correct value for end_key, the view gives me the expected result.
How to prevent that behavior from occurring ?

API Connect - 500 error when including basic Javascript

I'm trying some basic API Connect tutorials on IBM's platform (running locally using loopback) and have got completely stuck at an early point.
I've built a basic API service with some in-memory data and setter / getter functions. I've then built a separate API which takes two GET parameters and uses one of my getter functions to perform a search based on two criteria. When I run it, I successfully get a response with the following JSON object:
[{"itemId":1,"charge":9,"itemSize":2,"id":2}]
I've then tried to add a piece of server logic that modifies the response data - at this point, I'm just trying to add an extra field. I've added a Javascript component in the Assemble view and included the following code (taken from a tutorial), which I thought should modify the message body returned by the API while still passing it through:
//APIC: get the payload
var json = apim.getvariable('message.body');
//console.error("json %s", JSON.stringify(json));
//same: code to inject new attribute
json.platform = 'Powered by IBM API Connect';
//APIC: set the payload
//message.body = json;
apim.setvariable('message.body', json);
Instead of getting an extra JSON parameter ("platform"), all I get is a 500 error when I call the service. I'm guessing that I'm doing something fundamentally wrong, but all the docs suggest these are the right variable names to use.
You can't access json.platform but at that point json variable is json type. Are you sure that you can add a property to a json type variable if your json object lacks of that property? I mean: What if you first parse the json variable of json type to a normal object, then add new property, and finally stringify to json type again for body assigning purposes?
var json = JSON.parse(apim.getvariable('message.body')); //convert to normal object
json.platform = 'Powered by IBM API Connect'; //add new property
apim.setvariable('message.body', JSON.stringify(json)); //convert to json again before setting as body value
You need to get the context in some determined format, and in this function do your logic. For example if your message is in json you need to do:
apim.readInputAsJSON(function (error, json) {
if (error)
{
// handle error
apim.error('MyError', 500, 'Internal Error', 'Some error message');
}
else
{
//APIC: get the payload
var json = apim.getvariable('message.body');
//console.error("json %s", JSON.stringify(json));
if(json){
//same: code to inject new attribute
json.platform = 'Powered by IBM API Connect';
//APIC: set the payload
//message.body = json;
apim.setvariable('message.body', json);
}
}
});
Reference:
IBM Reference
You have the message.body empty, put a invoke/proxy policy before your gateway/javascript policy for example.

'An undeclared property' when trying to create record via Web API

I am getting an error which I just cannot seem to debug. I am trying to create a custom activity entity via custom HTML/JavaScript web resource.
The user clicks a button and the following params:
var params = {
'rob_faqid#odata.bind': '/rob_faqs(guid-here)',
'rob_source': 180840000,
'subject': 'Signpost',
'actualstart': new Date(),
'actualend': new Date()
};
Are passed to this URL:
https://dynamicsorg/api/data/v8.2/rob_quickactions/
With the following headers:
xhr.setRequestHeader('OData-MaxVersion', '4.0');
xhr.setRequestHeader('OData-Version', '4.0');
xhr.setRequestHeader('Accept', 'application/json');
xhr.setRequestHeader('Content-Type', 'application/json; charset=utf-8');
xhr.setRequestHeader('Prefer', 'return=representation');
This gives me a HTTP code of 400 (bad request) and this error message:
An undeclared property 'rob_faqid' which only has property annotations in the payload but no property value was found in the payload. In OData, only declared navigation properties and declared named streams can be represented as properties without values.
Interestingly, I get this error whether I use an actual GUID or if I put some gibberish in there (suggesting it is not to do with the value being passed in).
I can create the records manually via the standard form.
I am using the odata.bind elsewhere within the same project with no errors.
After a good night's sleep I realised my error. To set the value of a lookup field, you need to use the relationship scheme name, and not the property name.
Once I changed that, all worked fine.
When you want to set the value of a lookup field during the creation or update of a (new) record via the web API, you have to use either the Schema Name or the Logical Name of the lookup followed by the bind annotation.
For default fields like primarycontactid the logical name has to be used (first column in the screenshot).
For custom fields like rob_FaqId the schema name has to be used (second column in the screenshot).
var params = {
'rob_FaqId#odata.bind': '/rob_faqs(guid-here)',
'rob_source': 180840000,
'subject': 'Signpost',
'actualstart': new Date(),
'actualend': new Date()
};
Screenshot of a solution > entities > your entity > fields:
So the general structure to create a new record with an already set lookup field via the web API is this:
{
"logicalorschemaName#odata.bind": "/relatedentitys(guid)" //don't forget the plural 's'
}
Or another example from the official documentation. How to create a new account record and directly assign an already existing contact as the primary contact.
var newAccountRecordObj = {
"name": "Sample Account",
"primarycontactid#odata.bind": "/contacts(00000000-0000-0000-0000-000000000001)"
}
While the accepted answer is correct in this instance, it doesn't seem to be the whole story. In some cases it's necessary to use <logical name>_<entity name>. For instance when doing a POST sharepointdocumentlocations, I had to use:
"regardingobjectid_contact#odata.bind": "/contacts(xxxx)"
"parentsiteorlocation_sharepointdocumentlocation#odata.bind" "/sharepointdocumentlocations(xxx)"
This may be something to do with the fact that those relationships can point to more than one type of entity, but I haven't found any Microsoft documentation about it.

How to parse node js response of mongodb data in angular?

I've an http server in node [not express]. On button click I've a get method, which then pulls documents from mongodb (using mongoose) and displays it on angular page.
on button click:
$http.get('/get').success(function(response){
console.log(response);
//logic to store JSON response of database and perform repeat to display each document returned on UI
});
In Node code where server is created using http.createServer instead of express:
if(req.url==="/get"){
res.writeHead(200,{'content-type':'text/plain'});
modelName.find({}, 'property1 prop2 prop3', function(err,docs){
res.write('response...: '+docs);
});
}
Here is my issue:
I'm able to send response from node js to angular js but how to parse it? If I don't add 'response...:' before docs then I get an error msg 'first argument should be a string or buffer'. On angular I get response like:->
response...:{_id:....1, prop1: 'a',prop2: 'b',prop3: 'c'},
{_id:....2, prop1: 'ab',prop2: 'bc',prop3: 'cd'}
I want to display documents as a tabular format
I don't know your exact setup, but I think you should transfer application/json instead of text/plain.
You cannot simply concatenate a string to docs, you need to return either only just docs (to transfer as an array) or write res.write({'response':docs}) (to transfer as an object).
Consider moving from $http to a resource service. In your resource service, you need to set isArray to false if you want to transfer as an object or to true if you transfer as an array: https://docs.angularjs.org/api/ngResource/service/$resource

SAPUI5 Create OData entity with dates - generates incorrect request payload that ends in CX_SXML_PARSE_ERROR

We are trying to create an entity that has date attributes via an odata service. Backend is an sap system. This entity has only 3 key attributes plus a bunch of other attributes. We have identified that dates in the keys are the root cause of the problem.
Keys:
Pernr type string,
begda type datetime
endda type datetime.
The code below, (which does not work), has been severely simplified when trying to troubleshoot the issue. At the moment, it reads an entity from an entity set and immediately tries to create one with exactly the same data.
Code:
var oODataModel = new sap.ui.model.odata.ODataModel("/sap/opu/odata/sap/Z_PERSONAL_DATA_SRV/");
//Test entity to be saved
var entity = null;
//Handler for read error
var handleReadE = function (oEvent){
alert("error");
};
//Handler for read success
var handleRead = function (oEvent){
//Get the data read from backend
entity = oEvent.results[0];
//Try to create a new entity with same data
oODataModel.create('/PersDataSet', entity, null, function(){
alert("Create successful");
},function(oError){
alert("Create failed", oError);
});
};
oODataModel.read("/PersDataSet", null, [], true, handleRead, handleReadE);
In the gateway error log, an xml parsing error appears. In this log, we can see the request data and it can be seen that the dates are transported with String types. These dates are defined in the service as DateTimes so the request is rejected.
Example:
<m:properties>
<d:Pernr m:type="Edm.String">00000001</d:Pernr>
<d:Endda m:type="Edm.String">9999-12-31T00:00:00</d:Endda>
<d:Begda m:type="Edm.String">1979-05-23T00:00:00</d:Begda>
When the entity is read, the backend does not send any type information. It sends like the following example:
<m:properties>
<d:Pernr>72010459</d:Pernr>
<d:Endda>9999-12-31T00:00:00</d:Endda>
<d:Begda>1876-07-21T00:00:00</d:Begda>
And, indeed, if we try to save the same info without the type=".." it works. So the problem are the incorrect types ODataModel.create adds to the xml.
My question is:
Can I tell ODataModel.create to not add this type info? It is not doing a good job inferring the types.
Can anyone share an example reading and writing dates through odata?
Thank you very much in advance.
the data returned from oODataModel.read is raw, before you post you need to parse it
var handleRead = function (oEvent){
//Get the data read from backend
entity = oEvent.results[0];
var newEntity = jQuery.extend({},entity);
delete newEntity.__metadata;
newEntity.Begda = new Date(entity.Begda);
newEntity.Endda = new Date(entity.Endda);
//Try to create a new entity with same data
oODataModel.create('/PersDataSet', newEntity, null, function(){
why not use json instead of xml?
Thanks all for the help.
We got this working accounting for the following:
The problem of the wrong types appended to the attributes comes from the read itself. The object returned by read has a __metadata attribute which describes the values. In this object the dates are set with type=edm.string, even when the service says they are DateTime. To me this is a bug of the .read function.
When trying to use the same object to save, create sees the __metatada on the entry and uses those values, producing type edm.string type for the dates. This caused the request to be rejected. Manually changing these __metadata.properties...type to Edm.DateTime makes it work.
In the end, we did the following:
Dates are parsed manually from the Odata response, creating a js Date
object from the strings in format "yyyy-mm-ddT00:00:00", to make it work with control bindings. When we want to save, the reverse is done.
The object to be created is a new object with
only the attributes we care (no __metadata)

Categories

Resources