I'm storing data in my local Node JS Elastic Search database, now I've inserted records and one of the fields is a created_at field, which I've stored with new Date(), however, upon retrieval of the data, it seems that these are Strings, and thus my query doesn't seem to return any results.
My data looks like:
{
"_index": "my_table",
"_type": "_doc",
"_id": "V1mouXcBt3ZsPNCwd3A1",
"_version": 1,
"_seq_no": 0,
"_primary_term": 1,
"found": true,
"_source": {
"properties": {
"created_at": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
},
"updated_at": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
}
},
"data": {
"name": "performance_overview",
"friendly_name": "Performance Overview",
"data": {
"cached": "9:39:21am",
"from": "02/19/2021 00:00:00",
"to": "02/19/2021 23:59:59",
"success": true,
"message": "message",
"sessions": "265",
"leads": "123",
"conversion_rate": "46.42"
},
"created_at": "2021-02-19 09:02:16",
"updated_at": "2021-02-19 09:02:16"
}
}
}
I'm using Moment JS, and have formatted the dates, and attempted what I believe is accurate for field mappings, although I'm new to Elastic Search and this seems to not return any results either.
const from = moment(from).startOf('day')
const results = await elastic.find('my_table', {
query: {
range: {
created_at: {
gte: moment(from).format('YYYY-MM-DD HH:MM:SS'),
}
}
}
})
elastic.find is a custom function that I've written which looks like the following and is exported from another JS file:
const find = async (table, data) => {
try {
const found = await client.search({
index: table,
body: data
}, {
ignore: [404],
maxRetries: 3
})
return found.body
} catch (err) {
return err
}
}
Elasticsearch is a JSON-in, JSON-out interface that stores your datetime data as strings of a given format (could be numeric (milli)seconds since the epoch too, if you choose so.)
As to why new Date() was converted to a string — at some point before the network transport, the JS client will serialize your documents-containing request body by calling JSON.stringify on it. When this function is called on a JS object, the JS runtime looks for an implementation of the .toJSON method on that object and serialize the value returned by the this method. You can verify this by running:
const date = new Date();
console.assert(`"${date.toJSON()}"` === JSON.stringify(date))
Now, in your elastic.find('my_table', {...}) call you're attempting two things at once — setting the mapping and querying the index. That's not going to work.
You've got to define your mapping before you ingest any documents. When you do that, make sure your date fields have a proper format to prevent inconsistencies down the line:
{
"properties": {
"created_at": {
"type": "date",
"format": "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"
}
}
}
After that add some documents to your table/index (through the index operation).
Then and only then will you be able to use your range query. Here's a good working example.
Related
I am trying to use the POST method to insert some data from a person with JSON. I am using the code from JS to construct, but when i start the transformation, it sends me "ERROR: invalid character ' ' in literal true (expecting 'e')". Does anyone know how to solve it?
const obj = {
"num_matricula": num_matricula,
"limit_date": "2022-05-20",
"admission_date": admission_date,
"cost_center": cost_center,
"pos_number": pos_number,
"role": role,
"department": department,
"pagamento": {
"vinculo": vinculo,
"valor": valor,
"recorrencia": recorrencia,
"contaBancaria": {
"banco": "001",
"carta": "c9160763-db6c-4e8c-a1ad-ad8709c99be2"
}
},
"deficiencia": deficiencia,
"jornada": jornada,
"profile": {
"name": name,
"email": email,
"mobile": mobile
},
"exame": {
"clinica": "6dc84ce4-7d9f-48ec-b9b1-a8a895a21fd4",
"data": "2022-05-15",
"hora": "14:00",
"obs": "Comparecer de manhã",
"guia": "e37dab24-c7a4-4b92-b9d1-32ed538b8300",
},
"docs": ["c9e26093-5e0c-4bd2-bea3-ac5182a6179f"],
"send_sms": true,
"send_email": true
};
const myJSON = JSON.stringify(obj);
Some columns are already provided with data from previous step (you can see in the images below), that is why i just repeated the column name in the JS code. Just to let you know, the boolean types of data are the columns: send_email, send_sms and deficiencia.
The problem is that JSON is a string. So in your first line you see this is not valid json: "num_matricula": num_matricula,
only numbers can be without double quotes: "num_matricula": 1234,
This question already has answers here:
Safely turning a JSON string into an object
(28 answers)
Closed 2 years ago.
Good day Developers in the house. I need help. I have a json string that i have gotten from API in Angular and console log to see it. I want to get a particular data from the json file and store in a variable because i need to query the data again. Below is how my Json data look like.
{
"done": true,
"record_id": "5fc0a7ac88d2f8534f8e59d8",
"customer_id": "5fa1541f6bd09b290f736608",
"balance": {
"clientId": "qh8RKp9BGhmKCn8ZAAED",
"status": true,
"balance_callback": "https://webhook.site/92b09e29-f08e-4472-b7e8-5875155360d67",
"data": {
"formatted": [
{
"available_balance": 31500,
"ledger_balance": 32000,
"ref": "saving-account",
"status": "active",
"account": "5fa9e536f6b7bb837cb22byu",
"connected": true
},
{
"available_balance": 11200,
"ledger_balance": 11200,
"ref": "current-account",
"status": "active",
"account": "5fa9e535f6b7bb837cb22buy",
"connected": false
},
{
"available_balance": 2000,
"ledger_balance": 2000,
"ref": "current-account",
"status": "active",
"account": "5fa9e536f6b7bb837cb22bty",
"connected": false
}
]
}
},
"Post": {
"callback_url": "https://webhook.site/92b09e29-f08e-4472-b7e8-5875155360d67"
},
"guarantors": [],
"redirect_url": "",
"launchAgain": true,
"hideExit": "",
"options": {},
"directors": null,
"auth": {
"clientId": "qh8RKp9BGhmKCn8ZAythj",
"status": true,
}
I want to get the callback-url under Post in the json file above.. Please any idea on how to do that with any javascript Method.
Let's say you call the date like that
fetch('urlOfTheApi')
.then( response => resonse.json()
.then( json => {
console.log(json);
});
So the solution is
fetch('urlOfTheApi')
.then( response => resonse.json()
.then( json => {
let callback_url = json.POST.callback;
return fetch(callback_url, {method:'POST'}).
})
.then( () => console.log('ok')
.catch( console.error );
Based on your JSON you should create an interface to define the object of the JSON and the particular part of the JSON that you need if you just need that property you could create the following interface
interface Response {
Post: {
callback_url: string;
}
}
In the service where you are requesting the endpoint from your API, you set that this method returns a Response type of data.
And inside of the subscription on your method you could access the property of the object of the JSON.
My mongoDB collection looks like this:
[
{
"id": "myid",
"field": {
"total": 1,
"subfield": [
{
"time": "2020-08-06T08:33:57.977+0530"
},
{
"time": "2020-05-08T04:13:27.977+0530"
}
]
}
},
{
"id": "myid2",
"field": {
"total": 1,
"subfield": [
{
"time": "2020-07-31T10:15:50.184+0530"
}
]
}
}
]
I need to update all the documents and convert date string in the field time available in the subfieldarray to mongoDB ISO date format.
I have thousands of documents and hundreds of objects in subfield array
I'm aware of the aggregate function $todate and $convert.
But I don't want to use aggregation because,
To use $todate or $convert, I need to unwind the field.subfield array which is again an expensive operation.
I want to update my document and save it with the date format.
My MongoDB server version: 4.0.3
I tried the following but it doesn't seem to work and also doesn't return any errors.
db.collection.find().forEach(function(doc) {
doc.field.subfield.time=new ISODate(doc.field.subfield.time);
db.collection.save(doc);
})
You missed a loop for subfield, because its an array,
db.collection.find().forEach(function(doc) {
doc.field.subfield.forEach(function(r) {
r.time = new ISODate(r.time);
})
db.collection.save(doc);
})
If this is for one time then time does not matter, i think both will take same time if you do with aggregation or forEach.
If you are planing to update MongoDb version then form 4.2,
a option you can update with updateMany() using update with aggregation pipeline,
db.collection.updateMany({},
[{
$set: {
"field.subfield": {
$map: {
input: "$field.subfield",
as: "r",
in: {
$mergeObjects: [
"$$r",
{ time: { $toDate: "$$r.time" } }
]
}
}
}
}
}]
)
I am making a db query upon hitting a POST API endpoint. The query needs to update the Json column in my networks table, which only has 3 columns (id, name, and json). I need to specifically update the coreEssentials array with another value, so I have been using the set 'json' = ? SQL query where I paste in the entire column with my changes in the specific field and it works (manually in the db). The only issues are, I need to do make a SQL call to SELECT the json column for a specific id first, (long story, but a backend application generates some data into the JSON (the coreEssentials key/object I need to update) then puts it into the data, then after I need to update).
I was doing this manually in my Postgresql GUI (DBbeaver) and my query simply looks like this:
update network set "json" = '{
"uid": "randomUid",
"etag": "randomEtag",
"name": "randomNameAgain",
"state": "PENDING",
"Type": "ABC",
"version": 1,
"dealerId": "random_uuid",
"Param": {
"AreaId": 0,
"AreaIdStr": "0.0.0.0",
"DeadInterval": 0,
"HelloInterval": 0
},
"networkData": {
"tic": "311",
"toe": "980",
"tac": "201",
"tac_id": "201",
"timeZone": null
},
"production": false,
"customerName": "random_name",
"IPPool": "0.0.0.0/32",
"customerEmail": "random#email.com",
"coreEssentials": [ ],
"deployment": "A"
}'
coreEssentials starts out as an Empty array but I need to set it to this:
[{
"version": 1,
"component": "purple",
"instanceId": "1"
},
{
"version": 1,
"component": "gray",
"instanceId": "1"
},
{
"version": 1,
"component": "blue",
"instanceId": "1"
} ]
I'm using a Node JS backend with pg-promise (Postgresql) library. Can anyone give me advice how to do this query?
I just set coreEssentials array to the data object returned from the first SQL query like #vitaly-t suggested :D
The server I'm working with changed the REST format from plain JSON:
{
"removedVertices": [
{
"id": "1",
"info": {
"host": "myhost",
"port": "1111"
},
"name": "Roy",
"type": "Worker"
}
],
"id": "2",
"time": 1481183401573
}
To Jackson format:
{
"removedVertices": [
"java.util.ArrayList",
[
{
"id": "1",
"info": [
"java.util.HashMap",
{
"host": "myhost",
"port": "1111"
}
]
"name": "Roy",
"type": "Worker",
}
]
"id": "2",
"time": 1482392323858
}
How can I parse it the way it was before in Angular/Javascript?
Assuming only arrays are affected, I would use underscore.js and write a recursive function to remove the Jackson type.
function jackson2json(input) {
return _.mapObject(input, function(val, key) {
if (_.isArray(val) && val.length > 1) {
// discard the Jackson type and keep the 2nd element of the array
return val[1];
}
else if (_.isObject(val)) {
// apply the transformation recursively
return jackson2json(val);
}
else {
// keep the value unchanged (i.e. primitive types)
return val;
}
});
}
If the api should be restful, then the server should not return none plain json results. I think the server site need to fix that.
I think it is because the server enabled the Polymorphic Type Handling feature.
Read Jackson Default Typing for object containing a field of Map and JacksonPolymorphicDeserialization.
Disable the feature and you will get result identical to plain json.
The main difference i see is that in arrays you have an additional string element at index 0.
If you always get the same structure you can do like this:
function jacksonToJson(jackson) {
jackson.removedVertices.splice(0, 1);
jackson.removedVertices.forEach((rmVert) => {
rmVert.info.splice(0, 1);
});
return jackson;
}