Been trying to see if there might be an easier way for me to manage a dataset by putting it all into a text file rather than having it in the JS itself (the text file will be several hundred lines long by the end), but I can't seem to get the array to populate the way that I need it to.
In the end, I need an array that'll look like this:
var names = [
{
"name": "john",
"tag": ["tall","blue eyes","ginger","fast"],
},
{
"name": "morgan",
"tag": ["stout","blue eyes","dark"],
},
{
"name": "ryan",
"tag": ["average","brown eyes","fast","strong","perceptive"]
}
]
Populated with all the names and tags from the text file formatted like this (or something like this, if there's a formatting that'll work better):
john: tall ,blue eyes, ginger, fast
morgan: stout, blue eyes, dark
ryan: average, brown eyes, fast, strong, perceptive
Here's where I've gotten myself thus far, searching around here and elsewhere. Mostly struggling with the array of tags. Currently it's spitting it out as a string, but I'm not really sure how to break it down.
const { readFile, promises: fsPromises } = require('fs');
readFile('NAMES.txt', 'utf-8', (err, data) => {
if (err) throw err;
var name = data.split(/\r?\n/), result = [], anotherarray = [];
name.forEach((pair) => {
if (pair !== '') {
let splitpair = pair.split(': ');
let key = splitpair[0].charAt(0).toLowerCase() + splitpair[0].slice(1);
result[key] = splitpair[1];
}
});
for (var i in result) anotherarray.push({ "name": i, "tag": result[i] });
console.log(anotherarray);
});
Any help or pointing in the right direction would be much appreciated!
You could use readline node module to read the file line by line and process it.
I think the most simple way to process is in two steeps:
Split by name and tags by :
Split the tags by , and trim the spaces [recommend trim by split by , so you can handle cases with multiple spaces]
NOTE: I added lowercase too because your example.
Example:
const readline = require("readline");
const fs = require("fs");
// Create the interface to read the file line by line
const file = readline.createInterface({
input: fs.createReadStream('text.txt'),
});
// answer
const names = [];
// Process file line by line
file.addListener("line", (line) => {
// Split name and tags
const [name, tags] = line.split(":");
// Insert the name with parsed tags
names.push({
name,
tag: tags.split(",").map((e) => e.trim(). toLowerCase()),
})
});
// Log answer
file.addListener("close", () => {
console.log(names);
});
the output:
[
{ name: 'john', tag: [ 'tall', 'blue eyes', 'ginger', 'fast' ] },
{ name: 'morgan', tag: [ 'stout', 'blue eyes', 'dark' ] },
{
name: 'ryan',
tag: [ 'average', 'brown eyes', 'fast', 'strong', 'perceptive' ]
}
]
Related
I have a JavaScript object of the following form:
const data = {
title: "short string",
descriptions: [
"Really long string...",
"Really long string..."
]
}
The long strings need to be excluded from the indexes and, for whatever reason, I can't figure out what format the object needs to be to save it:
const entityToSave = dataToDatastore(data);
datastore.save({
key: datastore.key(["TestEntity"]),
data: entityToSave
})
I simply need to know what entityToSave should look like. I've tried about twenty different shapes and every single attempt I've tried that uses excludeFromIndexes has either thrown an error saying the string was too large or ends up as an Entity type instead of an Array type.
I can get it to work via the GCP admin interface so I feel like I'm going crazy.
Edit: For convenience I am including a script that should run as long as you (1) set the PROJECT_ID and (2) add an appropriately long string to the descriptions array.
const { Datastore } = require("#google-cloud/datastore");
const PROJECT_ID = null;
const data = {
title: "short string",
descriptions: [
"Really long string...",
"Really long string...",
]
}
const entityToSave = dataToDatastore(data);
async function save() {
const datastore = new Datastore({
projectId: PROJECT_ID,
});
console.log(entityToSave);
const entity = {
key: datastore.key(["TestEntity"]),
data: entityToSave
};
datastore.save(entity);
}
function dataToDatastore(data) {
return data
}
save();
I simply need to know what dataToDatastore should look like. I've already tried numerous variations based on documentation and discussions from four or five different places and not one has worked.
You have to apply excludeFromIndexes: true to all entities in the array which is crossing the 1500 bytes limit.
const data = {
title: "short string",
descriptions: [
"Really long string...",//> 1500 bytes
"Really long string...", //> 1500 bytes
]
}
Here is how entityToSave should look like
const entityToSave = data.descriptions.map(description => ({
value: description,
excludeFromIndexes: true
}));
console.log(entityToSave);
// this will transform data.descriptions to
// [
// { value: 'Really long string...', excludeFromIndexes: true },
// { value: 'Really long string...', excludeFromIndexes: true }
// ]
Which will apply `excludeFromIndexes: true` to all entities
//Then save the entity
datastore.save({
key: datastore.key(["TestEntity"]),
data: entityToSave
})
For more information check this github issue and stackoverflow thread
I'm getting a javascript object array. I'm trying to write it to json file by replacing certain params. i don't require comma separated between every object and need to insert a new json object before every existing object.
current js object is
[{name:"abc", age:22},
{name:"xyz", age:32,
{name:"lmn", age:23}]
the expected json output file is,
{"index":{}}
{name:"abc", age:22}
{"index":{}}
{name:"xyz", age:32}
{"index":{}}
{name:"lmn", age:23}
My code is
sourceData = data.map((key) => key['_source'])
const strData = JSON.stringify(sourceData);
const newStr = strData.replace("},{", "}\n{");
const newStr2 = newStr.replace(`{"name"`, `{"index" : {}}\n{"name"`);
fs.writeFile('data.json', newStr2, (err) => {
if (err) {
throw err;
}
console.log("JSON data is saved.");
});
But in my output only the first object is changing. I need this to happen my entire output json file. Please help. Thanks
I'd insert dummy objects first and then map stringify over the result:
array = [
{name: "abc", age: 22},
{name: "xyz", age: 32},
{name: "lmn", age: 23}]
result = array
.flatMap(item => [{index: {}}, item])
.map(x => JSON.stringify(x))
.join('\n')
console.log(result)
// fs.writeFileSync('path/to/file', result)
Note that the result isn't JSON (but it looks like valid JSON-Lines).
I wouldn't try to do string manipulation on the result of JSON.stringify. JSON is too complex for manipulation with basic regular expressions.
Instead, since you're outputting the contents of the array, handle each array entry separately:
const ws = fs.createWriteStream("data.json");
sourceData = data.map((key) => key['_source'])
for (const entry of sourceData) {
const strData = JSON.stringify(entry);
ws.write(`{"index":{}}\n${strData}\n`);
}
ws.end();
console.log("JSON data is saved.");
Live Example (obviously not writing to a file):
const sourceData = [
{name:"abc", age:22},
{name:"xyz", age:32},
{name:"lmn", age:23}
];
for (const entry of sourceData) {
const strData = JSON.stringify(entry);
console.log(`{"index":{}}\n${strData}`); // Left off the trailing \n because console.log breaks up the lines
}
console.log("JSON data is saved.");
Here is a sample of the .log file I need to convert. I am using Node.
{"test": "data", "test1": 123, "foo": "feel me??"}
{"test": "data", "test1": 123, "foo": "feel me??"}
{"test": "data", "test1": 123, "foo": "feel me??"}
I am importing it by using this code.
let data = fs.readFileSync(log_path, 'utf8', function(err, data){
if (err) throw err;
let tweets = data.split('\n').map(line => JSON.parse(line));
return tweets;
fs.close(data, (err) => {
console.log(err);
})
})
As you can see it's not separated by commas so is not in JSON format. I am trying to read the file, then split it by a new line, but that doesn't seem to be working.
Assuming "feel me??" is meant to be a property, you could split up the lines and then map them to an array of objects:
const text = ` {"test": "data", "test1": 123, "foo": "feel me??"}
{"test": "data", "test1": 123, "foo": "feel me??"}
{"test": "data", "test1": 123, "foo": "feel me??"}`;
const arrOfObjs = text.split('\n')
.map(line => JSON.parse(line));
console.log(arrOfObjs);
The other problem is that readFileSync, as its name implies, reads the file synchronously. It doesn't accept a callback like that. Change your file-reading code to:
let data = fs.readFileSync(log_path, 'utf8');
// do stuff with the `data` string
Remember that since you're not working with a stream, you don't need fs.close.
Personally I used the read-last-lines package to get only a few last lines of my log file.
To make it work I had to slightly modify the code from the accepted answer, which was a great help for myself. I am going to add it here in case if someone was struggling with a similar issue.
readLastLines.read('info.log', 10)
.then((lines) => {
const arrOfStringObjs = lines.split('\n')
let arrOfObjs = []
arrOfStringObjs.forEach(strObj => {
if (strObj !== undefined && strObj !== null && strObj !== '') {
arrOfObjs.push(JSON.parse(strObj))
}
});
console.log(arrOfObjs)
Hope it helps.
There are some objects in an array like this:
const result = [
{
"_id": "Dn59y87PGhkJXpaiZ",
"title": "Something",
"synonyms": [ "Anything", "else" ]
},
{ ... }
]
I do get this result by performing this:
Content.find({
$or: [
{ title: { $regex: new RegExp(term, 'i') } },
{ synonyms: { $regex: new RegExp(term, 'i') } }
]
}).toArray()
As you can see, I'm searching for title (string) or synonym (array) elements by a given search term.
So searching for some or any will give me the first document as my result.
In my component I do the output of the data like this:
render () {
return result.map((link, index) => {
return <Dropdown.Item
text={link.title}
key={index}
/>
})
}
But right now I do get the output Something for the dropdown item if I'm searching for any (term). For the user this doesn't make sense.
Of course any should give me the output Anything and some should give me the output Something.
In this example you can also search for thing and I would expect two output elements (of one single result document): Anything and Something.
I'm not quite sure how to modify the code to get this result. I think the best place to modify should be the react component (output) - not the server request result.
You could issue two separate queries on the server side to maintain clearly which part of your document was matched. This would ensure that for the doc which matches using synonyms was not used to return title.
const matchingTitles = Content.find(
{ title: { $regex: new RegExp(term, 'i') } }
}, {title: 1} ).toArray().map(x => x.title);
const matchingSynonyms = Content.find(
{ synonyms: { $regex: new RegExp(term, 'i') } }
}, {synonyms: 1} ).toArray().map(x => x.synonyms).reduce((x, y) => x.concat(y), []);
return Array.from(new Set([...matchingTitles, ...matchingSynonyms]));
I fetch the strings separately using two queries and then take the set union of them.
And on client side you could use these strings directly to display the search result.
I have a mongo collection with a array field called 'tags'. What I want to do is create a single object that stores all of the various tags with a label and value. The end result should be an object I can use in a Select2 field in a Meteor application to create the results options. I have gotten close, but all of my solutions have not worked and are super ugly (read: not functional javascript)
Here is a sample document:
{
"_id": "sjkjladlj",
"title": "Coldplay is Cool",
"tags": ["music", "yuppie"]
}
Now the end result I would like is:
[
{
value: "music",
label: "music"
},
{
value: "yuppies",
label: "yuppies"
},
{
value: "Some tag from another doc"
label: "Some tag from another doc"
}
]
Any ideas?
Here is the closest I have gotten.
options: function() {
tagsArray = [];
ca = Notes.find({}, {tags: 1}).fetch();
ca.forEach(function(it) {
result = {};
result = it.tags;
tagsArray.push(result);
});
console.log(tagsArray);
return tagsArray;
}
}
you can try with aggregation pipeline like this
db.colleaction.aggregate([{$project:{_id:0,tags:1}},{$unwind:"$tags"},{$project:{"value":"$tags","lable":"$tags"}}])
Update. As soon as I posted I realized I simply need to add a inner loop. Its ugly, but it works.
options: function() {
tagsArray = [];
ca = Notes.find({}, {tags: 1}).fetch();
ca.forEach(function(it) {
result = {};
result = it.tags;
result.forEach(function(child){
inner = {};
inner.value = child;
inner.label = child;
tagsArray.push(inner);
});
});
console.log(tagsArray);
return tagsArray;
}