How to convert a javascript object array file to customized json file - javascript

I'm getting a javascript object array. I'm trying to write it to json file by replacing certain params. i don't require comma separated between every object and need to insert a new json object before every existing object.
current js object is
[{name:"abc", age:22},
{name:"xyz", age:32,
{name:"lmn", age:23}]
the expected json output file is,
{"index":{}}
{name:"abc", age:22}
{"index":{}}
{name:"xyz", age:32}
{"index":{}}
{name:"lmn", age:23}
My code is
sourceData = data.map((key) => key['_source'])
const strData = JSON.stringify(sourceData);
const newStr = strData.replace("},{", "}\n{");
const newStr2 = newStr.replace(`{"name"`, `{"index" : {}}\n{"name"`);
fs.writeFile('data.json', newStr2, (err) => {
if (err) {
throw err;
}
console.log("JSON data is saved.");
});
But in my output only the first object is changing. I need this to happen my entire output json file. Please help. Thanks

I'd insert dummy objects first and then map stringify over the result:
array = [
{name: "abc", age: 22},
{name: "xyz", age: 32},
{name: "lmn", age: 23}]
result = array
.flatMap(item => [{index: {}}, item])
.map(x => JSON.stringify(x))
.join('\n')
console.log(result)
// fs.writeFileSync('path/to/file', result)

Note that the result isn't JSON (but it looks like valid JSON-Lines).
I wouldn't try to do string manipulation on the result of JSON.stringify. JSON is too complex for manipulation with basic regular expressions.
Instead, since you're outputting the contents of the array, handle each array entry separately:
const ws = fs.createWriteStream("data.json");
sourceData = data.map((key) => key['_source'])
for (const entry of sourceData) {
const strData = JSON.stringify(entry);
ws.write(`{"index":{}}\n${strData}\n`);
}
ws.end();
console.log("JSON data is saved.");
Live Example (obviously not writing to a file):
const sourceData = [
{name:"abc", age:22},
{name:"xyz", age:32},
{name:"lmn", age:23}
];
for (const entry of sourceData) {
const strData = JSON.stringify(entry);
console.log(`{"index":{}}\n${strData}`); // Left off the trailing \n because console.log breaks up the lines
}
console.log("JSON data is saved.");

Related

SheetJS specify header order with json_to_sheet

I am using SheetJS in angular to export json as .xlsx file. For reference the json could be as follows:
[{
"ID": "E111",
"Name": "John",
"LastLogin": "2022-02-12"
},
{
"ID": "E112",
"Name": "Jake",
"Score": 22
"LastLogin": "2022-02-12"
}]
Note: The keys to the object are unknown, and can vary. The only known keys are ID and LastLogin.
I am using the following function to export
public exportAsExcelFile(json: any[], excelFileName: string): void {
const worksheet: XLSX.WorkSheet = XLSX.utils.json_to_sheet(json);
console.log('worksheet',worksheet);
const workbook: XLSX.WorkBook = { Sheets: { 'data': worksheet }, SheetNames: ['data'] };
const excelBuffer: any = XLSX.write(workbook, { bookType: 'xlsx', type: 'array' });
this.saveAsExcelFile(excelBuffer, excelFileName);
}
private saveAsExcelFile(buffer: any, fileName: string): void {
const data: Blob = new Blob([buffer], {
type: EXCEL_TYPE
});
FileSaver.saveAs(data, fileName + '_export_' + new Date().getTime() + EXCEL_EXTENSION);
}
The resulting excel looks like this
I want LastLogin to be the last column no matter the object. Is there a way to achieve this? I am pretty new to this, so any help is appreciated.
The behaviour of SheetJS here is to take the order of column headers for the Excel data from the first row, and then as new object keys are encountered then for the matching row header to be added at the end.
To control this behaviour to get the output formatted the way you want, you can process the input json before calling XLSX.utils.json_to_sheet.
Define this function:
function restructureObjectForSheet(obj) {
// for each object in the array put the keys in a new array
// flatten that array
// there will be duplicate names which can be removed with Set
// turn it back into an array
const uniqKeys = Array.from(new Set(obj.map(o => Object.keys(o)).flat()));
// remove LastLogin from this array
// then put LastLogin at the end of the array
const endKey = "LastLogin";
const rowHeaders = uniqKeys.filter(k => k !== endKey).concat(endKey);
// process the original data into a new array
// first entry will define row headers in Excel sheet
const newData = obj.map(o => {
return rowHeaders.reduce((a, c) => {a[c] = o[c] ; return a}, {});
});
return newData;
}
I've commented the code, but the essential features are:
get an array of all the unique keys across all the objects in your input array (your json variable)
ensure LastLogin is the last element of the array
create one new object per input object and where the original data does not have the property (e.g. Score) then the value is undefined
Now, in your exportAsExcelFile method, just make this adjustment before the 1st line:
const newJson = restructureObjectForSheet(json);
const worksheet: XLSX.WorkSheet = XLSX.utils.json_to_sheet(newJson );

Populating Nested Array from Text File

Been trying to see if there might be an easier way for me to manage a dataset by putting it all into a text file rather than having it in the JS itself (the text file will be several hundred lines long by the end), but I can't seem to get the array to populate the way that I need it to.
In the end, I need an array that'll look like this:
var names = [
{
"name": "john",
"tag": ["tall","blue eyes","ginger","fast"],
},
{
"name": "morgan",
"tag": ["stout","blue eyes","dark"],
},
{
"name": "ryan",
"tag": ["average","brown eyes","fast","strong","perceptive"]
}
]
Populated with all the names and tags from the text file formatted like this (or something like this, if there's a formatting that'll work better):
john: tall ,blue eyes, ginger, fast
morgan: stout, blue eyes, dark
ryan: average, brown eyes, fast, strong, perceptive
Here's where I've gotten myself thus far, searching around here and elsewhere. Mostly struggling with the array of tags. Currently it's spitting it out as a string, but I'm not really sure how to break it down.
const { readFile, promises: fsPromises } = require('fs');
readFile('NAMES.txt', 'utf-8', (err, data) => {
if (err) throw err;
var name = data.split(/\r?\n/), result = [], anotherarray = [];
name.forEach((pair) => {
if (pair !== '') {
let splitpair = pair.split(': ');
let key = splitpair[0].charAt(0).toLowerCase() + splitpair[0].slice(1);
result[key] = splitpair[1];
}
});
for (var i in result) anotherarray.push({ "name": i, "tag": result[i] });
console.log(anotherarray);
});
Any help or pointing in the right direction would be much appreciated!
You could use readline node module to read the file line by line and process it.
I think the most simple way to process is in two steeps:
Split by name and tags by :
Split the tags by , and trim the spaces [recommend trim by split by , so you can handle cases with multiple spaces]
NOTE: I added lowercase too because your example.
Example:
const readline = require("readline");
const fs = require("fs");
// Create the interface to read the file line by line
const file = readline.createInterface({
input: fs.createReadStream('text.txt'),
});
// answer
const names = [];
// Process file line by line
file.addListener("line", (line) => {
// Split name and tags
const [name, tags] = line.split(":");
// Insert the name with parsed tags
names.push({
name,
tag: tags.split(",").map((e) => e.trim(). toLowerCase()),
})
});
// Log answer
file.addListener("close", () => {
console.log(names);
});
the output:
[
{ name: 'john', tag: [ 'tall', 'blue eyes', 'ginger', 'fast' ] },
{ name: 'morgan', tag: [ 'stout', 'blue eyes', 'dark' ] },
{
name: 'ryan',
tag: [ 'average', 'brown eyes', 'fast', 'strong', 'perceptive' ]
}
]

How do I convert a .csv file to a javascript array of dictionaries. Keys of each dictionary are column headers in the .csv file and items are values

For example, if I had a .csv file where the column headers are mentioned in the first row of the file and their subsequent values are specified in the following rows like so,
index,id,description,component,service
0,5,lorem ipsum,7326985,Field Service
The first step is to separate out the data and the column headers. I'm going to make the assumption that the csv is stored in your program as an array of strings, each representing a line, i.e.
const csv = [
"index,id,description,component,service",
"0,5,lorem ipsum,7326985,Field Service"
]
If your CSV isn't already represented this way, you can achieve this with the fs module:
const fs = require('fs');
fs.readFile('./data.csv', (err, data) => {
if (err) {
throw new Error(err);
}
const csv = String(data) // convert the buffer to a string
.split('\n') // Split the string into an array where each item contains one line
.filter(Boolean); // Remove any empty lines
// Do the rest of the operations on the CSV data here
});
In which case we can split them up easily using the spread operator, after splitting each string on the commas:
const [ headers, ...data ] = csv.map(row => row.split(','));
Now our objects look something like this:
// headers
[ 'index', 'id', 'description', 'component', 'service' ]
// data
[
[ '0', '5', 'lorem ipsum', '7326985', 'Field Service' ]
]
And we can now go ahead and map each of the arrays in the data array to an object, using each value's index to map it to a specific heading from the CSV.
data.map(row => {
const rowObject = {};
row.forEach((value, index) => {
rowObject[headers[index]] = value;
});
return rowObject
});
We of course have to return this mapped data object somewhere, or assign it to a new variable as the Array.map() function does not update the original array but creates a new one. Putting it all together into a code snippet it could look like this:
const csv = [
"index,id,description,component,service",
"0,5,lorem ipsum,7326985,Field Service"
];
function csvToJSON(csv) {
const [headers, ...data] = csv.map(row => row.split(','));
return data.map(row => {
const rowObject = {};
row.forEach((value, index) => {
rowObject[headers[index]] = value;
});
return rowObject
});
}
console.log(csvToJSON(csv));

Re-ordering of JSON data

I am new to JSON and want to re-structure my JSON response,
JSON-structure now-
{
"map": {
"Cityname": "[Jammu, Srinagar]",
"Pincode": "[180001, 190001]"
}
}
How I need it to be-
[
{ Cityname: "Jammu", Pincode: 180001},
{ Cityname: "Srinagar", Pincode: 190001}
]
Is there a way to do so, I searched for some possible solutions but was unable to do so.
Here is a Dynamic way of doing so, should work with any json string with the same layout. I fixed some of your json string.
// first we use JSON.parse() to turn a json string into a JS Object.
const data = JSON.parse(`{ "map": { "cityName": ["Jammu", "Srinagar"], "pinCode": [180001, 190001]}}`);
// cities will hold our city objects
const cities = [];
// Object.entries(theObject) returns theObjects keys and values as an array
// EG:
// [[key, value], [key, value]]
// forEach() loops through each key, value pair.
// Because we know that we are going to get a key value pair,
// I destructured the array into it's key and values using:
// [key, values] = [properties];
Object.entries(data.map).forEach((prop) => {
const [key, values] = prop;
values.forEach((value, index) => {
// now we check that the current index is an object.
// we do this because we can't add a property and value otherwise.
if(typeof cities[index] != "object"){
cities[index] = {};
}
// now we set the current value
cities[index][key] = value;
})
})
console.log(cities);
Your JSON response, its not quite logical because there is not mapping between city and pincode. I assumed that cityname and pincode are in the same order in the arrays. I used exact json structure you provided in the question.
You can skip additional steps substring and parse if your json data have correct data types (Cityname string array / Pincode int array).
const json = {
"map": {
"Cityname": "[Jammu, Srinagar]",
"Pincode": "[180001, 190001]"
}
}
const someFunc = () => {
let output = [];
const {Cityname, Pincode} = json.map;
const citynameArray = (Cityname.substring(1, Cityname.length-1)).split(",");
const pincodeArray = JSON.parse(Pincode);
citynameArray.map((v,i) => {
output.push({Cityname: v, Pincode: pincodeArray[i]})
});
return output;
}
console.log(someFunc());

Multifield inputs request in node js server side

My frontend request form request get me this
req.body {
name: ["abc","ggg"],
class: ["A","B"],
bloodGroup: ["A+","B-"]
}
I had try will my code to solve the problem like this
let arr = [];
req.body.name.forEach((item, index)=>{
arr.push({
name: item,
class: req.body.class[index],
bloodGroup: req.body.bloodGroup[index]
})
})
return arr;
Is there any other way like dynamically, I will pass only my request and it will return me objects of these data
this is sample req.body there will be multiple more data in each key
name, class, bloodGroup
ex:
name ["a","b","c",...] so on...
class ["ab","bc","dc",...] so on...
bloodGroup ["a+","b+","c-",...] so on...
We can do an Array.map on the first property of the request body.
We then use Object.entries on req.body and map it to return only the correct entry for each index. We then use Object.fromEntries() to create each object.
This means that however many properties are present on req.body, they will be present on the result objects.
const req = {
body: {
name: ["abc","ggg", "ccc"],
class: ["A","B", "C"],
bloodGroup: ["A+","B-", "C-"],
height: [175, 182, 190]
}
}
const arr = Object.values(req.body)[0].map((item, index) => {
const entries = Object.entries(req.body).map(([key, properties]) => ([key, properties[index]]));
return Object.fromEntries(entries);
});
console.log(arr);

Categories

Resources