Send file from frontend to NODEJS backend to convert XML to JSON - javascript

I'm sending xml files from frontend page to my backend NODEJS server using method post
FRONTEND simple code gating files and post to server
...
`req = {
method: "POST",
url: 'api.com/sendfiles',
data: {files: files},
};`
...
BACKEND CODE
req.body.files
` {
name: 'file.xml',
size: 35003,
url: 'blob:http://localhost/89bd5938-5cc4-48bc-809f-ab46c243ed7d',
_file: {}
}`
--
...
`req.body.files.forEach(async doc => {
request.get(
{
url: doc.url,
},
function (error, response, body) {
console.log (body, error, response);
}
);`
...
need to work file blob files received in post request to get xml file and convert to json

try this one,
const parser = require('fast-xml-parser');
const fetch = require('node-fetch');
const xmlToJSON = (xml) => {
const jsonObj = parser.parse(xml, {
ignoreAttributes: false,
attributeNamePrefix: '_',
});
return jsonObj;
};
const processXMLFile = async (file) => {
const response = await fetch(file.url);
const xml = await response.text();
const json = xmlToJSON(xml);
return json;
};
const processFiles = async (files) => {
const jsonFiles = [];
for (const file of files) {
const jsonFile = await processXMLFile(file);
jsonFiles.push(jsonFile);
}
return jsonFiles;
};
const files = req.body.files;
const jsonFiles = await processFiles(files);

Related

Node js: Download file to memory and pass to library function without saving to disk

The below code is working fine, but I had some code feedback:
"Why download and save the file to disk, only to read it back in memory?"
However, after spending some hours exploring options with Buffer and stream, I just don't seem to be getting anywhere.
const fs = require('fs');
const { PdfData } = require('pdfdataextract');
const axios = require('axios').default;
const getPDFText = async ({ url }) => {
const tmpDir = `${process.cwd()}/my_dir`;
const writer = fs.createWriteStream(`${tmpDir}/document.pdf`);
const response = await axios({
url,
method: 'get',
responseType: 'stream'
});
response.data.pipe(writer);
const text = await new Promise((resolve, reject) => {
writer.on('finish', () => {
const fileData = fs.readFileSync(`${tmpDir}/document.pdf`);
PdfData.extract(fileData, {
get: {
// ...
},
})
.then(resolve)
.catch(reject);
});
writer.on('error', reject);
});
return text;
};
How can I avoid saving the file to disk, and to instead feed it into the PdfData.extract method?
The signature for .extract says it accepts an Uint8Array.
Something like
const {PdfData} = require('pdfdataextract');
const axios = require('axios').default;
async function getPDFText({url}) {
const response = await axios({
url,
method: 'get',
responseType: 'arraybuffer',
});
const pdfUint8Array = new Uint8Array(response.data);
const res = await PdfData.extract(pdfUint8Array, /* ... */);
console.log(res);
return res;
}
could do the trick?

consume s3 getobjectcommand result in angular (open as pdf)

I am trying to open a file from a s3 bucket using angular as a pdf. To do this, I have a node service running which gets the object, which I call from angular. Then I'm trying to open in angular as a pdf. Is there something I am missing? When I open the PDF, it shows up as a blank (white) document.
Below is my node code:
const streamToString = (stream) =>
new Promise((resolve, reject) => {
const chunks = [];
stream.on("data", (chunk) => chunks.push(chunk));
stream.on("error", reject);
stream.on("end", () => resolve(Buffer.concat(chunks).toString("utf8")));
});
const readFile = async function getObj(key) {
const params = {
Bucket: vBucket,
Key: key,
};
const command = new GetObjectCommand(params);
const response = await client.send(command);
const { Body } = response;
return streamToString(Body);
};
And here I am consuming in angular and opening as PDF
The service:
getObj(key: String): Observable<any>{
const httpOptions = {
'responseType' : 'arraybuffer' as 'json'
//'responseType' : 'blob' as 'json' //This also worked
};
return this.http.get<any>(environment.s3Ep + '/getfile?key=' + key, httpOptions );
}
And code consuming the service:
this.s3Svc.getObj(key).subscribe(
res => {
let file = new Blob([res], {type: 'application/pdf'});
var fileURL = URL.createObjectURL(file);
window.open(fileURL);
}
);
I started experiencing the same issue. Found a solution, replacing streamToString with streamToBuffer as follows:
const streamToBuffer = async (stream: Readable): Promise<Buffer> => {
return new Promise((resolve, reject) => {
const chunks: Array<any> = []
stream.on('data', (chunk) => chunks.push(chunk))
stream.on('error', reject)
stream.on('end', () => resolve(Buffer.concat(chunks)))
})
}
and the code that consumes it:
const command = new GetObjectCommand({ Bucket, Key })
const data = await s3.send(command)
const content = await streamToBuffer(data.Body as Readable)
fs.writeFileSync(destPath, content)
In my case I'm writing to a PDF file.
Writing as a string retrieved from streamToString or writing it as buffer.toString() resulted in the blank PDF.

Text -> PNG -> ReadStream, all done on the front-end?

I'm not sure if this is even possible, but here's what I'm trying to do:
Let the user enter some text
Generate a PNG from that text
Upload it to Pinata, which requires it to be in ReadStream format
Do all of this on the front-end
I've managed to accomplish (1) and (2) using html2canvas.
The tricky part is (3). The reason it has to be in ReadStream format is because that's the format Pinata's SDK wants:
const fs = require('fs');
const readableStreamForFile = fs.createReadStream('./yourfile.png');
const options = {
pinataMetadata: {
name: MyCustomName,
keyvalues: {
customKey: 'customValue',
customKey2: 'customValue2'
}
},
pinataOptions: {
cidVersion: 0
}
};
pinata.pinFileToIPFS(readableStreamForFile, options).then((result) => {
//handle results here
console.log(result);
}).catch((err) => {
//handle error here
console.log(err);
});
I realize that this would be no problem to do on the backend with node, but I'd like to do it on the front-end. Is that at all possible? Or am I crazy?
I'm specifically using Vue if that matters.
For anyone interested the solution ended up being using fetch+blob:
const generateImg = async () => {
const canvas = await html2canvas(document.getElementById('hello'));
const img = canvas.toDataURL('image/png');
const res = await fetch(img);
return res.blob();
};
This blob can then be passed into a more manual version of their SDK:
const uploadImg = (blob: Blob) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
const data = new FormData();
data.append('file', blob);
const metadata = JSON.stringify({
name: 'testname',
});
data.append('pinataMetadata', metadata);
const pinataOptions = JSON.stringify({
cidVersion: 0,
});
data.append('pinataOptions', pinataOptions);
return axios
.post(url, data, {
maxBodyLength: 'Infinity' as any, // this is needed to prevent axios from erroring out with large files
headers: {
// #ts-ignore
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: apiKey,
pinata_secret_api_key: apiSecret,
},
})
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error);
});
};

How can I get my mongoose collection data displayed on the client side?

I am trying to fetch the data from my mongoose collection, and dispaly that data on client side. But I am getting 'Object' on the cilent side instead of data, but the data is saved in the database.
Can anyone help me, Thanks
This is my code
router.js
const mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/Library', { useNewUrlParser: true });
const api_list = new mongoose.Schema({
data:Array
});
const api = mongoose.model('api_listLs', api_list);
router.post('/api/list/Post', (request, response) => {
const data = request.body
const apiList = new api({data:data});
apiList.save().then(() => {
console.log(data)
response.json(data);
}).catch(() => {
response.status(400).send("Item was not send to the database")
});
});
router.get('/api/list/Get', (request, response) => {
api.find({},(err,data)=> {
if(err){
response.end();
return;
}
response.json(data);
});
});
page1.js
I used a fetch 'POST' method to pass data into database. And save it in a collection
function TableRow() {
let items = 'hey'
let domore = 'hi'
let cells = document.querySelectorAll('#recieve-info td');
cells.forEach(cell => cell.onclick = async function () {
let prevcell = cell.previousElementSibling;
if (prevcell) {
let data = {items, domore}
let options = {
method: 'POST',
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(data)
};
const response = await fetch('/api/list', options);
const json = await response.json();
console.log(json);
}
});
}
page2.js
Here I used fetch 'GET' method to to retreive data from database and show it on client side
async function ParaG() {
const response = await fetch('/api/list');
const data = await response.json();
console.log(data);
alert(data);
for (item of data) {
const para = document.querySelector('.Second-Para');
para.innerHTML += `${item.data}, ${item.data}`
}
}
'.Second-Para' is a p element in a container 'div'
Here is the output. I am getting 'Object' instead of data
This app pass data from one page to another by onclick
result is in data key and data is array of objects so use {data} to access the data like this:
async function ParaG() {
const response = await fetch('/api/list');
const {data} = await response.json();
console.log(data);
alert(data);
for (item of data) {
let info = item.data[0];//access to data
const para = document.querySelector('.Second-Para');
para.innerHTML += `${info.items}, ${info.domore}`
}
}

How do I upload an array of mages using fect API

I am trying to upload an array of images using my custom API (Node JavaScript), here I just want to save the file name to my database, and uploading of the file works just fine (the middleware which I created for uploading works just fine because I have tried using the same middleware in the postman).
When it comes to the actual JavaScript file it is not working and shows the following error:
Cast to string failed for value "{ '0': {}, '1': {} }" at path "images", images: Cast to Array failed for value "[ { '0': {}, '1': {} } ]" at path "images"
Here is my code
const createProperty = ['city', 'propertyType', 'propertyName', 'images'];
const newProperty = new Array();
for (var i = 0; i < myids.length; i++) {
newProperty[i] = $(myids[i]).val();
}
newProperty[myids.length] = [document.getElementById('PropertyPhotos').files];
const data = {};
createProperty.forEach((id, index) => {
data[createProperty[index]] = newProperty[index]
})
await createData(data);
// here is the CreateDate function
export const createData = async (data) => {
try {
const res = await axios({
method: 'POST',
url: '/api/v1/properties',
data
});
if (res.data.status === 'success') {
showAlert('success', 'Property Created Successfully');
}
console.log(res.data);
} catch (err) {
showAlert('error', err.response.data.message);
console.log(err.response.data.message);
}
}
It looks like you are using Mongo. Look here and here.
And to send files you need to use form-data, but I don't see it in your code.
const formdata = new FormData();
formdata.append("file", fileInput.files[0], "/filepath.pdf");

Categories

Resources