How to convert BLOB into PDF file in the Node environment? - javascript

I have a Node Js server, in it, I am fetching a blob data from another web service (which is for a PDF file), now after receiving blob, I want to convert it again into PDF file.
Anyone, who knows how to achieve this please help.
Here is my code block I have tried so far:
const fetch = require('node-fetch');
const Blob = require('fetch-blob');
const fs = require('fs');
fetch(url, options)
.then(res => {
console.log(res);
res.blob().then(async (data) => {
const result = data.stream();
// below line of code saves a blank pdf file
fs.createWriteStream(objectId + '.pdf').write(result);
})
})
.catch(e => {
console.log(e);
});

Modification points:
For fs.createWriteStream(objectId + '.pdf').write(data), please modify res.blob() to res.buffer().
Please modify .then(res => {res.blob().then() to .then(res => res.buffer()).then(.
Modified script:
fetch(url, options)
.then(res => res.buffer())
.then(data => {
fs.createWriteStream(objectId + '.pdf').write(data);
})
.catch(e => {
console.log(e);
});
Note:
In this modification, it supposes that the fetch process using url and options works fine.
References:
node-fetch
write()

Related

Express.js - hit url to download file from api

I am working with shopify admin api. I create a bulk operation and in return it gives me a url. When you hit this url it automatically downloads the file. This file as information that my server needs.
So where I am at - I send the request for bulk operation, then I hit the url to download file and file gets downloaded by my browser.
What I need is to hit the url, and then download and save the file into a folder on my server where I can then start working with it. And I dont know how to do this, I tried, I looked around. Any help would be awesome. Thanks.
My code:
app.get("/postBulkProds", async (req,res) => {
let url = GRAPHQL_CUSTOM_URL;
await fetch(url, postBulkProducts())
.then(res=> res.json())
.then(bulk => res.send(bulk))
.catch(error => res.send(error));
});
app.get("/getBulkProds", async(req,res, next) => {
let data;
let url = GRAPHQL_CUSTOM_URL;
await fetch(url, getBulkProducts())
.then(products=> (products.json(products)))
.then(products => (data = products))
.catch(error => res.send(error));
console.log(data.data.currentBulkOperation.url);
res.locals.bulk = data;
let url2 = data.data.currentBulkOperation.url;
res.redirect(url2);
});
EDIT-SOLVED
fetch(url2)
.then(r => r.text())
//.then( t=> console.log(t))
.then(t => {let json=JSON.stringify(t);
//console.log(json);
fs.writeFile('./myfile.jsonl', json, err => {
if(err){
console.log("error writing", err);
} else {
console.log("success");
}
})
})
.catch(error => console.log(error));

Is is possible to read a .csv file with Javascript fetch API?

Like
fetch('state_wise_data.csv')
.then(response => response.json())
.then(data => console.log(data))
.catch(err => console.log(err))
Tried doing this but didn't work.
First of all, CSV it's not a JSON. Fetch does not have CSV support, you will need to download CSV string (you can use response.text()) and use the third party CSV parser.
For parse CSV parser you can use papaparse:
"Isn't parsing CSV just String.split(',')?"
Heavens, no. Papa does it right. Just pass in the CSV string with an optional configuration.
Example:
const response = fetch('state_wise_data.csv')
.then(response => response.text())
.then(v => Papa.parse(v))
.catch(err => console.log(err))
response.then(v => console.log(v))
It also supports file downloading:
Papa.parse('state_wise_data.csv', {
download: true,
complete: results => {
console.log(results);
}
})
Fetch is 100% work with .csv file (or even api with req.query).
'content-type': 'text/csv' must be addressed in the fetch's headers:{}, and use res.text() instead of res.json() to interpret data.
const downloadCsv = async () => {
try {
const target = `https://SOME_DOMAIN.com/data/csv/addresses.csv`; //file
//const target = `https://SOME_DOMAIN.com/api/data/log_csv?$"queryString"`; //target can also be api with req.query
const res = await fetch(target, {
method: 'get',
headers: {
'content-type': 'text/csv;charset=UTF-8',
//'Authorization': //in case you need authorisation
}
});
if (res.status === 200) {
const data = await res.text();
console.log(data);
} else {
console.log(`Error code ${res.status}`);
}
} catch (err) {
console.log(err)
}
}
CSV is not a JSON file type, so you cant parse as a json text. you can check how to parse CSV text in javascript here : Example JavaScript code to parse CSV data
I would use the following method and insert it where you are supposed console.log the data.
const parseCSV = (data) => {
// create empty array
const csvData = [];
// this will return each line as an individual String
const lines = data.split("\n");
// loop through the lines and return an array of individual
// Strings within the line that are separated by a comma
for (let i = 0; i < lines.length; i++) {
csvData[i] = lines[i].split(",");
}
// return an array of arrays 2D array
// e.g [ [1,2,3], [3,4,5], [6,7,8] ]
return csvData;
};
pretty easy,
GET the url, with normal fetch req, and first convert response to text and then it's done
fetch('sample-url.csv')
.then((response) => response.text())
.then((data) => console.log(data));

fetch JSON works in node but fails in broser (JSON.parse unexpected end of data at line 1 column 1)

I'm trying to get a json file from an api, when using this code on node.js it works perfectly fine and I can see the results I'm looking for, but when I try to use it from the browser it fails and returns with the error message on the title.
async function searchVolumes(volume, url, apiKey) {
let result = await fetch(url, {mode: 'no-cors'})
.then(res => res.json())
.then(json => json.results)
.catch(error => console.log('Error reading data ' + error))
return result
}
I've tried it in Firefox and Edge and have the same problem in both of them, but checking the network tab I can see the result and the json is fine with no errors, and as I say at the beginning it works on node too.
I modified it according to sideshowbarker links and removed the no-cors mode and added a proxy cors anywhere server and it's working now:
const proxy = 'https://cors-anywhere.herokuapp.com/'
url += proxy
async function searchVolumes(volume, url, apiKey) {
let result = await fetch(url)
.then(res => res.json())
.then(json => json.results)
.catch(error => console.log('Error reading data ' + error))
return result
}

React + Node/Express | Rendering a PDF binary stream blob in React

In React I have hyperlinks which initiate a fetch for PDF files from a backend Node server with Express. The issue is that the stream opens a new window of binary text instead of a PDF file.
React frontend:
//Link
<a href={'#'} onClick={() => this.renderPDF(row.row.pdfid)}> {row.row.commonName}.pdf</a>
//Fetch call
renderPDF = (pdfLink) => {
fetch('http://localhost:8000/pdf' + '?q=' + pdfLink, {
method: 'GET'
//credentials: 'include'
})
.then(response => response.blob())
.then(blob => URL.createObjectURL(blob))
.then(url => window.open(url))
.catch(error => this.setState({
error,
isLoading: false
}));
}
Node backend:
app.get('/pdf', (req, res) => {
let readStream = fs.createReadStream(req.query["q"]);
let chunks = [];
// When the stream is done being read, end the response
readStream.on('close', () => {
res.end()
})
// Stream chunks to response
readStream.pipe(res)
});
Any input would be much appreciated.
Updating your code,
app.get('/pdf', (req, res) => {
let readStream = fs.createReadStream(req.query["q"]);
let stat = fs.statSync(req.query["q"]);
// When the stream is done being read, end the response
readStream.on('close', () => {
res.end()
})
// Stream chunks to response
res.setHeader('Content-Length', stat.size);
res.setHeader('Content-Type', 'application/pdf');
res.setHeader('Content-Disposition', 'inline; filename=test.pdf');
readStream.pipe(res);
});
Try with this now. Also, check if you get the query['q'] and is not undefined or empty just to validate on error side.

Download and upload image without saving to disk

Using Node.js, I am trying to get an image from a URL and upload that image to another service without saving image to disk. I have the following code that works when saving the file to disk and using fs to create a readablestream. But as I am doing this as a cron job on a read-only file system (webtask.io) I'd want to achieve the same result without saving the file to disk temporarily. Shouldn't that be possible?
request(image.Url)
.pipe(
fs
.createWriteStream(image.Id)
.on('finish', () => {
client.assets
.upload('image', fs.createReadStream(image.Id))
.then(imageAsset => {
resolve(imageAsset)
})
})
)
Do you have any suggestions of how to achieve this without saving the file to disk? The upload client will take the following
client.asset.upload(type: 'file' | image', body: File | Blob | Buffer | NodeStream, options = {}): Promise<AssetDocument>
Thanks!
How about passing the buffer down to the upload function? Since as per your statement it'll accept a buffer.
As a side note... This will keep it in memory for the duration of the method execution, so if you call this numerous times you might run out of resources.
request.get(url, function (res) {
var data = [];
res.on('data', function(chunk) {
data.push(chunk);
}).on('end', function() {
var buffer = Buffer.concat(data);
// Pass the buffer
client.asset.upload(type: 'buffer', body: buffer);
});
});
I tried some various libraries and it turns out that node-fetch provides a way to return a buffer. So this code works:
fetch(image.Url)
.then(res => res.buffer())
.then(buffer => client.assets
.upload('image', buffer, {filename: image.Id}))
.then(imageAsset => {
resolve(imageAsset)
})
well I know it has been a few years since the question was originally asked, but I have encountered this problem now, and since I didn't find an answer with a comprehensive example I made one myself.
i'm assuming that the file path is a valid URL and that the end of it is the file name, I need to pass an apikey to this API endpoint, and a successful upload sends me back a response with a token.
I'm using node-fetch and form-data as dependencies.
const fetch = require('node-fetch');
const FormData = require('form-data');
const secretKey = 'secretKey';
const downloadAndUploadFile = async (filePath) => {
const fileName = new URL(filePath).pathname.split("/").pop();
const endpoint = `the-upload-endpoint-url`;
const formData = new FormData();
let jsonResponse = null;
try {
const download = await fetch(filePath);
const buffer = await download.buffer();
if (!buffer) {
console.log('file not found', filePath);
return null;
}
formData.append('file', buffer, fileName);
const response = await fetch(endpoint, {
method: 'POST', body: formData, headers: {
...formData.getHeaders(),
"Authorization": `Bearer ${secretKey}`,
},
});
jsonResponse = await response.json();
} catch (error) {
console.log('error on file upload', error);
}
return jsonResponse ? jsonResponse.token : null;
}

Categories

Resources