here's my problem :
I want to create a Google Sheets extension in which I basically extract data from a sheet in Google Sheets, that I modify using methods in node JS.
Then, having the data that I modified in a string, I want to upload that string into the client's Drive, in a csv or xml file. Therefore I don't have a local file that I can use to upload the file, just a string variable.
How do I upload that string ?
Thanks a lot, that's my first app and I'm struggling a bit.
Code
const {google} = require ('googleapis');
const keys = require ('./keys.json');
const client = new google.auth.JWT(
keys.client_email, null,
keys.private_key,
['googleapis.com/auth/drive'],
'https://www.googleapis.com/…'
);
client.authorize(function(err, tokens){
if (err){
console.log(err);
return
} else {
console.log('Connected');
gsrun(client);
}
});
async function gsrun(cl) {
const gsapi = google.sheets({version: 'v4', auth: cl});
}
You have to set your file's metadata and the data it will contain (it's important the MIME type for this case must be text/csv) and the file's body will be a simple string. This code will help you taking into consideration you already did the OAuth process and have the string you want to insert:
module.exports.init = async function (){
// Before calling the API, build your own Drive service instance
// In the second argument, you must pass your own string message
const pro = await uploadSimpleString(drive, null);
console.log(pro);
}
uploadSimpleString = (drive, message) => {
// Set file metadata and data
message = message || 'This is a simple String nice to meet you';
const fileMetadata = {'name': 'uploadSimpleStringt.csv'};
const media = {
mimeType: 'text/csv',
body: message
};
// Return the Promise result after completing its task
return new Promise((resolve, reject) => {
try{
// Call Files: create endpoint
return drive.files.create({
resource: fileMetadata,
media: media,
fields: 'id'
},(err, results) => {
// Result from the call
if(err) reject(`Drive error: ${err.message}`);
resolve(results);
})
} catch (error){
console.log(`There was a problem in the promise: ${error}`);
}
});
}
Notice
To test this code, run it in your CLI using this command:
node -e 'require("./index.js").init()'
Where index.js is your file's name and init() is your main function.
Docs
For more info, please check these links and also consider using the [google-drive-api] tag in that way, there are more chances to receive help because more people will be able to find your question.
How to get Help
Files: create
G Suite documents and corresponding export MIME types
Related
Some context
I've created a service worker to send notifications to registered users.
It works well until I tried to implement a sort of id to each people who register to a service worker (to send notification).
I do that because I have to delete old registration from my database, so I took the choice to let each users three registration (one for mobile device and two others for different navigator on computer) and if there is more, I want to remove from the database the older.
Tools
I'm using nodejs, express and mySql for the database.
The issue
When I launch a subscription I got this error:
SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse (<anonymous>)
I saw in an other post that it's because they try to JSON.parse what's already an object.
But in my case, I can't find where I parse, see the part which are concerned:
// service.js (service worker file)
// saveSubscription saves the subscription to the backend
const saveSubscription = async (subscription, usrCode) => {
const SERVER_URL = 'https://mywebsite:4000/save-subscription'
subscription = JSON.stringify(subscription);
console.log(subscription); // I got here what I expect
console.log(usrCode); // <-------------------------------- HERE I GOT UNDEFIND
const response = await fetch(SERVER_URL, {
method: 'post',
headers: {
'Content-Type' : 'application/json',
},
body : {
subscription: subscription,
usrCode: usrCode
}
})
return response
}
But when I console.log(usrCode) in my inspector, I got the good value.
So how should I do to get the value in service.js
Maybe the problem is from:
const bodyParser = require('body-parser')
app.use(bodyParser.json())
At the beginning I thought that the issue is from the back (because I'm not really good with async function).
And here is the back, If maybe I got something wrong.
// index.js (backend)
// Insert into database
const saveToDatabase = async (subscription, usrCode) => {
// make to connection to the database.
pool.getConnection(function (err, connection) {
if (err) throw err; // not connected!
console.log(usrCode);
console.log(subscription);
connection.query(`INSERT INTO webpushsub (webpushsub_info, webpushsub_code) VALUES ('${subscription}', '${usrCode}')`, function (err, result, fields) {
// if any error while executing above query, throw error
if (err) throw err;
// if there is no error, you have the result
console.log(result);
connection.release();
});
});
}
// The new /save-subscription endpoint
app.post('/save-subscription', async (req, res) => {
const usrCode = req.body.usrCode; // <------------------ I'm not sure about this part
const subscription = req.body.subscription
await saveToDatabase(JSON.stringify(subscription, usrCode)) //Method to save the subscription to Database
res.json({ message: 'success' })
})
By searching on google, I've found this tutorial. So the reason why usrCode is undefined is because the service worker doesn't have access to a data stored in front.
First you have to pass it in the URL as following:
// swinstaller.js (front)
// SERVICE WORKER INITIALIZATION
const registerServiceWorker = async (usrCode) => {
const swRegistration = await navigator.serviceWorker.register('service.js?config=' + usrCode); //notice the file name
return swRegistration;
}
And then get it in the service worker:
// service.js (service worker file)
// get the usrCode
const usrCode = new URL(location).searchParams.get('config');
I am trying implement a function which provides a temporary file url for downloading. Right now user enter his/her email id and function response with a file url. To fetch the file I am using firebase-admin SDK to access storage bucket and retrive a file.
Problem : Right now everything is working fine but the file URL is not working.. I am getting the error: Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.
app.post('/getFile', (req, res) => {
let email = req.body.mail;
if (micro.validateEmail(email)) {
micro.checkIfUserExists(admin, email)
.then(data => {
storage.file('form.pdf')
.getMetadata()
.then(fileRes => {
sendMail.send(email, fileRes[0].mediaLink).then(msg => {
res.send({
uri: fileRes,
res: msg,
mail: email,
data: data
})
});
});
});
}
});
Edit:
So I have found that I could use getSignedURL to get a url but I don't know how to specifiy a file path, for which I url is needed.
const bucket = storage.bucket();
//-
// Generate a URL that allows temporary access to list files in a bucket.
//-
const config = {
action: 'list',
expires: '03-17-2025'
};
//-
// If the callback is omitted, we'll return a Promise.
//-
bucket.getSignedUrl(config).then(function(data) {
const url = data[0];
});
Edit 2
So I have found that you can specify the file by specifying the bucket.file('')
const bucket = storage.bucket().file('form.pdf');
But now I have another problem. I am getting this error in firebase function log
Permission iam.serviceAccounts.signBlob is required to perform this operation on service account
I have two functions in separate files to split up the workflow.
const download = function(url){
const file = fs.createWriteStream("./test.png");
const request = https.get(url, function(response) {
response.pipe(file);
});
}
This function in my fileHelper.js is supposed to take a URL with an image in it and then save it locally to test.png
function uploadFile(filePath) {
fs.readFile('credentials.json', (err, content) => {
if (err) return console.log('Error loading client secret file:', err);
// Authorize a client with credentials, then call the Google Drive API.
authorize(JSON.parse(content), function (auth) {
const drive = google.drive({version: 'v3', auth});
const fileMetadata = {
'name': 'testphoto.png'
};
const media = {
mimeType: 'image/png',
body: fs.createReadStream(filePath)
};
drive.files.create({
resource: fileMetadata,
media: media,
fields: 'id'
}, (err, file) => {
if (err) {
// Handle error
console.error(err);
} else {
console.log('File Id: ', file.id);
}
});
});
});
}
This function in my googleDriveHelper.js is supposed to take the filePath of call and then upload that stream into my google drive. These two functions work on their own but it seems that the https.get works asynchronously and if I try to call the googleDriveHelper.uploadFile(filePath) function after the download, it doesn't have time to get the full file to upload so instead a blank file will be uploaded to my drive.
I want to find a way so that when the fileHelper.download(url) is called, it automatically uploads into my drive.
I also don't know if there is a way to create a readStream directly from the download function to the upload function, so I can avoid having to save the file locally to upload it.
I believe your goal as follows.
You want to upload a file retrieving from an URL to Google Drive.
When you download the file from the URL, you want to upload it to Google Drive without creating the file.
You want to achieve this using googleapis with Node.js.
You have already been able to upload a file using Drive API.
For this, how about this answer?
Modification points:
At download function, the retrieved buffer is converted to the stream type, and the stream data is returned.
At uploadFile function, the retrieved stream data is used for uploading.
When the file ID is retrieved from the response value of Drive API, please use file.data.id instead of file.id.
By above modification, the file downloaded from the URL can be uploaded to Google Drive without creating a file.
Modified script:
When your script is modified, please modify as follows.
download()
const download = function (url) {
return new Promise(function (resolve, reject) {
request(
{
method: "GET",
url: url,
encoding: null,
},
(err, res, body) => {
if (err && res.statusCode != 200) {
reject(err);
return;
}
const stream = require("stream");
const bs = new stream.PassThrough();
bs.end(body);
resolve(bs);
}
);
});
};
uploadFile()
function uploadFile(data) { // <--- Modified
fs.readFile("drive_credentials.json", (err, content) => {
if (err) return console.log("Error loading client secret file:", err);
authorize(JSON.parse(content), function (auth) {
const drive = google.drive({ version: "v3", auth });
const fileMetadata = {
name: "testphoto.png",
};
const media = {
mimeType: "image/png",
body: data, // <--- Modified
};
drive.files.create(
{
resource: fileMetadata,
media: media,
fields: "id",
},
(err, file) => {
if (err) {
console.error(err);
} else {
console.log("File Id: ", file.data.id); // <--- Modified
}
}
);
});
});
}
For testing
For example, when above scripts are tested, how about the following script?
async function run() {
const url = "###";
const data = await fileHelper.download(url);
googleDriveHelper.uploadFile(data);
}
References:
Class: stream.PassThrough
google-api-nodejs-client
I am currently in the process of creating a REST API for my personal website. I'd like to include some downloads and I would like to offer the possibility of selecting multiple ones and download those as a zip file.
My first approach was pretty easy: Array with urls, request for each of them, zip it, send to user, delete. However, I think that this approach is too dirty considering there are things like streams around which seems to be quite fitting for this thing.
Now, I tried around and am currently struggling with the basic concept of working with streams and events throughout different scopes.
The following worked:
const r = request(url, options);
r.on('response', function(res) {
res.pipe(fs.createWriteStream('./file.jpg'));
});
From my understanding r is an incoming stream in this scenario and I listen on the response event on it, as soon as it occurs, I pipe it to a stream which I use to write to the file system.
My first step was to refactor this so it fits my case more but I already failed here:
async function downloadFile(url) {
return request({ method: 'GET', uri: url });
}
Now I wanted to use a function which calls "downloadFile()" with different urls and save all those files to the disk using createWriteStream() again:
const urls = ['https://download1', 'https://download2', 'https://download3'];
urls.forEach(element => {
downloadFile(element).then(data => {
data.pipe(fs.createWriteStream('file.jpg'));
});
});
Using the debugger I found out that the "response" event is non existent in the data object -- Maybe that's already the issue? Moreover, I figured that data.body contains the bytes of my downloaded document (a pdf in this case) so I wonder if I could just stream this to some other place?
After reading some stackoveflow threads I found the following module: archiver
Reading this thread: Dynamically create and stream zip to client
#dankohn suggested an approach like that:
archive
.append(fs.createReadStream(file1), { name: 'file1.txt' })
.append(fs.createReadStream(file2), { name: 'file2.txt' });
Making me assume I need to be capable of extracting a stream from my data object to proceed.
Am I on the wrong track here or am I getting something fundamentally wrong?
Edit: lmao thanks for fixing my question I dunno what happened
Using archiver seems to be a valid approach, however it would be advisable to use streams when feeding large data from the web into the zip archive. Otherwise, the whole archive data would need to be held in memory.
archiver does not support adding files from streams, but zip-stream does. For reading a stream from the web, request comes in handy.
Example
// npm install -s express zip-stream request
const request = require('request');
const ZipStream = require('zip-stream');
const express = require('express');
const app = express();
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var stream = request('https://loremflickr.com/640/480')
zip.entry(stream, { name: 'picture.jpg' }, err => {
if(err)
throw err;
})
zip.finalize()
});
app.listen(3000)
Update: Example for using multiple files
Adding an example which processes the next file in the callback function of zip.entry() recursively.
app.get('/archive.zip', (req, res) => {
var zip = new ZipStream()
zip.pipe(res);
var queue = [
{ name: 'one.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'two.jpg', url: 'https://loremflickr.com/640/480' },
{ name: 'three.jpg', url: 'https://loremflickr.com/640/480' }
]
function addNextFile() {
var elem = queue.shift()
var stream = request(elem.url)
zip.entry(stream, { name: elem.name }, err => {
if(err)
throw err;
if(queue.length > 0)
addNextFile()
else
zip.finalize()
})
}
addNextFile()
})
Using Async/Await
You can encapsulate it into a promise to use async/await like:
await new Promise((resolve, reject) => {
zip.entry(stream, { name: elem.name }, err => {
if (err) reject(err)
resolve()
})
})
zip.finalize()
I'm integrating Quickblox in my Ionic2 app, for now I was able to do all the things except uploading a file.
In Quickblox you have to upload a file using a function made by them that according to js documentation looks like this:
var inputFile = $("input[type=file]")[0].files[0];
var params = {name: inputFile.name,
file: inputFile,
type: inputFile.type,
size: inputFile.size,
public: false};
QB.content.createAndUpload(params, function(err, response){
if (err) {
console.log(err);
} else {
console.log(response);
var uploadedFile = response;
var uploadedFileId = response.id;
}
});
So I translated above code to typescript and I have something like this:
uploadFile(filename) {
File.resolveDirectoryUrl(cordova.file.dataDirectory).then(
(dirEntry) => {
File.getFile(dirEntry, filename, { create: false }).then(
(fileEntry) => {
console.log(fileEntry);
fileEntry.file((file) => {
console.log(file);
var params = {
name: file['name'],
file: file,
type: file['type'],
size: file['size'],
'public': false
};
quickblox.content.createAndUpload(params,
(err, response) => {
if (err) {
console.log(err);
} else {
console.log(response);
var uploadedFileId = response.id;
var msg = {
type: 'groupchat',
body: "[attachment]",
extension: {
save_to_history: 1,
}
};
msg["extension"]["attachments"] = [{ id: uploadedFileId, type: 'photo' }];
quickblox.chat.send(this.xmpp_room_jid, msg);
}
});
})
}).catch(
(err) => {
console.log(err);
}
);
}
);
}
This work in the terms of "I get ok responses from quickblox server", but when I go to the admin console of quickblox to check the uploaded content I find out that the image I uploaded has 0 bytes.
So after checking the code for a while I compared side by side all my function calls with the example app of quickblox and the only difference I could find was in the File object.
This is the File object I get in my Ionic 2 app:
And this is the one I get in the quickblox js example:
All the others things looks identically except this File object.
I'm almost sure that this is the problem I'm having, and because I'm very newbie in this field, I couldn't find a way to cast from my File object in Ionic to something like the File object in the js example.
Thanks in advance at all for your time and help.
EDIT:
I attach the requests/responses logs from my Ionic app:
Could you please post the code you used to connect to chat, create a session, open a video call?
The documentation on quickblox is very bad and i got stuck at connecting to chat.