I've just started to use GAS and I'm having issues with uploading files to a GDrive folder.
I'm using this way to read the form (html+bootstrap) file inputs and generate the files into a drive folder. There are 4 file inputs for different topics on the form.
function processForm(theForm) {
//Variables from files to upload
var talleres = theForm.talleres;
var sistEval = theForm.sistEval;
var otrosDoc = theForm.otrosDoc;
var matsDigi = theForm.matsDigi;
var folder = DriveApp.getFolderById(folderId);
var nombTuto = theForm.curso+'_'+theForm.nombre+'_'+theForm.paterno+'_'+theForm.materno;
var foldTuto = DriveApp.createFolder(nombTuto);
var doc = foldTuto.createFile(talleres);
var doc2 = foldTuto.createFile(sistEval);
var doc3 = foldTuto.createFile(otrosDoc);
var doc4 = foldTuto.createFile(matsDigi);
folder.addFolder(foldTuto);
DriveApp.removeFolder(foldTuto);
I've limited the file size with a js validation on the form.html to 100Mb each file, but when I try to upload more than 50Mb in total (considering the 4 files) the console of bugzilla returns the message:
NetworkError: Connection failure due to HTTP 500
And the web gets frozen, and obviously didn't save anything.
So, I need to know if i have to change the method for upload because I'm searching on the documentation of GAS page but can't find anything about "upload Quotas" or detailed information from the error 500, just like "Internal Server Error".
Sorry about my english, regards.
With the Drive Service of Google Apps Script the limit is set to 10MB per upload. I guess that the Blob function has the same limits that the others, it's an oversight from Google.
Throws an exception if content is larger than 10MB.
Issue already reported here: #552, #2806
I guess this limitation is due to the limitation of URL Fetch POST size.
Related
I hope you're good!
I have an REST-API with PHP (Flight-PHP as framework) running in one server and I want to download a PDF saved in the server. But I'm having troubles with that.
The API resource that needs to be called to download the PDF is like:
GET /sales/:id/download
If I run the resource mentioned above in a browser, it will download a PDF file and it will display the PDF downloaded without troubles.
Now, in the frontend (a.k.a. a web application running in my browser) I have the following code:
$scope.download = (function (id) {
$http.get($rootScope.api_url + 'sales/' + id + '/download')
.then(function (response) {
var resp = response.data;
var blob = new Blob([resp], {type : 'application/pdf'});
saveAs(blob, folio + ".pdf"); //yup, I'm using SaveAs.js
}, function (reason) {
alert("The file weren't downloaded");
});
});
The code mentioned above downloads me a pdf file... But it is white!
So, after open both PDF's (one generated from the backend and another generated from the js script) it appears me with some chars I can't read (literally, I can't read)
So, my question is, how can I download the file using a different encode? And, which is the better way to encode this file to avoid the loss of chars?
I have made a Chrome app that relies heavily on Chrome's fileSystem API to record and save video streams from various websites. Since the stream data is processed in javascript before being saved, simply downloading the streams doesn't work.
Now I am considering making a Firefox version...
I know that Firefox has a sandboxed file system API, but as far as I know, it is not possible to save the files to the physical file system.
Only option I can see is creating a blob from the sandboxed file system and download that blob.
I have actually two questions:
Are there any options I have missed to create and save files directly in the physical file system from Firefox addons?
Even if I have to rely on the sandboxed file system, is it possible to open files in append mode, ie. to append data to existing files?
Yes to your first question: there is the io/file API. Opening a file returns a stream (io/bytestream). Examples from the docs
function readBinaryDataFromFile (filename) {
var fileIO = require("sdk/io/file");
var data = null;
if (fileIO.exists(filename)) {
var ByteReader = fileIO.open(filename, "rb");
if (!ByteReader.closed) {
data = ByteReader.read();
ByteReader.close();
}
}
return data;
}
function writeBinaryDataToFile(data, filename) {
var fileIO = require("sdk/io/file");
var ByteWriter = fileIO.open(filename, "wb");
if (!ByteWriter.closed) {
ByteWriter.write(data);
ByteWriter.close();
}
}
Seems like this should be easier. But admittedly I don't understand blobs.
function doGet(e) {
var app = UiApp.createApplication();
var panel = app.createVerticalPanel().setId('panel');
var fileUpload = app.createFileUpload().setName('theFile').setId('theFile');
var handler = app.createServerChangeHandler('uploadfile');
handler.addCallbackElement(panel);
fileUpload.addChangeHandler(handler);
panel.add(fileUpload);
app.add(panel);
return app;
}
function uploadfile(e)
{
// data returned which can be used to create a blob
// assuming mime-type to be a text file in this example
var fileBlob = Utilities.newBlob(e.parameter.thefile, "text/plain","file.txt" );
// Create a new file
var doc = DocumentApp.create('Uploaded Text File');
doc.appendParagraph(fileBlob.getDataAsString());
// Save and close the document
doc.saveAndClose();
//var doc = DocsList.createFile(fileBlob.getDataAsString());
var app = UiApp.getActiveApplication();
app.getElementById('panel').add(app.createLabel('File Uploaded successfully'));
return app;
}
I keep getting undefined returned when I attempt to upload a file using this Google Apps Script Code. All I want to do is upload a text file to my Google Drive.
What should I do to fix this code? or is there a better way to do this?
Google Apps Script now has "Experimental" Resources to Advanced Google Services. I counted 15 Advanced Services, including Drive API.
Click the Resources menu:
You must also go to the Google Developers console, and enable Drive Service there also.
The Google Documentation is at: Advanced Drive Services
Here is some sample code from Google:
function uploadFile() {
var image = UrlFetchApp.fetch('http://goo.gl/nd7zjB').getBlob();
var file = {
title: 'google_logo.png',
mimeType: 'image/png'
};
file = Drive.Files.insert(file, image);
Logger.log('ID: %s, File size (bytes): %s', file.id, file.fileSize);
}
EDIT
I thought there was no way to upload a file from the users computer to Google drive using the HTML Service, but luckily I'm wrong again! :)
File Upload with HTML Service
To do a file upload, you cannot use a server handler. You must use a FormPanel and a Submit button. See the FileUpload widget's documentation.
I'm looking for a way to save large files (exactly 8 megabytes) in Safari. I have tried using both the URI scheme along with the eligreyFileSaver and the flash plugin Downloadify. All of these cause Safari to allocate memory until the web worker process reaches about 2 gigabytes and then Safari crashes.
I realize there are questions like this one before, but I have tried everything those questions have resulted in. Links:
Using HTML5/Javascript to generate and save a file
How to Save a file at client side using JavaScript?
create a file using javascript in chrome on client side
This code works on Firefox & Google Chrome (uses the eligreyFileSaver library for saveAs):
function io_saveData (){
var bb;
var buffer;
var data;
alert ("The file will now be saved.");
bb = new BlobBuilder();
for (var i = 0;i<kMapHeight;i++){
var stduint8 = new Uint8Array (uint16map[i].buffer);
var stduint8LittleEndian = new Uint8Array (kMapWidth*2);
//byte swap work around
for (var j = 0;j<stduint8.length;j+=2){
stduint8LittleEndian [j] = stduint8 [j+1]
stduint8LittleEndian [j+1] = stduint8 [j];
}
bb.append(stduint8LittleEndian.buffer);
}
var blob = bb.getBlob("example/binary");
saveAs(blob, "Data File");
bb = null;
buffer = null;
data = null;
}
I'm looking for a way for Safari to create a download without crashing. The deployment area is Mac OS X, so each machine will have apache built in along with PHP, I would rather not take that route though.
Here you go. First of store the file in HTML5 file system and after the completion data storing download it using filesaver api. I worked on it and I got good results with out blocking UI and crashes of browser. better to do it in webworkers to get performance of app.
Here are helpful article to it.
TEMPORARY storage has a default quota of 50% of available disk as a shared pool. (50GB => 25GB) (Not restricted to 1GB anymore)
http://updates.html5rocks.com/tag/filesystem
Unfortunately, Safari7 seems to not support writing files.
https://github.com/eligrey/FileSaver.js/issues/12
http://caniuse.com/#feat=filesystem
I have upload file functionality on one of the page. I check for the extension of the file using JavaScript. Now i want to restrict the user from uploading file greater than 1 MB. Is there any way i can check the file upload size using JavaScript.
My code currently look like this:
<script language="JavaScript">
function validate() {
var filename = document.getElementById("txtChooseFile").value;
var ext = getExt(filename);
if(ext == "txt" || ext == "csv")
return true;
alert("Please upload Text files only.");
return false;
}
function getExt(filename) {
var dot_pos = filename.lastIndexOf(".");
if(dot_pos == -1)
return "";
return filename.substr(dot_pos+1).toLowerCase();
}
</script>
See http://www.w3.org/TR/FileAPI/. It is supported by Firefox 3.6; I don't know about any other browsers.
Within the onchange event of a <input id="fileInput" type="file" /> simply:
var fi = document.getElementById('fileInput');
alert(fi.files[0].size); // maybe fileSize, I forget
You can also return the contents of the file as a string, and so forth. But again, this may only work with Firefox 3.6.
Now it is possible to get file size using pure JavaScript. Nearly all browser support FileReader, which you can use to read file size as well as you can show image without uploading file to server. link
Code:
var oFile = document.getElementById("file-input").files[0]; // input box with type file;
var img = document.getElementById("imgtag");
var reader = new FileReader();
reader.onload = function (e) {
console.log(e.total); // file size
img.src = e.target.result; // putting file in dom without server upload.
};
reader.readAsDataURL(oFile );
You can get file size directly from file object using following code.
var fileSize = oFile.size;
Other that aquiring the filename there is no way for you to find out any other details about the file in javascript including its size.
Instead you should configure server-side script to block an oversized upload.
Most of these answers are way out-of-date. It is currently possible to determine file size client-side in any browser that supports the File API. This includes, pretty much, all browsers other than IE9 and older.
It might be possible using a lot of browser-specific code. Take a look at the source of TiddlyWiki, which manages to save itself on the user's hard drive by hooking into Windows Scripting Host (IE), XPCOM (Mozilla), etc.
I don't think there is any way of doing that with plain JS from a web page.
With a browser extension maybe, but from a page javascript cannot access the filesystem for security reasons.
Flash and Java should have similar restrictions, but maybe they are a bit less strict.
not possible. would be a major security concern to allow client side scripts to run that can read file info from and end users hard drive.
See here:
http://www.kavoir.com/2009/01/check-for-file-size-with-javascript-before-uploading.html
As to all the people saying this has to be done server side, they are absolutely spot on it does.
In my case though the maximum size I will except is 128Mb, if a user tries to upload something that is 130Mb they should not have to wait the 5 minute upload time to find out it is too big so I need to do an additional check before they submit the page for usability sake.
I had the same issue, Here's a simple JavaScript snippet worked for me. Adding for future googlers.
HTML
<input type="file" name="photo" id="photo" accept="image/*">
JS
const file = document.getElementById('photo');
// Show KB (add one more /1024 for MB)
const filesize = file.files[0].size / 1024;
if (filesize > 500) { // Alert greater than 500kb
console.log(filesize);
alert('Please upload image less than 500 KB');
return;
}