504 Error with jQuery File Upload? - javascript

I am using the API for blueimp's jQuery File Upload plugin to upload large and small files to S3. I am sending a PUT request to the server, which handles the S3 uploading. My code is below:
// Initialize the file upload so we don't get errors
angular.element(document.querySelector('#add-file-btn')).fileupload({
url: '/test'
});
// Set up options for file upload
angular.element(document.querySelector('#add-file-btn')).fileupload(
'option',
{
url: '/temp', // Temporary, replaced in handleAdd
type: 'PUT',
dataType: 'json',
dropZone: 'dropzone-area',
acceptFileTypes: /(\.|\/)(csv|jpe?g|png)$/i, // Only accept csv files (images enabled for testing)
progressInterval: 2000, // Check progress every 2 seconds
autoUpload: true,
paramName: 'files',
formData: {
fileSize: null
},
beforeSend: function(xhr, data) {
// Set request headers
xhr.setRequestHeader('Authorization', 'bearer ' + storageService.get('token'));
xhr.setRequestHeader('Accept', 'application/json ');
},
multipart: true,
add: handleAdd,
submit: handleSubmit,
progress: handleProgress ,
processdone: handleProcessDone,
processfail: handleProcessFail,
start: handleStart,
stop: handleStop,
done: handleDone
}
);
When I upload a small file (200 KB) the file is successfully uploaded. When I try to upload a large file (2.66 GB) and log the progress shown in my handleProgress() function, I see that the progress hangs for several minutes, then it rapidly logs the last of the progress, so that right after it is 100% complete I see a 504 Gateway Timeout error has been returned.
function handleProgress(e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
data.files[0].progress = progress;
console.log(progress)
}
The backend guy says his code works for uploading large files via the command line so we're trying to figure out why it's not also working from the browser. Any ideas? I know the 504 error is supposed to be just a server-side problem but it's confusing that it's working for the CLI...

Related

Issue in grapesJS - Asset manager handling

I'm now working on grapesJS implementation in our application. I have gone through documents provided & some of the issues reported regarding the asset manager in Github, etc.. I was not able to show the list of uploaded images in the Asset manager after a hard refresh list disappears.
I have tried to upload files to AmazonS3, upload was ok & I get its response, also it is showing in the list. In this case, I was not able to edit images due to the CORS issue.
Later, I tried with the base64 encoded image. With this method, I was able to edit uploaded images. Also got it in the image listing. As I said earlier, the image list only available until a hard refresh. I think it is from the cache.
I have doubts in a few codes, can you help me out of it.
Here is the assetManager section code:
assetManager: {
storageType : '',
storeOnChange : true,
storeAfterUpload : true,
embedAsBase64 : true, // Make this false to upload image to AmazonS3
upload : siteURL+'assets/CONTENT/img', // For temporary storage, Upload endpoint, set `false` to disable upload
assets : [],
// headers: {}, // Custom headers to pass with the upload request
// params: {}, // Custom parameters to pass with the upload request, eg. csrf token
credentials: 'include', // The credentials setting for the upload request, eg. 'include', 'omit'
autoAdd : 1, // If true, tries to add automatically uploaded assets.
dropzone : 0, // Enable an upload dropzone on the entire editor (not document) when dragging files over it
openAssetsOnDrop : 1, // Open the asset manager once files are been dropped via the dropzone
multiUpload : true, // Allow uploading multiple files per request. If disabled filename will not have '[]' appended
showUrlInput: true, // Toggles visiblity of assets url input
uploadFile: function(e) {
// Ref: https://blog.webnersolutions.com/adding-image-upload-feature-in-grapesjs-editor/
var files = e.dataTransfer ? e.dataTransfer.files : e.target.files;
var formData = new FormData();
for(var i in files){
formData.append('file-'+i, files[i]); // Containing all the selected images from local
}
$.ajax({
url: siteURL + 'uploadImage_base64', // Save image as base64 encoded - Its a function
// url: siteURL + 'uploadImage', // Upload image to AmazonS3 - Its a function
type: 'POST',
data: formData,
contentType: false,
crossDomain: true,
dataType: 'json',
mimeType: "multipart/form-data",
processData: false,
success: function(result) {
var myJSON = [];
if ((typeof(result['data']) != 'undefined') && (result != 'null')) {
$.each(result['data'], function(key, value) {
myJSON[key] = value;
});
console.log(myJSON);
// while using base64 encode => 0: {name: "ipsumImage.png", type: "image", src: "data:image/png;base64,iVBORw0KGgAAVwA…AAAAAAAAAAAAAAAD4Pv4B6rBPej6tvioAAAAASUVORK5CYII=", height: 145, width: 348}
// while using AmazonS3 => 0: {name: "logo_sigclub.png", type: "image", src: "https://amazonaws.com/assets/CONTENT/img/logo_sigclub.png", status: true, message: "Uploaded successfully", …}
editor.AssetManager.add(myJSON); //adding images to asset manager of GrapesJS
}
}
});
}
}
I have doubt in:
upload : siteURL+'assets/CONTENT/img', is this image upload path or a function which uploads image?
If I'm using AmazonS3, can I able to get the editing option with the CORS issue?
How can I show the uploaded images in Asset Manager without disappearing later?

Dropzone.js - Laravel | 500 Internal Server Error

I want to upload my images with dropzone.js, I get images on server side, its ok. But When I want to try save, it gives 500 internal server error.
Dropzone.options.mDropzoneTwo = {
paramName: "file", // The name that will be used to transfer the file
maxFiles: 10,
maxFilesize: 10, // MB
url: 'images/save' ,
method: 'POST',
headers: {
"X-CSRF-TOKEN": document.head.querySelector("[name=csrf-token]").content
},
uploadMultiple: true,
accept: function(file, done) {
done();
},
success: function () {
console.log()
}
};
And here is my controller. I'm saving images to public/uploads folder.
$realname = str_slug(pathinfo($request->file('file')->getClientOriginalName(), PATHINFO_FILENAME));
$extension = $request->file('file')->getClientOriginalExtension();
$new_name = $realname."-".time().".".$extension;
$request->file('file')->move(public_path('uploads/'.str_slug($new_name)));
$image = new Media();
$image->image = str_slug($new_name);
$image->image_path = "images/".str_slug($new_name);
$image->image_alt_name = $realname;
$image->save();
As per comments --
It means your application is not getting file object on $request->file('file') this, you can get more info by printing $request and then you can check for file too weather its being sent from client script or not

Javascript Blob to File / Sending Files through Ajax Request

In an Express app, I have a simple image uploader which was working perfectly using Ajax PUT requests. But now I need to crop the images before sending them to the server.
I plugged in Croppie to help with this. On form submit I get the cropped image blobs from Croppie, convert them to files, append the files to the Ajax FormData object, and send the FormData off to Express.
But the server isn't receiving and files, and examining the PUT request in Chrome's dev tools shows that no file info is being send.
I've console.logged the blobs and the files, and as far as I can tell they are being created properly.
Getting the blobs and converting them to files:
var fileList = []
if($croppieContainers.length > 0){
$.each($croppieContainers, function(i){
$(this).croppie('result', {
type: 'blob', // Get the blob from the croppie container
}).then(function(blob) {
let file = new File([blob], "image-" + i, {
type: blob.type,
lastModifiedDate: new Date(); // Make a file from the blob...
});
fileList.push(file) // And push it to fileList
});
});
}
Appending the files to ajaxData:
if(fileList.length > 0) {
$.each('fileList', function(file){
ajaxData.append($input.attr('name'), file);
})
}
posting to Express:
$.ajax({
url: $form.attr('action'),
type: $form.attr('method'),
data: ajaxData,
// dataType: 'json',
cache: false,
contentType: false,
processData: false,
complete: function() {
$form.removeClass('is-uploading');
},
success: function(data) {
// Handlers...
Console.log(blob) returns:
Blob
size:288345
type: "image/png"
__proto__: Blob
And the file:
File
lastModified : 1478972364233
lastModifiedDate : Sat Nov 12 2016 17:39:24 GMT+0000 (GMT)
name : "image-0"
size : 288345
type : "image/png"
webkitRelativePath : ""
__proto__ : File
I've tried getting the file a different way, but it gives the same result:
if($croppieContainers.length > 0){
$.each($croppieContainers, function(i){
let file = new File([$(this).croppie('result', {
type: 'blob'}
)],
"image-" + i)
ajaxData.append($input.attr('name'),file)
});
}
I'm note sure how to debug this, apart from console.logging at various stages, and checking the XHR in dev tools...
Can anyone see what the problem is, or tell me how I can debug further?
Thanks you!

Cannot Access uploaded file with XHR - laravel

I use laravel with LPology/Simple-Ajax-Uploader to upload files with xhr requests.
however, i cannot access uploaded file with
Input::file('name') // or eaither
Input::get('name')
i get NULL for both.
how i can access uploaded file with xhr request.
here is my JS code:
var sizeBox = document.getElementById('sizeBox'), // container for file size info
progress = document.getElementById('progress'); // the element we're using for a progress bar
var uploader = new ss.SimpleUpload({
button: 'upload-btn', // file upload button
url: '{{action('filesController#postPicUpload')}}', // server side handler
name: 'myfile', // upload parameter name
progressUrl: 'uploadProgress.php', // enables cross-browser progress support (more info below)
responseType: 'json',
allowedExtensions: ['jpg', 'jpeg', 'png', 'gif'],
maxSize: 1024, // kilobytes
hoverClass: 'ui-state-hover',
focusClass: 'ui-state-focus',
startXHR: function(){
console.log('startXHR');
},
startNonXHR: function(){
console.log('startNonXHR');
},
disabledClass: 'ui-state-disabled',
onSubmit: function(filename, extension) {
this.setFileSizeBox(sizeBox); // designate this element as file size container
this.setProgressBar(progress); // designate as progress bar
},
onComplete: function(filename, response) {
console.log(response);
if (!response) {
alert(filename + 'upload failed');
return false;
}
// do something with response...
}
});
});
and here is my action:
public function postPicUpload(){
$file = Input::file('myfile');
$file->move('imgs/'.date('Y/n/j/', time()),$file->getClientOriginalName());
}
any help?
Had the same problem in a normal form I forgot to add in form data.
enctype="multipart/form-data"

Upload files to Dropbox using a Dropbox Core API in Javascript

I am working on a simple chrome-extension that needs to upload files to the user's dropbox folder. I am using the simple AJAX requests as mentioned below to upload files, however it works for files with extensions such as .txt, .json, .c, etc i.e. files whose mime type is of type text/plain or similar type but all other file types such as pdfs, image files etc get corrupted and produce blank contents. What am I missing in uploading the files the correct way.
function startUpload()
{
var folderPath = $(this).closest('tr').attr('path')+'/';
var file = $("#upload_file")[0].files[0];
if (!file){
alert ("No file selected to upload.");
return false;
}
var reader = new FileReader();
reader.readAsText(file, "UTF-8");
reader.onload = function (evt) {
uploadFile(folderPath+file.name,evt.target.result,file.size,file.type);
}
}
//function to upload file to folder
function uploadFile(filepath,data,contentLength,contentType){
var url = "https://api-content.dropbox.com/1/files_put/auto"+filepath;
var headers = {
Authorization: 'Bearer ' + getAccessToken(),
contentLength: contentLength,
};
var args = {
url: url,
headers: headers,
crossDomain: true,
crossOrigin: true,
type: 'PUT',
contentType: contentType,
data : data,
dataType: 'json',
success: function(data)
{
getMetadata(filepath.substring(0,filepath.lastIndexOf('/')),createFolderViews);
},
error: function(jqXHR)
{
console.log(jqXHR);
}
};
$.ajax(args);
}
I believe the issue is reader.readAsTextFile(file, "UTF-8"). If the file isn't a text file, this will misinterpret the contents. I think you want reader.readAsBinaryString or reader.readAsArrayBuffer. (I haven't tested it myself.)
EDIT
After testing this myself, I found that readAsArrayBuffer is what you need, but you also need to add processData: false as an option to $.ajax to prevent jQuery from trying to convert the data to fields in a form submission.
Also be sure to use dataType: 'json' to properly parse the response from the server.

Categories

Resources