I use laravel with LPology/Simple-Ajax-Uploader to upload files with xhr requests.
however, i cannot access uploaded file with
Input::file('name') // or eaither
Input::get('name')
i get NULL for both.
how i can access uploaded file with xhr request.
here is my JS code:
var sizeBox = document.getElementById('sizeBox'), // container for file size info
progress = document.getElementById('progress'); // the element we're using for a progress bar
var uploader = new ss.SimpleUpload({
button: 'upload-btn', // file upload button
url: '{{action('filesController#postPicUpload')}}', // server side handler
name: 'myfile', // upload parameter name
progressUrl: 'uploadProgress.php', // enables cross-browser progress support (more info below)
responseType: 'json',
allowedExtensions: ['jpg', 'jpeg', 'png', 'gif'],
maxSize: 1024, // kilobytes
hoverClass: 'ui-state-hover',
focusClass: 'ui-state-focus',
startXHR: function(){
console.log('startXHR');
},
startNonXHR: function(){
console.log('startNonXHR');
},
disabledClass: 'ui-state-disabled',
onSubmit: function(filename, extension) {
this.setFileSizeBox(sizeBox); // designate this element as file size container
this.setProgressBar(progress); // designate as progress bar
},
onComplete: function(filename, response) {
console.log(response);
if (!response) {
alert(filename + 'upload failed');
return false;
}
// do something with response...
}
});
});
and here is my action:
public function postPicUpload(){
$file = Input::file('myfile');
$file->move('imgs/'.date('Y/n/j/', time()),$file->getClientOriginalName());
}
any help?
Had the same problem in a normal form I forgot to add in form data.
enctype="multipart/form-data"
Related
I'm now working on grapesJS implementation in our application. I have gone through documents provided & some of the issues reported regarding the asset manager in Github, etc.. I was not able to show the list of uploaded images in the Asset manager after a hard refresh list disappears.
I have tried to upload files to AmazonS3, upload was ok & I get its response, also it is showing in the list. In this case, I was not able to edit images due to the CORS issue.
Later, I tried with the base64 encoded image. With this method, I was able to edit uploaded images. Also got it in the image listing. As I said earlier, the image list only available until a hard refresh. I think it is from the cache.
I have doubts in a few codes, can you help me out of it.
Here is the assetManager section code:
assetManager: {
storageType : '',
storeOnChange : true,
storeAfterUpload : true,
embedAsBase64 : true, // Make this false to upload image to AmazonS3
upload : siteURL+'assets/CONTENT/img', // For temporary storage, Upload endpoint, set `false` to disable upload
assets : [],
// headers: {}, // Custom headers to pass with the upload request
// params: {}, // Custom parameters to pass with the upload request, eg. csrf token
credentials: 'include', // The credentials setting for the upload request, eg. 'include', 'omit'
autoAdd : 1, // If true, tries to add automatically uploaded assets.
dropzone : 0, // Enable an upload dropzone on the entire editor (not document) when dragging files over it
openAssetsOnDrop : 1, // Open the asset manager once files are been dropped via the dropzone
multiUpload : true, // Allow uploading multiple files per request. If disabled filename will not have '[]' appended
showUrlInput: true, // Toggles visiblity of assets url input
uploadFile: function(e) {
// Ref: https://blog.webnersolutions.com/adding-image-upload-feature-in-grapesjs-editor/
var files = e.dataTransfer ? e.dataTransfer.files : e.target.files;
var formData = new FormData();
for(var i in files){
formData.append('file-'+i, files[i]); // Containing all the selected images from local
}
$.ajax({
url: siteURL + 'uploadImage_base64', // Save image as base64 encoded - Its a function
// url: siteURL + 'uploadImage', // Upload image to AmazonS3 - Its a function
type: 'POST',
data: formData,
contentType: false,
crossDomain: true,
dataType: 'json',
mimeType: "multipart/form-data",
processData: false,
success: function(result) {
var myJSON = [];
if ((typeof(result['data']) != 'undefined') && (result != 'null')) {
$.each(result['data'], function(key, value) {
myJSON[key] = value;
});
console.log(myJSON);
// while using base64 encode => 0: {name: "ipsumImage.png", type: "image", src: "data:image/png;base64,iVBORw0KGgAAVwA…AAAAAAAAAAAAAAAD4Pv4B6rBPej6tvioAAAAASUVORK5CYII=", height: 145, width: 348}
// while using AmazonS3 => 0: {name: "logo_sigclub.png", type: "image", src: "https://amazonaws.com/assets/CONTENT/img/logo_sigclub.png", status: true, message: "Uploaded successfully", …}
editor.AssetManager.add(myJSON); //adding images to asset manager of GrapesJS
}
}
});
}
}
I have doubt in:
upload : siteURL+'assets/CONTENT/img', is this image upload path or a function which uploads image?
If I'm using AmazonS3, can I able to get the editing option with the CORS issue?
How can I show the uploaded images in Asset Manager without disappearing later?
As of now, I have a file upload API using Shrine to handle file attachments and I have added tus-ruby-server to the mix to add the ability to resume uploads.
I have followed the steps here to make it so that when a file is uploaded from tus-ruby-server's endpoint, a database record is also created using Shrine's file handling.
When I POST a file directly to my API's uploads endpoint, it sends the JSON data like so:
{"video"=>#<ActionDispatch::Http::UploadedFile:0x007f811d1d5920 #tempfile=#<Tempfile:/tmp/RackMultipart20170517-19364-6pfr0g.webm>, #original_filename="video.webm", #content_type="video/webm", #headers="Content-Disposition: form-data; name=\"video\"; filename=\"video.webm\"\r\nContent-Type: video/webm\r\n">, "id"=>"35"}
When I upload a file using tus and then POST that generated file data to my API, it sends the following JSON (which fails):
{"video"=>"{\"id\":\"http://localhost:3000/files/8794bf67f32d01cabb9416ca9febf3c3\",\"storage\":\"cache\",\"metadata\":{\"filename\":\"video.webm\",\"size\":142288,\"mime_type\":\"\"}}", "id"=>"35"}
Is there any indication of what is going wrong here?
Here's the JavaScript that handles the uploading (using tus-js-client):
var upload = new tus.Upload(file, {
chunkSize: 1024 * 1024,
endpoint: 'http://localhost:3000/files',
metadata: {filename: file.name, content_type: file.type},
onError: function(error) {
alert(error);
},
onProgress: function(bytesSent, bytesTotal) {
var progress = parseInt(bytesSent / bytesTotal * 100, 10);
var percentage = progress.toString() + '%';
console.log(percentage + ' finished');
},
onSuccess: function(result) {
var fileData = {
id: upload.url,
storage: 'cache',
metadata: {
filename: upload.file.name.match(/[^\/\\]+$/)[0],
size: upload.file.size,
mime_type: upload.file.type
}
};
var fileDataJSON = JSON.stringify(fileData);
var formData = new FormData();
formData.append('video', fileDataJSON);
makeXMLHttpRequest('http://localhost:3000/api/uploads/' + gon.currentId, 'PUT', formData);
}
});
console.log('Uploading video recording to server...');
upload.start();
The JavaScript I used before to handle uploading without tus:
var formData = new FormData();
formData.append('video', blob, 'video.webm');
makeXMLHttpRequest('http://localhost:3000/api/uploads/' + gon.currentId, 'PUT', formData);
Dropzone.js seems to be uploading images as multi-part form data. How do I get it to upload images in the same way an image upload would work with cURL or a binary image upload with Postman?
I'm getting a pre-signed URL for S3 from a server. The pre-singed URL allows an image upload, but not form fields:
var myDropzone = new Dropzone("#photo-dropzone");
myDropzone.options.autoProcessQueue = false;
myDropzone.options.autoDiscover = false;
myDropzone.options.method = "PUT";
myDropzone.on("addedfile", function ( file) {
console.log("Photo dropped: " + file.name );
console.log("Do presign URL: " + doPresignUrl);
$.post( doPresignUrl, { photoName: file.name, description: "Image of something" })
.done(function( data ) {
myDropzone.options.url = data.url
console.log(data.url);
myDropzone.processQueue();
});
});
If I use the returned URL with Postman and set the body to binary and attach the image, then the upload works fine. However, if the Dropzone library uses the same URL to upload the image to S3 then I get a 403 because S3 does not expect form fields.
Update:
An Ajax alternative works as below with a S3 signed url, but Dropzone.js does not seem willing to put the raw image data in the PUT message body.
$.ajax( {
url: data.url,
type: 'PUT',
data: file,
processData: false,
contentType: false,
headers: {'Content-Type': 'multipart/form-data'},
success: function(){
console.log( "File was uploaded" );
}
});
Set maxFiles to 1.
Dropzone.autoDiscover = false;
dzAllocationFiles = new Dropzone("div#file-container", {
url: "api.php?do=uploadFiles"
, autoDiscover: false
, maxFiles: 1
, autoQueue: true
, addRemoveLinks: true
, acceptedFiles: "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"
});
dzAllocationFiles.on("success", function (file, response) {
// Success Operations
});
dzAllocationFiles.on("maxfilesexceeded", function (file, response) {
allocationFileNames = [];
this.removeAllFiles();
this.addFile(file);
});
Add below options, then working.
myDropzone.options.sending = function(file, xhr) {
var _send = xhr.send;
xhr.send = function() {
_send.call(xhr, file);
}
}
I am using the API for blueimp's jQuery File Upload plugin to upload large and small files to S3. I am sending a PUT request to the server, which handles the S3 uploading. My code is below:
// Initialize the file upload so we don't get errors
angular.element(document.querySelector('#add-file-btn')).fileupload({
url: '/test'
});
// Set up options for file upload
angular.element(document.querySelector('#add-file-btn')).fileupload(
'option',
{
url: '/temp', // Temporary, replaced in handleAdd
type: 'PUT',
dataType: 'json',
dropZone: 'dropzone-area',
acceptFileTypes: /(\.|\/)(csv|jpe?g|png)$/i, // Only accept csv files (images enabled for testing)
progressInterval: 2000, // Check progress every 2 seconds
autoUpload: true,
paramName: 'files',
formData: {
fileSize: null
},
beforeSend: function(xhr, data) {
// Set request headers
xhr.setRequestHeader('Authorization', 'bearer ' + storageService.get('token'));
xhr.setRequestHeader('Accept', 'application/json ');
},
multipart: true,
add: handleAdd,
submit: handleSubmit,
progress: handleProgress ,
processdone: handleProcessDone,
processfail: handleProcessFail,
start: handleStart,
stop: handleStop,
done: handleDone
}
);
When I upload a small file (200 KB) the file is successfully uploaded. When I try to upload a large file (2.66 GB) and log the progress shown in my handleProgress() function, I see that the progress hangs for several minutes, then it rapidly logs the last of the progress, so that right after it is 100% complete I see a 504 Gateway Timeout error has been returned.
function handleProgress(e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
data.files[0].progress = progress;
console.log(progress)
}
The backend guy says his code works for uploading large files via the command line so we're trying to figure out why it's not also working from the browser. Any ideas? I know the 504 error is supposed to be just a server-side problem but it's confusing that it's working for the CLI...
I know many asked this question and I looked into their post but I still never got the correct answer so here goes.
I'm trying to upload an image to s3 using the Browser Based uploading technique introduce by Amazon dev. Right now I'm able to calculate both policy & signature on my end. But when I tried to upload an image I always get a "Signature not match" (>.<). One main problem I'm having is that the credentials I have are just temporary: AWS Security Token Service, this consist of an accessKEy, secretKey and security token. I'll post my code so anyone please
Here's my policy_json conversion
function setValues(accesskey, secretkey, token) {
var folder = 'avatars/email#domain.com/',
acl = 'public-read',
bucket = 'my-bucket';
var POLICY_JSON = { "expiration": "2013-12-03T12:29:27.000Z",
"conditions": [
{"bucket": bucket},
["starts-with", "$key", folder],
{"acl": acl},
{"success_action_redirect": "201"},
["starts-with", "$Content-Type", "image/png"]
]
};
var policy = Base64.encode(JSON.stringify(POLICY_JSON));
var signature = b64_hmac_sha1(secretkey, policy);
return {
"policy":policy,
"signature":signature,
"folder" : folder,
"acl" : acl,
"bucket" : bucket
}
}
My upload
function upload(acl, accesskey, policy, signature, token) {
$(':button').click(function()
{
var file = document.getElementById('file').files[0];
// var key = "events/" + (new Date).getTime() + '-' + file.name;
var xdate = '20131016T105000Z';
// var formData = new FormData($('form')[0]); //$('form')[0]
var formData = new FormData(); //$('form')[0]
formData.append("key", "avatars/email#domain.com/${filename}");
formData.append("acl", acl);
formData.append("success_action_redirect", "201");
formData.append("Content-Type", file.type);
formData.append("AWSAccessKeyId", accesskey);
formData.append("policy", policy);
formData.append("signature", signature);
// formData.append("x-amz-security-token", token);
formData.append("file", file);
$.ajax({
url: 'https://mybucket.s3.amazonaws.com', //Server script to process data
type: 'POST',
xhr: function() { // Custom XMLHttpRequest
var myXhr = $.ajaxSettings.xhr();
if(myXhr.upload){ // Check if upload property exists
myXhr.upload.addEventListener('progress',progressHandlingFunction, false); // For handling the progress of the upload
}
return myXhr;
},
//Ajax events
beforeSend: function(e) {
e.setRequestHeader("Authorization", "AWS "+accesskey+":"+signature);
e.setRequestHeader("x-amz-date", xdate);
e.setRequestHeader("x-amz-acl", acl);
e.setRequestHeader("x-amz-security-token", token);
// alert('Are you sure you want to upload document.');
},
success: function(e) { alert('Upload completed'); } ,
error: function(jqXHR, textStatus, errorThrown) {
console.log(textStatus);
console.log(errorThrown);
} ,
// Form data
data: formData,
//Options to tell jQuery not to process data or worry about content-type.
cache: false,
contentType: false,
processData: false
});
});
}
My only html
<form enctype="multipart/form-data">
<input id="file" name="file" type="file" />
<input type="button" value="Upload" />
</form>
This is what confused me, at first inside the mybucket it has a folder named avatars now when my colleague uploaded an image (say image1.jpg) using his email_address (say other_email_Address#domain.com) and the uploading is successful when we look at the bucket it created a folder with the name of his email_address and inside it the image. Why is that?
mybucket/
- avatars/
- my_email_address.#domain.com
- image1
- other_email_address#domain.com
- image1
so on ...
the tools i used are: webtoolkit.base64.js, sha1.js, base64-binary.js.