javascript File to java bytes[] with resteasy - javascript

I try to upload a file from an angularJS front to a jboss backoffice, but i get this exeption when service is called:
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of byte[] out of START_OBJECT token
at [Source: io.undertow.servlet.spec.ServletInputStreamImpl#1cc8ac9f; line: 1, column: 39] (through reference chain: fr.test.Document["datas"
])
I guess there is something wrong when i convert the javascript File in java bytes[], but i don't really know what.
I collect the file via a regular input type="file":
<input
type="file"
style="display: none;"
onchange="angular.element(this).controller().addDocument(this)"
>
Then i catch result with an onchange method in my controller:
(i read on this link Return the Array of Bytes from FileReader() that i have to use a promise, that's return a bytes array).
...
ctrl.addDocument = function(element){
var file = element.files[0];
var fileData = new Blob([element.files[0]]);
var promise = new Promise(
function(resolve){
var reader = new FileReader();
reader.readAsArrayBuffer(fileData);
reader.onload = function() {
var arrayBuffer = reader.result
var bytes = new Uint8Array(arrayBuffer);
resolve(bytes);
}
}
);
promise.then(function(data) {
var document = {
name: file.name,
type: file.type,
datas: data
};
console.dir(document);
ctrl.doc = document;
}).catch(function(err) {
console.log(err);
growl.error('Unable to upload file');
});
}
...
Finally, backoffice is called, using a rest service. Here the declaration of my backoffice's service:
#POST
#Path("/uploadFile")
#Consumes(MediaType.APPLICATION_JSON)
public void uploadFile(Document document) {
LOGGER.info("upload document !");
}
And the document object that contains same properties:
public class Document {
private byte[] datas;
private String name;
private String type;
public byte[] getDatas() {
return datas;
}
public void setDatas(byte[] datas) {
this.datas = datas;
}
...
If i comment the line "ctrl.doc = document;" in my controller (the bytes part), service work correctly.
What i missed ? Can i really send a file this way ?

Related

Ionic 4 upload screenshot to server

I want to take a screenshot and upload it to a server (I use spring-boot), for this I used native library screenshot and its angular service to get image URI, I transformed the image URI to a blob and I sent it using FORMDATA and post request of HTTPCLIENT, the problem is in back office where I got no parametre named file is found. Please, can anyone help me?
N.B: I use MULTIPARTFILE as webservice parametre type and REQUESTPARAM annotation.
here the java code :
#PostMapping(value = { "/uploadImg/{idColis}" })
public void uploadScreenShot(#RequestParam("file") MultipartFile file, #PathVariable String idColis) {
if (file != null) {
try {
fileService.importerImage(file);
} catch (Exception e) {
e.printStackTrace();
}
}
}
angular code used :
call(colis : any){
this.screenshot.URI(80).then(img => {
this.screenShotsuccess = 'screened';
this.colisService.upload(img,colis).subscribe(res=>{
this.screenShotsuccess = 'screened and uploaded';
});
}, err => {
this.screenShotsuccess = err ;
} );}
upload(imgData : any,colis :any){
// Replace extension according to your media type
const imageName = colis.codeEnvoi+ '.jpg';
// call method that creates a blob from dataUri
const imageBlob = this.dataURItoBlob(imgData.URI);
const imageFile = new File([imageBlob], imageName, { type: 'image/jpeg' })
let postData = new FormData();
postData.append('file', imageFile);
let data:Observable<any> = this.httpClient.post(this.wsListeUploadImage+colis.codeEnvoi,postData);
return data;}
dataURItoBlob(dataURI) {
console.log(dataURI);
const byteString = window.atob(dataURI.split(',')[1]);
const arrayBuffer = new ArrayBuffer(byteString.length);
const int8Array = new Uint8Array(arrayBuffer);
for (let i = 0; i < byteString.length; i++) {
int8Array[i] = byteString.charCodeAt(i);
}
const blob = new Blob([int8Array], { type: 'image/jpeg' }); return blob;}
here is the error that i got :
2019-12-29 08:21:07.276 WARN 5356 --- [nio-8080-exec-7] .w.s.m.s.DefaultHandlerExceptionResolver : Resolved [org.springframework.web.multipart.support.MissingServletRequestPartException: Required request part 'file' is not present]
In your Angular code, you are creating FormData correctly, but you never use it:
let data:Observable<any> = this.httpClient.post(this.wsListeUploadImage+colis.codeEnvoi,{'file':imageFile});
Change it to
let data:Observable<any> = this.httpClient.post(this.wsListeUploadImage+colis.codeEnvoi, postData);

Saving jpg file with cloud-code Parse-Server

I'm trying to save jpg files with cloud code on parse server ...
On Android I can do it using this way
Bitmap bitmap = ((BitmapDrawable) myImageView.getDrawable()).getBitmap();
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte [] byteArrayPhotoUpdate = stream.toByteArray();
final ParseFile pictureFileParse = new ParseFile( newUserInfo.getObjectId() + ".JPEG",byteArrayPhotoUpdate);
newUserInfo.put("profile_picture",pictureFileParse);
newUserInfo.saveInBackground();
But I have no idea how to do this in the cloud code. I call my cloud code functions like this
HashMap<String, String> params = new HashMap();
ParseCloud.callFunctionInBackground("myCloudFuncion", params, new FunctionCallback<String>() {
#Override
public void done(String aFloat, ParseException e) {
}
});
but I have no idea how to pass a bitmap in hashmap params.
I already searched the internet, but nothing that I found in helped, the links that refer to something useful, is already old and outdated, from the epoch of the old parse ...
In parse docs I found this
var base64 = "V29ya2luZyBhdCBQYXJzZSBpcyBncmVhdCE=";
var file = new Parse.File("myfile.txt", { base64: base64 });
Which made me confused because I do not know if the 2 "base64" parameters refer to variable or base64 type
Should I convert my bitmap to base64 and send it as parameter to the cloud code?
If you have been through this and know how, I will be very happy to know your solution.
Thank you!
you need convert your image bitmap for base64 like that:
Bitmap bitmap = ((BitmapDrawable) img.getDrawable()).getBitmap();
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte [] byteArrayPhotoUpdate = stream.toByteArray();
String encodedfile = new String(Base64.encodeBase64(byteArrayPhotoUpdate), "UTF-8");
And then, send your string base64 in params, like that:
HashMap<String, String> params = new HashMap();
params.put("fileInfo",encodedfile);
ParseCloud.callFunctionInBackground("saveParseUserInfo", params, new FunctionCallback<String>() {
#Override
public void done(String aFloat, ParseException e) {
Log.i("ewaeaweaweaweawe", "done: " + aFloat);
}
});
Now in your cloud code, use that:
Parse.Cloud.define("saveParseUserInfo", function(request, response) {
var userId = request.user.id;
var base64 = request.params.fileInfo;
var userClass = Parse.Object.extend("User");
//create a user object to set ACL
var userObject = userClass.createWithoutData(userId);
//create new ParseObject
var userPublicClass = Parse.Object.extend("userPublic");
var userPublic = new userPublicClass();
var aclAction = new Parse.ACL(userObject);
aclAction.setPublicReadAccess(true);
userPublic.setACL(aclAction);
userPublic.set("name", "name random");
userPublic.set("username", "username_random");
//Now create a Parse File object
var file = new Parse.File("photo.jpeg", { base64: base64 });
//set file object in a colum profile_picture
userPublic.set("profile_picture",file);
//save
userPublic.save(null, { useMasterKey: true,
success: function(actionSuccess) {
response.success("saved!!");
},
error: function(action, error) {
// Execute any logic that should take place if the save fails.
// error is a Parse.Error with an error code and message.
response.error(error.message);
}
});
});
I hope it's help you.
This answer works if you do not wish to use Base64 that requires API 26 and above for android.
I know João Armando has answered this question, but this is for the benefit of others who, like me, are supporting versions before API 26 for Android.
P.S. The Base64.encodeBase64(...) is deprecated and Base64.getEncoder()... is used now, which requires API 26.
There are 3 key parts to the solution:
Convert your bitmap to byteArray
Send this byteArray directly as params when calling your cloud function
Format this byteArray in cloud code itself
In Android:
Convert bitmap to byte[]
Bitmap bitmap = <Your source>;
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
Send as params when calling cloud function
HashMap<String, Object> params = new HashMap<>();
params.put("imageInByteArray", byteArray);
ParseCloud.callFunctionInBackground("yourCloudFunction", params, new FunctionCallback<Map>() {
#Override
public void done(Map object, ParseException e) {
if(e == null){
// Success
} else {
// Failed
}
}
});
In cloud function/code
Depends on the version of javascript you use, the codes may differ. I am using a backend-as-a-service provider, which has improved from promises-related codes. The logic should still be applicable regardless.
Parse.Cloud.define("reportId", async request => {
// Retrieve and set values from client app
const imageInByteArray = request.params.imageInByteArray;
// Format as ParseFile
var file = new Parse.File("image.png", imageInByteArray);
// Initialize your class, etc.
....
// Save your object
await yourImageObject.save(null, {useMasterKey:true});
});

Batched Media Upload to Azure Blob Storage through WebApi

My web app currently allows users to upload media one-at-a-time using the following:
var fd = new FormData(document.forms[0]);
fd.append("media", blob); // blob is the image/video
$.ajax({
type: "POST",
url: '/api/media',
data: fd
})
The media then gets posted to a WebApi controller:
[HttpPost, Route("api/media")]
public async Task<IHttpActionResult> UploadFile()
{
if (!Request.Content.IsMimeMultipartContent("form-data"))
{
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
}
string mediaPath = await _mediaService.UploadFile(User.Identity.Name, Request.Content);
return Ok(mediaPath);
}
Which then does something along the lines of:
public async Task<string> UploadFile(string username, HttpContent content)
{
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer imagesContainer = blobClient.GetContainerReference("container-" + user.UserId);
var provider = new AzureStorageMultipartFormDataStreamProvider(imagesContainer);
await content.ReadAsMultipartAsync(provider);
var filename = provider.FileData.FirstOrDefault()?.LocalFileName;
// etc
}
This is working great for individual uploads, but how do I go about modifying this to support batched uploads of multiple files through a single streaming operation that returns an array of uploaded filenames? Documentation/examples on this seem sparse.
public class AzureStorageMultipartFormDataStreamProvider : MultipartFormDataStreamProvider
{
private readonly CloudBlobContainer _blobContainer;
private readonly string[] _supportedMimeTypes = { "images/png", "images/jpeg", "images/jpg", "image/png", "image/jpeg", "image/jpg", "video/webm" };
public AzureStorageMultipartFormDataStreamProvider(CloudBlobContainer blobContainer) : base("azure")
{
_blobContainer = blobContainer;
}
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
if (parent == null) throw new ArgumentNullException(nameof(parent));
if (headers == null) throw new ArgumentNullException(nameof(headers));
if (!_supportedMimeTypes.Contains(headers.ContentType.ToString().ToLower()))
{
throw new NotSupportedException("Only jpeg and png are supported");
}
// Generate a new filename for every new blob
var fileName = Guid.NewGuid().ToString();
CloudBlockBlob blob = _blobContainer.GetBlockBlobReference(fileName);
if (headers.ContentType != null)
{
// Set appropriate content type for your uploaded file
blob.Properties.ContentType = headers.ContentType.MediaType;
}
this.FileData.Add(new MultipartFileData(headers, blob.Name));
return blob.OpenWrite();
}
}
Assuming your AzureStorageMultipartFormDataStreamProvider is similar to the same class mentioned on this blog, that is actually already processing multiple files if there are multiple files in the request.
So all you need to do is change your UploadFile to return a IEnumerable<string> and change your controller to have mediaPath as such.
So your MediaService would have:
var filenames = provider.FileData.Select(x => x.LocalFileName).ToList(); ;
return filenames;
And your controller would have:
var mediaPaths = await _mediaService.UploadFile(User.Identity.Name, Request.Content);
return Ok(mediaPaths);
Since you don't post the related codes with the AzureStorageMultipartFormDataStreamProvider class.
So I create a custom AzureStorageMultipartFormDataStreamProvider which inherits from the MultipartFileStreamProvider to enable the web api upload batched uploads of multiple files.
In the AzureStorageMultipartFormDataStreamProvider we could override the ExecutePostProcessingAsync method.
In this method, we could get the upload file data, then we could upload these data to the azure storage.
More details, you could refer to below codes. The total Controller.
public class UploadingController : ApiController
{
public Task<List<FileItem>> PostFile()
{
if (!Request.Content.IsMimeMultipartContent("form-data"))
{
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
}
var multipartStreamProvider = new AzureStorageMultipartFormDataStreamProvider(GetWebApiContainer());
return Request.Content.ReadAsMultipartAsync<AzureStorageMultipartFormDataStreamProvider>(multipartStreamProvider).ContinueWith<List<FileItem>>(t =>
{
if (t.IsFaulted)
{
throw t.Exception;
}
AzureStorageMultipartFormDataStreamProvider provider = t.Result;
return provider.Files;
});
}
public static CloudBlobContainer GetWebApiContainer(string containerName = "webapi-file-container")
{
// Retrieve storage account from connection-string
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
"your connection string");
// Create the blob client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(containerName);
// Create the container if it doesn't already exist
container.CreateIfNotExists();
// Enable public access to blob
var permissions = container.GetPermissions();
if (permissions.PublicAccess == BlobContainerPublicAccessType.Off)
{
permissions.PublicAccess = BlobContainerPublicAccessType.Blob;
container.SetPermissions(permissions);
}
return container;
}
}
public class FileItem
{
/// <summary>
/// file name
/// </summary>
public string Name { get; set; }
/// <summary>
/// size in bytes
/// </summary>
public string SizeInMB { get; set; }
public string ContentType { get; set; }
public string Path { get; set; }
public string BlobUploadCostInSeconds { get; set; }
}
public class AzureStorageMultipartFormDataStreamProvider : MultipartFileStreamProvider
{
private CloudBlobContainer _container;
public AzureStorageMultipartFormDataStreamProvider(CloudBlobContainer container)
: base(Path.GetTempPath())
{
_container = container;
Files = new List<FileItem>();
}
public List<FileItem> Files { get; set; }
public override Task ExecutePostProcessingAsync()
{
// Upload the files to azure blob storage and remove them from local disk
foreach (var fileData in this.FileData)
{
var sp = new Stopwatch();
sp.Start();
string fileName = Path.GetFileName(fileData.Headers.ContentDisposition.FileName.Trim('"'));
CloudBlockBlob blob = _container.GetBlockBlobReference(fileName);
blob.Properties.ContentType = fileData.Headers.ContentType.MediaType;
//set the number of blocks that may be simultaneously uploaded
var requestOption = new BlobRequestOptions()
{
ParallelOperationThreadCount = 5,
SingleBlobUploadThresholdInBytes = 10 * 1024 * 1024 ////maximum for 64MB,32MB by default
};
//upload a file to blob
blob.UploadFromFile(fileData.LocalFileName, options: requestOption);
blob.FetchAttributes();
File.Delete(fileData.LocalFileName);
sp.Stop();
Files.Add(new FileItem
{
ContentType = blob.Properties.ContentType,
Name = blob.Name,
SizeInMB = string.Format("{0:f2}MB", blob.Properties.Length / (1024.0 * 1024.0)),
Path = blob.Uri.AbsoluteUri,
BlobUploadCostInSeconds = string.Format("{0:f2}s", sp.ElapsedMilliseconds / 1000.0)
});
}
return base.ExecutePostProcessingAsync();
}
}
The result like this:
I would checkout uploading the media directly to the blob storage after getting the SAS token for all your files from the Web API in one request. Upload the files using a promise and http get from your client, which will parallelize the upload.
Which will be your right design and approach. Which will also increase your upload speed and reduce the latency.

Saving image/files into Parse Cloud using Parse Code

I have this code (Java/Android) that takes the bitmap, converts it to byte array, and then puts it into a Map known as 'picture' and is finally sent up via calling the "createcard" function:
public static void createCard(final String nametext, final String initialbalancetext, String databaseclass, String cardnotes, String cardtype, Bitmap picture) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
picture.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
Map<String, Object> map = new HashMap<>();
map.put("cardname", nametext);
map.put("balance", Double.valueOf(initialbalancetext));
map.put("database", databaseclass);
map.put("cardnotes", cardnotes);
map.put("picture", byteArray);
// Check if network is connected before creating the card
if (CardView.isNetworkConnected) {
ParseCloud.callFunctionInBackground("createcard", map, new FunctionCallback<String>() {
#Override
public void done(String s, ParseException e) {
// COMPLETE
CardListCreator.clearadapter();
CardListAdapter.queryList();
}
});
}
This is my "createcard" function (Javascript/Parse Cloudcode). It supposedly takes the 'picture' key and grabs the byte array, and attempts to save it:
Parse.Cloud.define("createcard", function(request, response){
var cardname = request.params.cardname;
var balance = request.params.balance;
var database = request.params.database;
var cardnotes = request.params.cardnotes;
var picture = request.params.picture;
var picturename = "photo.jpg";
var parseFile = new Parse.File(picturename, picture);
parseFile.save().then(function(parseFile){
var currentuser = Parse.User.current();
var CurrentDatabase = Parse.Object.extend(database);
var newObject = new CurrentDatabase;
newObject.set("cardname", cardname);
newObject.set("balance", balance);
newObject.set("user", currentuser);
newObject.set("cardnotes", cardnotes);
newObject.set("cardpicture", parseFile);
newObject.save(null, {
success: function(newObject){
console.log(newObject);
response.success(newObject + " successfully created");
}, error: function(newObject, error){
console.log(error);
response.error(error+" error");
}
})
});
});
Now my problem is that the function is simply not working. I don't know why as my console.log isn't actually logging anything. Does anyone have any ideas?
I hope this code is given your question.
private void Update(){
ParseQuery<ParseObject> query = ParseQuery.getQuery("Table Name");
query.getInBackground(parsekey, new GetCallback<ParseObject>() {
public void done(ParseObject gameScore, ParseException e) {
if (e == null) {
if(camera!=null) {
ByteArrayOutputStream stream = new ByteArrayOutputStream();
camera.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte imageInByte[] = stream.toByteArray();
encodedImage = Base64.encodeToString(imageInByte, Base64.DEFAULT);
image = new ParseFile("image.txt", imageInByte);
}else{
Bitmap bitmap = BitmapFactory.decodeResource(getResources(), R.drawable.user_icon);
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte imageInByte[] = stream.toByteArray();
encodedImage = Base64.encodeToString(imageInByte, Base64.DEFAULT);
image = new ParseFile("image.txt", imageInByte);
}
findViewById(R.id.editName)).getText().toString(),Profile.this);
gameScore.put("name", ((EditText) findViewById(R.id.editName)).getText().toString().trim());
gameScore.put("gender",gender_val);
gameScore.put("country",country_val);
gameScore.put("dateofbirth",((TextView) findViewById(R.id.editdate)).getText().toString().trim());
gameScore.put("profession",profession);
gameScore.put("image",image);
gameScore.saveInBackground();
Toast.makeText(Profile.this,"SuccessFull Updated",Toast.LENGTH_LONG).show();
}else{
Log.e("erroreeeee", DataBase.getUserObjectId(Profile.this)+" "+e.getMessage());
}
}
});
}
For all in the future: You want to put the image into the byte[] and put it into the map. You don't need to encode it to a string, unless you want to decode it in the cloudcode. On the cloudcode, request the param of the byte[] that you uploaded, and then put it into the ParseFile and then finally call save on it.

File upload with ember-upload, how to fill request with additional data for servicestack?

For introduction, I have problem with communication between servicestack and application written in ember.js via REST, I am using ember-uploader component to upload a file to service stack.
View hbs:
<table class="table table-bordered table-hover">
{{file-upload}}
</table>
component in coffee script
ABC.FileUploadComponent = Ember.FileField.extend(
url: "/api/upload"
filesDidChange: (->
uploadUrl = #get("url")
console.log uploadUrl
files = #get("files")
test = { fileName: "test" }
uploader = Ember.Uploader.create(
url: uploadUrl
)
uploader.upload(files[0],test) unless Ember.isEmpty(files)
console.log files
return
).observes("files")
)
component in javascript
ABC.FileUploadComponent = Ember.FileField.extend({
url: "/api/upload",
filesDidChange: (function() {
var files, test, uploadUrl, uploader;
uploadUrl = this.get("url");
console.log(uploadUrl);
files = this.get("files");
test = {
fileName: "test"
};
uploader = Ember.Uploader.create({
url: uploadUrl,
data: test
});
if (!Ember.isEmpty(files)) {
uploader.upload(files[0], test);
}
console.log(files);
}).observes("files")
});
My service model:
namespace ABC.Service.ServiceModel
{
public class Upload
{
[Route("/upload")]
public class UploadRequest : IRequiresRequestStream
{
public System.IO.Stream RequestStream { set; get; }
public object FileName { set; get; }
}
public class UploadResponse
{
public int Successed { set; get; }
}
}
}
My Service Method
namespace ABC.Service.Service
{
public class UploadService : ServiceBase // Service base inherites from ServiceStack.Service
{
public Upload.UploadResponse Post(Upload.UploadRequest request)
{
var req = base.Request;
var reqThatIwant = request.FileName;
return new Upload.UploadResponse() { Successed = 1 };
}
}
}
and here is screen from watch :
So my question is, how I have to change the code to get data marked as "2" into Request object marked as "1" (marked on the screen)?
Handling Raw Request Stream
When you use IRequiresRequestStream you're saying you want to take over deserializing the Request and access the raw input HTTP Request Body as a Stream. As a result ServiceStack wont attempt to read from the Request body and instead inject the HTTP Request stream - in this case the only Request DTO parameters it will be able to populate are those on the /pathinfo or ?QueryString, e.g:
[Route("/upload/{FileName}")]
public class Upload : IRequiresRequestStream
{
public Stream RequestStream { set; get; }
public string FileName { set; get; }
}
Accessing FormData HTTP POSTs
But if the JavaScript component is sending you HTTP POST FormData (i.e. application/x-www-form-urlencoded or multipart/form-data) than it's very unlikely you want to treat it like a raw Request Stream but instead access the Request.FormData or Request.Files that were posted.
Handling File Upload examples
Based on your screenshot, the HTTP Request Content-Type is multipart/form-data which case you will most likely be able to access any uploaded files using Request.Files.
Some examples of accessing HTTP Uploaded Files are available in the Live Demos:
Imgur - Save uploaded files to a MemoryStream
public object Post(Upload request)
{
foreach (var uploadedFile in Request.Files
.Where(uploadedFile => uploadedFile.ContentLength > 0))
{
using (var ms = new MemoryStream())
{
uploadedFile.WriteTo(ms);
WriteImage(ms);
}
}
return HttpResult.Redirect("/");
}
Rest Files - Save to FileSystem
public void Post(Files request)
{
var targetDir = GetPath(request);
var isExistingFile = targetDir.Exists
&& (targetDir.Attributes & FileAttributes.Directory) != FileAttributes.Directory;
if (isExistingFile)
throw new NotSupportedException(
"POST only supports uploading new files. Use PUT to replace contents of an existing file");
if (!Directory.Exists(targetDir.FullName))
Directory.CreateDirectory(targetDir.FullName);
foreach (var uploadedFile in base.Request.Files)
{
var newFilePath = Path.Combine(targetDir.FullName, uploadedFile.FileName);
uploadedFile.SaveTo(newFilePath);
}
}
HTTP Benchmarks - Handle multiple and .zip uploaded files
public object Post(UploadTestResults request)
{
//...
foreach (var httpFile in base.Request.Files)
{
if (httpFile.FileName.ToLower().EndsWith(".zip"))
{
using (var zip = ZipFile.Read(httpFile.InputStream))
{
var zipResults = new List<TestResult>();
foreach (var zipEntry in zip)
{
using (var ms = new MemoryStream())
{
zipEntry.Extract(ms);
var bytes = ms.ToArray();
var result = new MemoryStream(bytes).ToTestResult();
zipResults.Add(result);
}
}
newResults.AddRange(zipResults);
}
}
else
{
var result = httpFile.InputStream.ToTestResult();
newResults.Add(result);
}
}
}

Categories

Resources