Compress Data in JavaScript and send it to Flask Server - javascript

So my team has made a small tool to perform Image Annotation and Labeling. And we were trying to optimize our code.
We were able to compress data from the server, but we have been trying to perform compression on data that is being sent from client to server.
What data you may ask, its just text file around 2 - 3mb.
Is there any way we can perform compression?
We are using JavaScirpt and want to send to FLASK.
This is the first question i am posting on here :)

You can try with this lib: Paco-zlib
var binaryString = pako.deflate("2-3mb text content", { to: 'string' });
// Here you can do base64 encode, make xhr requests and so on.
But at server side, responding page will receive compressed data. You shoud decompress it using something like zlib.decompress(compressedData)

Related

Deflating a response in Rails & inflating in JS

I have created an API in Ruby on Rails. Rather than sending the response body back with the API response, I am broadcasting it to an external service, which then takes care of the realtime distribution to all connected clients (i.e. WebSockets).
Today I have hit a snag where I realized that our 3rd party provider only allows data packets of up to 25kb. One of our responses has started giving problems as the data grew to be more than this limit and the 3rd party service has started blocking calls. As a side note, data packets will seldom grow to be greater than 25kb.
I did some research and was contemplating what the best idea would be. One idea I was thinking of, was to compress the response using ZLib and then to decompress it on the JS side. The article that led to this was this StackOverflow question.
I managed to get the deflation & Base64 encoding right, but could not decode on the JS side. I also tested the Base 64 string generated but services like this one, flags the base64 string as invalid.
My code looks like this:
In a Rails Controller
...
compressed_data = Zlib::Deflate.deflate(json_data.to_s)
encoded_data = Base64.encode64(compressed_data)
broadcast encoded_data
...
In JS that receives the broadcast:
import pako from 'pako';
import { decode } from 'js-base64';
...
const decoded = decode(payload.data);
const decompressed = pako.inflate(decoded);
...
When I execute the broadcast, I get the error: unknown compression method. I understand that this might have something to do with pako, but I have also tried other approaches with no success. Does anybody have any ideas or perhaps even a better way to approach this problem?
UPDATE:
The Base64 string generated in rails looks like this:
eJxlU2Fr2zAQ/SuHP25JkJW2abMPY6OllJWkNGFftmEU+RKLypKRTknD6H/f\nyQ5ru0EQutPdy3vvzr8LUxfz8nJ2NSrIkMViXkj4GjyGCNc+tBjgZxJCXsAq\nbbfePsF3NNa4XTEqaow6mI6Md9xWSgGPqm0RPkC+r8gjeW9yLDn+5vktRDPA\nYWrtJ4uwPBUIka9wr/qiCTze3N6t1o9f1nfLBTzcLK7vFref4cGiiggBdyYS\nU9scQRHkJEE5axjEO6AGoVZH2ODWB+xDlXRmOYGl0wgHhEbtM4xGs8cajj6F\nE2hQuXA0pGokZWyEg7GW4SCiIyDfQwa0uFccW4aI5PUTqF1+PzR+aNDekdKU\noXKT9m1nkQZCeyRiE6ELXmNkvWzniWRlvVYnT+/9gVUuQ4euVjyc16JIKlBV\nK+onJmQ5FuVYynU5nQvBv4kQ4qOQfHvTxCinFpesfc3TscswK2NZANdvTF0z\nuwqfCV0cqDj/JuSSriL1XIUeTXCcjjy3qgvYmtT2qRq3KlkmiZ2PhvqcTiGg\n00cGXKgF4+iADFFXigYdYlzKsTxbl2I+vZpPy4mUs786Ule/K+5Flyya32Uu\nvijL1+KIocrbPcv0gnK66cOzy1ER2fMsPMeDFSy5Mo7ZtGxBtVc2YSzmP0q2\ncSTzMc3HeT5yTvwaFU2/q9kG/oLOR0XLQzeaV7Hq0saa2FTkn9D9a7bl4VKq\n/xuC9W73/kHFNs+5H7HnFcCaZTlFKeTMATf5z/rvMO/VYEtuffkDW0lDVA==\n
Your data starts off with a zlib header, but the compressed data is corrupted for some reason.

Client vs server image process and shown

Client vs server imagen process.
We got a big system which runs on JSF(primefaces) EJB3 and sometimes JavaScript logic (like for using firebase and stuff).
So we run onto this problem, we have a servlet to serve some images. Backend take a query, then extract some blob img from DB, make that BLOB into array of bytes, send it to browser session memory and servlet take it to serve it in ulr-OurSite/image/idImage. Front end calls it by <img>(url/image/id)</img> and works fine so far.
Then we are using a new direct way to show img, we send BLOB/RAW data to frontend and there we just convert them into Base64.imageReturn. and pass it to html.
Base64 codec = new Base64();
String encoded = codec.encodeBase64String(listEvidenciaDev.get(i).getImgReturns());
Both work, for almost all cases.
Note: We didn't try this before because we couldn't pass the RAW data through our layers of serialized objects and RMI. Now we can of course.
So now there are two ways.
Either we send data to servlet and put it on some url, which means the backend does all the job and frontend just calls url
or we send data to frontend which is going to make some magic and transform it to img.
This brings 2 questions.
If we send to frontend RawObject or make them call URL to show his image content, final user download the same amount of data? This is important because we have some remote branch offices with poor internet connection
Is worth pass the hard work to frontend (convert data) or backend (convert and publish)?
EDIT:
My questions is not about BLOB (the one i call RAW data) being bigger than base64
It is; passing the data as object and transform it to a readable picture is more heavy to internet bandwidth than passing a url from our servlet with the actual IMG and load it on html ?
I did choose to close this answer because we did some test and it was the same bandwidth usage on front end.
Anyway we make use of both solutions
If we dont want to charge frontend making a lot of encode we set a servlet for that images (that comes with more code and more server load). We look for the best optimization on specific cases.

What is the proper place for upload logic in a MEAN stack application

So i'm trying to wrap my head around developing full applications with the MEAN stack (mongodb, express, angular, node.js).
I understand that using express and node i can create a rest api with endpoints to grab data for my app. I also understand that angular is for FRONT END only. So my question is this... when you have something like an upload form and you want to upload an image to the server, would you want to create an api endpoint called something like "/api/upload/" and have all your logic for uploading the image inside that endpoint, or would you want all that upload logic somewhere else and then simply provide the file name to the "/api/upload/" endpoint with a post request?
I'm not 100% sure what you mean by upload logic - there isn't much needed, but I would put everything (including the file itself) in a POST to /api/upload/, then save it however you wish to within that function.
It is always a better approach to put your business logic at server side and I suggest you to follow this approach. If you are following this approach you can easily manipulate images if required. for e.g while uploading an logo or avatar sometime we need to crop,re-size etc operations on image like same image is used for thumbnail and profile picture. Here this approach is very meaning full for us. we can give response to the user and create a new process for image manipulation without waiting or notifying end user. Most of the apps following this approach for better experience
You can apply a separation of concerns, first of you can delegate the file upload to the client to be more "user friendly", crop it, resize it and then post the resulting file as "a file" or base64 to the server for storing it either in the database or to the file system.
I'd recommend a combination of these two libraries for the client:
Angular File Upload and ngImgCrop
then you can post the image and use a body parser to "catch" the image in express I'd recommend you to use busboy and it could be part of an endpoint as you mentioned like api/file/upload for example
// your controller
exports.uploadDocument = function(req,res, next){
req.pipe(req.busboy);
req.busboy.on('file',function(fieldname, file, filename, encoding, contentType){
// implementation
});
//updating req.body with busboy parameters
req.busboy.on('field',function(fieldname,val){
req.body[fieldname] = val;
});
req.busboy.on('finish', function() {
// implementation
next();
});
};
I hope that helps.
Short answer: express should do the upload, but it doesn't have to.
Long answer: It's a design question. If it was an image, you could have angular post the image to the imgur api and send that returned url to your server, but that's much more unreliable than doing it server-side. People could be using your site from phones, tablets, etc and though image uploading is fairly fast and consistent, the server will always do it the fastest of all because it doesn't depend on the client's wireless connection strength. Angular and Express are equally good at validating the image file.
Most importantly, if something goes wrong, you'll want to log the attempt (and maybe the image) are mostly unable to do that on the client side.
I would strongly advice you move your logic to the backend and do it with express,
More secure, as the code is not expose on the client browser
Fast uploading (though can be fast on the client side too, but if the client browser is slow, it would relatively affect the uploading)
Reduce the code exposed to the client browser

Compress JSON message on server and decompress in client side

I send Json messages to a service which republish these messages to our users. The problem is that some of my messages are bigger than the maximum limit of the allowed message size, so I wondered if I could apply any kind of compression to my messages then decompress them in the client side.
First I tried Gzip in C#, but it seems so hard to decompress using JavaScript.
Some other voices tell to try LZMA and Zlip.
Am I on the right way, or I should think in a different way.
I found a solution, It succeed to decompress a compressed text using both 'C# and PHP'. Zlib is used for compression.
I get The solution from JSXCompressor, you can download this example :
http://jsxgraph.uni-bayreuth.de/distrib/jsxcompressor.zip
see testhelloworld.php
In PHP, The compression had been done using gzcompress then the compressed output had been encoded using base64_encode.
$x = 'Some text or json';
$compressed = base64_encode(gzcompress($x, 9)); // 9 could be from 1 to 9
// echo $compressed;
file_put_contents('compressed.txt', $compressed);
For decompression :
$.ajax('compressed.txt').done(function (res) {
console.info(JXG.decompress(res));
});
On possibility would be use BSON which is a binary encoding of JSON data.
http://bsonspec.org/
On the client side there are a few libraries for encoding/decoding. One comes with Mongo DB: https://github.com/mongodb/js-bson
On the server side JSON.net supports BSON serialization/deserializtion: http://james.newtonking.com/archive/2009/12/26/json-net-3-5-release-6-binary-json-bson-support
This isn't compression exactly. Just a more compact representation of the JSON data.

Best format to send image from javascript client to SQL server

I am making an application that will store a Azure SQL server DB user information, including profile photo downloaded from Facebook. On the server side, ASP.NET MVC4'll have a controller that will receive the information and send it to the database.
The client side is Javascript and thought to give the image in json (once converted to base64). Is it a good option? Is it better to directly send the jpg? What are the advantages of sending information in json?
In SQL Server image field would be stored as a nvarchar (max)
Are you going to return the image as a binary stream content type image/jpeg or as a text stream encoded base64? Is far more likely that you're going to do the former, so there is little reason to go through an intermediate base64 encoded transfer. And of course, store them as VARBINARY(MAX). Even if you would choose to store them as base64, choosing an Unicode data type for base64 text is really wasteful, (double the storage cost for no reason...), base64 can fit very well in VARCHAR(max).
But, specially in a SQL Azure environemnt, you should consider storing media in Azure BLOB storage and store only the Blob path in your database.
In my opinion, it's better sending the image directly in .jpg using Multipart Forms or something like that.
Sending information in Json is useful when you transfer explicit data, like collections or objects that you will be able to query or de-serialize later.
The client side is Javascript and thought to give the image in json (once converted to base64). Is it a good option?
As Pasrus pointed out, you are not going to manipulate the image data. So JSON does not seems to be a good choice here.
One option is, you can add the base64 data into src attribute in html tag and send it.
What are the advantages of sending information in json?
Please check this answers and there are so many:
Advantages of using application/json over text/plain?
In SQL Server image field would be stored as a nvarchar (max)
Please refer this link:
Storing images in SQL Server?

Categories

Resources