Compress JSON message on server and decompress in client side - javascript

I send Json messages to a service which republish these messages to our users. The problem is that some of my messages are bigger than the maximum limit of the allowed message size, so I wondered if I could apply any kind of compression to my messages then decompress them in the client side.
First I tried Gzip in C#, but it seems so hard to decompress using JavaScript.
Some other voices tell to try LZMA and Zlip.
Am I on the right way, or I should think in a different way.

I found a solution, It succeed to decompress a compressed text using both 'C# and PHP'. Zlib is used for compression.
I get The solution from JSXCompressor, you can download this example :
http://jsxgraph.uni-bayreuth.de/distrib/jsxcompressor.zip
see testhelloworld.php
In PHP, The compression had been done using gzcompress then the compressed output had been encoded using base64_encode.
$x = 'Some text or json';
$compressed = base64_encode(gzcompress($x, 9)); // 9 could be from 1 to 9
// echo $compressed;
file_put_contents('compressed.txt', $compressed);
For decompression :
$.ajax('compressed.txt').done(function (res) {
console.info(JXG.decompress(res));
});

On possibility would be use BSON which is a binary encoding of JSON data.
http://bsonspec.org/
On the client side there are a few libraries for encoding/decoding. One comes with Mongo DB: https://github.com/mongodb/js-bson
On the server side JSON.net supports BSON serialization/deserializtion: http://james.newtonking.com/archive/2009/12/26/json-net-3-5-release-6-binary-json-bson-support
This isn't compression exactly. Just a more compact representation of the JSON data.

Related

Deflating a response in Rails & inflating in JS

I have created an API in Ruby on Rails. Rather than sending the response body back with the API response, I am broadcasting it to an external service, which then takes care of the realtime distribution to all connected clients (i.e. WebSockets).
Today I have hit a snag where I realized that our 3rd party provider only allows data packets of up to 25kb. One of our responses has started giving problems as the data grew to be more than this limit and the 3rd party service has started blocking calls. As a side note, data packets will seldom grow to be greater than 25kb.
I did some research and was contemplating what the best idea would be. One idea I was thinking of, was to compress the response using ZLib and then to decompress it on the JS side. The article that led to this was this StackOverflow question.
I managed to get the deflation & Base64 encoding right, but could not decode on the JS side. I also tested the Base 64 string generated but services like this one, flags the base64 string as invalid.
My code looks like this:
In a Rails Controller
...
compressed_data = Zlib::Deflate.deflate(json_data.to_s)
encoded_data = Base64.encode64(compressed_data)
broadcast encoded_data
...
In JS that receives the broadcast:
import pako from 'pako';
import { decode } from 'js-base64';
...
const decoded = decode(payload.data);
const decompressed = pako.inflate(decoded);
...
When I execute the broadcast, I get the error: unknown compression method. I understand that this might have something to do with pako, but I have also tried other approaches with no success. Does anybody have any ideas or perhaps even a better way to approach this problem?
UPDATE:
The Base64 string generated in rails looks like this:
eJxlU2Fr2zAQ/SuHP25JkJW2abMPY6OllJWkNGFftmEU+RKLypKRTknD6H/f\nyQ5ru0EQutPdy3vvzr8LUxfz8nJ2NSrIkMViXkj4GjyGCNc+tBjgZxJCXsAq\nbbfePsF3NNa4XTEqaow6mI6Md9xWSgGPqm0RPkC+r8gjeW9yLDn+5vktRDPA\nYWrtJ4uwPBUIka9wr/qiCTze3N6t1o9f1nfLBTzcLK7vFref4cGiiggBdyYS\nU9scQRHkJEE5axjEO6AGoVZH2ODWB+xDlXRmOYGl0wgHhEbtM4xGs8cajj6F\nE2hQuXA0pGokZWyEg7GW4SCiIyDfQwa0uFccW4aI5PUTqF1+PzR+aNDekdKU\noXKT9m1nkQZCeyRiE6ELXmNkvWzniWRlvVYnT+/9gVUuQ4euVjyc16JIKlBV\nK+onJmQ5FuVYynU5nQvBv4kQ4qOQfHvTxCinFpesfc3TscswK2NZANdvTF0z\nuwqfCV0cqDj/JuSSriL1XIUeTXCcjjy3qgvYmtT2qRq3KlkmiZ2PhvqcTiGg\n00cGXKgF4+iADFFXigYdYlzKsTxbl2I+vZpPy4mUs786Ule/K+5Flyya32Uu\nvijL1+KIocrbPcv0gnK66cOzy1ER2fMsPMeDFSy5Mo7ZtGxBtVc2YSzmP0q2\ncSTzMc3HeT5yTvwaFU2/q9kG/oLOR0XLQzeaV7Hq0saa2FTkn9D9a7bl4VKq\n/xuC9W73/kHFNs+5H7HnFcCaZTlFKeTMATf5z/rvMO/VYEtuffkDW0lDVA==\n
Your data starts off with a zlib header, but the compressed data is corrupted for some reason.

Compress Data in JavaScript and send it to Flask Server

So my team has made a small tool to perform Image Annotation and Labeling. And we were trying to optimize our code.
We were able to compress data from the server, but we have been trying to perform compression on data that is being sent from client to server.
What data you may ask, its just text file around 2 - 3mb.
Is there any way we can perform compression?
We are using JavaScirpt and want to send to FLASK.
This is the first question i am posting on here :)
You can try with this lib: Paco-zlib
var binaryString = pako.deflate("2-3mb text content", { to: 'string' });
// Here you can do base64 encode, make xhr requests and so on.
But at server side, responding page will receive compressed data. You shoud decompress it using something like zlib.decompress(compressedData)

How to send binary data from a Node.js socket.io (v2.0.x) server to a browser client?

Engine.io send function do send binary data as binary, it can be seen
in as binary in DevTools.
Socket.io handles it as json ans sends it
as text.
May be somehow Engine.io send function can be accessed via Socket.io instance?
May be I need to re-implement custom encoder/decoder for this?
Answering my own question because may be it will save some time for someone.
Found out that binary data in format of Uint8Array cannot be sended properly, it is just needed to be converted to regular Buffer with new Buffer(...) and it will be transported by wire as binary
This was already answered here. The client code provided is written in java.
EDIT: there is also the javascript client code

Compress JSON responce in PHP

I've a simple MVC model. I'm doing an Ajax request where I send some data to be processed by PHP and retrieve database records as JSON. As this object could be quite large, is there some way I could compress/encrypt it on the PHP (server side) and decrypt it on the Javascript side (client)
$.ajax({
url: "/php/function/link/",
dataType: 'json',
data: {
"date": date,
},
type: "POST",
success: function(_data){
// load encrypted data here and decrypt it.
},
error: function() {
alert("Some error fetching!");
}
I tried using the following methods, but they didn't seem to work (I was getting error while decompressing them on the javascript end):
JSONC
https://stackoverflow.com/a/11901649/1443702
Are there any other better ways?
I simply need to :
compress data on javascript to be passed from client->send it to server (PHP) -> decompress it and compute database queries -> compress it -> pass it to javascript(client side) -> decompress it
The best way is just to enable HTTP traffic compression in web server. All modern browsers and servers support it. Read more about HTTP_compression. And you will have additional bonus: all your traffic will be compressed, not AJAX traffic only.
For php - You can try this: http://rosettacode.org/wiki/LZW_compression#PHP
For JavaScript - http://rosettacode.org/wiki/LZW_compression#JavaScript
Ideally you should avoid sending large data set unless the use case is unavoidable. I would suggest you to reconsider your design. But recently I ran into similar use-case as part of product requirement where I needed to compress handle 5MB of JSON data (partially) in JavaScript. I tried the above and was able achieve 50% compression.

In node.js and express, how should a client send a large, up to 3k, amount of data to the server?

the client will be sending my server a change log, containing a list of commands and parameters, JSON or not is TBD.
This payload can be a 3 or 4K not likely to be more.
What is the standard approach to deal with requirement?
Client should send a json, containing all of the changes, as part of the request body?
Any recommendations? Lessons learned?
Just POST the data. 3-4 KB is nothing unless you're dealing with feature-phone WAP browsers in the middle of rural India, performance issues of the "OMG, I'm Google and care about every byte ever because of my zillion-user userbase" type, or something like that.
If you're really worried about payload size, you can gzip-base64 encode it before sending - but only do this if a) you really care about this (which is unlikely) and b) your payload is large enough that this saves you bandwidth. (gzip-base64'ing small payloads often increases their size, since there isn't enough data to get enough compression benefit to offset the 33% size increase from base64 encoding.)
You can use a normal JSON post to send across 3/4K of data.
You should pay more attention to what you do with the data received on the server side, whether you buffer up all data before you start processing them (store in db or elsewhere), or process them in chunks. If you are simply dumping the data into files on server, you should create a Writable stream and pump chunks of data received into the stream.
How are you going to process the received data on the server? But then, 3/4K is not really worrying amount of data.
You can set the maximun upload size with
app.use(express.limit('5mb'));
if that's an issue?
But, there shouldn't really be any limitations on this as default, except the max buffer size (which I believe is 1GB).
It also sounds like this is something that you can just post to the server with a regular POST request, in other words you use a form with a file input and just upload the file the regular way, as 4kb isn't really a big file.

Categories

Resources