Deflating a response in Rails & inflating in JS - javascript

I have created an API in Ruby on Rails. Rather than sending the response body back with the API response, I am broadcasting it to an external service, which then takes care of the realtime distribution to all connected clients (i.e. WebSockets).
Today I have hit a snag where I realized that our 3rd party provider only allows data packets of up to 25kb. One of our responses has started giving problems as the data grew to be more than this limit and the 3rd party service has started blocking calls. As a side note, data packets will seldom grow to be greater than 25kb.
I did some research and was contemplating what the best idea would be. One idea I was thinking of, was to compress the response using ZLib and then to decompress it on the JS side. The article that led to this was this StackOverflow question.
I managed to get the deflation & Base64 encoding right, but could not decode on the JS side. I also tested the Base 64 string generated but services like this one, flags the base64 string as invalid.
My code looks like this:
In a Rails Controller
...
compressed_data = Zlib::Deflate.deflate(json_data.to_s)
encoded_data = Base64.encode64(compressed_data)
broadcast encoded_data
...
In JS that receives the broadcast:
import pako from 'pako';
import { decode } from 'js-base64';
...
const decoded = decode(payload.data);
const decompressed = pako.inflate(decoded);
...
When I execute the broadcast, I get the error: unknown compression method. I understand that this might have something to do with pako, but I have also tried other approaches with no success. Does anybody have any ideas or perhaps even a better way to approach this problem?
UPDATE:
The Base64 string generated in rails looks like this:
eJxlU2Fr2zAQ/SuHP25JkJW2abMPY6OllJWkNGFftmEU+RKLypKRTknD6H/f\nyQ5ru0EQutPdy3vvzr8LUxfz8nJ2NSrIkMViXkj4GjyGCNc+tBjgZxJCXsAq\nbbfePsF3NNa4XTEqaow6mI6Md9xWSgGPqm0RPkC+r8gjeW9yLDn+5vktRDPA\nYWrtJ4uwPBUIka9wr/qiCTze3N6t1o9f1nfLBTzcLK7vFref4cGiiggBdyYS\nU9scQRHkJEE5axjEO6AGoVZH2ODWB+xDlXRmOYGl0wgHhEbtM4xGs8cajj6F\nE2hQuXA0pGokZWyEg7GW4SCiIyDfQwa0uFccW4aI5PUTqF1+PzR+aNDekdKU\noXKT9m1nkQZCeyRiE6ELXmNkvWzniWRlvVYnT+/9gVUuQ4euVjyc16JIKlBV\nK+onJmQ5FuVYynU5nQvBv4kQ4qOQfHvTxCinFpesfc3TscswK2NZANdvTF0z\nuwqfCV0cqDj/JuSSriL1XIUeTXCcjjy3qgvYmtT2qRq3KlkmiZ2PhvqcTiGg\n00cGXKgF4+iADFFXigYdYlzKsTxbl2I+vZpPy4mUs786Ule/K+5Flyya32Uu\nvijL1+KIocrbPcv0gnK66cOzy1ER2fMsPMeDFSy5Mo7ZtGxBtVc2YSzmP0q2\ncSTzMc3HeT5yTvwaFU2/q9kG/oLOR0XLQzeaV7Hq0saa2FTkn9D9a7bl4VKq\n/xuC9W73/kHFNs+5H7HnFcCaZTlFKeTMATf5z/rvMO/VYEtuffkDW0lDVA==\n

Your data starts off with a zlib header, but the compressed data is corrupted for some reason.

Related

Compress Data in JavaScript and send it to Flask Server

So my team has made a small tool to perform Image Annotation and Labeling. And we were trying to optimize our code.
We were able to compress data from the server, but we have been trying to perform compression on data that is being sent from client to server.
What data you may ask, its just text file around 2 - 3mb.
Is there any way we can perform compression?
We are using JavaScirpt and want to send to FLASK.
This is the first question i am posting on here :)
You can try with this lib: Paco-zlib
var binaryString = pako.deflate("2-3mb text content", { to: 'string' });
// Here you can do base64 encode, make xhr requests and so on.
But at server side, responding page will receive compressed data. You shoud decompress it using something like zlib.decompress(compressedData)

Client vs server image process and shown

Client vs server imagen process.
We got a big system which runs on JSF(primefaces) EJB3 and sometimes JavaScript logic (like for using firebase and stuff).
So we run onto this problem, we have a servlet to serve some images. Backend take a query, then extract some blob img from DB, make that BLOB into array of bytes, send it to browser session memory and servlet take it to serve it in ulr-OurSite/image/idImage. Front end calls it by <img>(url/image/id)</img> and works fine so far.
Then we are using a new direct way to show img, we send BLOB/RAW data to frontend and there we just convert them into Base64.imageReturn. and pass it to html.
Base64 codec = new Base64();
String encoded = codec.encodeBase64String(listEvidenciaDev.get(i).getImgReturns());
Both work, for almost all cases.
Note: We didn't try this before because we couldn't pass the RAW data through our layers of serialized objects and RMI. Now we can of course.
So now there are two ways.
Either we send data to servlet and put it on some url, which means the backend does all the job and frontend just calls url
or we send data to frontend which is going to make some magic and transform it to img.
This brings 2 questions.
If we send to frontend RawObject or make them call URL to show his image content, final user download the same amount of data? This is important because we have some remote branch offices with poor internet connection
Is worth pass the hard work to frontend (convert data) or backend (convert and publish)?
EDIT:
My questions is not about BLOB (the one i call RAW data) being bigger than base64
It is; passing the data as object and transform it to a readable picture is more heavy to internet bandwidth than passing a url from our servlet with the actual IMG and load it on html ?
I did choose to close this answer because we did some test and it was the same bandwidth usage on front end.
Anyway we make use of both solutions
If we dont want to charge frontend making a lot of encode we set a servlet for that images (that comes with more code and more server load). We look for the best optimization on specific cases.

How to POST JSON attributes with underscore?

The issue:
On the server, I'm receiving my api_key parameter as api key (with space instead of underscore), how do I send it with underscore?
Code:
data = {
api_key: this.state.api_key
}
axios.post('/resource', data)
.then(response => { console.log(response) });
I'm using both React and axios (must use axios) on the frontend, and Rails on the backend.
Thanks
UPDATE
The rails action which first receives the parameters:
private
resource_params
ActiveModelSerializers::Deserialization.jsonapi_parse(params)
end
Axios serializes object params with JSON.stringify, which means the request is almost certainly being sent how you want (unless you are transforming it somehow). The issue is probably something on the Rails end. Looking at your resource_params method, I can see you are using ActiveModelSerializers's JSON API adapter. However, the request you are sending with Axios is not JSON API Compliant. You might try JSON.parse(params) instead or better yet, use Strong Params. Also you mentioned that you are using resource_params as a before action, but it doesn't look like you are assigning an instance variable there as one might expect. How are you handling that response? Are you calling that method directly in your action as well?
Anyway if that doesn't help I would be happy to continue our discussion in the chat.
After hours beating my head about this, I've found the culprit: Visual Studio Code
This is the output on VS Code:
And this is the output on linux terminal:
This is such a stupid issue but it never occurred to me that the bloody text editor would omit the underscores.
Trully #chris-g, this wasn't an issue with JSON or React, it was on the Rails side as #Xavier suspected, though the issue is unrelated to the lack of underscores themselves.
This is what I get for giving Microsoft some credit after so long.

Send data in chunks with nodejs

I'm quite new to nodejs and I'm working on a backend for an Angular 4 application. The problem is that the backend is quite slow to produce the whole data for the response and I'd like to send data over time as soon as it's available. I was reading about RxJS but I can't really figure out how to use it in node, can you please help me?
Maybe you are looking for a way to stream the data
Express
Normally you respond with res.send(data), it can be called only once.
If you are reading and sending a large file, you can stream the file data while being read with res.write(chunk) and on the 'end' event of the file reading, you call res.end() to end the response.
EDIT : As you state, what you want is to stream as soon as the chunk is available, so you can use the res.flush() command between writes ( just flush after res.write(chunk)).
It would be much faster in your case but the overall compression will be much less efficient.

Read/write json from js to file on server (not server app)

I'm working with a .js client and have and object that I need to write out to a file on the server. A couple of questions - file i/o with JavaScript is new to me... I was planning on using jquery and json. I'm using java serverside. I don't have a problem reading what I get back from my servlet, but the file i/o is killing me! A couple of questions:
I can open a file I generated myself via the .js with an $.ajax call, but it's not handling my json syntax (I tried both an $.getJson and $.ajax - handwritten json, so I might (probably) are doing something wrong with it). I used firebug's console and it looks ok...
How can I write my object to a file on the server?
Then, when I want to read it, what do I need to do to process it? Right now I'm using a jsonFilter function (uses JSON.parse if that's available, otherwise eval) to process data that I'm getting from the servlet.
The object I'm writing isn't simple, but it's not super complex either. There's an array that contains an array, but that shouldn't make a difference if the software is both reading/writing it.
Thanks for any help! I'm at a loss - tried alot of different things.
You can open a file located on the server via ajax by querying the file and loading it into a JSON object. You might want to LINT your JSON
You can not write to an object on the server via the client. This is a severe security breach.
Common practice is to change the JSON data and then send it via ajax to server-side code. The server will then do the file IO.
Yes using JSON.parse otherwise eval is indeed correct. I would recommend json2.js
The data should be fine as long as it passes JSONLint.
Your main issue is that it's impossible to write to the server from the client. Get the client to load the data through ajax change it and then query the server to update the file.
js don't have i/o property;
you should use ajax or http request to send message to server,and tell server to do de i/o action...

Categories

Resources