PNG Data Url damaged after sending it with Ajax - javascript

I wanted to send a PNG Data Url with Ajax to my PHP-Script but the Url on the Client Side is not the same that i recive with PHP.
$.ajax({
url: "proceed.php",
type: "post",
data: "signature=" + signaturePad.toDataURL('image/png'),
error: function(e) {
alert("ERROR");
console.log(e);
},
success: function(e) {
alert(e);
}
});
I think it gets damaged while sending - maybe a encoding problem?
I Already tried to encode the URL with JSON but it's the same problem...
data: "singature=" + JSON.stringify(signaturePad.toDataURL("image/png")

I presume you are using HTML5's canvas.toDataUrl() as described here.
Your image data is not the same, because instead of sending the actual image, you are sending the data of the image that is painted on your site.
As the browser is throwing away unneeded data and (probably) keeping only RGBA and size information, the image you receive is understandably 'mangled'. Documentation also states that the image resolution will always be only 96 dpi.
There could also be problems arising from using URI component for transfer of "binary data" (these quotes are supposed to be huge). There seems to be no set lower or upper bound for URI component, as stated here. I would suggest using this technique only in case of small (IMHO around 40x40px) images.
Refer here on how to send bigger images over jQuery's $.ajax().

Related

JavaScript img.src onerror event - get reason of error

There can be different reasons for <img> load errors, such as network error response, bad image data...
error object received from onerror doesn't seems to specify the exact reason.
Is there a way to know if the error is because of a network error, say HTTP 500 or a network timeout?
EDIT:
I'm not looking for an alternative way to load a resource, such as AJAX request. I need an answer specifically for <img> tag with onerror event. The reason for that is that I'm using this method for pixel-tracking and I need a way to retry on upon network errors. I'm also not looking for alternative tracking methods such as JSONP.
Edit 16Nov16 2020GMT
Maybe you are pixel-tracking in emails or other clients limited in Javascript capabilities.
One idea that comes to mind is to use URL query paramters in your <img>'s src URL.
With regards to network timeouts, I will pose the idea that a user opens an email, loads the email entirely, then disconnects from the internet and somehow this does not give the tracker enough time to load.
https://developer.mozilla.org/en-US/docs/Web/API/WindowTimers/setTimeout
I would suggest to use setTimeout() inside your onerror function.
This will continue attempting to set/load the <img>'s src URL. You could append the seconds it took until successful load to the URL of your src file as a query parameter like ?s=<sec>
As far as determining status 500 codes on image loads you might want to consider creating a custom 500 error file which would then create -- for example -- a MySQL database entry with all sorts of information you have access to and if you chose to use the query parameters mentioned before then you have slightly more information added to the error.
onerror for <img> gives limited information about the network
The information that is available from <img> can be found at
https://www.w3.org/TR/html/semantics-embedded-content.html#htmlimageelement-htmlimageelement
Older answer:
Perhaps a route you would like to try is to use AJAX to load the image data and set the <img> src to the base64 of the image data received. I hope this helps.
Edit 14Nov16 2018GMT
Alternatively use AJAX to determine if the image loads properly and then use the same URL sent to AJAX as the src for your <img>. It would of course be redundant but would avoid the issue of long "data" URLs.
Edit 15Nov16 0832GMT
Also regarding Network Timeout I found this thread to be useful JQuery Ajax - How to Detect Network Connection error when making Ajax call
Apparently you can specify a timeout to AJAX much like using error except you just provide the miliseconds manually.
Converting to Base64
https://developer.mozilla.org/en-US/docs/Web/API/WindowBase64/Base64_encoding_and_decoding
https://developer.mozilla.org/en-US/docs/Web/API/WindowBase64/btoa
var encodedData = window.btoa("Hello, world"); // encode a string
Or if you are concerened about older browsers able to use btoa() then you might be interested in Google's https://chromium.googlesource.com/chromiumos/platform/spigots/+/refs/heads/firmware-u-boot-v1/base64_encode.js
Status Code checks in jQuery's AJAX
jQuery: How to get the HTTP status code from within the $.ajax.error method?
$.ajax({
type: 'GET',
url: '/path-to-my/image.png',
data: null,
success: function(data){
alert('horray! 200 status code!');
// convert to base64; add to img.src # btoa(data)
document.querySelector("#hlogo img").src = "data:;base64,"+ data;
},
error:function (xhr, ajaxOptions, thrownError){
switch (xhr.status) {
case 400:
// Take action, referencing xhr.responseText as needed.
case 404:
// Take action, referencing xhr.responseText as needed.
case 500:
// Take action, referencing xhr.responseText as needed.
}
});
Notes
https://www.rfc-editor.org/rfc/rfc2397#section-3
dataurl := "data:" [ mediatype ] [ ";base64" ] "," data
mediatype := [ type "/" subtype ] *( ";" parameter )
data := *urlchar
parameter := attribute "=" value
https://www.rfc-editor.org/rfc/rfc2046#section-4.2
Using of a generic-purpose image viewing application this way
inherits the security problems of the most dangerous type supported
by the application.
https://www.rfc-editor.org/rfc/rfc2397#page-4
The effect of using long "data" URLs in applications is currently
unknown; some software packages may exhibit unreasonable behavior
when confronted with data that exceeds its allocated buffer size.
Other References
Unknown file type MIME?
Asynchronously load images with jQuery

Loading very large images with AngularJS

I'm trying to load a very large image (from web app's perspective), around 10-20 MB with an AJAX call.
I use angular resource:
'use strict';
angular.module('myApp')
.factory('Picture', function ($resource, DateUtils) {
return $resource('api/pictures/search', {}, {
'get': {
method: 'GET',
transformResponse: function (data) {
data = angular.fromJson(data);
return data;
}
}
});
});
and in return I get a response JSON where one of the fields contains Base64-encoded image, say something like this:
{title: "some title", picture: "<Base64-encoded image data>", ...}
Upon successful response I take response.picture and put it into an <img>:
<img ng-src="{{pictureSrc}}" class="img-responsive center-block" alt="Picture">
like so:
Picture.get({id: resourceId}, function(response){
$scope.pictureSrc = 'data:image/png;base64,' + response.picture;
});
This approach works up till about 10MB, but as soon as size of the image increases past that, JSON deserialization fails.
If I change Content-Type of the response to image/png, and return only image data, the app still fails, but somewhere in angular.js.
I realize my current approach isn't that flexible and it seems at this point I have to change the way this image fetching is done rather than fixing this one issue.
Hence my question: is there a sure way to get large images (up to 20MB) with an AJAX call using AngularJS?
UPD:
I'm running FF 42.0 and Chrome 45.0.2454.101 (64-bit), but it's expected to work on all major browsers as well as IE8 (that one is probably a topic of different conversation)
UPD 2:
Clarification:
FF freezes if I return only image data, not JSON in my response. In case of JSON it says JSON.parse: unterminated string at line 1 column 1572865 of the JSON data.
HTTP response (i.e. JSON in your case) has no size limit. Browser
memory is a limitation.
In your case object parsed from JSON response consumes too much memory. Hence, it becomes unresponsive. It's better you test with different data sizes and check whether your application works correctly.
Hope it helps!

JSON.parse causes Chrome tab to crash ("Aw, Snap") despite use of try/catch

I'm loading some JSON data from an AJAX query:
$.ajax({'url': url, type: params.method, 'data': data, timeout: this.settings.timeout, success: function(d,a,x){
console.log('request Complete',params.endpoint,params.params);
var json = null;
try {
json = JSON.parse(d);
} catch(e) {
console.error(e);
}
console.log('json');
// omitted for brevity...
}
});
I'm seeing occasional "Aw, Snap" crashes in chrome where the last console.log is the "request Complete" (the error or 2nd log never get shown).
I suppose that it's important to note that the data may be large (sometimes as big as ~15Mb), which is why I'm not printing out d on every request and looking for malformed JSON (yet... I may result to that). FWIW, I've also tried $.parseJSON instead of JSON.parse
Research I've done into the "Aw, Snap" error is vague, at best. My best guess atm is that this is an OOM. Unfortunately, there's not much I can do to decrease the footprint of the result-set.
Is there any way I could, at the least, gracefully fail?
What happens when you tell jQuery the response data is JSON via the dataType property? Doing so should cause jQuery to pre parse and just give you the data. If you're right about what is going on, it seems this might cause a crash too. jQuery does some sanity checks before parsing, though.
$.ajax({
url: url,
type: params.method,
data: data,
dataType: 'json',
timeout: this.settings.timeout,
success: function (d, a, x){
// `d` should already be parsed into an object or array, and ready to use
}
});
If that doesn't help, please post your actual JSON response for us to take a look at.
"#the system" nailed it. 15MB of info coming back is too much for your browser to handle. I tripped upon this trying to see why my 64 Byte encoded image string is crashing the chrome tab, and it's only 200-500KB.
To 'gracefully fail', it seems you will need to have server side logic in place to prevent this from happening; perhaps limit the size of the JSON string. Only have X characters of length in the initial JSON string and add a property for 'isFinishedSending: false' or 'leftOff: [index/some marker]' so you can timeout for a bit and hit your server again with the the place you left off.

google spreadsheets - ajax call (get and post)

what I need to do is read the content of a "public" google spreadsheet (by public I mean that I saved the sheet clicking on "File > Publish to the web", so it's accessible without the need to be logged in into a google account), and, why not, write something into it too.
googlin' around, I found that I can access the sheet and get the xml equivalent of the sheet content with something like
https://spreadsheets.google.com/feeds/list/<sheetCode>/od6/public/values
It works great if I load that url into a browser. But I need to find a "javascript-way" to get and handle the returned value, ie the xml (or json, but xml would be preferable).
I tried to use an ajax call, but I think there's something messy with the protocol.. I can't get the server response correctly.
$.ajax({
type: "GET",
url: "https://spreadsheets.google.com/feeds/list/<sheetCode>/od6/public/values",
success: function(data){alert("yeah");},
error: function(){alert("fail..");},
dataType:"xml",
});
I also tried to get the json instead of xml, adding "?alt=json" to the url and changing the datatype, but I still have the problem..
Any idea / suggestion?
Thanks in advance, best regards
You need to request with a JSONP call and you need to specifiy a callback - method. This can be done in jQuery using:
var url = 'https://spreadsheets.google.com/feeds/list/<CODE>/od6/public/values?alt=json-in-script&callback=?';
jQuery.getJSON(url).success(function(data) {
console.log(data);
}).error(function(message) {
console.error('error' + message);
}).complete(function() {
console.log('completed!');
});
Documentation and samples for google spreedsheets .

Make the browser pool for an image until available

I have a small http server which generates some images on-the-fly, and the generation process may take some time. After generation, the image is cached indefinetly.
Currently, if a user requests an image which is not cached, the server returns a 202 Accepted with a Refresh header. If the image is cached, a 301 Permanently Moved is sent and the user redirected to a unique url (different images may share the same unique url).
The whole system breaks if the image is referenced in an <img> tag (on Firefox, at least). Can this be solved without Javascript? If not, how would the script look like?
Im not sure if you can do it without Javascript but you could probably do this with ajax? I mean, point at the server and then just check to see if it is there... then if it is display it, else try again 30 seconds later, it could be something like:
function getImage(img) {
$.ajax({
cache: true,
url: <<ADDRESS>>,
data: "",
timeout: 60,
error: function (jqXHR, error, errorThrown) {
setTimeout(function() {
getImage(img);
}, 30000);
},
success: function (data) {
//set the image
}
});
}
Okay your then hoping that the image will come down at somepoint.
The only other option would be to generate the image before it is requested? For example if it is just creating a thumbnail for a photo gallery, why wait until it is requested to generate it? Just generate it as soon as you have it?
Hope that helps / makes sense.

Categories

Resources