Internet Explorer and Base64 image display - javascript

In aim to manipulate more easily various images on client side with Javascript, I wrote a function on server side (in VB 2010) to convert a file into a base64 string that I send to the client.
When I tried it in Internet Explorer 8.0 with 3 different images, 1 portrait and 2 landscapes, I realized that only the portrait image was displayed entirely, meanwhile both landscape images were truncated (I can see just the upper part of the image).
I thought I had a bug in my conversion function, until I tried my local page with Firefox: every image is perfectly displayed by Firefox.
So there is my question: Is this a well-known bug of Internet Explorer ? If the answer is yes, Is there a well-known remedy for that well-known bug?

IE8 can only do Base64 URI images up to 32kb in size. It's a marginally annoying limitation, but you can still get by with icons/etc. Keep in mind that Base64 encoded images are on average 33% more data sent down the pipe anyway, so it's... eh, y'know, use your judgement.
For anyone interested, it's also possible to make these work in IE6/7 (without, I might add, that 32kb limit...): http://venodesigns.net/2010/06/17/you-got-your-base64-in-my-css/

IE8 is limited to 32KB. One way around this is to just serve the images in 'tiles' where each tile is less than 32KB.

Related

How to get pasted images with transparency in Javascript

When detecting pasted images using event.clipboardData, the alpha channel is lost. Apparently this is because Windows stores images in the clipboard as 24-bit bitmaps.
I've heard that some applications store the transparency data of copied images separately, but I can't find out whether this can be accessed through clipboardData.
Here's a pasted image detector I wrote a while ago:
http://12Me21.github.io/paste/
I actually researched the subject of transparency on the Windows clipboard extensively, so while I'm not really experienced with javascript, I can help you along with the principles.
The clipboard works with a system of string IDs for their clipboard types, and you can put multiple such types on the clipboard simultaneously. In recent years, a lot of applications (including MS Office and Gimp) have started using the type "PNG" for transparent images, which will contain a raw byte stream with the bytes for a png image.
However, a much more commonly used one is the DIB format, which has clipboard ID "DeviceIndependentBitmap", and that one may be... problematic. Like the PNG one, it is a raw byte stream, but the actual file format is a bit of a mess. For more information about the DIB format, I advise you to read through this question and the answer I gave to it, and the Device Independent Bitmap specs and BITMAPINFOHEADER struct on MSDN. Long story short, it's a 32bpp RGB format that's abused as (and sometimes mistaken for) ARGB. I have no idea how plausible it is to parse DIB into a usable image in Javascript, though, but a lot of programs (including Google Chrome) seem to use it as only transparent image format they put on the clipboard when copying an image.
To fully support transparent pasting you'll probably need to support both the PNG and DIB formats (and, given the problems and controversies surrounding DIB, preferably in that order). This answer may give more information regarding the general method to sift through the clipboard and parse the PNG and DIB images, though it is C# code, and not Javascript.
Here's what I found:
In all tests I've done, pasting pixels results in a single image/png blob in the clipboard, so there's not much you can do - it either works, or doesn't.
The behaviour is consistent between paste event, navigator.clipboard.read, and pasting into contenteditable.
Transparency is hopelessly hit and miss on Windows depending on how the data is copied:
from Paint.NET: opaque in both Chromium and Firefox.
from Krita: transparent in Firefox, opaque in Chromium.
from Aseprite: transparent in Chromium, opaque in Firefox.
This is probably due to the aforementioned nightmare of clipboard formats on Windows.
Transparency works remarkably well on Linux - I have tested copying from Krita, Aseprite, and GIMP with Firefox and Chromium and all is well.
Specification does not say anything about discarding or not discarding alpha channel, perhaps making it implementation-defined in general.
A workaround is to also offer the user to provide their images via <input type="file" accept="image/*"/> and/or drag-and-drop events (for files), both of which are free of this issue (likely due to passing the file as-is).

How can I show tiff extension on Google Chrome?

I really need help for this problem. I searched it more than 2 days but i couldnt find any solution.
I have an application wrritten by ext.net framework at fronthand side. I have a problem related with showing tif file in a browser.You know some of browsers don't support tif file.Only IE and Saffari browsers support it.But I want to show it in Google Chrome browser.Also I want to not only view but also magnify and shrink to examine it.At this point, to meet my need what can I do.According to some research, It can be shown after conversion to other standart image format(png,jpg).Would you show me a way to overcome this problem.
Thanks in advance
TIFF is not a format suitable for images on the Internet, and it is poorly supported. Instead, you should use:
JPG for photos (JPG's compression is not lossless, but it's good for photos).
PNG for ClipArts and schematics (it has lossless compression which is good for large uni-color surfaces).
The preferred way to convert would be to directly access the TIFFs on your server and to convert them, either with a graphical tool like GIMP or with a console batch converter like imagemagick (check their websites for demos).
I cannot recommend sending a TIFF to the client and let them convert it at every access. It sends too much data, results in a longer page load, decreases battery life of handheld devices and is much less portable.

how to prevent the difference in AudioContext.decodeAudioData under chrome/FF

i'm using wavesurfer.js for a tool i wrote. I used it to display a certain mp3 file. The problem i have is that if i load the mp3 file in both browsers the one in chrome gets chopped off in the beginning. I started debugging the problem and it seems that a call to audioContext.decodeAudioData from the AudioAPI results in the mp3 getting chopped off in chrome, the input in both cases is consistent (a 2781 bytearray goes in).
From firefox I get 121248 samples back and the layout looks good, from chrome I get 114330 and it's chopped off in the beginning.
I tested another file which is longer and it also gave me a difference of 6918 samples missing.
The same problem occurred under linux with FF where the samples returned are 124416. (gstreamer plugin)
(btw these are all comparable since the systems all use 48kHz output)
The plugin to decode mp3 under windows for firefox is the vlc plugin since ff itself is not capable to decode mp3 due to licensing issues.
the file is encoded as:
Audio file with ID3 version 2.4.0, contains: MPEG ADTS, layer III, v2.5, 24 kbps, 8 kHz, Monaural
originally it was a PCM 32kHz mono file
I can imagine that mp3 is not sample consistent due to different implementations of the decoders. (kinda answered my own question here)
What would be a consistent codec over multiple browsers that produces sampleconsistency? I know that Wav should produce comparable results, right?
How about ogg? can I assume consistency in the amount of samples since the codebase should be the same, or does these differences in samples stem from the fact that the audioAPI is built up differently in the different browsers (a bit counter intuitive from an API i would say)

HTML5 LocalStorage seems to become corrupt

I'm using BootUp.js (https://github.com/TradeMe/bootup.js) to load and store CSS and JavaScript files into HTML5 LocalStorage. The site is mobile focused so the time saving and speed boost this creates is great! However, I've noticed the odd occasion were the CSS (never noticed it with JS) becomes corrupt in the storage and so the site renders horribly until the storage is cleared and the CSS files are refetched from the server.
I've seen this happen very sporadically on Safari on an iPhone 4 (iOS 6), Chrome on a Galaxy S3 and Chrome on a Nexus 7 - so it doesn't seem to be limited to any particular device, browser or OS. Is this an issue any one has come across before? Is it possible that the data has just somehow become corrupt? Are there any known issues with WebKit (I guess) that could cause it?
I'm planning to implement a work-around by storing some kind of checksum that can be generated in JS to ensure the data is fully there. If not, clear it out and fetch from the server.
I'd first use this:
http://arty.name/localstorage.html
Mobile browsers tend to cut back on storage space due to obvious memory limitations. Your CSS and Javascript might be too big, even when minified.
Other thing I can think of for this behavior is that localStorage might become corrupt when starting a save, and refreshing the page at the same time. I'm not familiar with the exact works of browsers, but I'm guessing they might stop a save in the middle.
Also, have a look here:
http://hacks.mozilla.org/2012/03/there-is-no-simple-solution-for-local-storage/

Does HTML/javascript client-side image compression seem like a reasonable, fool-proof plan?

I have a plan to add images to a chat I've created, but the problem is that I have a small amount of bandwidth to use and I don't want to overstay my welcome, so does this form of file compression seem legitimate and safe? If you open developer tools on any common broswer, you can see how many bytes come in and out of the local compressor, and the elapsed time.
Each result differs on each computer when using the same image, and it uses "image/webp" format when Chrome is available because it uses less space than any other format. GIFs loose their animation and PNGs lose their transparency.
Is there anything I am missing? It combines HTML5's canvas.toDataURL() compression and LZW compression together to deliver maximum results. It works in Chrome and IE10, and I haven't been able to test it on any other browsers. My goal isn't to make it compatible with every browser, but instead to deliver a convenient form of compression.
It combines HTML5's canvas.toDataURL() compression
That's not "compression", it's "encoding", and it's a bad idea. You're not compressing anything, converting the image to a base64-encoded data URI will decompress the image, as you can fit a lot fewer bytes in base64 than you can in actual 256 bit binary encoding. LZW Compressing the resulting text will have negligible benefit.
You can put your image in another host. There is plenty of free picture's hosting service out there.
Your bandwidth will be safe, no data transmission require. But it depend of what you need to do with theses pictures ...

Categories

Resources