I'm trying to display a quite large pdf on my website. Everything is exactly the same as the official demo, and it works well with the example file. But, when I used my file by adding ?file= parameter, the page refuse to redraw on any scale above 40%, so the preview looks blurry on normal scale (it still redraws when I change the scale from 20% to 30%, but when I switch to 100%, the page streches what it rendered in 40% into 100% scale, instead of rendering another image in 100%). The page size is 2766.8 x 2531.8mm (horizontal).
How should I modify the scripts in order to improve the maxium resolution to at least 100% ?
In most (but NOT ALL) web browser based PDF viewers, calls to Acrobat or a clone viewer like PDF.js will require "suggested" magnification set as a fragment #zoom.
The limits accepted are fixed in two places,
by the browser pdf extender range (often 8-6400%), and
the browsers own allowed range of zoom (often 25-500%)
It does NOT have to be respected, since user controls their PDF extender viewing preferences thus only IF the download can return to the browser inline frame, which it does not have to, it can be directed at an external application/pdf viewer without the #zoom respected.
Here is that example with requested actual scale (also zoom in #200 and out #50) for test in main stream browsers with PDF extenders that return downloads to Browser.
Each click is potentially a fresh download, unless the browser reuses the same local cached file, which itself can cause other problems, e.g. when user has go to same page, settings active.
Normally the #zoom is after .pdf e.g. sample.pdf#zoom=percent but note the value here is passed to the HTML worker.
http://mozilla.github.io/pdf.js/web/viewer.html#zoom=200
http://mozilla.github.io/pdf.js/web/viewer.html#zoom=100
http://mozilla.github.io/pdf.js/web/viewer.html#zoom=50
Remember the scale is NOT guaranteed, especially if users browser or device has over-riding zoom settings.
I've realized what happend after another day's work. In the original viewer.js, the resolution is not infinite. Instead, the canvas size is limited to 16M pixels:
maxCanvasPixels: {
value: 16777216, //16M
kind: OptionKind.VIEWER
},
For my case, I tried 256M pixels (since the png version of this file has 232M pixels). It does render a nice and clear preview with more pixels, but the rendering process runs very laggy and the browser itself is almost unresponsive.
Related
I'm working on integrating screen capturing in a framework I'm expanding. I'm requesting the MediaStream of the screen through the getDisplayMedia method and recording it using the RecordRTC library. Both Firefox and Chrome allow the user specify what exactly will be shared, like the entire screen, a specific window or a single tab (only Chrome). I noticed the choice here significantly affects the resulting video file size. The results below are from 30 second recordings, where the browser window filled the entire screen.
Chrome:
Entire Screen: 3.44 MB
Window: 0.81 MB
Tab: 0.15 MB
Firefox:
Entire Screen: 5.23 MB
Window: 3.56 MB
Ofcourse when recording a window opposed to the entire screen the resolution becomes slightly smaller. Like for the firefox recording: entire screen = 2560x1440 and window = 2488x1376, but I don't think that should make that much of a difference.
I've tried looking at the Chromium source code (as that's open-source and Chrome based) to figure out what the difference is between the different options, but can't seem to figure out what is going on. None of my Google searches were successful either.
Does anyone know what the reason for these large differences is?
I'm on Ubuntu 20.04 if that makes a difference.
This is because when you record the window or the tab, the browser is responsible for the rendering of the content. So it knows when something new has been painted and when nothing has been painted.
You can clearly see this in Chrome where they'll even fire a mute event after 5 seconds on the VideoTrack of tab-capture where nothing is animated.
So, they know there is nothing new being painted, they don't pass anything to the stream and instead create a single frame with a duration of long time.
When capturing the desktop however, they're not responsible for the rendering anymore and don't know if something has changed: they have to pass every frame as a new frame.
As part of a project of mine, I want to write an image-viewer inside a webbrowser.
The images are extracted from video files on the server and sent to the client. The image should be displayed as is and not be scaled (unless the user explicitely chose to do so) by the browser, as this would distort the image.
This image illustrates the problem. The Win32-Application (with disabled DPI-Scaling) Shows a 20x20 image without any scaling (the black area). Chrome shows the 20x20 image (here a green image) scaled by a factor of two. What I want is that the image on the browser has the same area as the black square inside the Win32-Application, regardless of which DPI-Setting the user has chosen on his system.
To be clear: Serving the user an image with higher resolution is not an option. Neither is having the browser scale the image with different Settings acceptable. JavaScript-based solutions however may work as well.
I have not yet been able to find a solution to that problem.
I haven't found a way to disable it, but I found something to basically reverse the scaling. (source)
var factor = 1/window.devicePixelRatio;
document.querySelector('img').style.transform = 'scale(' + factor + ')';
I do not know how accurate this scaling will be and whether the image will still be rendered pixel-perfect. However, assuming the browser does some rounding internally before rendering, it should.
I am trying to make a website with a minimalist feeling to it, so I put a fullscreen image on the background as background-image of body. I have a transition: background-position 1s set as a CSS rule for body and an easing function, to create a smooth scrolling effect when going to other pages in the same HTML file (I have no actual scrollbar, just navigation elements). The thing I noticed was that once I started scrolling, the memory reserved by the page went from a small 77MB to over 500MB! I tested this in Firefox, but it doesn't seem to happen (either because pages have no separate processes or memory allocation works differently, I imagine). Why does this happen in Google Chrome and not in other browsers? And how can I reduce the enormous amounts of memory reserved by my page?
To give some information on what I am using:
Browser: Google Chrome
RAM: 8GB The page uses javascript with the
following plugins:
jQuery
Bootstrap
Background image dimensions are: 1440 x 540
A few possible causes of the problem:
The image is too big to be rendered with a transition and an easing function.
I should not use background-image for this, but create a new <img/> as a background.
I somehow only checked it with developer tools open in Chrome, increasing the memory allocated.
It's not the image causing the problem, but the web fonts I scroll simultaneously with the background image, also using transition and an easing function.
And I want to add that maybe this is not even a problem after all, it's just that I have never seen a page go over 300MB with memory allocation.
Not sure if this should be posted as a comment instead, however:
Nothing you can do about that. Chrome trades RAM for performance.
For instance each chrome tab runs on its own process for that purpose.
I had cases when a single youtube tab took over 1.5gb memory & virtual memory.
Even a blank tab takes 45mb memory for me.
So there is no issue with your code at all.
OK, so I'm building an application similar to the Battle.Net's launcher :) Tabs for multiple applications and I would have liked to add a "dynamic content area" where I could post news and stuff. Therefore I placed a WebBrowser there, set a width and a height and proceeded to create a static HTML file.
I am aware that WPF's measure unit is is not the pixel. Therefore, when creating the CSS for the HTML, I multiplied the WebBrowser's dimensions by 1.33 and used that for my HTML's body width and height. And it looked good on my work monitor. When I got home, surprise! Apparently my monitor works at a higher DPI value, therefore the image I had there was larger than the browser window, the text was way too big, etc. I did try switching to EM as a unit, but still no luck.
So, how can I create a webpage that will fit all kinds of DPI settings? Seeing as I can't really get the monitor DPI values (not reliably, as far as I have been able to read). Is it possible?
The other possibility that I have considered is to create a placeholder for an image and some text (WPF controls that is) and populate them from the server, but I may want to change the layout at some point, etc, therefore this doesn't really help me too much in the long run.
On a site of mine, my client is reporting that images that are reduced in size by code (i.e. specified a width/height) are appearing jagged and pixellated. I have asked her what browser she uses and inevitably it's Internet Explorer.
Is there a way to optimise images in IE or do I need to manually resize the images on photoshop before I put them on the site?
The images in question are resized from 220x220 to 80x80 and I have javascript that expands them to 220x220 upon clicking.
Resizing down or up in a browser can look terrible. It varies from browser to browser, but apparently IE is the worst.
It's best to either write a server side script to create thumbnails, or to manually do it yourself if quality of the image is important. It also saves bandwidth as you don't need to load the big image and only display 1/10th of the pixels.
You should avoid using width and height for resizing. It'll cause a longer loading time (on slow connections and big images).
A better idea is making thumbnails (with Photoshop for example) and use the "Web save" option to reduce the size even more.
http://acidmartin.wordpress.com/2009/01/05/better-image-scaling-and-resampling-in-internet-explorer/
Bicubic image resampling is turned off by default in IE. You can use this to turn it on in your reset stylesheet:
img
{
-ms-interpolation-mode: bicubic;
}
use timthumb, it will create thumbnails for you, you just need to link to the script, and specify the size of the thumbnail and that's it. http://www.darrenhoyt.com/2008/04/02/timthumb-php-script-released/
i'm using it on one of my sites -> http://iv-designs.org/
you can see the images are clean and not pixelated.
Assuming your images are JPEGs, the easiest option is to use IE7's bicubic image resizing feature, which you can turn on using CSS:
img { -ms-interpolation-mode: bicubic; }
Be aware that it's got performance implications (using it a lot will slow the browser down). It also has no effect in IE6, and is no longer needed in IE8.
Another way (which does work in IE6) is to use Ethan Marcotte's wonderful Fluid Images script, which uses some damn clever CSS filters to fix the problem in IE6 and 7. My own variation on the theme fixes the right click problem, but requires jQuery.