Live HTML preview without updating the whole preview every time content changes - javascript

I'm building a markdown editor using node-webkit (native applications using HTML/JavaScript), marked (a js markdown compiler) and CodeMirror (code editor). So I'm basically building a markdown editor using HTML and JavaScript (+ jQuery) that runs as a native application.
One of it's features is that it has a live preview of the compiled HTML. This live preview is inside an iframe and updates every time the user changes any content in the CodeMirror editor. The problem is that when the preview updates the whole document inside the iframe gets replaced. So it's basically reloading a whole webpage every time you press a key. This is not very efficient, and makes the whole application run slow. It's also a problem when you add any embedded media like a Youtube video to your document, because it's going to reload that video every time the preview updates.
I tried to solve the problem by adding a timer so the preview won't update faster than once every 250 milliseconds. This solves the slowness problem, but embedded media will still reload on every preview update.
I tried some other live preview HTML/markdown editors and most of the used the same method for a live preview as me, except StackEdit (and probably some other ones but StackEdit is a good example). I noticed that in StackEdit when you embed a youtube video somewhere in the document, and then edit some text somewhere else in the document the Youtube video doesn't reload. That's exactly what I need: only update content in the preview that has changed. How can I get my live preview to work like that?
Note: This is how the preview currently gets updated:
var HTML = marked(CodeMirror.getValue());
$('iframe').contents().find('.content').html(HTML);
(This happens every time with an interval of 250 milliseconds when the
content of the codemirror editor changes.)

You're probably looking for a way to compare HTML with something like HTML_PREV. If that's the case, you may be looking for HTML diff algorithms, like htmldiff.js. You also might be interested in this related question.

Related

Embedding PDF with flipping pages

I've seen a lot of questions and answers here that help you too embed PDF documents, but they are all "vertical read", where you scroll down to see other pages.
Is there any plugin, or maybe even a simple script that would help you to embed and read the PDF document horizontaly? Like where you see two pages and click next on top to see the next two.
Just like a book. (No fancy animations though)
Thanks.
There are plugins that lets you read pdf documents. But they might be fancy.
http://www.jqueryrain.com/2012/09/best-jquery-pdf-viewer-plugin-examples/
or
http://fliphtml5.com/free-pdf-to-jquery-flipbook.php
I like the last one especially
You can set the initial view of a PDF to show two pages.
If you add Fullscreen view to that, and the user has not deactivated clicking to the next page, it actually should give the effect you want (not verified).
If the Fullscreen view does not work, your user would use the cursorLeft and cursorRight buttons to navigate.

Select of a section of a web page and output all js used there?

Is there an application that allows me to select a section of a web page, and then outputs all js used there? I've been told I can do this with Chrome Inspector, but haven't had any success so far.
Example:
On this page - http://preview.oklerthemes.com/porto/2.7.0/page-left-sidebar.html - there is a tabbed box in the sidebar. I want to easily grab all the JS/CSS needed for that box. I usually use Inspector to look at all the styles, and go and grab theme from each CSS file, but I don't know how to do this for the JS.
It's not quite clear from your question what you're asking.
Are you trying to see what JS causes writes or changes to a particular part of a web page? The easiest way would be to open the page with the element inspector, right-click a particular chunk of HTML and stick a breakpoint on modifications.
The next time a function causes any changes, the breakpoint will trigger and you'll be able to crawl up the call stack to see what the cause was.

How to get content in a flash embeded on a web site?

When dealing with simple web page, even if the page doesn't allow selecting text or right click, a browser with tools like inspect element is enough to get the content wanted.
But is it possible to get the content in an embeded flash? For example, in a flash online game there is a text paragraph, how can I select and copy these text? Will Javascript be useful or is it simply not possible?
Thanks!
Edit:
For example, I'm playing an online card game, and I'm chatting with my opponent. And I'd like to save our chat somewhere. But I can't even select the text so it's impossible to copy and save it.. Hope it clears a bit.
Depending on how the text is embedded in the SWF, you may be able to select it as you would normal text. If it's embedded as an image, you'll need to use OCR tools.
You can also try to decompile the SWF, using a program such as Sothink's SWF Decompiler

Take a screenshot of a certain website at a given moment

I want to take a screenshot from my website of another website or preferable 1 part of it (an object tag).
I want it to work this way: I click on a button that will send a request to screenshot the page at this moment. Several ideas are insert the page inside a flash object and screenshot it. Or opening a browser on the server and when I click the button send a request using AJAX to tell the server to screenshot with this browser.
How should I do this because I kind of failing right now with the flash Idea. The page I am trying to screenshot is a live camera that uses a .wvx object. But I can't even do that with a .swf object.
Thanks!
You can try phantomjs. They have several examples of renderding page output. It supports plugins like flash too.
render() always renders the entire page. To just render one <object> tag, I guess you can do one of:
create a page that only contains that <object> tag.
use javascript to remove everything else.
crop the final screenshot based on <object> page coordinates.
To use phantomjs from php, try php-PhantomjsRunner.
Edit 1: In case you only want to render a flash file that does not actually rely on the web page it is in, you can try Gnash according to the blog post "Server-side PNG rendering of SWF images using Gnash" by Valentine Bichkovsky.
phantomjs screen-scraping

Getting the final/timed render of a URL

I am looking for a way to, give a URL, get the source of a webpage back after the JavaScript has been run on it. For example:
I have a webpage with a .
On loading the page, some JavaScript populates the div.
Viewing the source of the page through a browser will not give the information which is within the div.
As far as I know, in order for the browser to render the page the div must have been filled with (X|D)HTML which would mean that the source of the page after being rendered is still just nested markup, so theoretically there should be a "final" version of the page source.
I have considered using a rendering engine like WebKit or Gecko and somehow adapting these to do this, however this is a fairly large task and I don't really want to duplicate something which has already been done. Does anyone know of a way of performing this task.
Regards.
Update: I am aiming to use Selenium (as mentioned in the comments to the accepted answer) to do this automatically for several pages. My project is a web spider which by design needs to target a number of pages in which the content I am aiming to reach is not available until after the JavaScript has populated everything.
Such addons for Firefox as the WebDev toolbar, or Firebug have options like 'View generated source'.
As far as timing it goes, just about the only option you have is to have a snippet of javascript code. You could set a start-time as soon as is possible on the page-load, and check again when the page is completed (either for dom-ready or page completely downloaded). It's going to be highly variable however, and if you are trying to time it in order to improve the speed (which is good to know, and to do) - just getting Firebug + Yslow would be far more useful.
Within Firefox you can get the final rendered DIV by waiting the browser to finish rendering, then pressing ctrl-A to select all content on the page and finally selecting "Show selection source" from the right-click menu.
This shows you the manipulated/populated DOM-code of the page.

Categories

Resources