Need to refresh a single ie7 web tab from command line - javascript

I need to create a batch file that will stop a process and then refresh a defined tab in internet explorer 7. Just need some help/pointers on the tab refresh part or if it's even possible... I don't want IE to open another tab, and another browser is not an option as the web based program is only compatible with IE. Any ideas? I've experimented with a VBS file with no luck and seeing how it's a web based program I cannot add Java Script to the page...

I know you said you tried VBScript, but it really is the most suitable solution for what you're trying to achieve. See "Hey, Scripting Guy! Blog: How Can I Tell if Any Internet Explorer Windows Are Open to a Particular Web Site?"
See the code:
For i = 0 to objShellWindows.Count - 1
Set objIE = objShellWindows.Item(i)
strURL = objIE.LocationURL
If InStr(strURL, "http://www.microsoft.com/technet/scriptcenter")Then
blnFound = True
End If
Next
Each iteration returns an instance of an open Internet Explorer window's WebBrowser Control. Instead of the blnFound = True try objIE.Refresh2().

You could frame the site, then refresh it from the outer frame with javascript on a timer. This may or may not suit your needs.

This is doable, but it's a little tricky and it requires a constraint: the tab you want to refresh has to have been opened by a Javascript call to window.open and it has to have a name. Let's call that name foo. Then you need to simply load another web page in that same browser session to execute the following Javascript:
window.open('http://other.site.url/etc', 'foo');
This means you need to both know the name of the frame and the target URL. But it's certainly doable.
Doing this from a batch file requires some scripting. In VBScript the code would be something like:
Dim browser
Set browser = CreateObject("SHDocVw.InternetExplorer")
browser.visible = True
browser.navigate("http://mysite.org/refresh.html")
Where refresh.html is the page containing the above Javascript followed by a call to window.close()

Assuming you have control over the web page too...
I'm suprised no one brought up the age old meta refresh.
Rather than do some goofy iframe/javascript magic, or some crazy IE COM object mambo, you could always write a meta refresh tag into your given a certain querystring is passed (or all the time, I don't know what your needs are)
Again, I'm not sure this suits your needs, but it is quick and pretty clean.
put this in your and it will refresh the page once every 60 seconds:
<meta http-equiv="refresh" content="60">

Related

Scrape sub elements in tables which cannot found in html but only in Chrome>F12>Element

I have tried to scrape scoring/event time and also player name http://en.gooooal.com/soccer/analysis/8401/events_840182.html.However cannot work.
require(RCurl);
require(XML);
lnk = "http://en.gooooal.com/soccer/analysis/8401/events_840182.html";
doc = htmlTreeParse(lnk,useInternalNodes=TRUE);
x = unlist(xpathApply(doc, "//table/tr/td"));
normal html page doesn't show the details of the table contents.
the nodes only can get from
>>> open Chrome >>> click F12 >>> click Element
Can someone help? Thanks a lot.
If you reload the page while Chrome developer tools are active, you can see that real data is fetched via XHR from http://en.gooooal.com/soccer/analysis/8401/goal_840182.js?GmFEjC8MND. This URL contains event id 840182 which you can scrape from the page. The part after ? seems to be just a way to circumvent browser caching. 8401, again, seems to be just first digits of the id.
So, you can load the original page, construct the second URL, and get real data from there.
Anyway... In most cases it's a morally questionalble practice to scrape data from web sites. I hope you know what you're doing :)
It sounds as if the content was inserted asynchronously using javascript, so using Curl won't help you there.
You'll need a headless browser which can actually parse and execute javascript (If you know ruby you could start looking for the cucumber-selenium-chromedriver combo), or maybe just use your browser with greasemonkey/tampermonkey to actually mimic a real user browsing the score scraping.
The contents are probably generated (by Javascript, like from an ajax call) after loading the (HTML) page. You can check that by loading the page in Chrome after disabling Javascript.
I don't think you can instruct RCurl to execute Javascript...

javascript failing with permission denied error message

I have a classic ASP web page that used to work... but the network guys have made a lot of changes including moving the app to winodws 2008 server running iis 7.5. We also upgraded to IE 9.
I'm getting a Permission denied error message when I try to click on the following link:
<a href=javascript:window.parent.ElementContent('SearchCriteria','OBJECT=321402.EV806','cmboSearchType','D',false)>
But other links like the following one work just fine:
<a href="javascript:ElementContent('SearchCriteria','OBJECT=321402.EV806', 'cmboSearchType','D',false)">
The difference is that the link that is failing is in an iframe. I noticed on other posts, it makes a difference whether or not the iframe content is coming from another domain.
In my case, it's not. But I am getting data from another server by doing the following...
set objhttp = Server.CreateObject("winhttp.winhttprequest.5.1")
objhttp.open "get", strURL
objhttp.send
and then i change the actual html that i get back ... add some hyperlinks etc. Then i save it to a file on my local server. (saved as *.html files)
Then when my page is loading, i look for the specific html file and load it into the iframe.
I know some group policy options in IE have changed... and i'm looking into those changes. but the fact that one javascript link works makes me wonder whether the problem lies somewhere else...???
any suggestions would be appreciated.
thanks.
You could try with Msxml2.ServerXMLHTTP instead of WinHttp.WinHttpRequest.
See differences between Msxml2.ServerXMLHTTP and WinHttp.WinHttpRequest? for the difference between Msxml2.ServerXMLHTTP.
On this exellent site about ASP you get plenty of codesamples on how to use Msxml2.ServerXMLHTTP which is the most recent of the two:
http://classicasp.aspfaq.com/general/how-do-i-read-the-contents-of-a-remote-web-page.html
About the IE9 issue: connect a pc with an older IE or another browser to test if the browser that is the culprit. Also in IE9 (or better in Firefox/Firebug) use the development tools (F12) and watch the console for errors while the contents of the iFrame load.
Your method to get dynamic pages is not efficient i'm afraid, ASP itself can do that and you could use eg a div instead of an iframe and replace the contents with what you get from the request. I will need to see more code to give better advice.

Checking if a website doesn't permit iframe embed

I am writing a simple lightbox-like plugin for my app, and I need to embed an iframe that is linked to an arbitrary page. The problem is, many web sites (for example, facebook, nytimes, and even stackoverflow) will check to see if is being embedded within a frame and if so, will refresh the page with itself as the parent page. This is a known issue, and I don't think there's anything that can be done about this. However, I would like the ability to know before hand if a site supports embed or not. If it doesn't, I'd like to open the page in a new tab/window instead of using an iframe.
Is there a trick that allows me to check this in javascript?
Maybe there is a server-side script that can check links to see if they permit an iframe embed?
I am developing a browser extension, so there is an opportunity to do something very creative. My extension is loaded on every page, so I'm thinking there's a way to pass a parameter in the iframe url that can be picked up by the extension if it destroys the iframe. Then I can add the domain to a list of sites that don't support iframe embed. This may work since extensions aren't loaded within iframes. I will work on this, but in the meantime....
Clarification:
I am willing to accept that there's no way to "bust" the "frame buster," i.e. I know that I can't display a page in an iframe that doesn't want to be in one. But I'd like for my app to fail gracefully, which means opening the link in a new window if iframe embed is not supported. Ideally, I'd like to check iframe embed support at runtime (javascript), but I can see a potential server-side solution using a proxy like suggested in the comments above. Hopefully, I can build a database of sites that don't allow iframe embed.
Check x-frame-options header by using following code
$url = "http://stackoverflow.com";
$header = get_headers($url, 1);
echo $header["X-Frame-Options"];
If return value DENY, SAMEORIGIN or ALLOW-FROM then you can't use iframe with that url.
Probably pretty late but what you need to do is make a request, likely from your server and look for the x-frame-options header. If it's there at all you can just open a new tab because if it is there is is one of the following: DENY, SAMEORIGIN, ALLOW-FROM. In any of these cases it's likely that you don't have access to open it in an iframe.
This subject has been discussed forever on the web with a particularly interesting (failed) attempt here:
Frame Buster Buster ... buster code needed
The bottom line is that even if you are able to construct a proxy that parses the contents of the page that you want in your iframe and removes the offending code before it is served to the iframe you may still come under "cease and desist" from the site if they get to hear about you doing it.
If you don't want your development to be widely available, you could probably get away with it. If you want your development to become popular, forget about it, and build a less underhand way of dealing with it.
Or develop it for mobile only... ;)
UPDATE: OK following on from your comment here's a bit of taster:
in javascript capture the click on the link
$("a").click(function(e){
preventDefault(e); // make sure the click doesn't happen
// call a server side script using ajax and pass the URL this.href
// return either a true or false; true = iframe breakout
// set the target attribute of the link to "_blank" for new window (if true)
// set the target attribute of the link to "yourframename" for iframe (if false)
// only now load the page in the new window or iframe
});
server side in PHP
$d = file_get_contents($url); // $url is the url your sent from the browser
// now parse $d to find .top .parent etc... in the <head></head> block
// return true or false

Take Screenshot of Browser via JavaScript (or something else)

For support reasons I want to be able for a user to take a screenshot of the current browser window as easy as possible and send it over to the server.
Any (crazy) ideas?
That would appear to be a pretty big security hole in JavaScript if you could do this. Imagine a malicious user installing that code on your site with a XSS attack and then screenshotting all of your daily work. Imagine that happening with your online banking...
However, it is possible to do this sort of thing outside of JavaScript. I developed a Swing application that used screen capture code like this which did a great job of sending an email to the helpdesk with an attached screenshot whenever the user encountered a RuntimeException.
I suppose you could experiment with a signed Java applet (shock! horror! noooooo!) that hung around in the corner. If executed with the appropriate security privileges given at installation it might be coerced into executing that kind of screenshot code.
For convenience, here is the code from the site I linked to:
import java.awt.Dimension;
import java.awt.Rectangle;
import java.awt.Robot;
import java.awt.Toolkit;
import java.awt.image.BufferedImage;
import javax.imageio.ImageIO;
import java.io.File;
...
public void captureScreen(String fileName) throws Exception {
Dimension screenSize = Toolkit.getDefaultToolkit().getScreenSize();
Rectangle screenRectangle = new Rectangle(screenSize);
Robot robot = new Robot();
BufferedImage image = robot.createScreenCapture(screenRectangle);
ImageIO.write(image, "png", new File(fileName));
}
...
Please see the answer shared here for a relatively successful implementation of this:
https://stackoverflow.com/a/6678156/291640
Utilizing:
https://github.com/niklasvh/html2canvas
You could try to render the whole page in canvas and save this image back to server. have fun :)
A webpage can't do this (or at least, I would be very surprised if it could, in any browser) but a Firefox extension can. See https://developer.mozilla.org/en/Drawing_Graphics_with_Canvas#Rendering_Web_Content_Into_A_Canvas -- when that page says "Chrome privileges" that means an extension can do it, but a web page can't.
Seems to me that support needs (at least) the answers for two questions:
What does the screen look like? and
Why does it look that way?
A screenshot -- a visual -- is very necessary and answers the first question, but it can't answer the second.
As a first attempt, I'd try to send the entire page up to support. The support tech could display that page in his browser (answers the first question); and could also see the current state of the customer's html (helps to answer the second question).
I'd try to send as much of the page as is available to the client JS by way of AJAX or as the payload of a form. I'd also send info not on the page: anything that affects the state of the page, like cookies or session IDs or whatever.
The cust might have a submit-like button to start the process.
I think that would work. Let's see: it needs some CGI somewhere on the server that catches the incoming user page and makes it available to support, maybe by writing a disk file. Then the support person can load (or have loaded automatically) that same page. All the other info (cookies and so on) can be put into the page that support sees.
PLUS: the client JS that handles the submit-button onclick( ) could also include any useful JS variable values!
Hey, this can work! I'm getting psyched :-)
HTH
-- pete
I've seen people either do this with two approaches:
setup a separate server for screenshotting and run a bunch of firefox instances on there, check out these two gem if you're doing it in ruby: selenium-webdriver and headless
use a hosted solution like http://url2png.com (way easier)
You can also do this with the Fireshot plugin. I use the following code (that I extracted from the API code so I don't need to include the API JS) to make a direct call to the Fireshot object:
var element = document.createElement("FireShotDataElement");
element.setAttribute("Entire", true);
element.setAttribute("Action", 1);
element.setAttribute("Key", "");
element.setAttribute("BASE64Content", "");
element.setAttribute("Data", "C:/Users/jagilber/Downloads/whatev.jpg");
if (typeof(CapturedFrameId) != "undefined")
element.setAttribute("CapturedFrameId", CapturedFrameId);
document.documentElement.appendChild(element);
var evt = document.createEvent("Events");
evt.initEvent("capturePageEvt", true, false);
element.dispatchEvent(evt);
Note: I don't know if this functionality is only available for the paid version or not.
Perhaps http://html2canvas.hertzen.com/ could be used. Then you can capture the display and then process it.
You might try PhantomJs, a headlesss browsing toolkit.
http://phantomjs.org/
The following Javascript example demonstrates basic screenshot functionality:
var page = require('webpage').create();
page.settings.userAgent = 'UltimateBrowser/100';
page.viewportSize = { width: 1200, height: 1200 };
page.clipRect = { top: 0, left: 0, width: 1200, height: 1200 };
page.open('https://google.com/', function () {
page.render('output.png');
phantom.exit();
});
I understand this post is 5 years old, but for the sake of future visits I'll add my own solution here which I think solves the original post's question without any third-party libraries apart from jQuery.
pageClone = $('html').clone();
// Make sure that CSS and images load correctly when opening this clone
pageClone.find('head').append("<base href='" + location.href + "' />");
// OPTIONAL: Remove potentially interfering scripts so the page is totally static
pageClone.find('script').remove();
htmlString = pageClone.html();
You could remove other parts of the DOM you think are unnecessary, such as the support form if it is in a modal window. Or you could choose not to remove scripts if you prefer to maintain some interaction with dynamic controls.
Send that string to the server, either in a hidden field or by AJAX, and then on the server side just attach the whole lot as an HTML file to the support email.
The benefits of this are that you'll get not just a screenshot but the entire scrollable page in its current form, plus you can even inspect and debug the DOM.
Print Screen? Old school and a couple of keypresses, but it works!
This may not work for you, but on IE you can use the snapsie plugin. It doesn't seem to be in development anymore, but the last release is available from the linked site.
i thing you need a activeX controls. without it i can't imagine. you can force user to install them first after the installation on client side activex controls should work and you can capture.
We are temporarily collecting Ajax states, data in form fields and session information. Then we re-render it at the support desk. Since we test and integrate for all browsers, there are hardly any support cases for display reasons.
Have a look at the red button at the bottom on holidaycheck
Alternatively there is html2canvas of Google. But it is only applicable for never browsers and I've never tried it.
In JavaScript? No. I do work for a security company (sort of NetNanny type stuff) and the only effective way we've found to do screen captures of the user is with a hidden application.

How can I monitor the rendering time in a browser?

I work on an internal corporate system that has a web front-end using Tomcat.
How can I monitor the rendering time of specific pages in a browser (IE6)?
I would like to be able to record the results in a log file (separate log file or the Tomcat access log).
EDIT: Ideally, I need to monitor the rendering on the clients accessing the pages.
The Navigation Timing API is available in modern browsers (IE9+) except Safari:
function onLoad() {
var now = new Date().getTime();
var page_load_time = now - performance.timing.navigationStart;
console.log("User-perceived page loading time: " + page_load_time);
}
In case a browser has JavaScript enabled one of the things you could do is to write an inline script and send it first thing in your HTML. The script would do two things:
Record current system time in a JS variable (if you're lucky the time could roughly correspond to the page rendering start time).
Attach JS function to the page onLoad event. This function will then query the current system time once again, subtract the start time from step 1 and send it to the server along with the page location (or some unique ID you could insert into the inline script dynamically on your server).
<script language="JavaScript">
var renderStart = new Date().getTime();
window.onload=function() {
var elapsed = new Date().getTime()-renderStart;
// send the info to the server
alert('Rendered in ' + elapsed + 'ms');
}
</script>
... usual HTML starts here ...
You'd need to make sure that the page doesn’t override onload later in the code, but adds to the event handlers list instead.
As far as non-invasive techniques are concerned, Hammerhead measures complete load time (including JavaScript execution), albeit in Firefox only.
I've seen usable results when a JavaScript snippet could be added globally to measure the start and end of each page load operation.
Have a look at Selenium - they offer a remote control that can automatically start different browsers (e.g. IE6), load pages, test for specific content on the page. At the end reports are generated that also show the rendering times.
Since others are posting answers that use other browsers, I guess I will too. Chrome has a very detailed profiling system that breaks down the rendering time of the page and shows the time it took for each step along the way.
As for IE, you might want to consider writing a plugin. There seems to be few tools like this on the market. Maybe you could sell it.
On Firefox you can use Firebug to monitor load time. With the YSlow plugin you can even get recommendations how to improve the performance.

Categories

Resources