I would like to make a small application to check special offers on some web site. This application should access this site periodically (once every few hours), parse the HTML to find the offer and notify me about the offer somehow.
I would like to develop it in JavaScript as a Chrome extension. Do you know about any examples of such an extension I can learn from?
Chrome extensions have the features available that you're after:
Request permission to the website you want to fetch.
Make a background page with a setInterval that makes an ajax request to the website and checks the contents.
Use notifications to notify the user of an update. After notifying, store the contents locally so you know when the live contents have been updated again.
Instead of making an extension, would it not be better to just subscribe to a particular website's RSS feed? You can download a nifty extension called RSS Live Links to give you updates, and you can subscribe to their offers feed.
Related
in the context you access a website, the browser dowload all required files (static files : CSS, scripts) or via AJAX. OK. You can see the dowload process in realtime using the Network tab in your devtools browser.
My question is : is it possible to "listen" to a file being dowloaded using JavaScript as the browser does in the Network tab ?
A concrete example would be to show the user what the browser is being dowloaded in from my website.
While searching over the Internet, I'v seen it's possible to overload xhr native functions : Add a "hook" to all AJAX requests on a page
Nevertheless, I don't think images and CSS download will trigger xhr function because the browser processes in it's own way.
I'm keen to hear the community about it.
Thanks in advance !
If you are explicitly downloading resources in your JavaScript code, you can inject hooks to track the AJAX requests, as per your message above. You can alternatively use the Resource Timing API to track network timing information of your requests, which is nice.
However, it is not possible to see the Network information of resources out of your control, as it requires access to the browser engine.
It is possible to get such information using a Chrome Extension, as an API exists that opens you up to this information. See chrome.devtools.network.
I'm creating a personal home page, due to the fact that iGoogle will be discontinued. One of the things I'm trying to create, is a speed dial-type interface, with website thumbnails as links, and I'd like to automate this process.
I've attempted screenshot automation a few years back with linux and the webkit engine. And it's fine. But my problem is, that I want the screenshots to be from my browser, i.e. my Gmail inbox, not the login page I'd get if attempting a remote screenshot.
I thought of using html2canvas but again, I'd have to load the source of the webpages remotely using a proxy, and that's not what I want. Another attempt of mine, was to load the website in an iframe, extract the source, and pass it on to html2canvas. Unfortunately most websites like google, facebook etc don't allow embeding their websites into iframes, so I'm still stuck.
How do plugins like FoxTab, and SpeedDial make the screenshots from within the browser without popups etc? They do it "browser side" silently, is it possible to duplicate this using just JavaScript? Or is there a way I could accomplish the same in another way, perhaps with a custom addon or something?..
Have you considered using a service like http://webthumbnail.org/ ?
http://phantomjs.org/ is also a great service for that if you want to do it yourself.
Take a look at phatomjs. We use it to take screenshots of all our hosted sites periodically. Phantomjs is a headless Webkit implementation.
I'm working on a web app which uses Backbone's HTML5 History option. In order to avoid having to code everything on the client and on the server, I'm using this method to route every request to index.html
I was wondering if there is a way to get Twitter Cards to work with this setup, as currently it can't read the page as everything is loaded in dynamically with Javascript.
I was thinking about using User Agents to detect whether it's the TwitterBot, and if it is, serving a static version of the page with the required meta-tags. Would this work?
Thanks.
Yes.
At one job we did this for all the SEO/search/facebook stuff etc.
We would sniff the user-agent, and if it was one of the following sniffers
Facebook Open Graph
Google
Bing
Twitter
Yandex
(a few others I can't remember)
we would redirect to a special page that was written to dump all the relevant data about the page for SEO purposes into a nicely formatted (but completely unstyled) page.
This allowed us to retain our google index position and proper facebook sharing even though our site was a total single-page app in backbone.
Yes, serving a specific page for Twitterbot with the right meta data markup will work.
You can test your results while developing using the card's preview tool.
https://dev.twitter.com/docs/cards/preview (with your static URL or just the tags).
i have seen this Post:
Authenticate a facebook user in a Firefox plug-in
and in the third comment someone said, that it isn't possible to load the facebook sdk into a firefox extension. But why?
The JavaScript SDK provided by Facebook relies on a script from connect.facebook.com to be inserted into a web page. However, when you are an extension you don't have a web page around to load this script - you have extension pages. These extension pages are privileged, loading the script into them would give that script permission to do things like reading files on user's disk drive (or simply format it). Doing that with a script on some remote server is a pretty big security risk even if Facebook is considered a trusted site - its servers could get hacked or the traffic might be intercepted and modified. An attacker could then essentially take over user's computer.
Getting an unprivileged context for the Facebook SDK is theoretically possible. Practically however this is complicated enough that I doubt anybody has done it (it's further complicated by the fact the App IDs are bound to a specific host name).
I have seen this excellent firefox extension, Screengrab!. It takes a "picture" of the web page and copies it to the clipboard or saves it to a png file. I need to do so, but with a new web page, from an url I have in javascript. I can open the web page in a new window, but then I have to call the extension -not to press the control- and saves the page once the page is fully loaded.
Is it possible?
I am pretty certain that it is not possible to access any Firefox add-on through web page content. This could create privacy and/or security issues within the Firefox browser (as the user has never given you permission to access such content on their machine). For this reason, I believe Firefox add-ons run in an entirely different JavaScript context, thereby making this entirely impossible.
However, as Dmitriy's answer states, there are server-side workarounds that can be performed.
Does not look like ScreenGrab has any javascript API.
There is a PHP solution for Saving Web Page as Image.
If you need to do it from JavaScript (from client side) - you can:
Step 1: Create a PHP server app that does the trick (see the link), and that accepts JSONP call.
Step 2: Create a client side page (JavaScript) that will send a JSONP request to that PHP script. See my answer here, that will help you to create such request.