I want to create an app which is able to inject a javascript script into a web page when the user clicks an item in the share menu (alternative methods of triggering the app to do this may also be ok).
I've found that adding an item into the share menu is pretty easy using the ACTION_SEND intent.
However, I've done a lot of googling and searching SO, but I can't seem to find any examples of people injecting scripts into an Android web browser, which is why I'm starting to doubt if this is possible.
Essentially, I have a bookmarklet which in injects javascript into whatever web page you're looking at. I want to build an app which does exactly the same as the bookmarklet does. Any alternative suggestions for achieving this would also be interesting to hear.
Related
I am new to the world of programming so I am trying to make a simple login script that will log me in to target.com when ran! Eventually I will want to add other things to it but I can’t seem to figure out the best way to do this. Any tips would be appreciated!
Using a tool like Puppeteer or Selenium can be used to automate browser tasks.
The basic idea is to use the tool to navigate to the website url, then while on the page you have to get references to the DOM elements you want to interact with. Once you do that you can use the tool to simulate events like clicking, typing etc.
However, logging in to sites like Target using scripts poses issues because they have ways to guard against automated scripts accessing their site. Some defenses include setting up Captchas and examining request headers.
After looking around a bit I came to no conclusion about this matter: does Google and other search engines crawl pages that are only accessible through ng-click, without an anchor tag? Or does an anchor tag always need to be present for the crawling to work successfully?
I have to build various elements which link to other pages in a generic way and ng-click is the best solution for me in terms of flexibility, but I suppose Google won't "click" those elements since they have no anchor tag.
Besides the obvious ui-sref tag with I have about other solutions like:
<a ng-click = 'controller.changeToLink()'>Link name</a>
Altough I am not sure if this is a good practice either.
Can someone please clarify this issue for me? Thanks.
Single page applications are in general very SEO unfriendly, ng-click not being followed being the least of the problems.
The application does not get rendered server side, so search engine crawlers have a hard time properly indexing the content.
According to this latest recommendation, the Google crawler can render and index most dynamic content.
The way that it will work is that it will wait for the Javascript to kicking and render the application, and only index after the content is injected in the page. This process is not 100% proof and single page applications cannot compete with static applications until recently.
This is the main reason why most sites are using them for their menu system, as that would make for a much better user experience than full page reloads. Single page apps are not SEO friendly.
This is slowly changing as now Angular Universal, Ember Fast Boot and React are adding the possibility to render server side an SEO friendly page, but still have it take over as SPA on the client side.
I think your best bet to try to improve your SEO is to submit a site map file to google using their webmaster tools. This will let google know about those pages that you trigger via ng-click.
Note that this only has a chance of working if you are using the HTML5 mode for the router and not using bookmarks (urls using #), as Google does not index bookmarks.
In general its very hard to get good SEO for an Angular 1 app, and thats why its mostly not used for public indexable content. The sweetspot of AngularJs is for building the "dashboard" private section of your app, that users can access after logging in.
Try using prerender.io to prerendered these angularge pages and filter out bot requests and serve these prerendered pages from the page cache.
I want to add a bit of extra HTML to an existing site based on a REST API call response.
Specifically, www.arbookfind.com lets you search for kids school books with an "AR" test. (My son has to read a certain number of books at a level.) It has a link to amazon.com if you find a book you want to buy. However I would like to know if available for Kindle (most are not). Right now I have to click the Amazon link, check the page, go back and try next one - it can take 10 tries to find one available on the Kindle. Painful!
I was after ideas of the easiest way to do this. That is, without touching the arbookfind.com web site, can I add some JavaScript (jQuery) to all the returned HTML pages. The JavaScript will look in the returned page for each book, fire off a Amazon ItemSearch query (?) to see if available on Kindle, then inject a HTML link to the Kindle book on Amazon. I can learn how to write the JavaScript - I am just after some pointers for the easiest way to augment the current site.
That way I can use the current arbookfind.com site to find a book, but it is faster for me to identify which books are available on Kindle without manually trying each link by hand.
E.g. a web browser plugin that runs some javaScript on each returned page? A varnish proxy with some smart logic to fiddle pages on the way through? A PHP app acting like as a proxy server? Thanks!
Maybe you want to have something like the chrome extension Tampermonkey.
It allows to add and manage userscripts for websites. Means, a javascript "snippet" which is added to websites maching specific patterns.
Hi I am relatively new to this topic so I have no idea if this is possible.
What I want to do is to create a widget which could be attached to the any web page other there dynamically. This widget has nothing to do with any web pages in particular but once the widget is created all the visitor of the web pages should be able to see this widget (is this possible?)
I don't know where to start..... should this service be browser's plugin (addons) or is there a way to dynamically manipulate someone else's dom dynamically?
Any thoughts, help etc would be a great help!
Thanks.
If I have understood your question correctly, you want to have a script that injects onto every webpage the user visits and displays a widget, correct?
You could create an add-on, although you would have to create a separate add-on for each browser you plan to support, and they can sometimes be a bit more complicated than they have to be for something as simple as script injection.
A better alternative is create a user script, which is basically a JavaScript file which is run whenever a vistor visits a website which matches a pattern that you specify (for instance, all websites they visit). Firefox has support for user scripts through the Greasemonkey extension, and Opera and Google Chrome has built-in support.
If you want to learn how to make your own user scripts, you can check out the Greasemonkey wiki or you can study some of the scripts
I'm connecting to a web site, logging in.
The website redirects me to new pages and Mechanize deals with all cookie and redirection jobs, but, I can't get the last page. I used Firebug and did same job again and saw that there are two more pages I had to pass with Mechanize.
I took a quick look at the pages and saw that there is some JavaScript and HTML code but couldn't understand it because it doesn't look like normal page code. What are those pages for? How they can redirect to other pages? What should I do to pass these?
If you need to handle pages with Javascript, try WATIR or Selenium - those drive a real web browser, and can thus handle any Javascript. WATIR Classic requires either IE or Firefox with a certain extension installed, and you will see the pages flash on the screen as it works.
Your other option would be understanding what the Javascript on the offending page does and bypassing it manually, but that seems onerous.
At present, Mechanize doesn't handle JavaScript. There's talk of eventually merging Johnson's capabilities into Mechanize, but until that happens, you have two options:
Figure out the JavaScript well enough to understand how to traverse those pages.
Automate an actual browser that does understand JavaScript using Watir.
what are those pages for? how they can redirect to other pages. what should i do to pass these?
Sometimes work is done on those pages. Sometimes the JavaScript is there to prevent automated access like what you're trying to do :). A lot of websites have unnecessary checks to make sure you have a "good" browser, so make sure that your user_agent is set to something common, like IE. Sometimes setting the user_agent to look like an old browser will let you get past without JavaScript.
Website automation is fun because you have to outsmart the website and its software developers, using multiple strategies. Like the others said, Watir is the best tool for getting past JavaScript at the moment.