Specifically, what I'm trying to do is create a mobile version of a site I don't have access to. The best approach I can think of is this:
My site executes their php search file and then displays the results page, but first modifies its DOM to use my CSS. Is it technically possible?
Your site can definitely access web content from another site, filter/transform it however it wants, and then forward the result wherever it wants. It is not a simple problem, potentially, as so much web content is dynamic. For example, if the source site has content that's formatted with CSS that's dynamically built by JavaScript, it'd be fairly difficult to come up with an automated transformation.
Whether the original site's owners will be happy about your site doing that is a separate issue.
Related
First of all, I have a background in C (++), Java, MATLAB and Python, mainly used for scientific and electronic applications (Math operation on data, reading data from sensors, microcontrollers).
But i'm relatively new to both HTML (CSS) and Javascript.
For both I've read some books. In HTML books, multiple pages are done by links (<a></a>).
In javascript (which feels a lot more natural to me than HTML), I've seen some examples where there is only 1 html page, full of divs, who are shown and hiden each time a certain page needs to be shown.
This is done with the Jquery command $('#div1').hide() and $('#div2').show();
Now my question is, what is the best practice? When is it better to have multiple HTML pages, and when is it better to have just hide/show divs with Javascript?
Thanks
Not Every one Can Use Javascript. Not Every Computer Or Browser Has The Basics Of Java Installed. But Every Computer Can Read HTML every Browswer Can Read HTML.
To identify If a visitor is using java.
How to check whether Java plugins are installed or not in a browser using Code .?
Java is mostly installed now days with the browser some basic functions. But older navigator or IE browsers dont always have it installed by default.
More Info Here also
How can I detect the Java runtime installed on a client from an ASP .NET website?
The easiest is using <ul><li> css navigation themes. Check this site out for more info.
https://medialoot.com/blog/how-to-create-a-responsive-navigation-menu-using-only-css/
When you have multiple html pages and user clicks on links, then on each click a new web page has to be fetched from server and then rendered.
Whereas when you do it in java script the same web page will be altered, so there are no additional requests to the server. And this will be much faster than loading a new web page.
But remember the initial loading time is second approach is longer but its negligible.
Let me point out that there is no "best practice" to the question that you are asking. It is entirely up to the team if they want to push all the content in one page or keep them separately.
If in case you have a content that requires decent amount of images to be loaded, or contents that you are sure will rarely be seen, you might want to keep them in separate pages so as to make the page load faster.
If you have heavy contents which requires a lot of interaction with javascript/jQuery then you certainly might want to keep them in separate pages so that later when you want to debug/add to the code it can be easily done.
The vice versa of the above holds true as well.
If in case you just have small content, or simple text content then you can easily do it in a single page.
Maybe you should use a tab component ? bootstrap wraps one very nicely:
http://getbootstrap.com/javascript/
Maybe thats the best approach, also take a look at angular.js routes in specific, it should do what ever you are looking for.
In order to see a live preview of another website that users link to on my website, I'm using iframes.
However, this is probably not the best solution, as a website is loaded directly into mine, with every JavaScript element etc that is on the linked page.
My question: how dangerous is it to do such a thing? What is the worst case scenario that could happen? Could a linked site just by using JavaScript (or other technologies) do any serious harm to my site or my user's data?
And then, the second part of my question is, of course, about the website preview.
All I found so far are scripts that contain more than one php and js file in order to load a website preview picture.
Isn't there an easier way to do this? What do you suggest?
how dangerous is it to do such a thing?
Some websites do not like to be embedded using frames. Such websites can possibly take over the full browser by ensuring it is loaded in the topmost window. Aside from that, as long as your website and the website you are loading aren't from the same domain, they won't be able to access your cookies, DOM etc. So its pretty safe in that respect.
about the website preview
There aren't many fool proof mechanisms other than generating the preview image server side - as I believe the scripts you've seen do.
If I decided to use some javascipt in my website like
$('#body').load(URL);
or
$.get(URL, {param:value}, function(){ ... });
or
window.title = 'TEXT';
Is it good for SEO? Or am I recommended to use pure PHP for data on the page for SEO purposes?
The question of if javascript is good for SEO or not is missing the point. We should pretty much assume that any content which is only available by javascript will not be crawled by the search engines. Google at least claims to be able to crawl some javascript only content but is fairly tight lipped about what exactly they can crawl. Other search engines probably don't crawl it and it's certainly the case that not all do. So assume it doesn't get crawled.
That doesn't mean it's bad for SEO.
If the content will contribute to your SEO, then it's bad for SEO. If the content is neutral to SEO, then it's neutral for SEO. So the answer to your question really depends on the nature of your content. If the content is part of your SEO campaign, then stick with server-side HTML generation be it PHP or some other method. Otherwise the question of SEO has no bearing on the decision to to use javascript or not. Accessibility would be another thing to take into account. Javascript only content is terrible for that.
The larger search engines can/do render limited amounts of javascript. However, for SEO purposes your best bet is rendering the content via HTML rather than javascript. A good rule of thumb is to utilize HTML for content/expressing limited content structure (e.g. paragraph type text = p, lists = ul/ol, headings = h1/h2/h3, etc...), CSS for presentation, and JS for client side programming. With that being said, always ensure a good user experience first. If you can do the above while providing a great user experience, great! If you can't, users first. Its likely you can keep both users and bots happy 95% of the time if you take the time to do so.
Further reading (sorry, I can only post one link as a new user):
Matt Cutts Interview (Check out #26 on Google Javascript Rendering)
A spider's view of Web 2.0
EDIT Added that for "a new user" ;) ~ drachenstern
I think first you should consider what SEO means. It means "Search Engine Optimization" ... how does a search engine get data in the first place for it to be optimized?
It does a GET on the page and whatever data is returned in the GET is processed. No JS engine. No POST data. So you should be optimizing for whatever data is returned on a GET.
Additionally, you tagged this with PHP, but the question has nothing to do with PHP.
Have you seen any of the questions on this list?
https://stackoverflow.com/search?q=javascript+seo
No Sir, Google does not translate flash and java script properly so it may not crawl those area using java script or flash content. I suggest you should keep your website simple but if it is necessary to keep flashy/java script content then you should keep a text base backup.
The first thing you should be asking is not what is good for SEO, but what is good for users. For users, loading data with JavaScript will give them an interactive page, where they can start seeing the page immediately while it is still loading and where the page can update without having to reload it.
From Google's Webmaster Guidelines and article on Cloaking, you should not assume that crawlers can understand JavaScript. This does not mean that you should not use JavaScript on your website, but rather that you should provide the textual equivalent in noscript tags, for use both by users with JavaScript disabled
as well as for crawlers, bearing in mind that the content of these noscript tags should be roughly equivalent to what was shown with JavaScript enabled; showing different content to users and to search engines is called "cloaking" and is frowned upon to say the least.
Google doesn't (yet) execute a page's Javascript (JS). So if your JS replaces/creates content on a page then the content would normally be invisible to the crawlers (not good).
But, the Googlers have implemented a url hack that enables your server to create pages (from the server, not from JS), with all the different varients of your JS page's content.
This solves the SEO problem of Ajax powered pages. At least for Google searches...
See Crawable Ajax
Javascript or any scripts for that matter should never be used to house your sites content, ever! The entire web is driven by HTML and CSS, and in rare cases XML languages, everything else is a headache when it comes to SEO. Ask yourself this question, what exactly is SEO and what is it that search engines are indexing? Javascript and all programming/scripting languages are proprietary, this means that they are NOT standards as defined by the W3C, which means they are essentially worthless when it comes to indexing content. On the other hand, HTML, CSS, and XML are real standards developed for the web! It's ok to use scripts to add additional functionality to your pages, embed apps like social networking plugins, etc, but you should never use them to hold your websites HTML, CSS, or actual content ever, for any reason. Here's a link to a good article that will explain why you should be using HTML and CSS, and not a million scripts, optimizing webpages using proper html markup. Scripts cause other problems besides code that is hard for search engines to decipher. For one, they are harder for browsers to process, causing pages to load much slower than "static" pages made with HTML and CSS would. Pages made with PHP tend to create "dynamic" URL's that users and search engines cannot read. This is why Google recommends people who use jsp or PHP for their webpages include a sitemap, otherwise your links will never be found and might as well not exist. Stick to the conventions! Lets face it, we have standards for a reason. If every electronic component in your home required had a different type of plug that required a special socked, and all those devices had differing voltage and amperage requirements, what would happen? You would essentially burn down your house! And, you'd be spending 5 hours a day at the hardware store looking for those special adapters to fit your wall sockets with. If you plan on designing a website, use scripts for embedding apps or connecting with a database only, and use HTML and CSS to build "static" webpages. Also, use text links, as they are both human and search engine readable, and easy to index and make sense of. Never use scripts for your links. Programming and scripting can be fun, but not on the internet its not.
Search engines index HTML, CSS, and content (multi-media, graphics, videos, text, thats it!) everything else is pointless and annoying to both users and search engines alike. For best results use XML and design a custom language.
Google can crawl, index and rank javascript generated content.
But... it uses an old Chrome version (42) with an old javascript render engine.
The consequence is that your javascript code needs to work in older browsers and older chrome versions (older than 42). So no fancy ES6 functions, you need to use polyfills or use Babel for example.
Although you can do a lot with javascript (like click events or inject your mobile menu), it's recommended to still use a-href instead of a button with a javascript event and then using a function to get to a new page.
You can check the mobile testing tool from Google: https://search.google.com/test/mobile-friendly and check the errors/warnings/logs. If the rendered output looks like intended, Google will see your content.
In search console you can also ask to index the page. Sometimes the javascript crawler is first, sometimes the 'classic' crawler.
Doublecheck it some days afterward by googling a sentence or paragraph from your page.
There's no answer about if it's better or not. Content is content and Google should rank your website, SPA, PWA, AMP site, PDF document, online Doc, wikipage, and so on based on their content, not on the underlying technique.
If you are familiar with JavaScript, give it a go.
Regards, Peter
I recently implemented a fix to create separate landing pages depending on whether or not the user has javascript enabled. Basically the way it works is this.
The default page is an HTML page w/ no javascript. Basic version of the site. Upon landing on it, there is a script that says if javascript is enabled then go to another page. That landing page is generated by sending the user request through a JSP file that renders the page (header, footer, etc.). The final landing page is http://whatever.com/home.jsp if the user has javascript enabled.
My question is if this will hurt SEO. Considering 99% of the world has javascript enabled I would hate to compromise any SEO benefit to accomodate the 1% who doesn't enable javascript.
Hope that make sense.
In general, searchbots should be treated as browsers with JS disabled. I think you can now imagine where they'll land.
This whole question is by the way completely unrelated to JSP. It is just a server side view technology which provides a template to write HTML/CSS/JS in and provides capabilities to control the page flow dynamically with taglibs and access backend data with EL. All what webbrowsers and bots sees (and thus all what counts for SEO) is its generated HTML output.
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Short version, if your JS sends them to entirely different content, it's probably bad, and Google may give you a a hard time. Other than that, you should be good.
If the alternative version is an (almost) full-featured, full-content version, then it's perfectly OK.
Google even advices for making alternatives for Flash-only sites, for example, in regard to usability.
Read google FAQ
You touch two topics, one is described as "Cloaking", the other as "Duplicate Content". With "cloaking", you present different (optimized-with-bad-intention) content based upon identification of the client that accesses it, e.g. by inspecting the User-agent header (google-bot versus Browser). You are not doing this, you just want to present content in a way that suits your client best, like a redirect on a page optimized for mobile clients ("m.example.com").
The other thing is how to avoid duplicate content. There's a way by indicating the original content source with a canonical tag, see here: http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
There have been many debates about this topic already here, but none of them fully answered my question so I figured I would pose it and hope I get one or two decent answers.
We're planning on relaunching our company website in the next few months. Our current site, for the most part, is text-driven and because of this we rank very well on Google, Yahoo, and Bing for our primary keywords. We want to increase the "Wow Factor" of the site a bit (we're an interactive agency) but still maintain a majority of our search engine footing. The option to use Flash, AJAX, and other technologies that are not considered to be search engine friendly have come up numerous times in our meetings and each time we have to evaluate what kind of impact it would have on us from an SEO perspective.
Assuming a good portion of the site content will be encapsulated within a Flash (swf) file, what would be the best course of action for maintaining current rankings? I've read numerous times that Google indexes Flash files but I am unsure as to what extent. Further, is there a method of telling Google not to index a Flash file (through a variable or otherwise)?
Finally, I had an idea that seemed sound in theory and wanted to put it out into the world and see what type of feedback I receive on it:
Again, assuming the whole page is in a Flash file living on index.html, would it be possible to build out the site as normal (set up a logical directory structure, add content to static pages within said structure, etc), specify paths to those static pages in a Google XML Sitemap file, and have the spiders crawl only those pages (which are rich in content) while the user experiences some concoction of Flash/Javascript/AJAX/etc? If this works, what would be the pros/cons of this solution? Thanks for bearing with me on this slightly off-kilter question.
Well referencing Google I found that they have made impressive strides into indexing Flash based web pages. The only limitation I found from reading the article is that they are currently still limited in their ability in these three areas:
Googlebot does not execute some types of JavaScript. So if your web
page loads a Flash file via
JavaScript, Google may not be aware of
that Flash file, in which case it will
not be indexed.
We currently do not attach content from external resources that are
loaded by your Flash files. If your
Flash file loads an HTML file, an XML
file, another SWF file, etc., Google
will separately index that resource,
but it will not yet be considered to
be part of the content in your Flash
file.
While we are able to index Flash in almost all of the languages found on
the web, currently there are
difficulties with Flash content
written in bidirectional languages.
Until this is fixed, we will be unable
to index Hebrew language or Arabic
language content from Flash files.
By the sounds of it you won't have any problems with any of the 3 'problems'. Based on this document Flash sounds like a viable option for you.
Adobe has been working on their end as well to accommodate the search engines in their stride to make SWFs more search engine friendly as well. So with the combined efforts of both Adobe and Google/Yahoo if you take a dip in ranking within a year or two the search algorithms will be better than they are even now.
As far as not indexing you should be able to add in a simple
User-agent: *
Disallow: /directory/
Disallow: /directory/page.html
to your robots.txt file.
Andrew,
I've had to deal with this sort of thing a few times and I'd recommend maintaining both a Flash site (for users) and an HTML site (for search engines). Here's how you do it:
With whatever server-side stuff you're using set up some kind of switch that determines whether a particular request is for HTML or for whatever your Flash movie consumes (XML, JSON, another SWF, whatever). Every page on your site should be able to return HTML and whatever you choose to feed your Flash movie. A query string parameter like "requestType=Flash" will work just fine.
Put all of the content in your HTML pages in a div tag and make the div invisible with CSS. Use SWFObject to check if the requesting browser supports Flash and, if it does, have SWFObject replace your HTML content with your Flash movie. Search engine spiders will ignore your scripts and simply crawl your HTML pages and if you'd like to show the HTML to users with browsers that don't support Flash (like mobile browsers), just make the HTML content visible after SWFObject has determined that the browser doesn't support Flash.
Once your Flash movie has loaded, have it request whatever data it needs from the server using the same URL of the page that it was loaded on, but with the addition of the switch variable above.
Handle navigation from that point on with SWFAddress. When a user clicks a button to request a new page, pass the request through SWFAddress first, which will update the browser history using the hash mark trick, and then have your Flash movie make its request to the server.
I'm currently working on a site for a friend that uses this technique here (I should note, to protect my pride, that the site is still very much a work in progress):
http://www.casabarbuenosaires.com/
A browser request to any page on the site will first return the HTML representation of that page (you can view source in your browser to see that). SWFObject then replaces the HTML content with a Flash movie that loads a custom XML description of the same page which the Flash movie then constructs and displays.
I've worked on sites in the past that have used this technique and gotten excellent search engine results. Since you don't need to worry too much about what your HTML site looks like to humans, you can focus solely on what it looks like to search engines.
Another added benefit of building your site this way is that you are compelled to separate your site's content/copy from its visual representation. Throwing your entire site into a single SWF is generally NOT a good way to do that. It's much easier to maintain (or re-skin or scrap) a site when your content isn't all mixed up with your code.
Hope this helps,
Scott