jQuery or any javascript for obligatory attribution link - javascript

the idea is to :
have a script which embed an attribution link to the website,
then this script should be integrated into the main script which makes the website working,
then this global script should be encryted that visitors can not read it...
or
have just an embed attribution link like this :
I use a Blogger template for one of my blog (http://prohungarica.blogspot.com) where the template author has integrated to its attribution (you can see the belowest and i can not find how he did... i would like the same thing to my own templates.
How might I do this?
I am interested in other than jQuery or javascript too !
EDIT :
So the first part of my question will be too difficult for my knowledge and some people did not even understand, the second part of my question remains :
How to embed an attribution link in my own template that users can not take it off, just like the example in my blog (see here above) ? Maybe with a jQuery or other javascript ?

HOW TO SHOW AN ATTRIBUTION LINK WHAT NOBODY CAN TAKE OFF
The web is not a good place to achieve such a thing. An explicit design goal of HTML is to make life easier for the user (visitor), even if doing so makes life harder for the author.
have a script which embed an attribution link to the website
(That's the easy part, which you already know how to do.)
then this script should be integrated into the main script which makes the website working
This tends to be hard, because the web browser is usually what makes most of a website "working." Anything you write as an author rather than an implementor is well isolated from the browser. I think the most you could do here is come up with a new layer of abstraction between the browser and the content. Otherwise it would be too easy to isolate the attribution logic. This is no small endeavor.
then this global script should be encryted that visitors can not read it
You probably mean "obfuscated." Encryption would prevent the visitor's browser from running the script, unless you also handed over the keys, which would make it pointless. Obfuscation keeps the code runnable, but makes it harder (though not impossible) for humans to understand. That said, it's inappropriate to give product recommendations on Stackoverflow, but I hope this clarification helps you find what you want.

Related

Will Google search results reflect website text that I replace with a script? [duplicate]

I am just wondering if Google or other search engines execute JavaScript on your web page. For example, if you set the title tag using JavaScript, does the Google search engine see that?
There have been some experiments performed for SEO purposes which indicate that at least the big players (Google, for example) can and do follow some simple JavaScript. They avoid sneaky redirects and such, but some basic content manipulation does seem to get through. (I don't have a link handy for Google themselves confirming or denying this, it's just various posts I've come across when dealing with this before.)
However, this is generally considered unreliable. If SEO is being done for any important purpose, don't rely on the spiders indexing much dynamic content.
There's actually a very good (in my opinion, anyway) answer here to a very similar question. What I like about that answer is how it breaks down the steps for generating good, indexable, and best of all maintainable web pages with concerns properly separated. Adhering as much as possible to this process generally results in good SEO, good accessibility, and good design skills in general.
Yes, Google executes Javascript. How much is a moving target.
Google executed some Javascript as early as 2011: http://searchengineland.com/google-can-now-execute-ajax-javascript-for-indexing-99518
This article from 2012 documents some experiments on what Javascript Google did and did not run at the time: http://moz.com/ugc/can-google-really-access-content-in-javascript-really
In May 2014, Google said publicly that they execute Javascript: http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html Although that post says that Google has been getting better, there are no publicly available details on what Javascript Google does and does not execute -- but presumably they are at least as good at it as they were in 2012.
I'm pretty sure they dont. However, you can see for yourself: google have a tool which will show you your page as it sees it as http://www.google.com/webmasters/
if the text is in the onpage javascript, google will see the text. but it will not be seen as the text of the title element.
but hey, this is quite easy to test. just do it. wait two days. if you then google your site with site:.... look whats in the headline. if it's in there then the answer is yes: google sees it, if not: no google doesn't. it's easily testable.
(p.s.: my money is on: no)
We need to remember that JavaScript is client-side language, and always start executing from client-side. If all of titles or contents are via javascript then it'll be output from client-side, and I doubt it'll show up on Google search (meanwhile if outputted on .html, then yes).
If I am correct as of latest, meta tags are "fuel for search-engine", and it have ties to SEO, where it is common robots to be scripted to crawl on your site.

Breach injection through shadow root

I'm not a JS programmer. I don't have enough skill to test it myself, so asking for help.
Is it possible to inject script or HTML tag into shadow_root element to check for holes on web?
For example <script>alert("alert");</script>
Maybe something else like by using <content>?
Main question: Is it possible or not?
And additional question: How?
According to what I could dig up in comments, you want to know if your users can inject code in your website pages. The answer is yes, the user has all the rights to play with the DOM in front of him. The easy way is by simply opening your favorite browser's developers' tools.
Do it yourself... Open your developer's tools window here, reach the console and write
document.write("<script>alert(\"alert\");</script>");
As you can see, you can change anything even directly on StackOverflow. But that causes absolutely no harm to others, just you. While you can do that on SO does mean that it is insecure at all! It is just that your browser has full control on what it received...
Now, the question should be more how can I detect possible points in my application where such injection can be harmful.
The answer is simple, never trust client input. The server should always validate the inputs, and make sure there is no database injection possible. When displaying user provided content, one should also make sure there is no hidden code tag that would be ran by the browser of the users looking at the webpage.
StackOverflow is not suited for this kind of knowledge sharing. I suggest you read about website security in general and then find more in depth resources related to your technology stack and the usage you have with your users' input.
Also, if you are asking this for a real job task you have been given. The most important thing to do would be to tell your manager you are not fit for the task. Not because you lack the talent, but because you lack the knowledge. This shows that you are smart enough to see the task as very important (security IS very important) and you are not willing to play with the company's reputation.
See workplace.stackexchange.com if you'd like to know how to best explain that to your superiors.

Javascript isn't good for SEO, is it?

If I decided to use some javascipt in my website like
$('#body').load(URL);
or
$.get(URL, {param:value}, function(){ ... });
or
window.title = 'TEXT';
Is it good for SEO? Or am I recommended to use pure PHP for data on the page for SEO purposes?
The question of if javascript is good for SEO or not is missing the point. We should pretty much assume that any content which is only available by javascript will not be crawled by the search engines. Google at least claims to be able to crawl some javascript only content but is fairly tight lipped about what exactly they can crawl. Other search engines probably don't crawl it and it's certainly the case that not all do. So assume it doesn't get crawled.
That doesn't mean it's bad for SEO.
If the content will contribute to your SEO, then it's bad for SEO. If the content is neutral to SEO, then it's neutral for SEO. So the answer to your question really depends on the nature of your content. If the content is part of your SEO campaign, then stick with server-side HTML generation be it PHP or some other method. Otherwise the question of SEO has no bearing on the decision to to use javascript or not. Accessibility would be another thing to take into account. Javascript only content is terrible for that.
The larger search engines can/do render limited amounts of javascript. However, for SEO purposes your best bet is rendering the content via HTML rather than javascript. A good rule of thumb is to utilize HTML for content/expressing limited content structure (e.g. paragraph type text = p, lists = ul/ol, headings = h1/h2/h3, etc...), CSS for presentation, and JS for client side programming. With that being said, always ensure a good user experience first. If you can do the above while providing a great user experience, great! If you can't, users first. Its likely you can keep both users and bots happy 95% of the time if you take the time to do so.
Further reading (sorry, I can only post one link as a new user):
Matt Cutts Interview (Check out #26 on Google Javascript Rendering)
A spider's view of Web 2.0
EDIT Added that for "a new user" ;) ~ drachenstern
I think first you should consider what SEO means. It means "Search Engine Optimization" ... how does a search engine get data in the first place for it to be optimized?
It does a GET on the page and whatever data is returned in the GET is processed. No JS engine. No POST data. So you should be optimizing for whatever data is returned on a GET.
Additionally, you tagged this with PHP, but the question has nothing to do with PHP.
Have you seen any of the questions on this list?
https://stackoverflow.com/search?q=javascript+seo
No Sir, Google does not translate flash and java script properly so it may not crawl those area using java script or flash content. I suggest you should keep your website simple but if it is necessary to keep flashy/java script content then you should keep a text base backup.
The first thing you should be asking is not what is good for SEO, but what is good for users. For users, loading data with JavaScript will give them an interactive page, where they can start seeing the page immediately while it is still loading and where the page can update without having to reload it.
From Google's Webmaster Guidelines and article on Cloaking, you should not assume that crawlers can understand JavaScript. This does not mean that you should not use JavaScript on your website, but rather that you should provide the textual equivalent in noscript tags, for use both by users with JavaScript disabled
as well as for crawlers, bearing in mind that the content of these noscript tags should be roughly equivalent to what was shown with JavaScript enabled; showing different content to users and to search engines is called "cloaking" and is frowned upon to say the least.
Google doesn't (yet) execute a page's Javascript (JS). So if your JS replaces/creates content on a page then the content would normally be invisible to the crawlers (not good).
But, the Googlers have implemented a url hack that enables your server to create pages (from the server, not from JS), with all the different varients of your JS page's content.
This solves the SEO problem of Ajax powered pages. At least for Google searches...
See Crawable Ajax
Javascript or any scripts for that matter should never be used to house your sites content, ever! The entire web is driven by HTML and CSS, and in rare cases XML languages, everything else is a headache when it comes to SEO. Ask yourself this question, what exactly is SEO and what is it that search engines are indexing? Javascript and all programming/scripting languages are proprietary, this means that they are NOT standards as defined by the W3C, which means they are essentially worthless when it comes to indexing content. On the other hand, HTML, CSS, and XML are real standards developed for the web! It's ok to use scripts to add additional functionality to your pages, embed apps like social networking plugins, etc, but you should never use them to hold your websites HTML, CSS, or actual content ever, for any reason. Here's a link to a good article that will explain why you should be using HTML and CSS, and not a million scripts, optimizing webpages using proper html markup. Scripts cause other problems besides code that is hard for search engines to decipher. For one, they are harder for browsers to process, causing pages to load much slower than "static" pages made with HTML and CSS would. Pages made with PHP tend to create "dynamic" URL's that users and search engines cannot read. This is why Google recommends people who use jsp or PHP for their webpages include a sitemap, otherwise your links will never be found and might as well not exist. Stick to the conventions! Lets face it, we have standards for a reason. If every electronic component in your home required had a different type of plug that required a special socked, and all those devices had differing voltage and amperage requirements, what would happen? You would essentially burn down your house! And, you'd be spending 5 hours a day at the hardware store looking for those special adapters to fit your wall sockets with. If you plan on designing a website, use scripts for embedding apps or connecting with a database only, and use HTML and CSS to build "static" webpages. Also, use text links, as they are both human and search engine readable, and easy to index and make sense of. Never use scripts for your links. Programming and scripting can be fun, but not on the internet its not.
Search engines index HTML, CSS, and content (multi-media, graphics, videos, text, thats it!) everything else is pointless and annoying to both users and search engines alike. For best results use XML and design a custom language.
Google can crawl, index and rank javascript generated content.
But... it uses an old Chrome version (42) with an old javascript render engine.
The consequence is that your javascript code needs to work in older browsers and older chrome versions (older than 42). So no fancy ES6 functions, you need to use polyfills or use Babel for example.
Although you can do a lot with javascript (like click events or inject your mobile menu), it's recommended to still use a-href instead of a button with a javascript event and then using a function to get to a new page.
You can check the mobile testing tool from Google: https://search.google.com/test/mobile-friendly and check the errors/warnings/logs. If the rendered output looks like intended, Google will see your content.
In search console you can also ask to index the page. Sometimes the javascript crawler is first, sometimes the 'classic' crawler.
Doublecheck it some days afterward by googling a sentence or paragraph from your page.
There's no answer about if it's better or not. Content is content and Google should rank your website, SPA, PWA, AMP site, PDF document, online Doc, wikipage, and so on based on their content, not on the underlying technique.
If you are familiar with JavaScript, give it a go.
Regards, Peter

Problems with noobs putting my GA code into their sites

I don't mean for the title to be derogatory, but this is a rather frustrating problem, and I'm looking for a good workaround, given a language barrier involved.
I have a site set up for a plugin I wrote, and, rather than use the site's resources to write their own code, I've had people simply rip the code from the samples on the site. Normally, this wouldn't be any issue at all, but they are also taking my Google Analytics instantiation, so my Analytics data is getting very skewed by incorporating visitation data from their websites.
I've been able to contact the English-speaking site owners with little issue. The problem lies in the Japanese language sites that are yanking the code. I have no idea how to ask them to take down the analytics portion.
Long-term, I'm providing a package that streamlines the learning-to-use process, but in the meantime, what can I do about this language barrier? Is there a way around this problem that I haven't thought of?
You could modify the javascript on your pages to only load Analytics if the domain matches your own.
Two ideas:
Don't show your actual GA code on your site. Replace it with some filler code that makes it obvious it's meant to be replaced. Since I'm not sure what your plugin is about I'm not sure how practical this is, but I think there must be a way.
Use Google Translate to give foreign users the option to see your page translated into their own language. Google even offers a tool to add a "Choose your language" drop-down to any page. (And of course make sure the most important parts of your site are in plain, easily-translated English.)
Good luck!
Also, building on Greg's solution, if the domain doesn't match your own, you could alert a message telling the implementer to remove the code, a la what Crockford did with his JSON parser.
I would contact them in English (using plain language), and at the bottom I would add a version of the same text run through Google Translate - with an apology that it was generated by Google Translate and may not be accurate.

How does disqus work?

Does anyone know how disqus works?
It manages comments on a blog, but the comments are all held on third-party site. Seems like a neat use of cross-site communication.
The general pattern used is JSONP
Its actually implemented in a fairly sophisticated way (at least on the jQuery site) ... they defer the loading of the disqus.js and thread.js files until the user scrolls to the comment section.
The thread.js file contains json content for the comments, which are rendered into the page after its loaded.
You have three options when adding Disqus commenting to a site:
Use one of the many integrated solutions (WordPress, Blogger, Tumblr, etc. are supported)
Use the universal JavaScript code
Write your own code to communicate with the Disqus API
The main advantage of the integrated solutions is that they're easy to set up. In the case of WordPress, for example, it's as easy as activating a plug-in.
Having the ability to communicate with the API directly is very useful, and offers two advantages over the other options. First, it gives you as the developer complete control over the markup. Secondly, you're able to process comments server-side, which may be preferable.
Looks like that using easyXDM library, which uses the best available way for current browser to communicate with other site.
Quoting Anton Kovalyov's (former engineer at Disqus) answer to the same question on a different site that was really helpful to me:
Disqus is a third-party JavaScript application that runs in your browser and injects itself on publishers' websites. These publishers need to install a small snippet of JavaScript code that makes the first request to our servers and loads initial JavaScript loader. This loader then creates all necessary iframe elements, gets the data from our servers, renders templates and injects the result into some element on the page.
As you can probably guess there are quite a few different technologies supporting what seems like a simple operation. On the back-end you have to run and scale a gigantic web application that serves millions of requests (mostly read). We use Python, Django, PostgreSQL and Redis (for our realtime service).
On the front-end you have to minimize your payload, make sure your app is super fast and that it doesn't break in extremely hostile environments (you will be surprised how screwed up publisher websites can be). Cross-domain communication—ability to send messages from hosting website to your servers—can be tricky as well.
Unfortunately, it is impossible to explain how everything works in a comment on Quora, or even in an article. So if you're interested in the back-end side of Disqus just learn how to write, run and operate highly-scalable websites and you'll be golden. And if you're interested in the front-end side, Ben Vinegar and myself (both front-end engineers at Disqus) wrote a book on the topic called Third-party JavaScript (http://thirdpartyjs.com/).
I'm planning to read the book he mentioned, I guess it will be quite helpful.
Here's also a link to the official answer to this question on the Disqus site.
short answer? AJAX, you get your own url eg "site.com/?comments=ID" included via javascript... but with real time updates like that you would need a polling server.
I think they keep the content on their site and your site will only send & receive the data to/from disqus. Now I wonder what happens if you decide that you want to bring your commenting in house without losing all existing comments!. How easy would you get to your data I wonder? They claim that the data belongs to you, but they have the control over it, and there is not much explanation on their site about this.
I'm always leaving comment in disqus platform. Sometimes, comment seems to be removed once you refreshed it and sometimes it's not. I think the one that was removed are held for moderation without saying it.

Categories

Resources