I am trying to block referrer spam on client and server side:
client side:
<script type='text/javascript'>
var unforgivable = ["darodar.com", "econom.co", "ilovevitaly.co"];
var re = new RegExp(unforgivable.join("|"), "i");
if (document.referrer.match(re))
window.location = "http://google.com/";
</script>
server site, via a servlet filter:
static String[] unforgivable = new String[]{"darodar.com", "econom.co", "lovevitaly"};
for (String badUrl: unforgivable)
if(requestURI.contains(badUrl)) {
res.setContentType("text/html");
PrintWriter pw=res.getWriter();
response.sendRedirect("http://www.google.com");
pw.close();
}
However, I keep getting ilovevitaly.co and darodar referrers when looking to my google analytics. Any clue?
Due to several searches on the internet and several tests on the website I manage, this is a resume that I've read :
There are 2 bots/referral spammers kind :
- a ) those who crawl your web like semalt. Have a look to your traffics logs supply by your hosting provider to identify them.
- b ) the other, which have only cracked your Google analytics ID to generate false traffic to include in your google stats. Their never visit your site. (darodar.com", "econom.co", "ilovevitaly.co belongs to this family)
Members of the A category can be blocked trough Htacess rules
Members of B must be filtering by a filter in GA.
More across the link
Happy new Year - Meilleurs voeux
Nota : This text does not translate by Google. I Beg your pardon, my English isn't fluent.
The only option is to try to hide your Google Analytics ID before spammers bot parsed your website homepage. This can be done by manipulating with Analytics JS code like this:
ga('create', 'UA-XX' + 'XXXX' + 'XX-X', 'auto');
Google Analytics fires at the request, which comes before your request handlers. So yes, they will show up.
You might try blacklisting the IP addresses : https://cloud.google.com/appengine/docs/java/config/dos
That will fire before your request handlers, and before Google Analytics.
Filter future and historical ga spam of all types with the link provided. Hostname filtering is particularly easy.
https://www.ohow.co/ultimate-guide-to-removing-irrelevant-traffic-in-google-analytics/
The only valid hostname is that of your websites (sub)domains. The author of the guide has created, and maintains, the regex to exclude all types of ga spam.
This makes GA usable again :-)
try using the .htaccess to stop spamming, a template can be found here: http://www.sebastianviereck.de/en/template-referer-spamm-htaccess-to-remove-from-google-analytics/
Related
Background
To pass a client_id from one domain to another, Google supports adding a "linker" parameter to outgoing Links that are part of the cross-domain tracking setup. This linker parameter contains the client_id, session_id (I believe, information about Google Ads, e.g. gclid) and a basic fingerprint + timestamp. On the receiving domain, if the browser fingerprint matches and the timestamp is not too far in the past, the passed client_id and session_id are stored in a first party cookie on the 2nd domain and consequently used.
analytics.js / GA-UA
With analytics.js (GA-UA) you could easily do the following, to decorate URLs manually:
function decorateUrl(urlString) {
var ga = window[window['GoogleAnalyticsObject']];
var tracker;
if (ga && typeof ga.getAll === 'function') {
tracker = ga.getAll()[0]; // Uses the first tracker created on the page
urlString = (new window.gaplugins.Linker(tracker)).decorate(urlString);
}
return urlString;
}
Yet, when only gtag is loaded, window.ga and window.gaplugins are not defined. As far as I see, there is currently no documented way to manually generate links with the linker parameter with gtag.
In Google's documentation, they suggest setting up the linker manually. (https://support.google.com/analytics/answer/10071811?hl=en#zippy=%2Cmanual-setup)
But this has several disadvantages, e.g. I have to create a custom "fingerprint" logic (so that decorated URLs are not shared) and e.g. Google Ads information is not included.
Either way, I would like to use the internal gtag logic to decorate URLs.
"Hacky" Workaround Solution
gtag automatically decorates a tags (as soon as they're clicked) that lead to a cross-domain-tracking domain specified in the GA4 data stream settings (e.g. "test.com"), but I specifically need to decorate URLs manually (i.e. without immediately redirecting to them).
I thought about doing the following:
Create a dummy, hidden a tag with the URL to decorate
Prevent redirection with onclick='event.preventDefault();'
Simulate click on hidden element so that gtag automatically adds the linker url parameter to the href attribute
Extract new href attribute
Remove hidden element
function decorateUrlGtag(urlString) {
var tempAnchorEl = document.createElement("a");
tempAnchorEl.setAttribute("type", "hidden");
tempAnchorEl.setAttribute("href", urlString);
tempAnchorEl.setAttribute("onclick", "event.preventDefault(); return false");
document.body.appendChild(tempAnchorEl);
tempAnchorEl.click();
var urlWithLinker = tempAnchorEl.href;
tempAnchorEl.remove();
return urlWithLinker;
}
This also does not work, because gtag does not seem to register the tempAnchorEl.click(); call. If I click the link manually, the URL is decorated - as expected.
Suggested Solutions
The solutions outlined here (Google Analytics gtag.js Manually adding the linker cross-domain parameter to URLs) also do not work for me:
Answer: Even after gtag is initiated, I do not see a global ga element
Answer: Same problem (no ga defined)
Do you (1) know if there is a way to generate the linker parameter manually with gtag that I have overlooked, (2) know how to make my "hacky" solution work or (3) have another possible solution?
I haven't grokked this solution, and I am not sure it answers your question directly, but Simo does give an outline of how to configure GA4 cross domain tracking here:
https://www.simoahava.com/gtm-tips/cross-domain-tracking-google-analytics-4/#how-to-configure-cross-domain-tracking-manually
He breaks the problem down into steps but does not go into great detail. He provides one code snippet:
"...you could also load the URL parameter values directly into the GA4 configuration with something like:
gtag('config', 'G-12345', {
// Namespace roll-up trackers
cookie_prefix: 'roll-up',
// Pull in the Client ID from the URL
client_id: (new URLSearchParams(document.location.search)).get('client_id'),
// Pull in the Session ID from the URL
session_id: (new URLSearchParams(document.location.search)).get('session_id')
});
"
Hope that helps!
I found hundreds of cloned versions of my website.
Whoever is doing that are using some code that clones my web pages, changes my website name mydomain.com to clone1.com, clone2.com, clone3.com etc and this makes it impossible to use a simple JS or PHP to check if the header URL is = to mysite.com then redirect.
It also does not work using the .htaccess
For this reason I have created this code:
<script type="text/javascript">
if (window.location.href== "http://clone1.com/cat1/{{{ $title->id }}}-{{ (Str::slug($title->title)) }}/cat2/{{ $se->n }}/cat3/{{ $episode->ep_n }}")
{
window.location.href = 'http://google.com/';
}
</script>
This script completes its purpose but is too long and is also very restrictive because it must contain the exact URL.
I'm looking to do this:
<script type="text/javascript">
if (window.location.href== "http://
//contains this part in its URL
clone1.com , clone2.com , clone3.com , clone4....
}}")
{
window.location.href = 'http://google.com/';
}
</script>
How can I create a global JS (JavaScript), that would detect if the current page is not on my domain and then redirect the reader to my domain and the same page?
Many thanks
1. Best Solution - Early Detection
Depending on your main traffic source, it is possible to detect who is scrapping you and block them based on their IP, Headers, number of page views and other data, using PHP & HTACCESS.
I really like this answer on the StackOverflow, that discusses almost all the options available for early detection.
How to detect fake users ( crawlers ) and cURL
2. Plugins & Extensions for Open Source Content Management Systems
Wordpress
If using Wordpress CMS, you can try some plugins, like WordFence, that can detect and block fake Google Crawlers, block based on the number of page views etc.
Other CMS
If you can't find a similar solution for your CMS of choice, consider to ask a community for a help with creating the solution like that, as I believe many people could benefit from it.
3. Solution for already stolen content with JavaScript
Sometimes the easiest road to hide something in JS, is to actually HIDE something by OBFUSCATING and by hiding in multiple important files. For example, obfuscate some important file on your website without which the website just wouldn't work properly.
For example, put an obfuscated version of the code below somewhere in JS file in the header, Obfuscate this code using any free services online or download your own library on Github:
Non-Obfuscated:
w='mysite.com'; // Current URL e.g. 'mysite.com/category1/page2/'
function check_origin(){
var check = 587;
if(window.location.hostname != w){
window.location.href = w;
}
return check;
}
var check = check_origin();
Obfuscated example:
var _0x303e=["\x6D\x79\x73\x69\x74\x65\x2E\x63\x6F\x6D","\x68\x6F\x73\x74\x6E\x61\x6D\x65","\x6C\x6F\x63\x61\x74\x69\x6F\x6E","\x68\x72\x65\x66"];w= _0x303e[0];function check_origin(){var check=587;if(window[_0x303e[2]][_0x303e[1]]!= w){window[_0x303e[2]][_0x303e[3]]= w};return check}var check=check_origin()
Now put an additional code in some Footer JS File, to verify the code above wasn't modified in any way:
Non-Obfuscated example:
if(w!=='mysite.com'||check == false || typeof check == 'undefined' || check !== 587 ){
window.location.href = 'mysite.com';
}
Obfuscated:
var _0x92bb=["\x6D\x79\x73\x69\x74\x65\x2E\x63\x6F\x6D","\x75\x6E\x64\x65\x66\x69\x6E\x65\x64","\x68\x72\x65\x66","\x6C\x6F\x63\x61\x74\x69\x6F\x6E"];if(w!== _0x92bb[0]|| check== false|| typeof check== _0x92bb[1]|| check!== 587){window[_0x92bb[3]][_0x92bb[2]]= _0x92bb[0]}
I have used free online service from Google's search results for the term "Free Online JS Obfuscator:
https://javascriptobfuscator.com/Javascript-Obfuscator.aspx
4. Fight thieves with available methods e.g. Request a Ban from Search Engines – The Digital Millennium Copyright Act of 1998
Here is a blog-post that describes what to do when someone is stealing your content.
https://lorelle.wordpress.com/2006/04/10/what-do-you-do-when-someone-steals-your-content/
You can investigate who is doing that and report them to their partners, search engines, advertisers - to disrupt their business.
Depending on their country of origin and yours, it is maybe even possible to sue them and win.
why not check if hostname is your ?
if(window.location.hostname != 'mysite.com'){
window.location.href = 'http://google.com/';
}
This question already has answers here:
Detect the Internet connection is offline?
(22 answers)
Closed 8 years ago.
How do you check if there is an internet connection using jQuery? That way I could have some conditionals saying "use the google cached version of JQuery during production, use either that or a local version during development, depending on the internet connection".
The best option for your specific case might be:
Right before your close </body> tag:
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
<script>window.jQuery || document.write('<script src="js/vendor/jquery-1.10.2.min.js"><\/script>')</script>
This is probably the easiest way given that your issue is centered around jQuery.
If you wanted a more robust solution you could try:
var online = navigator.onLine;
Read more about the W3C's spec on offline web apps, however be aware that this will work best in modern web browsers, doing so with older web browsers may not work as expected, or at all.
Alternatively, an XHR request to your own server isn't that bad of a method for testing your connectivity. Considering one of the other answers state that there are too many points of failure for an XHR, if your XHR is flawed when establishing it's connection then it'll also be flawed during routine use anyhow. If your site is unreachable for any reason, then your other services running on the same servers will likely be unreachable also. That decision is up to you.
I wouldn't recommend making an XHR request to someone else's service, even google.com for that matter. Make the request to your server, or not at all.
What does it mean to be "online"?
There seems to be some confusion around what being "online" means. Consider that the internet is a bunch of networks, however sometimes you're on a VPN, without access to the internet "at-large" or the world wide web. Often companies have their own networks which have limited connectivity to other external networks, therefore you could be considered "online". Being online only entails that you are connected to a network, not the availability nor reachability of the services you are trying to connect to.
To determine if a host is reachable from your network, you could do this:
function hostReachable() {
// Handle IE and more capable browsers
var xhr = new ( window.ActiveXObject || XMLHttpRequest )( "Microsoft.XMLHTTP" );
// Open new request as a HEAD to the root hostname with a random param to bust the cache
xhr.open( "HEAD", "//" + window.location.hostname + "/?rand=" + Math.floor((1 + Math.random()) * 0x10000), false );
// Issue request and handle response
try {
xhr.send();
return ( xhr.status >= 200 && (xhr.status < 300 || xhr.status === 304) );
} catch (error) {
return false;
}
}
You can also find the Gist for that here: https://gist.github.com/jpsilvashy/5725579
Details on local implementation
Some people have commented, "I'm always being returned false". That's because you're probably testing it out on your local server. Whatever server you're making the request to, you'll need to be able to respond to the HEAD request, that of course can be changed to a GET if you want.
Ok, maybe a bit late in the game but what about checking with an online image?
I mean, the OP needs to know if he needs to grab the Google CMD or the local JQ copy, but that doesn't mean the browser can't read Javascript no matter what, right?
<script>
function doConnectFunction() {
// Grab the GOOGLE CMD
}
function doNotConnectFunction() {
// Grab the LOCAL JQ
}
var i = new Image();
i.onload = doConnectFunction;
i.onerror = doNotConnectFunction;
// CHANGE IMAGE URL TO ANY IMAGE YOU KNOW IS LIVE
i.src = 'http://gfx2.hotmail.com/mail/uxp/w4/m4/pr014/h/s7.png?d=' + escape(Date());
// escape(Date()) is necessary to override possibility of image coming from cache
</script>
Just my 2 cents
5 years later-version:
Today, there are JS libraries for you, if you don't want to get into the nitty gritty of the different methods described on this page.
On of these is https://github.com/hubspot/offline. It checks for the connectivity of a pre-defined URI, by default your favicon. It automatically detects when the user's connectivity has been reestablished and provides neat events like up and down, which you can bind to in order to update your UI.
You can mimic the Ping command.
Use Ajax to request a timestamp to your own server, define a timer using setTimeout to 5 seconds, if theres no response it try again.
If there's no response in 4 attempts, you can suppose that internet is down.
So you can check using this routine in regular intervals like 1 or 3 minutes.
That seems a good and clean solution for me.
You can try by sending XHR Requests a few times, and then if you get errors it means there's a problem with the internet connection.
I wrote a jQuery plugin for doing this. By default it checks the current URL (because that's already loaded once from the Web) or you can specify a URL to use as an argument. Always doing a request to Google isn't the best idea because it's blocked in different countries at different times. Also you might be at the mercy of what the connection across a particular ocean/weather front/political climate might be like that day.
http://tomriley.net/blog/archives/111
i have a solution who work here to check if internet connection exist :
$.ajax({
url: "http://www.google.com",
context: document.body,
error: function(jqXHR, exception) {
alert('Offline')
},
success: function() {
alert('Online')
}
})
Sending XHR requests is bad because it could fail if that particular server is down. Instead, use googles API library to load their cached version(s) of jQuery.
You can use googles API to perform a callback after loading jQuery, and this will check if jQuery was loaded successfully. Something like the code below should work:
<script type="text/javascript">
google.load("jquery");
// Call this function when the page has been loaded
function test_connection() {
if($){
//jQuery WAS loaded.
} else {
//jQuery failed to load. Grab the local copy.
}
}
google.setOnLoadCallback(test_connection);
</script>
The google API documentation can be found here.
A much simpler solution:
<script language="javascript" src="http://maps.google.com/maps/api/js?v=3.2&sensor=false"></script>
and later in the code:
var online;
// check whether this function works (online only)
try {
var x = google.maps.MapTypeId.TERRAIN;
online = true;
} catch (e) {
online = false;
}
console.log(online);
When not online the google script will not be loaded thus resulting in an error where an exception will be thrown.
So I am building a WebApp. There I want to have a Button to open and AutoLogin to the GoogleCalendar. I already tried to create a js but i guess i did something wrong.
The js Code:
var popupWindow;
function OpenCalendar() {
popupWindow = window.open('https://accounts.google.com/ServiceLogin?service=cl', 'Calendar');
popupWindow.focus();
popupWindow.document.getElementById('Email').value = 'mail';
popupWindow.document.getElementById('Passwd').value = 'pass';
}
You cannot access any resource from other domain through JavaScript. It is restricted due to security reasons. In short this is not possible.
You can take a look at Google Accounts authentication and authorization here at this link https://developers.google.com/accounts/
I have at my disposal Javascript and Classic ASP. Using these two how can I check to see if a user is a member of a particular active directory group? I know VBSCRIPT has memberof function but I can only use javascript. Any help is appreciated
You'll need to ensure that your web server is set to use Windows Authentication. Then you can use Request.ServerVariables("LOGON_USER") to get the current user's domain\username.
You'll then query Active Directory using ADSI to get group membership.
Here's a link to msdn's ADSI pages. http://msdn.microsoft.com/en-us/library/aa772170%28v=vs.85%29.aspx
This page has some sample scripts (in vbscript)
As far as I know there is no possibility to access activeDirectory by using Javascript. Javascript runs within the browser - and may not access anything out of this sandbox.
In case I misunderstood your question und you ment server-side checking - use ASP functions to check for.
You might also try using Javascript to instantialte a WScript.Network object
var WshNetwork = new ActiveXObject("WScript.Network");
From there, you can get
var netWorkUserName = WshNetwork.UserName;
var netWorkDomain = WshNetwork.UserDomain;
A word of warning: I'm pretty sure this is IE only and requires security changes in IE.
You'll need AJAX and a connection to the AD using ADODB.Connection with the "ADsDSOObject" provider.
EDIT: I saw your comment above. Here's a start:
ldapCommand.CommandText = "select sn from '" & _
"LDAP://example.com/DC=example,DC=com" & _
"' WHERE samAccountName=" & "'" & username & "'"
Set ldapRecordSet = ldapCommand.Execute
ldapCommand is an ADODB.Command, and if Execute throws an error, then the user is not in the domain.