I have a new site that I am putting together and part of it has statistics for the site's users. I would like to create a widget that others can use on another website by invoking javascript that reads data from my server and shows that statistics for a given user, but I am having a hard time finding specific tutorials that covers this in django.
I have seen the link at Alex Maradon's site [0], but it looks to me like that is passing html back to the widget and I am having a hard time figuring out how to do this using something like xml.
Are there any django apps for doing this or does anyone know of good how-tos?
[0] http://alexmarandon.com/articles/web_widget_jquery/
This is not matter of Django, you can solve this by using the most common solution. Javascript.
Give your users this to put on their websites.
<script type="text/javascript" src="http://mysite.com/widget/user/124546465"></script>
On a django view, render the next template:
(function(){
document.write('<div class="mysite-userprofile">');
document.write('My visits are {{total_visits}}<br />')
document.write('</div>') })()
)
So on your view, you may have something like this, the mimetype is important
def total_visits(request, user_id):
user = get_object_or_404(User, id = user_id)
total_visits = Visits.objects.filter(user:user).total_visits() #this is a method to count, you may have to write your own logic
context = {'total_visits': total_visits}
render_to_response('widget_total_visits.html', context, mimetype='text/javascript')
What can you do next?
User settings, like this.
<script type="text/javascript">
mysite_options = {
'just_friends': True,
'theme': 'bluemarine,
'realtime': True
}
</script>
<script type="text/javascript" src="http://mysite.com/widget/user/124546465"></script>
So on your template, you can use the variables set before include the script on the web site of your user, a simple stuff.
Later, you can use POST method, to gather information from the user clients. For stats.
And of course make it Ajax!
I hope this give you a path to follow
Obey the one rule: Keep It Simple Silly!
I know it may be very web 1.0 but an iframe really is your best friend in this situation. A simple piece of code such as <script>document.write("<iframe src='yoursite.com/userwidget/" + username + "' height='30' width='150' />");</script> to inject an iframe at load time will save you a crapload of time writing jsonp async code and dom manipulators and making sure that all the elements you inject onto their page will be styled correctly on every different website and worrying about origin policies.
if you have your code plug in an iframe pointed at you page then:
The origin is you! you don't have to worry about it.
You can use django templates instead of js to construct the widget being shown.
their CSS won't mess with your presentation.
their js can't easily manipulate your stats ;)
This is exactly what iframes were designed to do.
Related
I need to call a react web application to another javascript web application, I cannot use the iframe or the object, I cannot use the server require, and I don't know the frontend technologies that I will found, then I must use the basically technologies of every application ( javascript or jquery ).
I tried some ways to do it, for example in javascript
qr=new XMLHttpRequest();
qr.open('get',
https://www.mywebapp.com/
,true);
qr.send('c1=v1&c2=v2');
qr.onload=function(){test2.innerHTML=qr.responseText}
and in jquery
var link = "https://mywebapp.com";
var params = "c1=v1&c2=v2"
$("#test").load(
link,
params,
function(){alert("ciao")}
);
});
also in jquery I have tried to use
ajax,ajaxSetup+ajax and other way, in every way i tried to change the value of every parameter like traditional true and false or content-type:'text/html' or every parameter i have found the result of every tried was that I was loaded the page, but the page cannot find the static resources ( javascript, css, images ), I need help
Thank you
Wow, after some clarification it sounds like the chatbot you have written needs to be embedded in to a page in another application. This way the chatbot doesn't know anything about the page it's being embedded on and the embedding page only needs a little bit of script to kick-start the loading of the chatbot.
You'll need to make sure you have JavaScript and CSS that can be downloaded from the chatbot server and dropped in to any page, regardless of which frameworks are in use.
Your JavaScript is then free to grab any other resources (HTML fragment/JavaScript/CSS/Images/etc.) as needed from the chatbot host. This is similar to how you would go about embedding Disqus or Google Maps on to a page.
I think helping implementing this is way out of scope for a single StackOverflow question but hopefully this will give you some idea of what is required and puts you on the right track.
My application uses AngularJS for frontend and .NET for the backend.
In my application I have a list view. On clicking each list item, It will fetch a pre rendered HTML page from S3.
I am using angular state.
app.js
...
state('staticpage', {
url: "/staticpage",
templateUrl: function (){
return 'http://xxxxxxx.cloudfront.net/staticpage/staticpage1.html';
},
controller: 'StaticPageCtrl',
title: 'Static Page'
})
StaticPage1.html
<div>
Hello static world 1!
<div>
How do I do SEO here?
Do I really need to do HTML snapshot using PanthomJS or so.
Yes PhantomJS would do the trick or you can use prerender.io with that service you can just use their open source renderer and have your own server.
Another way is to use _escaped_fragment_ meta tag
I hope this helps, if you have any questions add comments and I will update my answer.
Do you know that google renders html pages and executes javascript code in the page and does not need any pre-rendering anymore?
https://webmasters.googleblog.com/2014/05/understanding-web-pages-better.html
And take a look at these :
http://searchengineland.com/tested-googlebot-crawls-javascript-heres-learned-220157
http://wijmo.com/blog/how-to-improve-seo-in-angularjs-applications/
My project front-end also has biult on top of Angular and I decieded to solve SEO issue like this:
I've created an endpiont for all search engines (SE) where all the requests go with _escaped_fragment_ parameter;
I parse a HTTP Request for _escaped_fragment_ GET parameter;
I make cURL request with parsed category and article parameters and get the article content;
Then I render a simpliest (and seo friendly) template for SE with the article content or throw a 404 Not Found Exception if article does not exists;
In total: I do not need to prerender some html pages or use prrender.io, have a nice user interface for my users and Search Engines index my pages very well.
P.S. Do not forget to generate sitemap.xml and include there all urls (with _escaped_fragment_) wich you want to be indexed.
P.P.S. Unfortunately my project's back-end has built on top of php and can not show you suitable example for you. But if you want more explanations do not hesitate to ask.
Firstly you can not assume anything.
Google does say that there bots can very well understand javascript application but that is not true for all scenarios.
Start from using crawl as google feature from the webmaster for your link and see if page is rendered properly. If yes, then you need not read further.
In case, you see just your skeleton HTML, this is because google bot assumes page load complete before it actually completes. To fix this you need an environment where you can recognize that a request is from a bot and you need to return it a prerendered page.
To create such environment, you need to make some changes in code.
Follow the instructions Setting up SEO with Angularjs and Phantomjs
or alternatively just write code in any server side language like PHP to generate prerendered HTML pages of your application.
(Phantomjs is not mandatory)
Create a redirect rule in your server config which detects the bot and redirects the bot to prerendered plain html files (Only thing you need to make sure is that the content of the page you return should match with the actual page content else bots might not consider the content authentic).
It is to be noted that you also need to consider how will you make entries to sitemap.xml dynamically when you have to add pages to your application in future.
In case you are not looking for such overhead and you are lacking time, you can surely follow a managed service like prerender.
Eventually bots will get matured and they would understand your application and you will say goodbye to your SEO proxy infrastructure. This is just for time being.
At this point in time, the question really becomes somewhat subjective, at least with Google -- it really depends on your specific site, like how quickly your pages render, how much content renders after the DOM loads, etc. Certainly (as #birju-shaw mentions) if Google can't read your page at all, you know you need to do something else.
Google has officially deprecated the _escaped_fragment_ approach as of October 14, 2015, but that doesn't mean you might not want to still pre-render.
YMMV on trusting Google (and other crawlers) for reasons stated here, so the only definitive way to find out which is best in your scenario would be to test it out. There could be other reasons you may want to pre-render, but since you mentioned SEO specifically, I'll leave it at that.
If you have a server-side templating system (php, python, etc.) you can implement a solution like prerender.io
If you only have AngularJS-only files hosted on a static server (e.g. amazon s3) => Have a look at the answer in the following post : AngularJS SEO for static webpages (S3 CDN)
yes you need to prerender the page for the bots, prrender.io
can be used and your page must have the
meta tag
<meta name="fragment" content="!">
I am currently using javascript and XMLHttpRequest on a static html page to create a view of a record in Zotero. This works nicely except for one thing: The page html title.
I can of course also change the <title>...</title> tag, but if someone wants to post the view to for example facebook the static title on the web page will be shown there.
I can't think of any way to fix this with just a static page with javascript. I believe I need a dynamically created page from a server that does something similar to XMLHttpRequest.
For PHP there is HTTPRequest. Now to the problem. In the javascript version I can use asynchronous calls. With PHP I think I need synchronous calls. Is that something to worry about?
Is there perhaps some other way to handle this that I am not aware of?
UPDATE: It looks like those trying to answer are not at all familiar with Zotero. I should have been more clear. Zotero is a reference db located at http://zotero.org/. It has an API that can be used through XMLHttpRequest (which is what I said above).
Now I can not use that in my scenario which I described above. So I want to call the Zotero server from my server instead. (Through PHP or something else.)
(If you are not familiar with the concepts it might be hard to understand and answer the question. Of course.)
UPDATE 2: For those interested in how Facebook scraps an URL you post there, please test here: https://developers.facebook.com/tools/debug
As you can see by testing there no javascript is run.
Sorry, im not sure if i understand what you are trying to ask, are you just wanting to change the pages title?
Why not use javascript?
document.title = newTitle
Facebook expects the title (or opengraph :title tags) to be present when it fetches the page. It won't execyte any JavaScript for you to fill in the blanks.
A cool workaround would be to detect the Facebook scraper with PHP by parsing the User Agent string, and serving a version of the page with the information already filled in by PHP instead of JavaScript.
As far as I know, the Facebook scraper uses this header for User Agent: "facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php)"
You can check to see if part of that string is present in the header and load the page accordingly.
if (strpos($_SERVER['HTTP_USER_AGENT'], 'facebookexternalhit') !== false)
{
//synchronously load the title and opengraph tags here.
}
else
{
//load the page normally
}
I want to make an app that can display on any webpage, just like how Disqus or IntenseDebate render on articles & web pages.
It will display a mini-ecommerce store front.
I'm not sure how to get started.
Is there any sample code, framework, or design pattern for these "widgets"?
For example, I'd like to display products.
Should I first create a webservice or RSS that lists all of them?
Or can one of these Ajax Scripts simply digest an XHTML webpage and display that?
thanks for any tips, I really appreciate it.
Basically you have two options - to use iframes to wrap your content or to use DOM injection style.
IFRAMES
Iframes are the easy ones - the host site includes an iframe where the url includes all the nessessary params.
<p>Check out this cool webstore:</p>
<iframe src="http://yourdomain.com/store?site_id=123"></iframe>
But this comes with a cost - there's no easy way to resize the iframe considering the content. You're pretty much fixed with initial dimensions. You can come up with some kind of cross frame script that measures the size of the iframe contents and forwards it to the host site that resizes the iframe based on the numbers from the script. But this is really hacky.
DOM injection
Second approach is to "inject" your own HTML directly to the host page. Host site loads a <script> tag from your server and the script includes all the information to add HTML to the page. There's two approaches - first one is to generate all the HTML in your server and use document.write to inject it.
<p>Check out this cool webstore:</p>
<script src="http://yourdomain.com/store?site_id=123"></script>
And the script source would be something like
document.write('<h1>Amazing products</h1>');
document.write('<ul>');
document.write('<li>green car</li>');
document.write('<li>blue van</li>');
document.write('</ul>');
This replaces the original <script> tag with the HTML inside document.write calls and the injected HTML comes part of the original page - so no resizing etc problems like with iframes.
<p>Check out this cool webstore:</p>
<h1>Amazing products</h1>
<ul>
<li>green car</li>
<li>blue van</li>
</ul>
Another approach for the same thing would be separating to data from the HTML. Included script would consist of two parts - the drawing logic and the data in serialized form (ie. JSON). This gives a lot of flexibility for the script compared to the kind of stiffy document.write approach. Instead of outpurring HTML directly to the page, you generate the needed DOM nodes on the fly and attach it to a specific element.
<p>Check out this cool webstore:</p>
<div id="store"></div>
<script src="http://yourdomain.com/store_data?site_id=123"></script>
<script src="http://yourdomain.com/generate_store"></script>
The first script consists of the data and the second one the drawing logic.
var store_data = [
{title: "green car", id:1},
{title: "blue van", id:2}
];
The script would be something like this
var store_elm = document.getElementById("store");
for(var i=0; i< store_data.length; i++){
var link = document.createElement("a");
link.href = "http://yourdomain.com/?id=" + store_elmi[i].id;
link.innerHTML = store_elmi[i].title;
store_elm.appendChild(link);
}
Though a bit more complicated than document.write, this approach is the most flexible of them all.
Sending data
If you want to send some kind of data back to your server then you can use script injection (you can't use AJAX since the same origin policy but there's no restrictions on script injection). This consists of putting all the data to the script url (remember, IE has the limit of 4kB for the URL length) and server responding with needed data.
var message = "message to the server";
var header = document.getElementsByTagName('head')[0];
var script_tag = document.createElement("script");
var url = "http://yourserver.com/msg";
script_tag.src = url+"?msg="+encodeURIComponent(message)+"&callback=alert";
header.appendChild(script_tag);
Now your server gets the request with GET params msg=message to the server and callback=alert does something with it, and responds with
<?
$l = strlen($_GET["msg"]);
echo $_GET["callback"].'("request was $l chars");';
?>
Which would make up
alert("request was 21 chars");
If you change the alert for some kind of your own function then you can pass messages around between the webpage and the server.
I haven't done much with either Disqus or IntenseDebate, but I do know how I would approach making such a widget. The actual displaying portion of the widget would be generated with JavaScript. So, say you had a div tag with an id of commerce_store. Your JavaScript code would search the document, when it is first loaded (or when an ajax request alters the page), and find if a commerce_store div exists. Upon finding such a container, it will auto-generate all the necessary HTML. If you don't already know how to do this, you can google 'dynamically adding elements in javascript'. I recommend making a custom JavaScript library for your widget. It doesn't need to be anything too crazy. Something like this:
window.onload = init(){
widget.load();
}
var widget = function(){
this.load = function(){
//search for the commerce_store div
//get some data from the sql database
var dat = ajax('actions/getData.php',{type:'get',params:{page:123}});
//display HTML for data
for (var i in dat){
this.addDatNode(dat[i]);
}
}
this.addDatNode = function(stuff){
//make a node
var n = document.createElement('div');
//style the node, and add content
n.innerHTML = stuff;
document.getElementById('commerce_store').appendNode(n);
}
}
Of course, you'll need to set up some type of AJAX framework to get database info and things. But that shouldn't be too hard.
For Disqus and IntenseDebate, I believe the comment forms and everything are all just HTML (generated through JavaScript). The actual 'plugin' portion of the script would be a background framework of either ASP, PHP, SQL, etc. The simplest way to do this, would probably just be some PHP and SQL code. The SQL would be used to store all the comments or sales info into a database, and the PHP would be used to manipulate the data. Something like this:
function addSale(){ //MySQL code here };
function deleteSale(){ //MySQL code here };
function editSale(){ //MySQL code here };
//...
And your big PHP file would have all of the actions your widget would ever need to do (in regards to altering the database. But, even with this big PHP file, you'll still need someway of calling individual functions with your ajax framework. Look back at the actions/getData.php request of the example JavaScript framework. Actions, refers to a folder with a bunch of PHP files, one for each method. For example, addSale.php:
include("../db_connect.php");
db_connect();
//make sure the user is logged in
include("../authenticate.php");
authenticate();
//Get any data that AJAX sent to us
var dat = $_GET['sale_num'];
//Run the method
include("../PHP_methods.php");
addSale(dat);
The reason you would want separate files for the PHP_methods and run files, is because you could potentially have more than one PHP_methods files. You could have three method API's, one for displaying content, one for requesting content, and one for altering content. Once you start reusing your methods more and more, its best to have them all in one place. Rewritten once, rewritten everywhere.
So, really, that's all you'd need for the widget. Of course, you would want to have a install script that sets up the commerce database and all. But the actual widget would just be a folder with the aforementioned script files:
install.php: gets the database set up
JavaScript library: to load the HTML content and forms and conduct ajax requests
CSS file: for styling the HTML content and forms
db_connect: a generic php script used connect to the database
authenticate: a php script to check if a user is logged in; this could vary, depending on whether you have your own user system, or are using gravitars/facebook/twitter/etc.
PHP_methods: a big php file with all the database manipulation methods you'd need
actions folder: a bunch of individual php files that call the necessary PHP methods; call each of these php files with AJAX
In theory, all you'd have to do would be copy that folder over to any website, and run the install.php to get it set up. Any page you want the widget to run on, you would simply include the .js file, and it will do all the work.
Of course, that's just how I would set it up. I assume that changes in programming languages, or setup specifics will vary. But, the basic idea holds similar for most website plugins.
Oh, and one more thing. If you were intending to sell the widget, it would be extremely difficult to try and secure all of those scripts from redistribution. Your best bet would be to have the PHP files on your own server. The client would need to have their own db_connect.php, that connects to their own database and all. But, the actual ajax requests would need to refer to the files on your remote server. The request would need to send the url of the valid db_connect, with some type of password or something. Actually, come to think of it, I don't think its possible to do remote server file sharing. You'd have to research it a bit more, 'cuz I certainly don't know how you'd do it.
I like the Azmisov's solution, but it has some disadvantages.
Sites might not want your code on their servers. It'd be much better if you would switch from AJAX to loading scripts (eg. jQuery's getJSON)
So I suggest:
Include jquery hosted on google and a short jquery code from your domain to the client sites. Nothing more.
Write the script with cross-domain calls to your server (through getJSON or getScript) so that everything is fetched directly and nothing has to be inctalled on the client's server. See here for examples, I wouldn't write anything better here. Adding content to the page is easy enough with jQuery to allow me not mentioning it here :) Just one command.
Distribute easily by providing two lines of <script src= ... ></script>
I have some div on page loaded from server by ajax, but in the scenario google and other search engine don't index the content of this div. The only solution I see, it's recognize when page get by search robot and return complete page without ajax.
1) Is there more simple way?
2) How distinguish humans and robots?
You could also provide a link to the non-ajax version in your sitemap, and when you serve that file (to the robot), you make sure to have included a canonical link-element to the "real" page you want users to see:
<html>
<head>
[...]
<link rel="canonical" href="YOUR_CANONICAL_URL_HERE" />
[...]
</head>
<body>
[...]
YOUR NON_AJAX_CONTENT_HERE
</body>
</html>
edit: if this solution is not appropriate (some comments below points out that that this solution is non-standard and only supported by the "big-three"), you might have to re-think whether you should make the non-ajax version the standard solution, and use JavaScript to hide/show the information instead of fetching it via AJAX. If it is business critical information that is fetched, you have to realize that not all users have JavaScript enabled, and thus they won't be able to see this information. A progressive enhancement approach might be more appropriate in this case.
Google gets antsy if you are trying to show different things to you users than to crawlers. I suggest simply caching your query or whatever it is that needs AJAX and then using AJAX to replace only what you need to change. You still haven't really explained what's in this div that only AJAX can provide. If you can do it without AJAX then you should be, not just for SEO but for braille readers, mobile devices and people without javascript.
You can specify a sitemap in your robots.txt. That sitemap should be a list of your static pages. You should not be giving to Google a different page at the same URL, so you should have a different URL with static and dynamic content. Typically, the static URL is .../blog/03/09/i-bought-a-puppy and dynamic URL is something like .../search/puppy.