Making a server-sided "localhost" API call in Javascript - javascript

In Javascript, it is possible to load server-sided files via HTML or Javascript. I am trying to do the same thing except via an API call instead of a file directly (as this file is automatically generated).
I'm hosting a flask server that's being served to http://0.0.0.0:5000/ on a remote linux server, and I am able to make a GET request on the server via command line using wget http://localhost:5000/data and it will return the correct result.
However, if I try to make an ajax call on an html page it instead tries to GET data from my local computer's localhost rather than the server's localhost
For example, I have a simple index.html on the server, served by nginx:
<script
src="https://code.jquery.com/jquery-3.2.1.min.js"
integrity="sha256-hwg4gsxgFZhOsEEamdOYGBf13FyQuiTwlAQgxVSNgt4="
crossorigin="anonymous">
</script>
<script>
var url = "http://127.0.0.1:5000/data"
$.ajax({
dataType: "json",
url: url,
success: function( data ) {
console.log(data)
}
});
</script>
But it doesn't seem to call on the server's localhost, and instead calls on the localhost of my computer. (i.e. it returns the results only if I am running the API service locally as well, and ERR_CONNECTION_REFUSED otherwise). How can I make it call on the "server"s localhost? Is this even possible to do?
An obvious workaround would be to process the file on the server and then save it to disk, and then call on the file directly via ajax, but I would like to avoid this if possible.

Related

Execute JS in server side with PHP [duplicate]

for a project at school I am trying to make a website that can show your grades in a prettier way than it's being done now.
I have been able to log in to the site using cURL and now I want to get the grades in a string so I can edit it with PHP.
The only problem is that cURL gets the html source code when it hasn't been edited by the javascript that gets the grades.
So basically I want the code that you get when you open firebug or inspector in a string so I can edit it with php.
Does anyone have an idea on how to do this? I have seen several posts that say that you have to wait till the page has loaded, but I have no clue on how to make my site wait for another third-party site to be loaded.
The code that I am waiting to be executed and of which I want the result is this:
<script type="text/javascript">
var widgetWrapper = $("#objectWrapper325");
if (widgetWrapper[0].timer !== undefined) {
clearTimeout( jQuery('#objectWrapper325')[0].timer );
}
widgetWrapper[0].timer = setTimeout( function() {
if (widgetWrapper[0].xhr !== undefined) {
widgetWrapper[0].xhr.abort();
}
widgetWrapper[0].xhr = jQuery.ajax({
type: 'GET',
url: "",
data: {
"wis_ajax": 1,
"ajax_object": 325,
'llnr': '105629'
},
success: function(d) {
var goodWidth = widgetWrapper.width();
widgetWrapper.html(d);
/* update width, needed for bug with standard template */
$("#objectWrapper325 .result__overview").css('width',goodWidth-$("#objectWrapper325 .result__subjectlabels").width());
}
});
}, 500+(Math.random()*1000));
</script>
First you have to understand a subtle but very important difference between using cURL to get a webpage, and using your browser visiting that same page.
1. Loading a page with a browser
When you enter the address on the location bar, the browser converts the url into an ip address . Then it tries to reach the web server with that address asking for a web page. From now on the browser will only speak HTTP with the web server. HTTP is a protocol made for carrying documents over network. The browser is actually asking for an html document (A bunch of text) from the web server. The web server answers by sending the web page to the browser. If the web page is a static page, the web server is just picking an html file and sending it over network. If it's a dynamic page, the web server use some high level code (like php) to generate to the web page then send it over.
Once the web page has been downloaded, the browser will then parse the page and interprets the html inside which produces the actual web page on the browser. During the parsing process, when the browser finds script tags it will interpret their content as javascript, which is a language used in browser to manipulate the look of the web page and do stuff inside the browser.
Remember, the web server only sent a web page containing html content he has no clue of what's javascript.
So when you load a web page on a browser the javascript is ONLY interpreted once it is downloaded on the browser.
2. What is cURL
If you take a look at curl man page, you'll learn that curl is a tool to transfer data from/to servers which can speak some supported protocols and HTTP is one of them.
When you download a page with curl, it will try to download the page the same way your browser does it but will not parse or interpret anything. cURL does not understand javascript or html, all it knows about is how to speak to web servers.
3. Solution
So what you need in your case is to download the page like cURL does it and also somehow make the javascript to be interpreted as if it was inside a browser.
If you had follwed me up to here then you're ready to take a look at CasperJS.

External URL linking

I have the following code:
<script>
var fileContent;
$.ajax({
url : "text.txt",
dataType: "text",
...
It loads the text from a .txt file in order to obtain the data. If the text.txt is on the same path as the html code, it loads the data. However, if I type for example (placing the file in a different folder):
url: "../../../files/text.txt"
It does not allow me to obtain the file. Any ideas of how to do it or how to implement it without changing the code in a significant way? Thanks!
There are three possible causes of this:
You are using HTTP
You are using HTTP and the path you are trying to access is not exposed by your web server. (You cannot access files above the directory root by default).
You need to give the file a URL on your web server and request that URL.
You are using local files
Different browsers have different security restrictions for Ajax on local files.
You can get this problem if the file is in a directory above the HTML document (some browsers only allow you to access files in the same or a lower directory).
You can resolve this by using HTTP. Ajax in general works very poorly without HTTP.
You simply have the URL wrong
Correct the URL.

Import XML with jQuery: works on server, not locally

I'm working on a script that reads an XML file and then outputs the data. It works perfectly when it runs on my web server, but won't run from my local machine. (The "542Data.xml" file is stored in the same folder as the HTML page on both the server and my computer, and I checked that all file versions are the same. I've tried it in Firefox and Chrome with the same results.)
<div id="output"></div>
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.9.1/jquery.min.js"></script>
<script>
$(document).ready(function()
{
$.ajax({
type: "GET",
url: "542Data.xml",
dataType: "xml",
success: parseXml
});
});
function parseXml(xml)
{
$(xml).find("point").each(function(index)
{
$("#output").append("Name: " + $(this).attr("name") + "<br />");
});
}
</script>
The XML is structured as:
<?xml version="1.0"?>
<destinations>
<point name="Tot Lot at Bryan Park">
<lat>39.15611</lat>
<long>-86.52664</long>
<type>outdoors</type>
</point>
<point name="Playground at Cascades Park">
<lat>39.19633</lat>
<long>-86.53581</long>
<type>outdoors</type>
</point>
</destinations>
What do I need to change to get this working locally?
EDIT: I was wrong, it's working in Firefox. (embarrassed!)
Your script works fine for me in Firefox.
Chrome has some security feature that disallows what you wanted to do (using file:/// for AJAX requests). You need to start your browser with:
chrome --disable-web-security
to disable security checks. (--allow-file-access-from-files might also do the trick, but I haven't tested it yet)
Warning: disabling security checks affects your browser security and should only be used for temporary development purposes. If you plan to run the code from your local machine in a prolonged period of time, consider installing a web server on your local machine.
If by "working locally" you mean you have the html and xml file in a folder and open the HTML file by double clicking on it, then there is no way.
In order for it to work locally you need an web server which will resolve http requests. Opening a local file on a file system is not what is happening here. .ajax() is making a server request. Without a server it won't work.
What are you using to develop? Check if the development server you are using can serve XML files.
According to the given (small) info. There is may be a security reason i.e. importing jquery from google's repository. Please give more code or look at the error console in firefox - ctr+shift+j and copy paste the error if there is any, or just download jquery and include it with path in local location.
It is running on server but not on your machine. See, the ajax request needs a local server running. To make it work, start some local server on your machine. For example, if you're on Windows, then download WAMP, and if on Linux, then install LAMP. Put your files in www folder. Then start the local server..and then access your file using localhost/your_file_name. That'll give you the result as you want it.

Why threre is no way to download file using ajax request?

In our application we need to implement following scenario:
A request is send from client
Server handles the request and generates file
Server returns file in response
Client browser displays file download popup dialog and allows user to download the file
Our application is ajax based application, so it would be very easy and convenient for us to send ajax request (like using jquery.ajax() function).
But after googilng, it turned out that file downloading is possible only when using non-ajax POST request (like described in this popular SO thread). So we needed to implement uglier and more complex solution that required building HTML structure of form with nested hidden fields.
Could someone explain in simple words why is that ajax requests cannot be used to download file? What's the mechanics behind that?
It's not about AJAX. You can download a file with AJAX, of course. However the file will be kept in memory, i.e. you cannot save file to disk. This is because JavaScript cannot interact with disk. That would be a serious security issue and it is blocked in all major browsers.
This can be done using the new HTML5 feature called Blob. There is a library FileSaver.js that can be utilized as a wrapper on top of that feature.
That's the same question I'd asked myself two days ago. There was a project with client written using ExtJS and server side realisation was on ASP.Net. I have to translate server side to Java. There was a function to download an XML file, that server generates after Ajax request from the client. We all know, that it's impossible to download file after Ajax request, just to store it in memory. But ... in the original application browser shows usual dialog with options open, save and cancel downloading. ASP.Net somehow changed the standard behaviour... It takes me two day to to prove again - there is no way to download file by request usual way ... the only exception is ASP.Net... Here is ASP.Net code
public static void WriteFileToResponse(byte[] fileData, string fileName)
{
var response = HttpContext.Current.Response;
var returnFilename = Path.GetFileName(fileName);
var headerValue = String.Format("attachment; filename={0}",
HttpUtility.UrlPathEncode(
String.IsNullOrEmpty(returnFilename)
? "attachment" : returnFilename));
response.AddHeader("content-disposition", headerValue);
response.ContentType = "application/octet-stream";
response.AddHeader("Pragma", "public");
var utf8 = Encoding.UTF8;
response.Charset = utf8.HeaderName;
response.ContentEncoding = utf8;
response.Flush();
response.BinaryWrite(fileData);
response.Flush();
response.Close();
}
This method was called from WebMethod, that, in turn, was called from ExtJS.Ajax.request. That's the magic. What's to me, I've ended with servlet and hidden iframe...
you can do this by using hidden iframe in your download page
just set the src of the hidden ifame in your ajax success responce and your task is done...
$.ajax({
type: 'GET',
url: './page.php',
data: $("#myform").serialize(),
success: function (data) {
$("#middle").attr('src','url');
},
});

External JSON data with offline development

I am developing a web app that accesses some external JSON data. I'm currently using jQuery's getJSON to get the data and call the callback.
My internet at home is terrible, so I'm regularly not connected. I am looking for a way to develop this app while disconnected from the internet.
My initial thought was to have an OFFLINE variable that I set, which changes the location of the scripts to a local file, but because jQuery's getJSON uses dynamically named functions for callbacks, it would need some server intelligence.
More info on how getJSON callbacks work here: http://docs.jquery.com/Ajax/jQuery.getJSON
I'm sure there's an easier way. Any suggestions?
** Edit **
Let me try and clarify a bit
I'm currently running a local web server. I have to - script tags can't reference a local file, for security reasons.
I'm currently calling getJSON with the url: http://twitter.com/status/user_timeline/user.json?callback=?
If I downloaded that json response and hosted it on the local webserver, it wouldn't work, because the callback name will change every time, yet the feed will have the function name it was originally fetched with.
I have a similar problem. Try xampp for an easy php/apache/mysql install on your machine.
I use dreamhost to host my site. I manage everything with a subversion repository, which allows me to simply do 'svn update' on my live site when I am ready to pull in my changes.
I also define all my paths relative to a base_url variable, which is set depending on the http host, so I don't have to change anything for my site to run on different webservers. I use codeigniter, and my config file looks like this:
switch($_SERVER['HTTP_HOST']) {
case "claytonhp":
$config['base_url'] = "http://claytonhp/<project_url>";
break;
// etc.
}
To use that same path in my javascript, I put the following at the top of each html file:
<script type="text/javascript">
siteUrl = '<?= base_url();?>';
</script>
<script type="text/javascript" src="<?= base_url();?>public/scripts/external/jquery/jquery.js"></script>
<!-- Local functionality -->
<script type="text/javascript" src="<?= base_url();?>public/scripts/common.js"></script>
<!-- etc -->
Then, my jquery ajax calls look like this:
$.ajax({
type: "POST",
url: siteUrl + "index.php/ajax_controller/getSomeData",
dataType: "json",
data: "id=5",
success: successCallback,
error: errorCallback
});
Just use a web server (IIS is built into Windows, or use Apache, or XAMP otherwise). That way, you're always connected to your web site (use http://localhost/...).
Quick solution is to just run a local web server. This is a good idea for all sorts of reasons.
If you don't want to do that, just define the URL to get the JSON from somewhere global, and pass it to getJSON(). Just don't forget to set it back before you put your code up on the server.
I used a local sinatra webserver, and replaced the hosts in my /etc/hosts file. It's nice because it's super easy to define new services.
I often forget to reset my hosts file, which can cause a lot of frustration, so I created a script to wrap the whole thing as well.
Here's an example that will serve up a twitter user feed.
run.sh
#!/bin/bash
cp /etc/hosts /etc/hosts.original
cat offline_hosts >> /etc/hosts
ruby server.rb -p 80
cp /etc/hosts.original /etc/hosts
offline_hosts
127.0.0.1 twitter.com
server.rb
#!/usr/bin/ruby
require 'sinatra'
# twitter user
# http://twitter.com/status/user_timeline/$USER.json&callback=?
get '/status/user_timeline/:username.json', :host_name => /twitter\.com/ do
render_file "feeds/#{params[:username]}.json"
end
def render_file filename
output = File.open(filename).read
output = "#{params[:callback]}(#{output});" if params[:callback]
output
end

Categories

Resources