the url is input by end users as string on my page, so may point to any domains.
JavaScript in current page needs to sniff the url, verify whether it's still valid, and return the types as image, or video, or audio, even considering html5 video audio tag and existent flash embed. And No need to wait for the complete file transfer.
Can someone help, from concept? thanks very much.
i'm aware the cross domain problem on ajax. So no idea on basic how-to.
If what you're asking, is:
Given any URL -> lookup given URL using a javascript ajax request, and determine if it is a video/audio/image - then, once detected, use the URL accordingly, then you can do something like this:
jQuery and AJAX response header
However, you'll not be able to make a request using client-side JavaScript to another domain, as it will require a cross-domain request (where your alternatives are JsonP, or weird headers in the response).
You're better off passing the URL to your own server, and performing the logic there (Via some kind of server-side web request) and passing a payload back to the client, with the required information in JSON or something - e.g.
{payload: 'video'}
Old question, but I recently wrote a utility that might help you out. It's a CORS-enabled MIME-type checker. See the API doc at lecoq.herokuapp.com
Use it like so: example
Related
I have been playing around with the requests library in Python 3 for quite some time now, and have decided to create a test program. For this program, I'm using the website https://ytmp3.cc/ as an example. But it turns out that a lot is going on, on the client-side it seems.
Some keys and other stuff are being generated, and I have been using Firefox's built-in network monitor, to figure out in which requests this is being made, but without luck.
As far as I know, the requests-library can't keep a "page" open and modify the DOM and content, by making more requests.
Anyone whom could take a look, and give a qualified guess on how the special keys are generated, and how I could possibly get these for my own requests.
Fx when loading the webpage, the first request made is for the root, and the response contains the webpage HTML. What I noticed is that at the bottom, there's an url containing some key and number.
<script id="cs" src="js/converter-1.0.js?o=7_1a-a&=_1519520467"></script>
id 7_1a-a
number _1519520467`
This is used for making the next request, but then a lot of following requests are being made, and some other keys are made as well. But I can't find where these come from since they are not returned by a request.
I know that when inserting a Youtube link, a request will be made to an url, as seen below.
https://d.ymcdn.cc/check.php?callback=jQuery33107639361236859977_1519520481166&v=eVD9j36Ke94&f=mp3&k=7_1a-a&_=1519520481168
This returns the following:
jQuery33107639361236859977_1519520481166({"sid":"21","hash":"2a6b2475b059101480f7f16f2dde67ac","title":"M\u00d8 - Kamikaze (Official Video)","ce":1,"error":""})
From this I can construct the download url, using the hash from above:
https://yyd.ymcdn.cc/ + 2a6b2475b059101480f7f16f2dde67ac (hash) + /eVD9j36Ke94 (youtube video id)
But how do I get
jQuery33107639361236859977_1519520481166&v=eVD9j36Ke94 and 1519520481168
Which I need to create the request?
You can probably save yourself and the operator of that website a lot of headache by just using youtube-dl, specifically with the --extract-audio --audio-format mp3 options. It's probably what that website itself uses.
youtube-dl is written in Python and can easily be used programatically.
If you insist on sending requests to that website for whatever reason, here's how I'd do it:
callback=jQuery33107639361236859977_1519520481166 specifies the name of the callback for the JSONP request. Any name you provide will be printed back out. For example, passing callback=foo will result in the following response:
foo({...})
You can omit it entirely and the server will serve just a JSON response in this case, which is nice.
_=1519520481168 is just to prevent the response being cached. It's randomly generated, just like the above parameter. The website checks for existence, however, so you have to at least pass something in.
The website, like many, checks for a valid Referer header.
Here's a minimal cURL command line to make a request to that website:
curl 'https://d.ymcdn.cc/check.php?v=eVD9j36Ke94&f=mp3&k=aZa4__&_=1' -H 'Referer: https://ytmp3.cc/'
I have a design issue with my SPA, and hope someone can give me some direction. A user profile page is rendered like this:
The browser fetches /some-username.
The server checks to see if the request was a XMLHTTPRequest or not. It is not, and so it simply returns the bundled javascript app to the browser to execute.
The javascript bundle is executed in the browser, it sees the current URL and makes an AJAX request, again to /some-username.
The server sees the XMLHTTPRequest header, looks up the user who has the custom URL "/some-username" and returns the JSON data about the user back to the javascript to render.
This feels wrong. The app should be making RESTful requests to /users/:id to fetch the user data. But how can it know the id that corresponds to the user with the URL /some-username?
It is worth adding an extra HTTP request just to look up the resource identifier? Something like /get_user_id?url=/some-username.
Are you flexible about your API? If so you may change /some-username to /user-id or if you want to include username /user-id/username but ignore username.
As alternative it is also common to make requests in a filter form. Like /users?username=peter
And feel free to use /users/peter if your username identifies the user. Becuase it's actualy the id (that doesn't have to be integer) and then your url is exactly /users/:id
There is nothing "unRESTful" about /some-username. It's just another resource. The response - I hope - contains the canonical URL /user/id anyway, either as a header or as some kind of "self" link.
That's also how you could achieve your goal. Embed the URL in the page either as JavaScript or as a header equivalent (unfortunately you cannot read the headers of the page request with JavaScript):
//header. Can also use a custom header like X-User-Location
<meta http-eqiv="Location" content="/user/id">
//JavaScript
<script>
var userURL = '/user/id
</script>
I recommend keeping your current approach.
I have been playing around with the jQuery library the last week or two.
Very handy! I am now playing with the AJAX requests to retrieve things such as the weather, current downloads and more, which have been going well so far!
I have now tried to connect up to my ISP to get my current data usage (peak, off peak etc).
When I use Chrome, I can manually type the variables into the URL and have the required JSON code show in the browser. The issue is, that it seems to return text/html instead of application/json.
When you go into developer tools, it shows text/html. This make it difficult for me to retrieve the data from my home server using AJAX and JSONP. See here for a failed query (but you can still see the text/html output, which is in a JSON format! Failed JSON Query on ISP
My question is, how could I get this data from the server URL, then make it into JSON that jQuery can read?
When I try the .load , $.get functions I run into Cross Origin Issues...
EDIT:Here is the PDF documentation for the API (Download at the bottom of the page)
Notice that I need to append certain values (user / pass / token). My ultimate aim is to have my JS read these values and store them.
The issue is, that it seems to return text/html instead of application/json.
That's a serverside issue. Go and file a bug report.
This make it difficult for me to retrieve the data
Not by itself. You should be able to override the settings how responses are parsed, e.g. in jQuery by using the datatype parameter.
using AJAX and JSONP
Notice that you cannot use JSONP, as it is not supported by that API (judging from the docs and a simple ?callback=test try). If you want support for that, file a bug report against the service provider.
When I try the .load, $.get functions I run into Cross Origin Issues...
Yes. They don't send CORS headers either. I suspect that this API is only used internally, and by devices that are not subject to a same-origin policy.
how could I get this data from the server URL, then make it into JSON that jQuery can read?
Use a proxy on your own server (that runs in the same domain as your app). It can also fix that content-type header.
For more details see also Ways to circumvent the same-origin policy, though most of the methods require cooperation of the service provider (to implement serverside features).
If i understand you correctly You ask for a certain value and it gives you a string. For most API's in the world they send a string that you have to parse into JSON or some language code. I would suggest looking at Parsing JSON Strings link. It explains how to take well formated strings and parse them into JSON readable objects.
var obj = jQuery.parseJSON( '{ "name": "John" }' );
alert( obj.name === "John" );
if you go on further and start using php take a look that Parsing JSON Strings with PHP
EDIT:
use .done() method to grab text from other pages after AJAX call.
$.ajax(...).done(function(html){
//do what you want with the html from the other page
var object = $.parseJSON(html)
}
Ok here's my problem. I'm working on this little site called 10winstreak and I'm trying to detect if a stream is live or not with javascript because our server that we run the site off of cant handle processing every single request with PHP. The basis of detecting if a stream is live or not is you go to their XML file and in one of their tags (if it's live) it will say something along the lines of true and often time the XML file on their site will be empty if a particular stream isn't live. for example if you have a twitch.tv stream for gamespot you go to http://api.justin.tv/api/stream/list.xml?channel=gamespot and if it's got stuff in it then it's live if not then it's not.
so basically my code looks like this:
function check (URL, term){
$.get(URL , function(data){
console.log(data);
//data is whatever the server returns from the request, do whatever is needed with it to show who is live.
var number = data.search(term);
if (number > -1)
{
document.write("Live");
}
else
{
document.write("Offline");
}
});
}
and URL is a url that gets passed in and term is the term to search for in the xml file (usually "true" or "True"). but before anything happens I end up with "XMLHttpRequest cannot load http://api.own3d.tv/liveCheck.php?live_id=6815. Origin (my server's URL) is not allowed by Access-Control-Allow-Origin."
I've looked into it all over the net and i dont seem to be able to find anything that I can use. there's alot of theory stuff but not enough actual code and i dont understand the theory stuff to be able to start typing code out. from what i've seen you have 2 ways to go, use JSONP or add a line somewhere in your sever to allow for cross-domain accessing. neither of which i understand fully nor know how or what to do. It would be alot of help for someone to show me what needs to be done to get rid of this error. of course if you can explain it to a non-coder like me it would be even more awesome but at my current point, as long as the code works for all I care it might as well be magic lol.
You can solve it :)
Take a look at xReader
<script src="http://kincrew.github.com/xReader/xReader.full.js"></script>
<script type="text/javascript">
xReader("http://api.own3d.tv/liveCheck.php?live_id=6815", function(data) {
alert(data.content);
})
</script>
I think you need cacheburst option. but you can be banned from YQL.
I think its because the path is not relative. You may be calling this from a different domain/sub-domain. You can potentially allow other origins to access, which may open up a security hole or you can create a proxy locally.
In PHP creating a proxy is easy: http://blog.proxybonanza.com/programming/php-curl-with-proxy/
Now, instead of directing your request straight to that URL send the request from jQuery to your own local url and have it access it on the server side.
Another option would be to use YQL: http://www.parrisstudios.com/?p=333 (I wrote an article about this a while ago)... In that way you can turn the response into JSON, which can be accessed cross-domain (as can javascript).
You could ask for the API responses to all be returned using a JSONP server and in JSON.
You aren't going to be able to do this via client-side javascript unless they've enabled some way to retrieve their data cross-domain (CORS, JSONP, some flash widgety thing getting read permissions from crossdomain.xml file(s) located on their server...)
Short answer: unless 10winstreak offers a JSONP service, you'll have to do things on the server-side.
Slightly longer answer:
For security reasons browsers won't let you make AJAX requests from www.example.com to www.example2.com (or any other domain except www.example.com). There isn't much you can do about this except use JSONP (and you can only do that if the remote webservice offers it).
Therefore, what you end up needing to do is ask your server "hey what's on that other server?" and (since it's not limited the way a browser is) it can go get the XML from that other server. There are various ways of doing this, either with code or Apache config; not sure what's right for you, but hopefully now you understand the general principle.
P.S. See this question: Wouldn't have been simpler to just discard cookies for cross-domain XHR? if you are curious why browsers do this.
* EDIT *
I just checked out JustinTV's site, and it appears that they already have a PHP library for you to use:
https://github.com/jtvapi/jtv_php_api
This is very likely your best bet (if you want to keep using PHP that is; if not they have libraries for other languages: http://www.justin.tv/p/api).
Here's the problem:
1.) We have page here... www.blah.com/mypage.html
2.) That page requests a js file www.foo.com like this...
<script type="text/javascript" src="http://www.foo.com/jsfile.js" />
3.) "jsfile.js" uses Prototype to make an Ajax request back to www.foo.com.
4.) The ajax request calls www.foo.com/blah.html. The callback function gets the html response and throws it into a div.
This doesn't seem to work though, I guess it is XSS. Is that correct?
If so, how can I solve this problem? Is there any other way to get my html from www.foo.com to www.blah.com on the client without using an iframe?
It is XSS and it is forbidden. You should really not do things that way.
If you really need to, make your AJAX code call the local code (PHP, ASP, whatever) on blah.com and make it behave like client and fetch whatever you need from foo.com and return that back to the client. If you use PHP, you can do this with fopen('www.foo.com/blah.html', 'r') and then reading the contents as if it was a regular file.
Of course, allow_remote_url_fopen (or whatever it is called exactly) needs to be enabled in your php.ini.
There is a w3c proposal for allowing sites to specify other sites which are allowed to make cross site queries to them. (Wikipedia might want to allow all request for articles, say, but google mail wouldn't want to allow requests - since this might allow any website open when you are logged into google mail to read your mail).
This might be available at some point in the future.
As mentioned above JSONP is a way around this. However, the site that you are requesting the data from needs to support JSONP in order for you to use on the client. (JSONP essentially injects a script tag into the page, and provides a callback function that should be called with the results)
If the site you are making a request to does not support JSONP you will have to proxy the request on your server. As mentioned above you can do this on your own server or what I have done in the past is use a http://www.jsonpit.com, which will proxy the request for you.
One option is to implement a proxy page which takes the needed url as a parameter. e.g. http://blah.com/proxy?uri=http://foo.com/actualRequest
JSONP was partially designed to get around the problem you are having
http://ajaxian.com/archives/jsonp-json-with-padding
JQuery has it in their $.getJSON method
http://docs.jquery.com/Ajax/jQuery.getJSON
The method shown above could become a large security hole.
Suggest you verify the site name against a white list and build the actual URI being proxied on the server side.
For cross domain hits this is a good working example and now is considered as some what "standard" http://www.xml.com/pub/a/2005/12/21/json-dynamic-script-tag.html.
there are other ways as well, for eg injecting iframes with document.domain altered
http://fettig.net/weblog/2005/11/28/how-to-make-xmlhttprequest-connections-to-another-server-in-your-domain/
I still agre that the easy way is calling a proxy in same domain but then it's not truly client side WS call.