Express using a patternless dynamic base URL to render different pages - javascript

I am interested in having a route that could respond to a request with a file eg express's res.sendFile() based on the URL's base parameter i.e. www.example.com/:parameter. The problem is that the URLs are completely user generated and completely dynamic. Similar to that of Github, www.github.com/username could render a user's profile or www.github.com/project could render a project—but they are both strings that don't have a pattern and the machine has no way of knowing that www.github.com/username refers to a user view unless it does some type of check.
app.all('/*', function(req, res) {
res.sendfile('index.html', { root: config.server.distFolder });
Github responds to server requests with different views based on the parameter, even though they have no predefined pattern.
i.e. it would be easy to know that www.github.com/user/username is a user route and the server can respond with a user view (the pattern to match would be www.github.com/user/:user But when the string is completely dynamic it becomes more difficult. (How does express know if it should respond with a user or a project view with a url like example.com/cococola)?
I believe you would somehow be able to check the URL parameter, understand that it refers to (in this case) either a project or a user's page, and then render that view. How do you do this without making a synchronous call and forcing the user to wait for the server to check what view-type the parameter string refers to before responding?
I'm using angular, are there other ways to respond to server requests with different pages based on URL's that have no predeterminable matching pattern? The reason being that, I would like to separate my site into many different apps. www.example.com/username might require a user's profile SPA, whereas www.example.com/projectname might require a user's project SPA—but, as these are user defined, there is no way to respond based on the parameter's matching pattern. Would like to keep the URL as minimal as possible :-)
Any help is appreciated. Thanks :-)

Just use a database / some kind of key value store where the key is the url parameter and the value is the view type. Then you just need to do a simple lookup.

Related

(Node.js) How to get URL of a binary file stored in server?

I am developing a service using node.js, which returns a url to a file stored in my service's database. In my program, I save the file to, for example, "./aFolder/filename.jpg". I'm going to use DigitalOcean.
My questions are:
1. What is the form of such a url?
2. How can I get that url in my code using node.js?
Thank you in advance.
The URL will be whatever you want to make it since you'll presumably be the one writing the code to handle incoming requests.
Something like http://uri/download/aFolder/filename.jpg seems like a reasonable choice, but you'd need to write your app to accept those paths.
You may want to look into Express where you can add route handlers via app.route() such that anything to /download gets processed by a particular callback that facilitates the mapping of the URL onto your filesystem and will likely be sending the correct file over using res.download()
A basic skeleton might be:
app.route('/download:path')
.all(function(req, res, next) {
res.download(req.params.path)
})

How to obtain the full url of an Angular view in Global.asax

In an ASP.NET MVC application, there is Request.Url to access the url in Global.asax.
But for an Angular application where an url is like http://domain/#/home. The Request.Url we obtained from Application_BeginRequest or Application_EndRequest are http://domain/. The Angular routes are not included.
It is reasonable because those routes are added at the client side. But is it possible to get the value of the true url in the MVC server side?
Update:
Just picked Matteo's answer as the correct one. Let me clarify a bit.
I have been trying this for one purpose: rewrite my url.
In the past, I used to check Request.ApplicationPath and manipulate url with string functions or built-in tools like VirtualPathUtility.
The need for the hash part is valid because the query string parameters are appended there. For example, I have a url like this:
http://[domain]/#/pay/cancel?paymentId=[some guid]
The conventional wisdom brought me to Global.asax to access those query parameters. I find none. Everything behind the hash tag is conveniently ignored.
So the correct way is to handle that part of url in the client side code. I am using ui-router. So for url rewrite/redirect, use stateProvider.when(oldUrl, newUrl);. To access query parameters, use $state.params.
Lesson learned: think clearly and approach different problem with different mindset.
There is no way, as the "hash" part of an url is not really part of the url. Have you ever used anchors in a page to create an index? The concept is the same.
Anyway I can't possibly imagine how the hash part could be useful server side. My guess is that you think it's useful because you're approaching a problem the wrong way.
If you complement your question with more details, like what you're trying to achieve, it's very possible we can provide you with an appropriate solution.

Efficient way to pass arrays in url

I am building a webapp and have a few arrays that I would like to pass through the URL in order to make the results of my application easily sharable.
Is there an efficient way to do this? I know a lot of websites (like youtube) use some sort of encoding to make their URLs shorter, would that be an option here?
Thanks in advance!
What I suspect you're asking is you have some page where the user can alter information, etc, and you want a way to create a URL on the fly with that information so it can easily be accessed again. I've listed two approaches here:
Use the query string. On your page you can have a button saying "save" that produces a URL with info about what the user did. For example, if I have a webpage where all I do is put my name in and select a color, I can encode that as http://my-website.com/page?name=John_Doe&color=red. Then, if I visit that link, your page could access the query object in JavaScript and load a page with the name and color field already set.
An approach for the "YouTube-style" URLs would be to create a hash of the relevant information corresponding to the page. For example, if I were creating a service for users to store plaintext files. These files are to have the following attributes: title, date, name, and body. We can create a hash of the string hash_string = someHashFunction(title+date+name).
Of course, this is a very naive hashing scheme, but something like this may be what you are looking for. Following this, your URL would be something like http://my-website.com/hash_string. The key here is not only creating these URLs, but having a means to route requests on the server side to the page corresponding to the hash_string.

Multiple URL parameters and rails/backbone.js

I have just begun to port a layered single page js app onto backbone.js and was trying to understand how to handle composite url parameters with routes and spalts in backbone.js. The backend has rails and sends JSON.
There are various entities (models) like filters, dimensions, features, questions which can be passed via request parameters.
URL 1
/display/#widget?id=42&fon=1,2,4&foff=6,9,19&q=1a2bc3abc4d
URL 2
/display/#widget?id=42&compare=345,567,90&fon=1,2,4&foff=6,9,19&q=1a2bc3abc4d
How to i structure these non-restful urls, keep the same functionality and allow bookmarkability.
Thanks
Backbone's router, for the purpose of invoking views, cares only about the hash portion of window.location. However, it does keep track of the search portion for the purpose of maintaining the browser history.
Therefore, the decision about bookmarkability is your responsibility: the hash will invoke a specific route, and what views that route hides or shows is up to you. How those views parse the search string and react is also up to you.
I can see what you want to do: change a model through the search function, then render it. It's a bit of a two-step trigger: hash-change -> model-sync -> show-view. Structuring that sounds like it'll be fun. But Backbone is capable.

How to prevent direct access to my JSON service?

I have a JSON web service to return home markers to be displayed on my Google Map.
Essentially, http://example.com calls the web service to find out the location of all map markers to display like so:
http://example.com/json/?zipcode=12345
And it returns a JSON string such as:
{"address": "321 Main St, Mountain View, CA, USA", ...}
So on my index.html page, I take that JSON string and place the map markers.
However, what I don't want to have happen is people calling out to my JSON web service directly.
I only want http://example.com/index.html to be able to call my http://example.com/json/ web service ... and not some random dude calling the /json/ directly.
Quesiton: how do I prevent direct calling/access to my http://example.com/json/ web service?
UPDATE:
To give more clarity, http://example.com/index.html call http://example.com/json/?zipcode=12345 ... and the JSON service
- returns semi-sensitive data,
- returns a JSON array,
- responds to GET requests,
- the browser making the request has JavaScript enabled
Again, what I don't want to have happen is people simply look at my index.html source code and then call the JSON service directly.
There are a few good ways to authenticate clients.
By IP address. In Apache, use the Allow / Deny directives.
By HTTP auth: basic or digest. This is nice and standardized, and uses usernames/passwords to authenticate.
By cookie. You'll have to come up with the cookie.
By a custom HTTP header that you invent.
Edit:
I didn't catch at first that your web service is being called by client-side code. It is literally NOT POSSIBLE to prevent people from calling your web service directly, if you let client-side Javascript do it. Someone could just read the source code.
Some more specific answers here, but I'd like to make the following general point:
Anything done over AJAX is being loaded by the user's browser. You could make a hacker's life hard if you wanted to, but, ultimately, there is no way of stopping me from getting data that you already freely make available to me. Any service that is publicly available is publicly available, plain and simple.
If you are using Apache you can set allow/deny on locations.
http://www.apachesecurity.net/
or here is a link to the apache docs on the Deny directive
http://httpd.apache.org/docs/2.0/mod/mod_access.html#deny
EDITS (responding to the new info).
The Deny directive also works with environment variables. You can restrict access based on browser string (not really secure, but discourages casual browsing) which would still allow XHR calls.
I would suggest the best way to accomplish this is to have a token of some kind that validates the request is a 'good' request. You can do that with a cookie, a session store of some kind, or a parameter (or some combination).
What I would suggest for something like this is to generate a unique url for the service that expires after a short period of time. You could do something like this pretty easily with Memcache. This strategy could also be used to obfuscate the service url (which would not provide any actual security, but would raise the bar for someone wanting to make direct calls).
Lastly, you could also use public key crypto to do this, but that would be very heavy. You would need to generate a new pub/priv key pair for each request and return the pubkey to the js client (here is a link to an implementation in javascript) http://www.cs.pitt.edu/~kirk/cs1501/notes/rsademo/
You can add a random number as a flag to determine whether the request are coming from the page just sent:
1) When generates index.html, add a random number to the JSON request URL:
Old: http://example.com/json/?zipcode=12345
New: http://example.com/json/?zipcode=12345&f=234234234234234234
Add this number to the Session Context as well.
2) The client browser renders the index.html and request JSON data by the new URL.
3) Your server gets the json request and checks the flag number with Session Context. If matched, response data. Otherwise, return an error message.
4) Clear Session Context by the end of response, or timeout triggered.
Accept only POST requests to the JSON-yielding URL. That won't prevent determined people from getting to it, but it will prevent casual browsing.
I know this is old but for anyone getting here later this is the easiest way to do this. You need to protect the AJAX subpage with a password that you can set on the container page before calling the include.
The easiest way to do this is to require HTTPS on the AJAX call and pass a POST variable. HTTPS + POST ensures the password is always encrypted.
So on the AJAX/sub-page do something like
if ($_POST["access"] == "makeupapassword")
{
...
}
else
{
echo "You can't access this directly";
}
When you call the AJAX make sure to include the POST variable and password in your payload. Since it is in POST it will be encrypted, and since it is random (hopefully) nobody will be able to guess it.
If you want to include or require the PHP directly on another page, just set the POST variable to the password before including it.
$_POST["access"] = "makeupapassword";
require("path/to/the/ajax/file.php");
This is a lot better than maintaining a global variable, session variable, or cookie because some of those are persistent across page loads so you have to make sure to reset the state after checking so users can't get accidental access.
Also I think it is better than page headers because it can't be sniffed since it is secured by HHTPS.
You'll probably have to have some kind of cookie-based authentication. In addition, Ignacio has a good point about using POST. This can help prevent JSON hijacking if you have untrusted scripts running on your domain. However, I don't think using POST is strictly necessary unless the outermost JSON type is an array. In your example it is an object.

Categories

Resources