I read this article here that talks about progressive enhancement for javascript and the author mentioned:
First, build an old-fashioned website
that uses hyperlinks and forms to pass
information to the server. The server
returns whole new pages with each
request.
Now, use JavaScript to intercept those links and form submissions and
pass the information via
XMLHttpRequest instead. You can then
select which parts of the page need to
be updated instead of updating the
whole page.
I'm a little curious if does that means returning html markups at the server side instead of json, which usually means building the markup on the client side? Is there a disadvantage for this approach?
Also, I notice applications, for instance Facebook, looks pretty crippled when I disabled Javascript (can't post updates etc.) Does that means that it does not handle graceful degradation properly?
Does progressive enhancement means no json with ajax?
No, it most certainly does not mean that. If JavaScript is disabled, there is no XMLHttpRequest, so there is no ajax.
Now, use JavaScript to intercept those links and form submissions and pass the information via XMLHttpRequest instead.
The JavaScript bits that intercept links and form submissions can freely change where the requests are made, URL parameters, and so on, which means that ajaxified URLs don't have to be identical to JavaScript-less ones. For example:
linky
could be intercepted and turned into an XMLHttpRequest which is actually made to
/some/page.json, or
/some/page.html?ajax=1, or
/bibbidi/bobbidi/boo (for all that it matters)
Progressive enhancement means that you start the page with code that will work everywhere, and then progressively add functionality that is accepted by that user's browser. A good example of this is on ajax type functionality with anchors. When the page loads, you can use urls in the hrefs so that spiders and non-javascript browsers can still get the content. But you also add an onclick that does the ajax loading. That way both the enabled and disabled clients get the best behavior that they can.
Essentially, progressive enhancement means you place the priority and importance of building a fully working "no-Javascript" website first, then slowly enhance your website by adding the Javascript functionality and then AJAX, while keeping the "no-javascript" features working.
This is just to allow those who have Javascript disabled to access and use the site as per normal.
Related
I am trying to create routes with vanilla javascript but every time I type a URL in the address bar I get an error saying, 'Cannot GET /about'. I am requesting a link to a tutorial or an answer to this kind of problem since it is my first time doing it with vanilla javascript and I have no clue.
Taking "Vanilla JavaScript" to mean "JavaScript, running in the browser, without the use of third-party libraries":
What you want is not (reasonably) possible.
When you type a URL into the address bar, the browser makes an HTTP request to that URL, and the HTTP server for the origin of the URL (i.e. the scheme + hostname + port) is responsible for delivering something (typically a webpage) back to the client.
You can't substitute client-side JavaScript for that initial request to the HTTP server.
There is an edge case. I think a progressive web app can use a service worker to intercept the request and generate a response internally. This is no good for handling the initial request though since the PWA wouldn't be installed at the time.
Generally, when you are writing a single page application you will need two parts for your URL handling.
The first part is the History API. This allows you to write JavaScript which tells the browser:
In response to the click the user just performed, I am going to update the DOM. If you were to visit this URL then you would get the same result as the changes I am making to the DOM, so go ahead and update the address bar to represent that.
It also lets you hook into the browser's back navigation so you can undo those changes if the user click's back.
The second part is where you make sure that the server really does deliver the same content for that other URL.
There are three strategies for achieving this:
Have the server return a more-or-less empty HTML document that checks the URL as it loads and then populates itself entirely with JavaScript. This is a poor approach which might as well just use hash bangs.
Generate all the HTML documents in advance. This is a strategy employed by Gatsby and Next.js. This is very efficient, but doesn't work for frequently updated content.
Generate the HTML documents on demand with server side code. Next.js can do this too.
You can do this when you write vanilla JavaScript (kinda), but it takes a lot of work since you need to write all the code to run on Node.js (where you might not count it as vanilla any more) to generate the HTML documents. I strongly recommend using a framework.
I am building a website which uses a lot of javascripts. I want to know if a user can edit the js too along with seeing it.
For example, I have an ajax function which calls a.php. Can user just edit the js function in firebug or something similar to make it b.php which I want don't want to be available to everybody.
Similarly, I call an ajax function with parameter x. Is it possible for a user to make that parameter y and then call that function.
Yes. Anything in the user's browser is under the control of the user.
You have control over nothing beyond the edge of your HTTP server.
Anything that is front end, that means, HTML, CSS javascript in any of its forms or any other scripting client side languages can be modified and it is your job as a web developer to expect them to be modified by curiosity of the user or just to try and find vulnerabilities.
That is why while having client side validations (javascript in any form or just HTML5 ones), it is also of utter importance that you actually validate this stuff on server side, with whatever language you are using (PHP, Ruby, ASP just to give a few examples).
On Chrome, users can easily press F12 on their keyboard to see your javascript/html/css code and try to modify it, we as web designers/developers do it as well for just inspiration, check out how something works, and well expect other people with different intentions to do it.
Same goes with Firefox, opera and pretty much any other web explorer.
Your job is not to prevent this, but to prevent that when someone changes something on the client side, the server side is ready to respond back in an appropriate way, preventing harm to the information on your servers.
To give a concrete example, that is why people take so much time in making sure queries to databases are sanitized, and not subjected to sql injections. More information about those here: http://www.unixwiz.net/techtips/sql-injection.html
So no, you can't prevent users from modifying your front end files, at most you can try some practices I've seen around like disabling right click (really annoying).
I have a web-page which content must be constructed on the fly. When user clicks some parts of the web-page, it must load information from the file which is placed on the server in the same directory along with web-page into special content <div>.
As far as I get it, with JavaScript, I must use ajax technology so I have a question: should I configure server so that he can handle ajax requests specifically, or is it just simple GET over HTTP request which should be supported by any web-server anyway?
And my second question - if ajax is technology, which will work out only if server is properly configurated, can I do what I need by simple GET from JavaScript somehow?
Also, if it is easier to use server-side scripting, how can it be done by VBScript?
AJAX requests are very much like usual HTTP requests. So you do not need to configure your server in any special way to make them work.
A usual server should already support at least GET and POST requests.
One thing, that might be important for you, however, is, that as long as there is no other "protection" for the files, everyone can access them directly, too. So in case the AJAX-loaded content contains some kind of user sensitive data, you should put some access control in place!
AJAX involves server side scripting, so it doesn't make sense to say it is easier to use server side scripting. Additionally, AJAX is nothing more than GET or POST requests that a script carries out for you asynchronously, allowing you to use the server responses in a document without reloading the entire page.
AJAX in and of itself is not so much of a technology as a technique. You can use AJAX, for example, without ever using the ubiquitous XmlHttpRequest object supplied by javascript.
With the jQuery AJAX methods, you can request text, HTML, XML, or JSON from a remote server using both HTTP Get and HTTP Post - And you can load the external data directly into the selected HTML elements of your web page...
and yes, no configa server properly
i suggest to you jquery framework (no server configure needed) (see also Sirko answer)
http://api.jquery.com/jQuery.ajax/
this is help you to load dynamic content see this
I have a series of .swf files that I inherited from an old version of a site I'm trying to rebuild.
When flash_element.submitForm() is called, they POST some data directly to a static url ("/submit"), then depending on the response, reload the browser page.
I would very much like to capture the data that they POST using javascript - preferably without it getting sent at all - so that I can have more intelligent logic to handle to request/response than is built into the .swf files I've inherited.
Basically: When a flash object makes a http request, can I catch and cancel this event in Javascript?
Basically no. You can try and use the various swf disassembler/reassembler things like the swfdump.exe that comes with flex to get rid of the post, or change it to a javascript call. There's precious little control or knowledge you can gain from a swf directly from javascript that the swf doesn't make explicitly available via the appropriate API's. This is is as it should be- if what you suggested were possible it would be a fairly serious security hole.
I have a form on my page where users enter their credit card data. Is it possible in HTML to mark the form's action being constant to prevent malicious JavaScript from changing the form's action property? I can imagine an XSS attack which changes the form URL to make users posting their secret data to the attacker's site.
Is it possible? Or, is there a different feature in web browsers which prevents these kinds of attacks from happening?
This kind of attack is possible, but this is the wrong way to prevent against it. If a hacker can change the details of the form, they can just as easily send the secret data via an AJAX GET without submitting the form at all. The correct way to prevent an XSS attack is to be sure to encode all untrusted content on the page such that a hacker doesn't have the ability to execute their own JavaScript in the first place.
More on encoding...
Sample code on StackOverflow is a great example of encoding. Imagine what a mess it would be if every time someone posted some example JavaScript, it actually got executed in the browser. E.g.,
<script type="text/javascript">alert('foo');</script>
Were it not for the fact that SO encoded the above snippet, you would have just seen an alert box. This is of course a rather innocuous script - I could have coded some JavaScript that hijacked your session cookie and sent it to evil.com/hacked-sessions. Fortunately, however, SO doesn't assume that everyone is well intentioned, and actually encodes the content. If you were to view source, for example, you would see that SO has encoded my perfectly valid HTML and JavaScript into this:
<script type="text/javascript">alert('foo');</script>
So, rather than embedding actual < and > characters where I used them, they have been replaced with their HTML-encoded equivalents (< and >), which means that my code no longer represents a script tag.
Anyway, that's the general idea behind encoding. For more info on how you should be encoding, that depends on what you're using server-side, but most all web frameworks include some sort of "out-of-the-box" HTML Encoding utility. Your responsibility is to ensure that user-provided (or otherwise untrusted) content is ALWAYS encoded before being rendered.
Is there a different feature in web browsers which
prevents these kinds of attacks from happening?
Your concern has since been addressed by newer browser releases through the new Content-Security-Policy header.
By configuring script-src, you can disallow inline javascript outright. Note that this protection will not necessarily extend to users on older browsers (see CanIUse ).
Allowing only white-labeled scripts will defeat most javascript XSS attacks, but may require significant modifications to your content. Also, blocking inline javascript may be impractical if you are using a web frameworks that relies heavily on inline javascript.
Nope nothing to really prevent it.
The only thing I would suggest to do is have some server side validation of any information coming to the server from a user form.
As the saying goes: Never trust the user