Here is what I am trying to do: I am making a custom text file containing a test. This test is unique to the user and well I don't want my server to stack all those text files.
Is there a way to use Ajax/JavaScript/jQuery to find if the user has already finished the download and if they have get a return value (1 if finished) so the response can be sent back to the PHP file and it can delete that file off the server (real-time like)?
I know there are plenty of ways to do this using PHP. Sort of like run clean up upon user log out and so on but I wanted to try using the method above since it can have many other applications that might be cool to use. I tried most search engines but they have nothing close to what I need.
Why do you need to store them in a file? Just use a PHP script or such that creates the test and outputs it directly to the user. That's how it will get deleted when the download is complete.
If it's important you may want the user to return to your server with the hash of the downloaded file. If the hash matches you know two things:
1. The user downloaded the file successfully
2. It's now ok to delete the file
Well it is very simple. I don't know how to make a PHP webpage send itself to the user other than to make the PHP make a text file and force send that to the user. This creates the problem of having so many text files in a temporary folder.
Now if the test required say 15 chapters each having a text or HTML format file then the script neatly zips all those files and sends them to the user. Again falling on the same problem if the user is finished downloading I am trying to get any type of script to delete the temporary zip or text file out of the temporary directory in somewhat real time.
If I could MD5 a downloaded file using JavaScript I welcome it and it would be a hack solution to the problem but how will the JavaScript gain access to the root access of the download folder of the user? There are security issues there if I am not mistaken. Hope this helps round the question a bit more.
I have a good solution for you here using the jQuery File Download plugin I created. It allows for you to get the behavior of performing an Ajax file download (not actually possible possible) complete with Success and Failure callbacks. In a nutshell you can just use the Success callback (that indicates the file download was successful) to perform an Ajax post back to the server to delete the file. Take a look at the blog post for an example on how to use the Success callback option or a demo which uses those callbacks in the context of showing modals to inform the user of what is going on.
Related
I have a list of webpages example.com/object/140, example.com/object/141, example.com/object/142, ...
and each page should have a particular background image example.com/assets/images/object/140.jpg, example.com/assets/images/object/141.jpg, ...
Some images are missing and then I use a default image. In that case, when I check if the image exists, I get a 404 error. I have already seen in several pages there isn't a direct way to avoid this problem.
Then I did the following: I created a service in the backend (C#) that checks if the file exists File.Exists(fileName);. That way I managed to avoid this error in my localhost. So far so good.
Now I published both my frontend and backend in two different services in Azure. The images are in the frontend but the file service is in the backend. My method does not work anymore because I can't access directly the frontend folders from the backend. One solution could be to make an http call from the backend to the frontend, but I think this doesn't make much sense, it's getting too messy.
One option could be to store in the DB a boolean with the (non)existence information, but I think this is prone to inconsistencies (if the boolean is not updated immediately when a new image is loaded or deleted, for example), even if I run a daily job to clean it.
Still another option could be to store the images directly in the DB and retrieve them together with the DTOs of the objects I'm loading in each particular page, but I guess that images that are shown only in the frondend should be stored in the frontend... shouldn't they?
Therefore:
a) Is any of these ideas acceptable? Is there a better way to avoid this error?
b) Another possibility: is there a way to access the frontend folders from the backend? I get a bit lost with the publishing and artifacts in Azure and I don't know if I could do it somehow.
I'm not sure how you've built the frontend, but I'm assuming that the background images are set using CSS. It is possible to set multiple background images in the same rule, and the browser will load them all and display them one below the other - if the first one loads successfully, and isn't transparent, then that is the only thing the user will see. But if the first image fails to load - for example because it doesn't exist, the second image will be shown.
See this other answer for more details: https://stackoverflow.com/a/22287702/53538
I'm not sure I fully understand this aspect of site security, but seems it is a bad idea to keep PHP files on root, so we move them into a PHP folder, and we call this folder phpIncludes. Some issues that are obscure are:
1) how do I access a file doStuff.php inside phpIncludes? From javascript? Seems to be a bad idea to have my javascript tell the name of the folder where sensitive stuff is, as in:
executeAjaxCall("phpFiles/doStuff.php",success,error);
2) Or from a "master" PHP that resides on root? Can't that be tampered with for being on root?
$secretVariableThatOnlyThisPHPWouldKnow="bla"; //then check it in the included file? Does that work?
//in JS
executeAjaxCall("masterFile.php?fileNeeded=doStuff");
//in PHP
$secretVar="bla";
include("myPathString/phpIncludes/".$_GET['fileNeeded'].".php");
//in all other INCLUDEable PHP files
if (!isset($secretVar))
{
die();
}
if (!isFromThisDomain)
{
die();
}
(How do I perform last test?)
What I have in mind is: "If someone wants to get the PHP files inside folder phpIncludes they can't, but they can access them thru HTML so requests need to be validated. Is it easily doable without sessions, like generating something in master.php that doStuff.php would recognize and hence do its job? Or is sessions the way to go?"
I am actually wondering about the "no session" scenario for this frees me from the task of attempting to implement anti-session hijacking code (if I ever learn it...). On the other hand, I am also thinking:"Isnt the variable secretVar accessable/tamperable since it will be on root?"
3) Do as (2), but have master.php inside yet ANOTHER folder (lets call it "master"), making it non-root? In this case, do I access my phpIncludes folder from the master folder using getcwd()+string manipulation? Or is there a "more elegant" (lazier...) method?
3 seems magical, where the only file I have on root would be index.php, that simply starts the HTML+JS and does nothing sensitive. Or am I missing something?
It is completely irrelevant how your PHP file and folder structure is. The only thing that matters is this:
What happens when you access a particular URL?
phpFiles/doStuff.php is a URL first and foremost. It doesn't matter how you access it ("directly" via the browser address bar, via AJAX, curl, whatever else); all that matters is what happens when you access that URL. And it's entirely up to you to ensure that nothing undesirable will happen with each URL access.
Don't publicly expose any URLs which aren't meant to exist in the first place. If you have a bunch of .php files which aren't mean to be accessed directly via a URL, then don't publicly expose them. That either means that you block access to them via your web server configuration (e.g. deny all in an .htaccess file), or that you take those files out of the public webroot to begin with.
Validate all input and necessary conditions in all publicly exposed URL endpoints as necessary. The user needs to be logged in to do something? Verify that. You require certain query parameters or POST body data? Verify that. Validate and verify every incoming request on its own merits before doing anything. Whether you repeat this validation code in each file individually or do it somewhere centrally is up to you.
Split your code across multiple files as appropriate to make it reusable. See points 1. and 2., you must simply take care which files are publicly exposed as URL entry points and at which point you need to do what sort of validation.
I have a PHP file that writes a significant amount of data to a "dump" file, then I am using Jquery to $('#element').load() that data from my user interface page. The problem is that I am creating this dump file on the fly and as more and more of the dump file contents are added, every few seconds my Jquery loads that data again. This works fine most of the time. The problem arises when the read happens, while the file is being written to. This halts my PHP script. I check in PHP with is is_writable() to make sure the file is available but is there a "is_readable()" type of function for JavaScript/Jquery? I know that I could create an "intermediate" PHP file to check, but that would require a significant amount of change to do, is there an easy way to check in JavaScript?
In your ajax call, you should just request the file and if it's not available, you should have an error handler that does whatever you want to do when it's not available. There is no other way to preflight it unless you invent a different ajax call on your server to give you that info.
If asking for it when it doesn't exist causes your PHP script a problem, then you need to just figure out how to fix that issue so it can just return either empty data or just return an error to the ajax call without messing up the server.
Ending up having to use an intermediary php file that waits for the dump file to become available before trying to read from it, then just echos the contents
All I want is:
select a file
small progress bar (unless it is not simple)
fail/success confirmation on client side
trigger action on server side.
all that without page reloading
Except that, the simpler the better.
Snippet would be most welcome.
There are plenty of scripts and tutorials around. Check for example http://www.ajaxf1.com/tutorial/ajax-file-upload-tutorial.html
Apparently that's not as trivial as one might think, since you can't just take the body of a form, containing an <input type='file'/> tag, and submit that.
But you can submit the form with a target of another <iframe/> and then poll the server with a XMLHttpRequest object for status updates, that however, requires that your sever-side script, that handles the upload, does so in a asynchronous manner, otherwise you will only get a status update once the file has been fully uploaded, not the kind of progress status updates you want. I think this is a challenge for most web frameworks to date, but I have never actually had any reason to dig into it. Sounds fun though...
If you just want to submit the file, independently of the actual form, you'll do the same, but you don't have to worry about the progress status updates.
What you can do, is to replaces the <input type='file'/> once the upload completes, with a <input type='hidden'/> containing the server-side ID of the recently uploaded file. That way you'll know when the user hits save, what files you'll want to actually save.
That hidden thing can also be a checkbox, which would let you undo a file upload, by simply unchecking that checkbox before hitting save.
File uploads using the XMLHttpRequest object is not possible in all browsers (only Firefox and Safari/Chrome support it), so for a cross-browser implementation use the <iframe> trick.
If you want a real XHR file upload, I have written an extended article on how to do it in Firefox 3. It's so low level that you actually have to build the actual HTTP request from JavaScript strings.
Maybe GearsUploader will fit.
In our current project we are providing a PDF download that can be customized by the user through an HTML form he submits. It takes several seconds to dynamically generate the PDF and I'd like to visualize this, e.g. by disabling the submit button until the download starts. Unfortunately, I couldn't find a way to detect when the download starts*. So I wouldn't know when to re-enable the submit button.
I already tried to specify an IFrame as the target of my HTML form, hoping that the onload event would be triggered. It is not, however -- probably because the PDF is sent with a "Content-disposition: attachment" header and it is not actually loaded into the IFrame.
The only solution I can think of right now involves generating the PDF to a temporary file on the server, which I would like to avoid.
*)Let me clarify this: I wouldn't need to know if the download was finished or even if it was really started. I'd like to detect the point at which the browser will ask the user whether to open or to save the file. I guess this happens when the browser receives the HTTP header.
If I was you I would do an AJAX call to the server with the information, generate the file, then return the file name/id/whatever back to the javascript, which then makes window.location something like download.php?id=x (but the file was already generated, so it is just setting the headers and reading it out) at which point you can re-enable the submit.
What you want is to be able to detect when the size of the downloaded file changes from 0 to a positive value. As far as I know it is impossible to do that with javascript - you need a plug-in that can access the client's file system.
A recommended work around: Create a session per download. Have the client poll the server about the status of the download. This could be "non existed", "not started", "started", "finished". You need some server's side work to persist and update the status of the download plus an AJAX framework.
The simplest solution would be to estimate the time (generously) and do it that way. It's a hack, but it gives the desired effect. The other option might be to submit the form with a callback using Ajax to submit the form and have the generator return details to the calling page. http://www.jquery.com/ might be a good place to start for that option.