I have an input field where I paste a download url.
After that, the I use an AJAX request to get the fileinfos such as headerinfo, content-length, mime type & in case I use curl accept-ranges.
I then start a consecutive loop of xhr2 requests with ranges to my php file.
http://www.example.com/chunks.php?url=http://url.com/someFile.ext&range=0-1024
http://www.example.com/chunks.php?url=http://url.com/someFile.ext&range=1024-2048
....
I can also change it to
http://www.example.com/chunks.php?url=http://url.com/someFile.ext&range=0-1024
http://www.example.com/chunks.php?url=http://url.com/someFile.ext&range=1025-2049
....
depending where my script starts to read the file.
My first approach was using cUrl & setting the ranges
<?php
$ch=curl_init();
curl_setopt($ch,CURLOPT_URL,$_GET['url']);
curl_setopt($ch,CURLOPT_RANGE,$_GET['range']);
curl_setopt($ch,CURLOPT_BINARYTRANSFER,1);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
$result=curl_exec($ch);
curl_close($ch);
echo $result;
?>
works great but if the range chunks are bigger than 1mb there is no animation on the client side onprogress event using ajax.
i prolly could use a custom CURLOPT_READFUNCTION... but i don't know how that works... so i changed approach and used the simple fopen
<?php
$r=explode('-',$_GET['range']);//get (from to) ranges
$cc=($r[1]-$r[0]); //Calculate Client Chunk length
$sc=128; //Set the Server chunk length
$b=""; //Buffer
$bytes=0; //bytes read
$h=fopen($_GET['url'],"rb"); // open the url
fseek($h,$r[0]); // jump to the from pointer retrieved from links
while($bytes<$cc){ //while bytes read is smaller than my client chunk
$sc=(($bytes+$sc)>$cc?($cc-$bytes):$sc); //prolley an error here
//if the server chunk + bytes read is bigger than the client chunk
//then the server chunk is clinet chunk - bytes read
$b=fread($h,$sc); // read the buffer
$bytes+=strlen($b); //add the buffer length to bytes read
echo $b;// echo the buffer
ob_flush(); // flush
flush(); // flush
}
fclose($h); //close
?>
now this works ... I get the right animation on the client and also the final size is correct the pointers should be ok (0-1024,1024-2048) as I use fseek && fread.
but the file is corrupt.
Now after some tests ... this is very slow.
A better approach would be cUrl with CURLOPT_READFUNCTION or fsoket open...
so I guess:
<?php
function $READ(){
//here i need small chuncks of the response flushed.
}
$ch=curl_init();
curl_setopt($ch,CURLOPT_URL,$_GET['url']);
curl_setopt($ch,CURLOPT_RANGE,$_GET['range']);
curl_setopt($ch,CURLOPT_BINARYTRANSFER,1);
curl_setopt($ch,CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch,CURLOPT_READFUNCTION,$READ);
$result=curl_exec($ch);
curl_close($ch);
echo $result;
?>
If you have a better solution I'm open to everything that uses javascript and php.
The point of this is to create a download manager with resume that stores the file into the window.webkitRequestFileSystem without filling the memory of the browser.
Let's say the client has chunks of 8mb and the server chunks are 256kb ..
then every 8mb of the chunk is appended to a file previously created with window.webkitRequestFileSystem
and every 256kb I have an update of the average download speed and this way I can create a nice animation.
The php on the server uses only 256kb ram and the client browser can empty the garbage collection every 8mb (theoretically).
EDIT2
For this code I found a solution:
the code allows you to get ranges for example:0-100
and get the output of this 100bytes chunked !!
this allows you to have a AJAX script that has a continuos flawless PROGRESSBAR
<?php
function w($ch,$chunk){
echo $chunk;
ob_flush();
flush();
return strlen($chunk);
};
$ch=curl_init();
curl_setopt($ch,CURLOPT_URL,$_GET['url']);
curl_setopt($ch,CURLOPT_RANGE,$_GET['range']);
curl_setopt($ch,CURLOPT_BINARYTRANSFER,1);
curl_setopt($ch,CURLOPT_WRITEFUNCTION,w);
curl_exec($ch);
curl_close($ch);
?>
But I hope you guys have a better solution at all!! thanks
I could get it to work with PHP curl's CURLOPT_WRITEFUNCTION callback setting. The following example callback function curl_write_flush intended for that curl option writes every chunk received and flushes the output to the browser.
<?php
/**
* CURLOPT_WRITEFUNCTION which flushes the output buffer and the SAPI buffer.
*
* #param resource $curl_handle
* #param string $chunk
*/
function curl_write_flush($curl_handle, $chunk)
{
echo $chunk;
ob_flush(); // flush output buffer (Output Control configuration specific)
flush(); // flush output body (SAPI specific)
return strlen($chunk); // tell Curl there was output (if any).
};
$curl_handle = curl_init($_GET['url']);
curl_setopt($curl_handle, CURLOPT_RANGE, $_GET['range']);
curl_setopt($curl_handle, CURLOPT_BINARYTRANSFER, 1);
curl_setopt($curl_handle, CURLOPT_WRITEFUNCTION, 'curl_write_flush');
curl_exec($curl_handle);
curl_close($curl_handle);
I tried with small files and big files and it works great but you can't set custom chunk size.
Download stream is the same speed as I can get with my ISP.
If you have anything better i'm open for any answer.
Related
I have a php-script that updates the download counter of the appropriate file (it sends headers with file). But if I click cancel button on save file dialog the counter is incremented too. It seems wrong.
My idea is to send ajax request and decrement counter if cancel button is clicked. But how to detect cancel button click?
You should try this Track file download progress with Javascript and do the increment when download 100% done
I solved my problem using something like downloading with parts of the file using fread in PHP.
It looks like this:
// send file (it shows save file dialog)
header('Cache-control: private');
header('Content-Type: application/octet-stream');
header('Content-Length: '.filesize($filepath));
header('Content-Disposition: filename='.$filename);
flush();
$file = fopen($filepath, "r");
while(!feof($file))
{
// send the current file part to the browser
print fread($file, round($download_rate * 1024));
// flush the content to the browser
flush();
// sleep one second
sleep(1);
}
// if all are right and file was downloaded
if (feof($file)) {
update_current_counter_or_insert_new_one_in_DATABASE();
}
fclose($file);
This link helps me to solve (php.net readfile article). So, it works for me more or less.
I have a <video> and <audio> element which load a file (mp4, mp3, doesn't matter) from my server via Range requests.
It seems however that the element only request the end Range from my server, and from there on out tries to stream directly from bytes 0 to the end, causing the player to be stuck in a "download loop", which makes the browser suspend all other actions until the download is complete.
Does anyone know a solution to this issue? Do I for example have to make my stream request HAVE an actual end to its content length or accept-ranges?
Here's the full request list from Chrome and at the bottom you can see that a request for the url view?watch=v__AAAAAA673pxX just stays pending, basically until either a new request is placed by the element.
In a nutshell: html5 elements get stuck in a download loop when using http-range requests and cause all other requests to stay "pending".
UPDATE
The issue was resolved server-side.
Whereas the original stream function would literally output every byte, I've modified the code to output ONLY the size of the actual buffer. This forces the elements to make a new request for the remaining data.
An important note here is to return the content-length, accept-ranges and content-ranges that match the file's size, start and ending position in each HTTP RANGE request.
For future references:
function stream(){
$i = $this->start;
set_time_limit(0);
while(!feof($this->stream) && $i <= $this->end) {
$bytesToRead = $this->buffer;
if(($i+$bytesToRead) > $this->end) {
$bytesToRead = $this->end - $i + 1;
}
$data = fread($this->stream, $bytesToRead);
echo $data;
flush();
$i += $bytesToRead;
}
}
new stream function:
function stream()
{
//added a time limit for safe-guarding
set_time_limit(3);
echo fread($this->stream, $this->offset);
flush();
}
Suppose you have a video of 1M bytes
When you browser request for video the first time it will send headers like this
Host:localhost
Range:bytes=0-
Range header bytes=0- means browser is asking server to return till whatever it can return ie. no end position is specified
To this server would usually reply with whole file except last byte to preserve the range context
Accept-Ranges:bytes
Content-Length:99999
Content-Range:bytes 0-99999/1000000
Now suppose your video is downloaded till 30% and you seek to 70% then browser will request that part header would be like this
Host:localhost
Range:bytes=700000-
It seems however that the element only request the end Range from my server,
You can see you inferred wrongly it's the starting position of video part
Now server might reply like
Accept-Ranges:bytes
Content-Length:300000
Content-Range:bytes 700000-99999/1000000
Note Content-Range it explicitly tells what portion of file .So my guess it that your server is not sending this information and browser is getting bugged.
Also sometimes mime-types can also cause problems try to use the exact mimetype of your file like Content-Type: video/mp4.If you use Content-Type: application/octet-stream then might cause compression which would disabled range headers
This is somewhat related to my other PHP video streaming post, but this time the issue is that the seekbar for the videos do not work in Chrome.
I have found several different posts about it here at Stack Overflow, but none of them have resolved the issue. I would link all of them, but I can't seem to find the same posts I found yesterday.
I am going to list two versions of the PHP code. I should also point out what exactly I'm doing before the PHP loads the video data. On an HTML page, I have a <video> tag without <source> tags. I use Javascript to make an AJAX call to a PHP file that has the source tags. The source tags themselves don't contain direct links to the video source files. Instead, they reference yet another PHP file that loads the data.
Top level HTML For Video. Super simple.
<video id="showvideo" height="540" width="864" controls></video>
Now for the AJAX call
function showVideo() {
if (window.XMLHttpRequest) {
// code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp = new XMLHttpRequest();
} else {
// code for IE6, IE5
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.onreadystatechange = function() {
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
document.getElementById("showvideo").innerHTML = xmlhttp.responseText;
}
}
xmlhttp.open("GET", "/firstphpfile.php", true);
xmlhttp.send();
}
The Javascript function loads when the page loads.
Here's the contents of firstphpfile.php
<?php
echo "
<source src=\"http://example.com/video1.php?type=stuff.mp4\" type=\"video/mp4\">
<source src=\"http://example.com/video2.php?type=stuff.ogv\" type=\"video/ogg\">
";
?>
Again, not a big deal. Now I am going to post a couple different versions of the video1.php file that actually grabs the file resource.
Version 1:
<?php
$file = video.mp4;
$filesize = filesize($file);
$offset = 0;
$length = $filesize;
if ( isset($_SERVER['HTTP_RANGE']) ) {
// if the HTTP_RANGE header is set we're dealing with partial content
$partialContent = true;
// find the requested range
// this might be too simplistic, apparently the client can request
// multiple ranges, which can become pretty complex, so ignore it for now
preg_match('/bytes=(\d+)-(\d+)?/', $_SERVER['HTTP_RANGE'], $matches);
$offset = intval($matches[1]);
$length = intval($matches[2]) - $offset;
} else {
$partialContent = false;
}
$file = fopen($file, 'r');
// seek to the requested offset, this is 0 if it's not a partial conten request
fseek($file, $offset);
$data = fread($file, $length);
fclose($file);
if ( $partialContent ) {
// output the right headers for partial content
header('HTTP/1.1 206 Partial Content');
header('Content-Range: bytes ' . $offset . '-' . ($offset + $length) . '/' . $filesize);
}
// output the regular HTTP headers
header("Content-Type:video/mp4");
header('Content-Length: $filesize');
header('Accept-Ranges: bytes');
// don't forget to send the data too
print($data);
?>
Version 2 (I like this one better for what it does in Firefox, but still no dice in Chrome)
<?php
$file = video.mp4;
$mime = "video/mp4"; // The MIME type of the file, this should be replaced with your own.
$size = filesize($file); // The size of the file
// Send the content type header
header('Content-type: ' . $mime);
// Check if it's a HTTP range request
if(isset($_SERVER['HTTP_RANGE'])){
// Parse the range header to get the byte offset
$ranges = array_map(
'intval', // Parse the parts into integer
explode(
'-', // The range separator
substr($_SERVER['HTTP_RANGE'], 6) // Skip the `bytes=` part of the header
)
);
// If the last range param is empty, it means the EOF (End of File)
if(!$ranges[1]){
$ranges[1] = $size - 1;
}
// Send the appropriate headers
header('HTTP/1.1 206 Partial Content');
header('Accept-Ranges: bytes');
header('Content-Length: ' . ($ranges[1] - $ranges[0])); // The size of the range
// Send the ranges we offered
header(
sprintf(
'Content-Range: bytes %d-%d/%d', // The header format
$ranges[0], // The start range
$ranges[1], // The end range
$size // Total size of the file
)
);
// It's time to output the file
$f = fopen($file, 'rb'); // Open the file in binary mode
$chunkSize = 8192; // The size of each chunk to output
// Seek to the requested start range
fseek($f, $ranges[0]);
// Start outputting the data
while(true){
// Check if we have outputted all the data requested
if(ftell($f) >= $ranges[1]){
break;
}
// Output the data
echo fread($f, $chunkSize);
// Flush the buffer immediately
#ob_flush();
flush();
}
}
else {
// It's not a range request, output the file anyway
header('Content-Length: ' . $size);
// Read the file
#readfile($file);
// and flush the buffer
#ob_flush();
flush();
}
?>
So, while both play the video without problems, only the Firefox version will let me do any kind of seeking. The second version makes it so you can only seek backwards, which I prefer.
There was another version I tried, but I had already deleted the code before writing this and haven't found it again.
I am not sure what I'm doing wrong and no solutions I have found solved the issue of allowing the Chrome version of the video to seek.
Ok, so I finally got it to work. I decided to not load in the php files with javascript.
Also, I got rid of the mime type variable and just set the header properly. I found that using a variable for the mime type cause my browsers to load the wrong mime type for the content type header thus causing the video resource to fail.
Ok, so for about a week now I've been doing tons of research on making xmlhttprequests to servers and have learned a lot about CORS, ajax/jquery request, google feed api, and I am still completely lost.
The Goal:
There are 2 sites in the picture, both I have access to, the first one is a wordpress site which has the rss feed and the other is my localhost site running off of xampp (soon to be a published site when I'm done). I am trying to get the rss feed from the wordpress site and display it on my localhost site.
The Issue:
I run into the infamous Access-Control-Allow-Origin error in the console and I know that I can fix that by setting it in the .htaccess file of the website but there are online aggregators that are able to just read and display it when I give them the link. So I don't really know what those sites are doing that I'm not, and what is the best way to achieve this without posing any easy security threats to both sites.
I highly prefer not to have to use any third party plugins to do this, I would like to aggregate the feed through my own code as I have done for an rss feed on the localhost site, but if I have to I will.
UPDATE:
I've made HUGE progress with learning php and have finally got a working bit of code that will allow me to download the feed files from their various sources, as well as being able to store them in cache files on the server. What I have done is set an AJAX request behind some buttons on my site which switches between the rss feeds. The AJAX request POSTs a JSON encoded array containing some data to my php file, which then downloads the requested feed via cURL (http_get_contents copied from a Github dev as I don't know how to use cURL yet) link and stores it in a md5 encoded cache file, then it filters what I need from the data and sends it back to the front end. However, I have two more questions... (Its funny how that works, getting one answer and ending up with two more questions).
Question #1: Where should I store both the cache files and the php files on the server? I heard that you are supposed to store them below the root but I am not sure how to access them that way.
Question #2: When I look at the source of the site through the browser as I click the buttons which send an ajax request to the php file, the php file is visibly downloaded to the list of source files but also it downloads more and more copies of the php file as you click the buttons, is there a way to prevent this? I may have to implement another method to get this working.
Here is my working php:
//cURL http_get_contents declaration
<?php
function http_get_contents($url, $opts = array()) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch, CURLOPT_USERAGENT, "{$_SERVER['SERVER_NAME']}");
curl_setopt($ch, CURLOPT_URL, $url);
if (is_array($opts) && $opts) {
foreach ($opts as $key => $val) {
curl_setopt($ch, $key, $val);
}
}
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
if (false === ($retval = curl_exec($ch))) {
die(curl_error($ch));
} else {
return $retval;
}
}
//receive and decode $_POSTed array
$post = json_decode($_POST['jsonString'], true);
$url = $post[0];
$xmn = $post[1]; //starting item index number (i.e. to return 3 items from the feed, starting with the 5th one)
$xmx = $xmn + 3; //max number (so three in total to be returned)
$cache = '/tmp/' . md5($url) . '.html';
$cacheint = 0; //this is how I set if the feed will be downloaded from the site it is from, or if it will be read from the cache file, I will implement a way to check if there is a newer version of the file on the other site in the future
//if the cache file doesn't exist, download feed and write contents to cache file
if(!file_exists($cache) || ((time() - filemtime($cache)) > 3600 * $cacheint)) {
$feed_content = http_get_contents($url);
if($feed_content = http_get_contents($url)) {
$fp = fopen($cache, 'w');
fwrite($fp, $feed_content);
fclose($fp);
}
}
//parse and echo results
$content = file_get_contents($cache);
$x = new SimpleXmlElement($content);
$item = $x->channel->item;
echo '<tr>';
for($i = $xmn; $i < $xmx; $i++) {
echo '<td class="item"><p class="title clear">' .
$item[$i]->title .
'</p><p class="desc">' .
$desc=substr($item[$i]->description, 0, 250) .
'... <a href="' .
$item[$i]->link .
'" target="_blank">more</a></p><p class="date">' .
$item[$i]->pubDate .
'</p></td>';
}
echo '</tr>';
?>
I have a php script that import large data from csv files with validations.
For that I need to show progress to the user. I have used Event Streaming for that.
When I echo something, I want it to be transferred to client one by one instead of server sent whole output in bulk.
I had already played around with ob_start(), ob_implicit_flush() & ob_flush(), but they didn't work.
My script is working perfect on another server.
Below server configurations are given:
Server configuration on which the code is not responding as desired, i.e.OS: Linux
PHP Version 5.4.36-0+deb7u3
Server API: CGI/FastCGI
Memory_limit: 128M
output_buffering: no value
As I have said, the code is working properly on another server which has the almost same configuration, i.e.
OS: Linux
PHP Version 5.4.37
Server API: CGI/FastCGI
Memory_limit: 256MB
output_buffering: no value
Below is my sample code for sending event:
<?php
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
header("Access-Control-Allow-Origin: *");
$lastEventId = floatval(isset($_SERVER["HTTP_LAST_EVENT_ID"]) ? $_SERVER["HTTP_LAST_EVENT_ID"] : 0);
if ($lastEventId == 0) {
$lastEventId = floatval(isset($_GET["lastEventId"]) ? $_GET["lastEventId"] : 0);
}
echo ":" . str_repeat(" ", 2048) . "\n"; // 2 kB padding for IE
echo "retry: 2000\n";
// event-stream
$i = $lastEventId;
while ($i <= 100) {
if($i==100){
echo "data: stop\n";
ob_flush();
flush();
break;
} else {
echo "id: " . $i . "\n";
echo "data: " . $i . ";\n\n";
ob_flush();
flush();
sleep(1);
}
$i++;
}
?>
Below is my client page on which I need response:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<title>EventSource example</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<script src="../jquery/eventsource.js"></script>
<script>
var es = new EventSource("events.php");
var listener = function(event) {
console.log(event.data);
var type = event.type;
if (event.data == 'stop') {
es.close();
} else {
var div = document.createElement("div");
div.appendChild(document.createTextNode(type + ": " + (type === "message" ? event.data : es.url)));
document.body.appendChild(div);
}
};
var errlistener = function(event) {
es.close();
}
es.addEventListener("open", listener);
es.addEventListener("message", listener);
es.addEventListener("error", errlistener);
</script>
</head>
<body>
</body>
</html>
Your best method to return chucked data to the browser is to use Web Sockets get the client to open a socket to your file reader then you can chunk the data to the browser without a problem.
Then once it has finished you can close the socket.
a good tutorial for web sockets
http://www.phpbuilder.com/articles/application-architecture/optimization/creating-real-time-applications-with-php-and-websockets.html
with this method you can then if you wanted implement verification so the server is not just sending chunks it's sends the chunks on request by javascript
So your Client could say i need chunk 5 and your server implement something like
$requestedChunk = 5; // this would be set by the javascript sending the request
$chunkSize = 256; // this would be your chunk size;
$readPossition = $requestedChunk * $chunkSize;
Link no longer works so here is one built on Ratchet: https://blog.samuelattard.com/the-tutorial-for-php-websockets-that-i-wish-had-existed/
I had a similar problem. Event streams were working as expected (returning chunks) on a server using the Apache 2.0 Handler but not on a server using FastCGI (returning it in bulk). I assumed that something in FastCGI is the culprit and so tried to resolve the problem by switching to CGI. Now the event stream works as expected.
Whether using CGI or FastCGI the Server API shows up as CGI/FastCGI so I assume that the server it works on for you is running CGI and the server it doesn't work on for you is running FastCGI. Try changing the non-working server to CGI.
As for why it doesn't work in FastCGI I'm not entirely sure but unless it's a necessary requirement and CGI isn't possible then the above solution should work.
Many things can prevent chunked response such as but not limited to;
Proxy or any other buffering mechanism on web server
When "Output buffering" is "on" in php.ini (you should explictly set it to off)
When gzip is enabled on web server
You should check these at first.