PHP And AJAX Download of a few MB file freezes website - javascript

Hello ive searched everywhere to find the answer however none of the solutions ive tried helped
What i am building is a site which connects to Youtube to allow users to search and download videos as MP3 files. I have built the site with the search etc however i am having a problem with the download part (ive worked out how to get the youtube audio file). The format for the audio is originally audio/mp4 so i need to convert it to mp3 however first i need to get the file on the server
So on the download page ive made a script that sends an ajax request to the server to start downloading the file. It then sends a request to a different page every few seconds to find out the progress and update it on the page the user is viewing.
However the problem is while the video is downloading the whole website freezes (all the pages dont load until the file is fully downloaded) and so when the script tries to find out the progress it cant until its fully done.
The file which downloads:
<?php
session_start();
if (isset($_GET['yt_vid']) && isset($_GET['yrt'])) {
set_time_limit(0); // to prevent the script from stopping execution
include "assets/functions.php";
define('CHUNK', (1024 * 8 * 1024));
if ($_GET['yrt'] == "gphj") {
$vid = $_GET['yt_vid'];
$mdvid = md5($vid);
if (!file_exists("assets/videos/" . $mdvid . ".mp4")) { // check if the file already exists, if not proceed to downloading it
$url = urlScraper($vid); // urlScraper function is a function to get the audio file, it sends a simple curl request and takes less than a second to complete
if (!isset($_SESSION[$mdvid])) {
$_SESSION[$mdvid] = array(time(), 0, retrieve_remote_file_size($url));
}
$file = fopen($url, "rb");
$localfile_name = "assets/videos/" . $mdvid . ".mp4"; // The file is stored on the server so it doesnt have to be downloaded every time
$localfile = fopen($localfile_name, "w");
$time = time();
while (!feof($file)) {
$_SESSION[$mdvid][1] = (int)$_SESSION[$mdvid][1] + 1;
file_put_contents($localfile_name, fread($file, CHUNK), FILE_APPEND);
}
echo "Execution time: " . (time() - $time);
fclose($file);
fclose($localfile);
$result = curl_result($url, "body");
} else {
echo "Failed.";
}
}
}
?>

I also had that problem in the past, the reason that it does not work is because the session can only be once open for writing.
What you need to do is modify your download script and use session_write_close() each time directly after writing to the session.
like:
session_start();
if (!isset($_SESSION[$mdvid])) {
$_SESSION[$mdvid] = array(time(), 0, retrieve_remote_file_size($url));
}
session_write_close();
and also in the while
while (!feof($file)) {
session_start();
$_SESSION[$mdvid][1] = (int)$_SESSION[$mdvid][1] + 1;
session_write_close();
file_put_contents($localfile_name, fread($file, CHUNK), FILE_APPEND);
}

Related

How to have every visitor on a website be on the same point of an audio file to simulate a scheduled live stream?

I am attempting to simulate a scheduled live audio stream without the use of any third party tools/software. In order to do so, I would need every visitor on the website to be on the same point on the audio file. My initial plan was to have a PHP script that keeps track of the time, and write to a .json file :
ini_set('max_execution_time', 0);
include 'mp3file.class.php';
$file = "./audioDuration.json";
$mp3file = new MP3File("Nightride.mp3");
$duration = $mp3file->getDurationEstimate();
$tracker = 0;
while($tracker < $duration){
$tracker++;
file_put_contents($file, $tracker);
sleep(1);
}
And the Javascript :
$.getJSON( "audioDuration.json",
function( returnedData ) {
document.getElementById('audioElement').currentTime = returnedData;
}
However, being completely new to PHP, I did not realize that any user can run this script on their own browser, and it would cause the audioDuration.json to contain the wrong data. I've done some research, and it appears that there are ways to have a PHP script only run if the server requests it. I am not sure if this is the most practical way to accomplish this.
I feel you should use a server side resource to be sure any client get the same "time" to set-up your audio file.
Why don't you use something like server date('H:i:s); function. If you get a 1hour long file you just need to dont take care about hours, and use only minutes and seconds to get which time should be used to start the audio file.
And you don't even need to use javascript to call server to get the value. If you use php to generate your HTML you can directly print value in the HTML's javascript when loading the page, something like :
echo '<script type="text/javascript">
document.getElementById('audioElement').currentTime = ' . $timer . ';
</script>';

PHP Video Stream Seekbar Unusable in Chrome

This is somewhat related to my other PHP video streaming post, but this time the issue is that the seekbar for the videos do not work in Chrome.
I have found several different posts about it here at Stack Overflow, but none of them have resolved the issue. I would link all of them, but I can't seem to find the same posts I found yesterday.
I am going to list two versions of the PHP code. I should also point out what exactly I'm doing before the PHP loads the video data. On an HTML page, I have a <video> tag without <source> tags. I use Javascript to make an AJAX call to a PHP file that has the source tags. The source tags themselves don't contain direct links to the video source files. Instead, they reference yet another PHP file that loads the data.
Top level HTML For Video. Super simple.
<video id="showvideo" height="540" width="864" controls></video>
Now for the AJAX call
function showVideo() {
if (window.XMLHttpRequest) {
// code for IE7+, Firefox, Chrome, Opera, Safari
xmlhttp = new XMLHttpRequest();
} else {
// code for IE6, IE5
xmlhttp = new ActiveXObject("Microsoft.XMLHTTP");
}
xmlhttp.onreadystatechange = function() {
if (xmlhttp.readyState == 4 && xmlhttp.status == 200) {
document.getElementById("showvideo").innerHTML = xmlhttp.responseText;
}
}
xmlhttp.open("GET", "/firstphpfile.php", true);
xmlhttp.send();
}
The Javascript function loads when the page loads.
Here's the contents of firstphpfile.php
<?php
echo "
<source src=\"http://example.com/video1.php?type=stuff.mp4\" type=\"video/mp4\">
<source src=\"http://example.com/video2.php?type=stuff.ogv\" type=\"video/ogg\">
";
?>
Again, not a big deal. Now I am going to post a couple different versions of the video1.php file that actually grabs the file resource.
Version 1:
<?php
$file = video.mp4;
$filesize = filesize($file);
$offset = 0;
$length = $filesize;
if ( isset($_SERVER['HTTP_RANGE']) ) {
// if the HTTP_RANGE header is set we're dealing with partial content
$partialContent = true;
// find the requested range
// this might be too simplistic, apparently the client can request
// multiple ranges, which can become pretty complex, so ignore it for now
preg_match('/bytes=(\d+)-(\d+)?/', $_SERVER['HTTP_RANGE'], $matches);
$offset = intval($matches[1]);
$length = intval($matches[2]) - $offset;
} else {
$partialContent = false;
}
$file = fopen($file, 'r');
// seek to the requested offset, this is 0 if it's not a partial conten request
fseek($file, $offset);
$data = fread($file, $length);
fclose($file);
if ( $partialContent ) {
// output the right headers for partial content
header('HTTP/1.1 206 Partial Content');
header('Content-Range: bytes ' . $offset . '-' . ($offset + $length) . '/' . $filesize);
}
// output the regular HTTP headers
header("Content-Type:video/mp4");
header('Content-Length: $filesize');
header('Accept-Ranges: bytes');
// don't forget to send the data too
print($data);
?>
Version 2 (I like this one better for what it does in Firefox, but still no dice in Chrome)
<?php
$file = video.mp4;
$mime = "video/mp4"; // The MIME type of the file, this should be replaced with your own.
$size = filesize($file); // The size of the file
// Send the content type header
header('Content-type: ' . $mime);
// Check if it's a HTTP range request
if(isset($_SERVER['HTTP_RANGE'])){
// Parse the range header to get the byte offset
$ranges = array_map(
'intval', // Parse the parts into integer
explode(
'-', // The range separator
substr($_SERVER['HTTP_RANGE'], 6) // Skip the `bytes=` part of the header
)
);
// If the last range param is empty, it means the EOF (End of File)
if(!$ranges[1]){
$ranges[1] = $size - 1;
}
// Send the appropriate headers
header('HTTP/1.1 206 Partial Content');
header('Accept-Ranges: bytes');
header('Content-Length: ' . ($ranges[1] - $ranges[0])); // The size of the range
// Send the ranges we offered
header(
sprintf(
'Content-Range: bytes %d-%d/%d', // The header format
$ranges[0], // The start range
$ranges[1], // The end range
$size // Total size of the file
)
);
// It's time to output the file
$f = fopen($file, 'rb'); // Open the file in binary mode
$chunkSize = 8192; // The size of each chunk to output
// Seek to the requested start range
fseek($f, $ranges[0]);
// Start outputting the data
while(true){
// Check if we have outputted all the data requested
if(ftell($f) >= $ranges[1]){
break;
}
// Output the data
echo fread($f, $chunkSize);
// Flush the buffer immediately
#ob_flush();
flush();
}
}
else {
// It's not a range request, output the file anyway
header('Content-Length: ' . $size);
// Read the file
#readfile($file);
// and flush the buffer
#ob_flush();
flush();
}
?>
So, while both play the video without problems, only the Firefox version will let me do any kind of seeking. The second version makes it so you can only seek backwards, which I prefer.
There was another version I tried, but I had already deleted the code before writing this and haven't found it again.
I am not sure what I'm doing wrong and no solutions I have found solved the issue of allowing the Chrome version of the video to seek.
Ok, so I finally got it to work. I decided to not load in the php files with javascript.
Also, I got rid of the mime type variable and just set the header properly. I found that using a variable for the mime type cause my browsers to load the wrong mime type for the content type header thus causing the video resource to fail.

Cross-Domain Rss Feed Request?

Ok, so for about a week now I've been doing tons of research on making xmlhttprequests to servers and have learned a lot about CORS, ajax/jquery request, google feed api, and I am still completely lost.
The Goal:
There are 2 sites in the picture, both I have access to, the first one is a wordpress site which has the rss feed and the other is my localhost site running off of xampp (soon to be a published site when I'm done). I am trying to get the rss feed from the wordpress site and display it on my localhost site.
The Issue:
I run into the infamous Access-Control-Allow-Origin error in the console and I know that I can fix that by setting it in the .htaccess file of the website but there are online aggregators that are able to just read and display it when I give them the link. So I don't really know what those sites are doing that I'm not, and what is the best way to achieve this without posing any easy security threats to both sites.
I highly prefer not to have to use any third party plugins to do this, I would like to aggregate the feed through my own code as I have done for an rss feed on the localhost site, but if I have to I will.
UPDATE:
I've made HUGE progress with learning php and have finally got a working bit of code that will allow me to download the feed files from their various sources, as well as being able to store them in cache files on the server. What I have done is set an AJAX request behind some buttons on my site which switches between the rss feeds. The AJAX request POSTs a JSON encoded array containing some data to my php file, which then downloads the requested feed via cURL (http_get_contents copied from a Github dev as I don't know how to use cURL yet) link and stores it in a md5 encoded cache file, then it filters what I need from the data and sends it back to the front end. However, I have two more questions... (Its funny how that works, getting one answer and ending up with two more questions).
Question #1: Where should I store both the cache files and the php files on the server? I heard that you are supposed to store them below the root but I am not sure how to access them that way.
Question #2: When I look at the source of the site through the browser as I click the buttons which send an ajax request to the php file, the php file is visibly downloaded to the list of source files but also it downloads more and more copies of the php file as you click the buttons, is there a way to prevent this? I may have to implement another method to get this working.
Here is my working php:
//cURL http_get_contents declaration
<?php
function http_get_contents($url, $opts = array()) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_TIMEOUT, 5);
curl_setopt($ch, CURLOPT_USERAGENT, "{$_SERVER['SERVER_NAME']}");
curl_setopt($ch, CURLOPT_URL, $url);
if (is_array($opts) && $opts) {
foreach ($opts as $key => $val) {
curl_setopt($ch, $key, $val);
}
}
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
if (false === ($retval = curl_exec($ch))) {
die(curl_error($ch));
} else {
return $retval;
}
}
//receive and decode $_POSTed array
$post = json_decode($_POST['jsonString'], true);
$url = $post[0];
$xmn = $post[1]; //starting item index number (i.e. to return 3 items from the feed, starting with the 5th one)
$xmx = $xmn + 3; //max number (so three in total to be returned)
$cache = '/tmp/' . md5($url) . '.html';
$cacheint = 0; //this is how I set if the feed will be downloaded from the site it is from, or if it will be read from the cache file, I will implement a way to check if there is a newer version of the file on the other site in the future
//if the cache file doesn't exist, download feed and write contents to cache file
if(!file_exists($cache) || ((time() - filemtime($cache)) > 3600 * $cacheint)) {
$feed_content = http_get_contents($url);
if($feed_content = http_get_contents($url)) {
$fp = fopen($cache, 'w');
fwrite($fp, $feed_content);
fclose($fp);
}
}
//parse and echo results
$content = file_get_contents($cache);
$x = new SimpleXmlElement($content);
$item = $x->channel->item;
echo '<tr>';
for($i = $xmn; $i < $xmx; $i++) {
echo '<td class="item"><p class="title clear">' .
$item[$i]->title .
'</p><p class="desc">' .
$desc=substr($item[$i]->description, 0, 250) .
'... <a href="' .
$item[$i]->link .
'" target="_blank">more</a></p><p class="date">' .
$item[$i]->pubDate .
'</p></td>';
}
echo '</tr>';
?>

write a file on local disk from web app [duplicate]

I am trying to create and save a file to the root directory of my site, but I don't know where its creating the file as I cannot see any. And, I need the file to be overwritten every time, if possible.
Here is my code:
$content = "some text here";
$fp = fopen("myText.txt","wb");
fwrite($fp,$content);
fclose($fp);
How can I set it to save on the root?
It's creating the file in the same directory as your script. Try this instead.
$content = "some text here";
$fp = fopen($_SERVER['DOCUMENT_ROOT'] . "/myText.txt","wb");
fwrite($fp,$content);
fclose($fp);
If you are running PHP on Apache then you can use the enviroment variable called DOCUMENT_ROOT. This means that the path is dynamic, and can be moved between servers without messing about with the code.
<?php
$fileLocation = getenv("DOCUMENT_ROOT") . "/myfile.txt";
$file = fopen($fileLocation,"w");
$content = "Your text here";
fwrite($file,$content);
fclose($file);
?>
This question has been asked years ago but here is a modern approach using PHP5 or newer versions.
$filename = 'myfile.txt'
if(!file_put_contents($filename, 'Some text here')){
// overwriting the file failed (permission problem maybe), debug or log here
}
If the file doesn't exist in that directory it will be created, otherwise it will be overwritten unless FILE_APPEND flag is set.
file_put_contents is a built in function that has been available since PHP5.
Documentation for file_put_contents
fopen() will open a resource in the same directory as the file executing the command. In other words, if you're just running the file ~/test.php, your script will create ~/myText.txt.
This can get a little confusing if you're using any URL rewriting (such as in an MVC framework) as it will likely create the new file in whatever the directory contains the root index.php file.
Also, you must have correct permissions set and may want to test before writing to the file. The following would help you debug:
$fp = fopen("myText.txt","wb");
if( $fp == false ){
//do debugging or logging here
}else{
fwrite($fp,$content);
fclose($fp);
}

Event Source -> Server returns event stream in bulk rather then returning in chunk

I have a php script that import large data from csv files with validations.
For that I need to show progress to the user. I have used Event Streaming for that.
When I echo something, I want it to be transferred to client one by one instead of server sent whole output in bulk.
I had already played around with ob_start(), ob_implicit_flush() & ob_flush(), but they didn't work.
My script is working perfect on another server.
Below server configurations are given:
Server configuration on which the code is not responding as desired, i.e.OS: Linux
PHP Version 5.4.36-0+deb7u3
Server API: CGI/FastCGI
Memory_limit: 128M
output_buffering: no value
As I have said, the code is working properly on another server which has the almost same configuration, i.e.
OS: Linux
PHP Version 5.4.37
Server API: CGI/FastCGI
Memory_limit: 256MB
output_buffering: no value
Below is my sample code for sending event:
<?php
header("Content-Type: text/event-stream");
header("Cache-Control: no-cache");
header("Access-Control-Allow-Origin: *");
$lastEventId = floatval(isset($_SERVER["HTTP_LAST_EVENT_ID"]) ? $_SERVER["HTTP_LAST_EVENT_ID"] : 0);
if ($lastEventId == 0) {
$lastEventId = floatval(isset($_GET["lastEventId"]) ? $_GET["lastEventId"] : 0);
}
echo ":" . str_repeat(" ", 2048) . "\n"; // 2 kB padding for IE
echo "retry: 2000\n";
// event-stream
$i = $lastEventId;
while ($i <= 100) {
if($i==100){
echo "data: stop\n";
ob_flush();
flush();
break;
} else {
echo "id: " . $i . "\n";
echo "data: " . $i . ";\n\n";
ob_flush();
flush();
sleep(1);
}
$i++;
}
?>
Below is my client page on which I need response:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
<title>EventSource example</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge">
<script src="../jquery/eventsource.js"></script>
<script>
var es = new EventSource("events.php");
var listener = function(event) {
console.log(event.data);
var type = event.type;
if (event.data == 'stop') {
es.close();
} else {
var div = document.createElement("div");
div.appendChild(document.createTextNode(type + ": " + (type === "message" ? event.data : es.url)));
document.body.appendChild(div);
}
};
var errlistener = function(event) {
es.close();
}
es.addEventListener("open", listener);
es.addEventListener("message", listener);
es.addEventListener("error", errlistener);
</script>
</head>
<body>
</body>
</html>
Your best method to return chucked data to the browser is to use Web Sockets get the client to open a socket to your file reader then you can chunk the data to the browser without a problem.
Then once it has finished you can close the socket.
a good tutorial for web sockets
http://www.phpbuilder.com/articles/application-architecture/optimization/creating-real-time-applications-with-php-and-websockets.html
with this method you can then if you wanted implement verification so the server is not just sending chunks it's sends the chunks on request by javascript
So your Client could say i need chunk 5 and your server implement something like
$requestedChunk = 5; // this would be set by the javascript sending the request
$chunkSize = 256; // this would be your chunk size;
$readPossition = $requestedChunk * $chunkSize;
Link no longer works so here is one built on Ratchet: https://blog.samuelattard.com/the-tutorial-for-php-websockets-that-i-wish-had-existed/
I had a similar problem. Event streams were working as expected (returning chunks) on a server using the Apache 2.0 Handler but not on a server using FastCGI (returning it in bulk). I assumed that something in FastCGI is the culprit and so tried to resolve the problem by switching to CGI. Now the event stream works as expected.
Whether using CGI or FastCGI the Server API shows up as CGI/FastCGI so I assume that the server it works on for you is running CGI and the server it doesn't work on for you is running FastCGI. Try changing the non-working server to CGI.
As for why it doesn't work in FastCGI I'm not entirely sure but unless it's a necessary requirement and CGI isn't possible then the above solution should work.
Many things can prevent chunked response such as but not limited to;
Proxy or any other buffering mechanism on web server
When "Output buffering" is "on" in php.ini (you should explictly set it to off)
When gzip is enabled on web server
You should check these at first.

Categories

Resources