Export PHP table with pagination to EXCEL, PDF, PRINTABLE - javascript

How can I export my php table with pagination into excel and pdf format? I've tried many plugins and tutorials on youtube still no luck but they only get the data displayed not the entire table itself. I want to export my table to at least printable version better if I can convert my table into PDF or Excel Format. On my table I have a maxium of 100 rows paginated with 10 rows per page. How can I do this? I need your expertise.
PHP Export to branch:
<?php
include "connect.php";
require('lib/js/fpdf.php');
$result = mysqli_query($conn,"SELECT * FROM tblSales");
$header = mysqli_query($conn,"SELECT UCASE('date','sales')
FROM 'INFORMATION_SCHEMA'.'COLUMNS'
WHERE 'TABLE_SCHEMA'='DB_NAME'
AND 'TABLE_NAME'='tblSales'
and 'COLUMN_NAME' in ('date','sales')");
$pdf = new FPDF();
$pdf->AddPage();
$pdf->SetFont('Arial','B',16);
foreach($header as $heading) {
foreach($heading as $column_heading)
$pdf->Cell(95,12,$column_heading,1);
}
foreach($result as $row) {
$pdf->Ln();
foreach($row as $column)
$pdf->Cell(95,12,$column,1);
}
$pdf->Output();
?>
AJAX Code:
$(document).on('click', '.export-branch-excel', function () {
var branch = $("#branch-hidden-data").val();
$.ajax ({
url:"export-branch-excel.php",
data: "id="+branch,
method: "POST",
dataType: "text",
success: function(data){
window.location = "export-branch-excel.php";
}
});
});

#lawrence agulto I changed this code to procedural type as you said. Try this code.
Download In excel:
<?php
include_once "connect.php";
$query = mysqli_query($conn,"SELECT * FROM tblSales ORDER BY date DESC ");
$columnHeader = "Column Name"."\t"."Column Name1"."\t"."Column Name2"."\t"."Column Name3"."\t";
$setData='';
if (mysqli_num_rows($query) > 0) {
while ($rec = mysqli_fetch_assoc($query)) {
$rowData = '';
foreach ($rec as $value) {
$value = '"'.$value.'"'."\t";
$rowData.=$value;
}
$setData.=trim($rowData)."\n";
}
}
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=Nobelz_Sushank.xls");
header("Pragme: no-cache");
header("Expires: 0");
echo "\t\tSales Data\n";
echo ucwords($columnHeader)."\n".$setData."\n";
?>
Download in pdf :
Download the FPDF library from here.
And change the query, table name, and TABLE_COLUMN_NAME according to your need.
<?php
$result = mysqli_query($conn," Your QUery");
$header = mysqli_query($conn,"SELECT UCASE(`COLUMN_NAME`)
FROM `INFORMATION_SCHEMA`.`COLUMNS`
WHERE `TABLE_SCHEMA`='DB_NAME'
AND `TABLE_NAME`='TABLE_NAME'
and `COLUMN_NAME` in ('TABLE_COLUMN_NAME','TABLE_COLUMN_NAME1', 'TABLE_COLUMN_NAME2', 'TABLE_COLUMN_NAME3')");
require('fpdf181/fpdf.php');
$pdf = new FPDF();
$pdf->AddPage();
$pdf->SetFont('Arial','B',16);
foreach($header as $heading) {
foreach($heading as $column_heading)
$pdf->Cell(95,12,$column_heading,1);
}
foreach($result as $row) {
$pdf->Ln();
foreach($row as $column)
$pdf->Cell(95,12,$column,1);
}
$pdf->Output();
?>

The subject is way too broad. What you need to do is:
supply a download link on the page (which means that any parameters needed for the export need be saved in a session; OR you can do a more difficult "download after AJAX POST")
the link will run the same query you use to paginate, with the necessary parameters (see above), but without the pagination. It will then retrieve a number of rows. Possibly a large number of rows. You are already 95% done on this.
you use a PHP library (the new PHPExcel for flexibility and features, Spout for speed) or a templating library (e.g. TBS) to pull these rows into an Excel file. This is the part you need to flesh out.
send along the Excel file as binary download.
The PDF version is the same (if a bit more complicated, since you normally need to compose the PDF yourself - there are ways to do otherwise but require access to the server, e.g. TBS + unoconv), except that you use a PDF library instead of an Excel one. E.g. FPDF or TCPDF.
For a quick result and if you don't care about a formatted Excel file (backgrounds, logos, borders...) you can replace the Excel part with a "fake" Excel - create a HTML table, a CSV file, or even worse a simple unescaped tab-separated blob of data, give it a XLS(X) extension and go. From my experience, this gives you the worst result possible: something that works 95-97% of the time 1. Great expectations, periodically dashed, with huge maintenance backlog2. If it's a pet project, go for it; if you need it in a professional setting, take the time to do it right.
(1) the remaining 3-5% being comprised of UTF-8 characters, quotes, numbers and dates written in any format but that which Excel understands for import, etc.
(2) add date formatting. Improve quoting. Detect UTF8. Detect UTF8 which actually was ISO-8859-15. Wait, unescaped. And so on and so forth. Oh, and every time everything must be done yesterday. And why it still does not work?

Related

how to save tensorflow save model multiple files in php

I have developed a neural network model with tensorflow. I want to save weights of my model on each time they update. So i though of updating a file on the server every time it learns. but the documentation on tensorflow js website on saving model using http server is really confusing. So i did some research and found some code. Still it is not working. I know i am missing the "multipart/form-data" and fact that there are two files, "The body consist of two files, with filenames model.json and model.weights.bin". Could not find anything that could help me. link to tensorflow documentation!
javascript to save
model.save('http://example.com/save.php');
save.php
<?php
$putdata = fopen("php://input", "r");
$fname = "weights.json";
$file = fopen("../static/" .$fname, 'w');
while ($data = fread($putdata, 1024)){
fwrite($file, $data);
}
fclose($file);
fclose($putdata);
?>
http://php.net/manual/en/function.file-put-contents.php
I don't see the input section with php stdin. This makes me feel like the connection isn't sending a stream like a socket connection, but rather a standard HTTP payload with a body attribute. But wait, theirs caveats. If it is a JSON payload you'll need the second two lines (like you had in your code, but not as a resource). The first two are my guess as to what may be going on. Remember you can debug through your browsers console to see the data payload, request method, ect...
$data = '<pre>' . json_encode($_POST) . '</pre>';
file_put_contents('stdPost.html', $data);
$data = file_get_contents('php://input');
file_put_contents('stdInput.json', $data);

How to generate a .doc using AJAX and PHP

today i need some help, i know its not that hard and there is a lot of help for doing this in php in this site but i couldn't find nothing with AJAX, the technology im learning now and i want to master some day.
My code is the following.
$(".descarga").click(function(){
var paquete={idArchivo:$(this).val()};
$.post("includes/descargarPublicacion.php",paquete,procesarDatos);
});
So when a buttom from the "descarga" class i make a "packet", which i use it with the post method to send the data to the php file called descargarPublicacion.php
This is how it looks the php file:
<?php
session_start();
$mysqli = new mysqli("localhost", "root", "", "registroflashback");
if (isset($_GET['idArchivo'])) {
header("Content-type: application/vnd.msword");
header("Cache-Control: must-revalidate,post-check=0, pre-check=0");
header("Content-disposition:attachment;filename=yeaboi.doc");
header("Expires: 0");
$idPubliGet=$_GET['idArchivo'];
$resultadoBusqueda=$mysqli->query("SELECT * FROM publicaciones WHERE idPubli='$idPubliGet'");
if ($resultadoBusqueda->num_rows>0) {
//$resultadoBusqueda['titulo'];
echo 'descarga exitosa';
}else{
echo 'descarga no exitosa';
}
}else{
echo 'descarga no exitosa';
}
?>
I made a little research and people told me to use the headers to convert the file and download it, but it dosnt works for me, it dosnt generates any file, however it executes the "echo descarga exitosa" which i use as return value for the following function in the js file.
function procesarDatos(datos_devueltos){
alert(datos_devueltos);
if(datos_devueltos=="descarga exitosa"){
$("#alertaDescarga").show(1000);
}
if(datos_devueltos!="descarga exitosa"){
$("#alertaDescargaError").show(1000);
}
}
How i could generate a .doc file from html using ajax and jquery? I know i have it almost, it should be some detail but i dont know which one is, thats why im asking some experienced help! Thank you !
I do not understand why you want to to serve the .doc file via ajax. In my opinion it's easier to just provide valid .doc over a normal GET Request.
$(".descarga").click(function(){
//onClick transfer id via Get-Param and download file
window.location = "includes/descargarPublicacion.php?idArchivo="+$(this).val();
});
php part (descargarPublicacion.php)
<?php
if (isset($_GET['idArchivo'])) {
header("Content-type: application/vnd.msword");
header("Cache-Control: must-revalidate,post-check=0, pre-check=0");
header("Content-disposition:attachment;filename=yeaboi.doc");
header("Expires: 0");
//ID is available via GET because we send it as Url Param
$idPubliGet=$_GET['idArchivo'];
//#TODO fetch relevant data with given ID
//#TODO generate valid(!) doc File output
//- just echo'ing something will not result in an valid document for Word
echo $coumentContent;
}
?>
To provide/generate a valid Word document is a little bit more complicated. I would recommend you to look into a libary which does all the work for you.
E.g. https://github.com/PHPOffice/PHPWord
If you instead want to serve just some simple .txt File - change your header Content-Type to text/plain and the filename to yeaboi.txt and print/echo out the text-content

Improve Page Performance, save PHP array on server?

is it possible to store a PHP-array to my server, right now it always gets created when someone reloads the page from a CSV file but that is unnecessary since the file only chances after each hour.
ATM, the page takes like 9 seconds to load, which is quite long. The CSV file has 10k+ rows with 9 elements per row, so it would be really good for performance if the server didn't have to process 100k elements for each user.
I already have a cronjob for downloading the csv file so it would be good if the parse command would be executed after the download finished, only once per hour.
cronjob:
<?php
function download_remote_file($file_url, $save_to) {
$content = file_get_contents($file_url);
file_put_contents($save_to, $content);
}
download_remote_file(<url here>, realpath(".") . '/dump.csv');
?>
and this happens with every reload of the page:
1st: Parse data to array
$url = 'dump.csv';
$csvData = file_get_contents($url);
$lines = explode(PHP_EOL, $csvData);
$array = array();
foreach ($lines as $line) {
$line = str_replace("\\", "\", $line);
$line = str_replace("#", "#", $line);
$array[] = str_getcsv($line);
2nd: pass array to Javascript
var array = <?php echo json_encode( $array ) ?>;
3rd: create HTML table
//some code
4th: initialise data table plugin
$(document).ready( function () {
createtable();
$('#scoreboard').DataTable( {
"iDisplayLength": 50,
language: {
decimal: ".",
},
"lengthMenu": false,
"bLengthChange": false
} );
} );
Is there something that could be done faster?
Like, as mentioned, save the php array server-side or maybe saving the JS array with the HTML table somehow?
-Innerwolf
After you parse your CSV, do this:
$file = fopen('/tmp/output.js', 'w');
fwrite($file, '<script type="text/javascript">');
fwrite($file, 'var array =');
fwrite($file, json_encode( $array ));
fwrite($file, ';');
fwrite($file, '</script>');
fclose($file);
copy('/path/to/script.js', '/path/to/script.js.bak');
move('/tmp/output.js', '/path/to/script.js');
Then, later on when you are outputting the HTML, you just need to stick in a:
<script type="text/javascript" src="/scripts/script.js">
in the header. People's browsers should cache it properly too. Note the copy and move -- you don't strictly need to make a backup copy, but you MUST use a move() to replace the 'live' script -- move() is atomic, more or less, and won't result in anyone getting a half-file.
Also, note that you'll need write permissions to where the script is -- there are ways to keep this pretty secure (not letting your PHP script write all over the hard drive), but that's out of scope here.
Since you mention getting the data on an hourly basis I suggest the following:
grab the CSV file with cron and store the data in a database on an hourly basis
configure your data tables component to use server side data
This way you won't force every user to download the entire array at once on every first page load.
The server side script only fetches the number of records that need to be displayed on that particular page in the table.

Sending data to PHP page to make a spreadsheet out of it

For some reason, when I try to send JSON data to a PHP page (where it gets downloaded as a spreadsheet), it runs without error, but doesn't bring up the prompt to download the spreadsheet. The JSON has generated without problem (I have made the PHP page create the file on the server, before trying to make it download without creating it).
Here is the JavaScript code that sends the JSON data to the server:
function writeToSpreadsheet()
{
// get the json for #theTable
var tableJSON = tableToJSON("tr:not(#titleRow)");
//alert(tableJSON);
alert("Sending table data to be written to the spreadsheet...");
$.post('/ResearchProject/tableContent/exportTable.php', {'table': tableJSON}).done(
function(response) { alert(((response == '') ? response : (tableJSON.title + ' written to file!')));})
.fail(function (xhr, ajaxOptions, thrownError) { alert("ERROR:" + xhr.responseText+" - "+thrownError); });
}
and here is exportTable.php
<?php
function cleanData(&$str)
{
$str = preg_replace("/\t/", "\\t", $str); // escaping all of the tabs
$str = preg_replace("/\r?\n/", "\\n", $str); // escaping any and all cases of carriage return
// if there is a single double-quote in the string, we wrap the string in quotes, replace every single double-quote with double double-quotes, and
// end with a double-quote
if(strstr($str, '"')) $str = '"' . str_replace('"', '""', $str) . '"';
}
// the data is coming from a JSON object that is being sent here
if (isset($_POST['table']))
{
$tableJSON = $_POST['table']; # somehow, this is already a PHP array (exactly the one we need)!!
// get the name of the table from the $tableJSON
$tableName = $tableJSON['title'];
// get the title row from $tableJSON
$titleRow = $tableJSON['titleRow'];
// fix the titleRow
foreach ($titleRow as $heading)
{
$heading = trim(preg_replace('/\s+/', ' ', $heading));
}
// get the rows from $tableJSON
$rows = $tableJSON['rows'];
// form the filename from the tableName
$fileName = $tableName . '.xls';
// here, we download the file without even creating it
header("Content-Disposition: attachment; filename=\"$fileName\"");
header("Content-Type: application/vnd.ms-excel");
// we echo the titleRow first
array_walk($titleRow, 'cleanData');
echo implode(chr(9), $titleRow) . "\r\n";
?>
<script>console.log('Title row written to file.');</script>
<?php
// now we echo the data
foreach($rows as $row)
{
array_walk($row, 'cleanData');
echo implode(chr(9), $row) . "\r\n";
?>
<script>console.log('Data row written to file.');</script>
<?php
}
}
else
{
echo 'You sent me no data :(\n';
}
?>
OK, MikeWarren, how do I test this??
You can test it by selecting a table from the dropdown menu and clicking the "Export table to spreadsheet" button here: http://dinotator.biokdd.org/ResearchProject/tableViewer.php
I am trying to have it where the table that is on the HTML page gets converted into an JSON object, and then downloaded. Thus, I would need to POST the data to the PHP page, right? (Query strings don't work.)
Query strings won't work because you are using jQuery's $.post call which means that your data is sent in the body of the request, as opposed to a query string which is what a GET uses. For JSON you do indeed want to use a POST.
As for what's going wrong, you need to decode your JSON into a PHP array using json_decode. Unfortunately it can't simply handle JSON how it is.
So most likely you'll want to do:
// now a poorly named variable
$tableJSON = json_decode($_POST['table']);
Also, looking at your Ajax, $.post does accept a .fail() listener, but it doesn't pass any error data as part of the callback. So if you want to be able to handle incoming response errors you'll need to use $.ajax:
$.ajax({
type: "POST",
url: "/your/url.php",
dataType: "json",
error: errorCallback
});
Finally, looking at how your code is structured, if you're actually trying to save to file, you're going to need some more logic. Right now, you're just rendering that table, and then returning it as a response which will show up in your done function. You're going to add some more logic in order to make it actually download. This question entails your exact problem.
Good luck!
I have found so much bad advice on the internet about how to solve this problem. One of the answers here also didn't work. :(
I have decided to get advice from a friend of mine, and me and him have decided on this approach:
Have my exportData.php simply write the data to $_SESSION, echo a JSON-encoded "success", and then exit
On exit, on the client-side of things, if "success" has been received, have the JavaScript open up a new tab to a file that I have created called downloadFile.php which actually does the downloading.
Why didn't sending the data between files work?
Downloading data entails setting the right headers and printing the data. When you send data to the file to do this (via AJAX), the buffer that the data is printed to is the one for response. You can see this by saying something like
success: function(response)
{
alert(response);
} and see the data that you "downloaded" not get downloaded, but get printed on-screen.
However, if you go to the file instead of simply passing data to it, your data will download, provided that it has access to the data that you are trying to download. You can see examples of this here: www.the-art-of-web.com/php/dataexport/ . In those examples, the data was "static" (that is, only existing in the scope of that PHP file, until download happened).
We then see that we should let another file handle the downloading. Here is what its contents should look like:
<?php
if (!isset($_SESSION))
session_start();
function cleanData(&$str)
{
$str = preg_replace("/\t/", "\\t", $str); // escaping all of the tabs
$str = preg_replace("/\r?\n/", "\\n", $str); // escaping any and all cases of carriage return
// if there is a single double-quote in the string, we wrap the string in quotes, replace every single double-quote with double double-quotes, and
// end with a double-quote
if(strstr($str, '"')) $str = '"' . str_replace('"', '""', $str) . '"';
}
// get the data from $_SESSION
if (isset($_SESSION))
{
$fileName = $_SESSION['fileName'];
$titleRow = $_SESSION['titleRow'];
$rows = $_SESSION['rows'];
// set the excel headers
header("Content-Type: application/vnd.ms-excel");
//header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"$fileName\"");
header("Pragma: no-cache");
header("Expires: 0");
// attempt download
array_walk($titleRow, 'cleanData');
echo implode(chr(9), $titleRow) . "\r\n";
// now we echo the data
foreach($rows as $row)
{
array_walk($row, 'cleanData');
echo implode(chr(9), $row) . "\r\n";
}
}
else
{
die('Problem with session variable. Data could not be sent for download.');
}
exit;
?>
Of course, before doing this, make sure that you have 'fileName', 'titleRow', and 'rows' already written to $_SESSION.
This should help anyone having problem downloading HTML table to Excel spreadsheet via PHP, and the best part is that you don't have to bloat your server by downloading an entire library, for potentially the functionality of one button!!

Formatting datetime in JSON script

This is my PHP/JSON script from localhost:
http://www16.zippyshare.com/v/6486125/file.html is the link if you need to download the PHP files to edit them in your answers if you want. (The link to the JSON file is mentioned in large-schedule.js in the file. Instructions on usage provided).
It partially works (as in the file echoes the data).
This is the code:
<?
header('Content-type: application/json; charset=utf-8');
header("access-control-allow-origin: *");
$link = mysql_pconnect("localhost", "test", "test") or die("Could not connect");
mysql_select_db("radiostations") or die("Could not select database");
$arr = array();
$rs = mysql_query("SELECT * FROM radio1r1");
while($obj = mysql_fetch_object($rs)) {
$arr[] = $obj;
}
echo '{"success":true,"error":"","data":{"schedule":['.json_encode($arr).'}';
echo isset($_GET['callback'])
? "{$_GET['callback']}($json)"
: $json;
However, I cannot get the contents of the fields startminutes and endminutes (stored as DATETIME) to display as 01/02/2013 00:00:00 within the JSON, in order to display them as
01/02/\2013 00:00:00
The fields I have are in the SQL file above.
As a PHP/JSON file the code works at a basic level; I can do callbacks well, but is there an easier way to get success true error data to display without manually putting it in?
As for the query string callback, I intend to do it so it has these 4 stations with different results from the MySQL tables:
Radio 1
Anytown FM
Big City FM
so the callback would look like
http://127.0.0.1/phpradiostation/radioschedule-json.php?callback=?&name=Anytown+FM
or
http://127.0.0.1/phpradiostation/radioschedule-json.php?callback=?&name=Big+City+FM
I have got it halfway there, with regard to the JSON but it displays a blank page despite there being data in the database!
PHP info: I'm using 5.4.1.0, on MAMP, OS X Mavericks, if that's relevant.
Basically, what I am asking is for help on actually getting it to display the data in the javascript.
Any help is appreciated!
As far as I can understand,
1st thing I notice is you are using json_encode in a wrong way. What you have to do is create a multi dimentional array and use json_encode to convert the particular array to JSON rather than manually doing it.
Answer to your question is yes JSON content should be escaped when they are passed. That why it shows as 01/02/\2013 00:00:00. What you have to do is decode the JSON data at the client-side. See the below two links.
How to JSON decode array elements in JavaScript?
Parse JSON in JavaScript?
Also use jsonlint to validate your JSON data.

Categories

Resources