Serve files from Apache to Javascript - javascript

I am writing my first web application with Javascript and WebGL. For now I am running the app on localhost from Apache. The app needs to work with data that is provided instantly. Until now I worked with AJAX calls that happen during runtime which doesn't work out for my purposes anymore. So instead of serving individual files from Server to Client when asked, I want the application to load all files from the Server to Client side at initialization time (I want this to happen automatically at the start so I don't have to add every new file as a url in the html index). I understand I should do this with Server Side scripting; probably with PHP since I have a Apache localhost? I have different folders which hold my necessary resources in a uniform dataformat (.txt, .png and .json). So what I want to do is, before the Javascript app starts, look through the folder and send one object per folder that holds filenames as keys bound to filedata. Is my intuition right that I need to do that with PHP? If yes, where do I start to tell my application what to do when (first start serving files with php, then start the javascript app)? How do I do this on localhost? Should I already think about extending my toolset (e.g. using nodeJS on ServerSide(locally for now))? If so what lightweight tools do you propose for this kind of work? I feel I am missing some design principles here.
EDIT:
Keep in mind that I don't want to specifically call a single file... I am already doing that. What I need is a script that automatically serves all the files of a certain folder on the server to the client side at init time of the app before the program logic of the actual application starts.

Your question is kind of broad so I'll try my best. Why does AJAX not work for real-time data but loading all the files once does? If you're working with real time data, why not look into a websocket or at the bare minimum, AJAX queries?
If you want to pass data from the server to the client, you will need to use a HTTP request no matter what. A GET request or POST request is necessary for the client to request data from the server and receive it as a response.
You could theoretically just pass the data from PHP straight to the view of the application (which is technically done through a GET request whenever a user requests data such as .php files from the server) but this isn't as flexible as if Javascript had access to the data. You can do some hacks and 'transfer' the data from the view to Javascript with some .value methods, but this isn't ideal and can be prone to some security holes. This also means data is only being passed once.
So what would need to happen is that the data would need to be processed upon initialization and then immediately transferred to the client by use of Javascript and HTTP requests.
So if you want Javascript to have access to the data and use it in variables or manipulate it further, then you'd need to use an HTTP request such as GET or POST which is called by Javascript. Otherwise, you need to immediately pass the data to the view upon initialization (through PHP), but this means you can't work with real-time data because the data is only being passed once when there is a page refresh.
Example of scandir():
<?php
//scandir() returns filenames, not data from files
$fileArray = scandir('datafolder/') //this is a relative path reference to the folder 'datafolder'
$finalArray = [];
foreach($fileArray as $filename){
tempArray = [];
$file = fopen('datafolder/' . $filename, 'r'); //im pretty sure scandir only retrieves the filenames and not the path, so you might need to append the filepath so your script knows where to look
$tempArray = fgetcsv($file, 1024); //temp array to hold contents of each iteration of foreach loop
array_push($finalArray, $tempArray); //this will store the data for later use
}
Or the data can be used however, depending on what it is. Say, if you need to combine the data from multiple .csv files, you can read each file and append it to a single array. If you want to read multiple distinct files and preserve the independence of each file, you can create multiple arrays and then pass back a single JSON encoded object that contains each file's data as a separate attribute of the object such as:
{
'dataOne': [0,1,2,3,4,5,6...],
'dataTwo': ['new', 'burger', 'milkshake'],
'dataThree': ['Mary', 'Joe', 'Pam', 'Eric']
}
Which can be created with a PHP associative array using one of the following methods:
//assuming $arrayOne is already assigned from reading a file and storing its contents within $arrayOne
$data['dataOne'] = $arrayOne;
// or
array_push($data['dataTwo'], $arrayTwo);
// or
array_push($data, [
'dataThree' => ['Mary', 'Joe', 'Pam', 'Eric']
]);
Then $data can simply be passed back which is a single array containing all the different sets of data, if each set needs to be distinct.

Related

Client vs server image process and shown

Client vs server imagen process.
We got a big system which runs on JSF(primefaces) EJB3 and sometimes JavaScript logic (like for using firebase and stuff).
So we run onto this problem, we have a servlet to serve some images. Backend take a query, then extract some blob img from DB, make that BLOB into array of bytes, send it to browser session memory and servlet take it to serve it in ulr-OurSite/image/idImage. Front end calls it by <img>(url/image/id)</img> and works fine so far.
Then we are using a new direct way to show img, we send BLOB/RAW data to frontend and there we just convert them into Base64.imageReturn. and pass it to html.
Base64 codec = new Base64();
String encoded = codec.encodeBase64String(listEvidenciaDev.get(i).getImgReturns());
Both work, for almost all cases.
Note: We didn't try this before because we couldn't pass the RAW data through our layers of serialized objects and RMI. Now we can of course.
So now there are two ways.
Either we send data to servlet and put it on some url, which means the backend does all the job and frontend just calls url
or we send data to frontend which is going to make some magic and transform it to img.
This brings 2 questions.
If we send to frontend RawObject or make them call URL to show his image content, final user download the same amount of data? This is important because we have some remote branch offices with poor internet connection
Is worth pass the hard work to frontend (convert data) or backend (convert and publish)?
EDIT:
My questions is not about BLOB (the one i call RAW data) being bigger than base64
It is; passing the data as object and transform it to a readable picture is more heavy to internet bandwidth than passing a url from our servlet with the actual IMG and load it on html ?
I did choose to close this answer because we did some test and it was the same bandwidth usage on front end.
Anyway we make use of both solutions
If we dont want to charge frontend making a lot of encode we set a servlet for that images (that comes with more code and more server load). We look for the best optimization on specific cases.

How can you access the HTTP response from a server using client-side JavaScript?

I'm trying to do client-side processing of some data sent in a server's HTTP response.
Here's what I'm working with: I have a web application that sends commands to a backend simulation engine, which then sends back a bunch of data/results in the response body. I want to be able to access this response using JavaScript (note..not making a new response, but simply accessing the data already sent from the server).
Right now, I am able to do this via a kludgy hack of sorts:
var responseText = "{{response}}";
This is using Django's template system, where I have already pre-formatted the template context variable "response" to contain a pre-formatted string representation of a csv file (i.e., proper unicode separators, etc).
This results in a huge string being transmitted to the page. Right now, this supports my immediate goal of making this data available for download as a csv, but it doesn't really support more sophisticated tasks. Also, I'm not sure if it will scale well when my string is, say, 2 MB as opposed to less than 1 KB.
I'd like to have this response data stored more elegantly, perhaps as part of the DOM or maybe in a cache (?) [not familiar with this].
The ideal way to do this is to not load the csv on document load, either as a javascript variable or as part of the DOM. Why would you want to load a 2MB data every time to the user when his intention may not be to download the csv everytime?
I suggest creating a controller/action for downloading the csv and get it on click of the download button.

Access a cookie that was set on a javascript file, but not on the HTML

I have a script that needs some external information to work with. It fetches this using Ajax requests. So far so good.
However, the script needs some of it's data right from the start. So I have been pondering a few options to supply it with that initial data at page load time:
Simplest: Just have it perform an Ajax request for the data right away. Downside of this is extra latency and more requests than strictly needed.
Ugly: Add a small script fragment at HTML render time that provides the initial data
Bad caching properties: Create the whole JS file dynamically and add the data right then.
Impossible: Something with headers... but unfortunately it seems we can't access them (see e.g. this question). Doing the extra Ajax request is not useful here as in that case we might just as well use option #1.
Something with cookies...
Not tried yet: Create a dynamic 'initial-data.js' script whose sole purpose it is to load the initial data. This would at least only send the data when needed, but it would require all users of my script to include 2 script files instead of one.... Also it will cause an extra request...
I am trying out the 4th option of using cookies to transport the initial data but so far not having any success. What I am trying to do:
When the browser requests the .js file, have the server add a Set-Cookie header with the initial data in it in the response.
In the JS file, read out the cookie.
It doesn't work. It seems I need to set the cookie on the response for the .html instead of the .js for the browser to make it available to the script... That's too bad as it would involve adding the Set-Cookie header to each page, even though it's only needed by that particular piece of JS.
I was actually very happy with the solution I thought I found because it let me send the initial data along with the request for the script only to those pages that actually use the script... Too bad!
Is there any way to do what I'm trying to do using cookies, headers or some similar mechanism?
Do you guys have any tips for this situation?
Background:
I am trying to write a semi-offline application. Semi-offline in that it should continue to work (apart from some functions that just need connectivity) when offline, but is expected to have periods with connectivity regularly. So I'm using local storage and synching with the server when possible.
To be able to have the client generate new items when offline, I am including an ID generator that gets handed out ID blocks by the server, consuming them as it generates ID's. The data I was trying to send to the script in a cookie is the initial list of ID blocks and some settings and looks like this:
/suid/suid.json:3:3:dxb,dyb,dzb
^ ^ ^ ^
url min max blocks
Where:
url = path to JSON for subsequent Ajax requests
min = minimum amount of ID blocks to keep in local storage
max = maximum amount of ID blocks to keep in local storage
blocks = comma separated list of ID blocks
The ID blocks are encoded as sort-of Base32 strings. I'm using a custom formatting schema because I want 53-bit ID's to be as short as possible in text format while still being easily human readable and write-able and URL-safe.

Alternative to passing Data to JavaScript from PHP?

I have a fairly large Application and I'm currently trying to find a way around having to pass Data from PHP (User Tokens for 3rd Party API's and such) through the DOM. Currently I use data-* attributes on a single element and parse the Data from that, but it's pretty messy.
I've considered just making the contents of the element encoded JSON with all the config in, which would greatly improve the structure and effectiveness, but at the same time storing sensitive information in the DOM isn't ideal or secure whatsoever.
Getting the data via AJAX is also not so feasible, as the Application requires this information all the time, on any page - so running an AJAX request on every page load before allowing user input or control will be a pain for users and add load to my server.
Something I've considered is having an initial request for information, storing it in the Cache/localStorage along with a checksum of the data, and include the checksum for the up-to-date data in the DOM. So on every page load it'll compare the checksums and if they are different (JavaScript has out-of-date data stored in Cache/localStorage), it'll send another request.
I'd rather not have to go down this route, and I'd like to know if there are any better methods that you can think of. I can't find any alternative methods in other questions/Google, so any help is appreciated.
You could also create a php file and put the header as type javascript. Request this file as a normal javascript file. <script src="config.js.php"></script> (considering the filename is config.js.php) You can structure your javascript code and simply assign values dynamically.
For security, especially if login is required, this file can only be returned once the user is logged in or something. Otherwise you simply return a blank file.
You could also just emit the json you need in your template and assign it to a javascript global.
This would be especially easy if you were using a templating system that supports inheritance like twig. You could then do something like this in the base template for your application:
<script>
MyApp = {};
MyApp.cfg = {{cfg | tojson | safe}};
</script>
where cfg is a php dictionary in the templating context. Those filters aren't twig specific, but there to give you an idea.
It wouldn't be safe if you were storing sensitive information, but it would be easier than storing the info in local storage,

Possible to cache JSON to increase performance / load time?

I'm using a JSON file to autopopulate a drop down list. It's by no means massive (3000 lines and growing) but the time taken to refresh the page is becoming very noticeable.
The first time the page is loaded the JSON is read, depending on what option the user has selected dictates which part of the JSON is used to populate the drop down.
It's then loaded on every refresh or menu selection after. Is it possible to somehow cache the values to prevent the need for it to be reloaded time and time again?
Thanks.
EDIT: More Info:
It's essentially a unit converter. The JSON holds all the details. When a users selects 'Temp' for example a call is made and the lists are populated. Once a conversion is complete you can spend all day running temp conversions and they'll be fine but everytime a user changes conversion type so now length, the page refreshes and takes a noticeable amount of time.
Unfortunately, I don't know of a standardized global caching mechanism in PHP. This article says that Optimizer Plus, a third party accelerator, is being included in core PHP starting in version 5.5. Not sure what version you are using but you could try that.
On a different note, have you considered file storage as andrew pointed out? I think it combined with $_SESSION could really help you in this case. Let me give you an example that would work with your existing JSON data:
Server Side
Store your JSON data in a .json file on your PHP server:
{
"data": "some data",
"data2": "more data",
"data3": [
...
],
etc.
}
Note: Make sure to properly format your JSON data. Remember all strings must be enclosed in double quotes ".
In PHP, use an if statement to decide the appropriate action:
error_reporting(E_ALL);
ini_set("display_errors", "On");
session_start();
if(isset($_SESSION['dataCache'])) {
echo json_encode($_SESSION['dataCache']);
} else {
$file = 'data.json';
if (!is_file($file) || !is_readable($file)) {
die("File not accessible.");
}
$contents = file_get_contents($file);
$_SESSION['dataCache'] = json_decode($contents, true);
echo $contents;
}
So lets dig into the above coding a little more. So here's what we are doing in a nutshell:
Turn on error reporting and start session support.
Check to see if we've already read the file for this user.
If so, pull the value from storage and echo it out and exit. If not continue below.
Save off the file name and do a little error checking to ensure PHP can find, open and read the contents of the file.
Read the file contents.
Save the decoded json, which is not an array because of the `true` parameter passed to `json_decode`, into your `$_SESSION` variable.
Echo the contents to the screen.
This will save you the time and hazzle of parsing out JSON data and/or building it manually on the server. It will be cached for the users session so that they can use it through out.
Client Side
I assume you are using ajax to fetch the information? If not correct me, but I was assuming that's where some of your JavaScript comes into play. If so you may consider this:
Store the returned data in sessionStorage on the user's browser when it's returned from the server:
$.ajax({
...
success: function (res) {
localStorage.setItem("dataCache", JSON.stringify(res));
},
...
});
Or if you use promise objects:
$.ajax({
...
}).done(function (res) {
localStorage.setItem("dataCache", JSON.stringify(res));
});
When you need to read it you can do a simple test:
var data;
// This returns null if the item is not in local storage.
// Since JavaScript is truthy falsy, it will be evaluated as false.
if(localStorage.getItem("dataCache")) {
data = JSON.parse(localStorage.getItem("dataCache"));
} else {
// Make ajax call, fetch object and store in localStorage in the success or done callbacks as described above
}
Notes:
localStorage is a new feature in HTML5, so it's not fully supported on all browsers yet. Most of the major ones do however, even as far back as IE8 (I think). However, there is no standardized size limit on how much these browsers are required to hold per site.
It's important to take that into consideration. I can guarantee you probably will not be able to store the entire 30,000 line string in localStorage. However, you could use this as a start. Combined with the server side solution, you should see a performance increase.
Hope this helps.
I use the browser's cache to ensure that my large chunk of JSON is only downloaded once per session. I program in ASP.NET, but I'm sure PHP has the same mechanisms:
On session start, I generate a random string as session key for my dynamic JavaScripts. This key get stored in the ASP.NET session state under the key JsonSessionID. That way I can refer to it in my page markup.
I have a "generic http handler" (an ashx file) that when called by the browser, returns a .js file containing my JSON.
In my HTML I include the dynamic script:
<script type="text/javascript" src="/dynamicJSON.ashx?v=<%= JsonSessionID %>"></script>
The browser will automatically cache any URLs included as scripts. The next time the browser is asked to load a cached script from a URL, it will just load up the file from the local disk. This includes dynamic pages like this.
By adding the ?v= in there, I ensure that the JSON is updated once per session.
Edit
I just realized that your JSON is probably static. If that's the case, you can just put your JSON into a static .js file that you include in your HTML, and the browser will cache it.
// conversionData.js
var conversionData = { "a":1,"b":2,"c":3 };
When you include the conversionData.js, the conversionData variable will be in scope with the rest of your page's JavaScript that dynamically updates the drop-downs.
Edit 2
If you are serving static files, this blog post has a good pattern for cache-busting based on the file's date modified property. i.e. the file is only downloaded when it is changed on the server.
I have yet to find a good method for cache-busting JSON created via database lookup tables, other than per-session. Which isn't ideal because the database could change mid-session.
Once you've got your JSON data decoded into an object you can just keep the object around, it should persist until a page reload at least.
If you want to persist between reloads you might want to look at HTML5's localStorage etc.
You would need to come up with an age strategy, maybe just dump the current date in there with it as well so you can compare that and expire as needed.
I would suggest storing your json data to a session. On first page load you can write a script to get your json data, then store them into a session.
on each page load/refresh afterwards you can check our session to decide what to do - use the session data or fetch again your json data.
This approach suites me for small scale data (for example: an array of products - colors - sizes - prices).
Based on your data you should test you loading times.
Here is a simple hack:
Create a call to a php file as GET request with parameter "bla-bla.html"
or "bla-bla.css"... well you know, it makes browser think it is not a php, but rather "html" or "css". And browser will cache it.
To verify that the trick is working - go to the "network" tab of the browser dev panel and you will see column "type" there along with "transferred" - instead of having php there and actual size, you will find "html" and "(cached)"
This is also good to know when you passing parameters like "blah-blak.html" to the php file and expect it will not be cached. Well, it will be cached.
Tested on FireFox Quantum 57.0.1 (Mac 64bit)
P.S.
Chrome 63 on Mac is capable of recognising real file type in this situation. So it cannot be fooled.
Thinking out of the box here:
but if your list has 3000 lines and growing (as you said)
is it possible for you to establish its maximum size ?
let's say the answer is 10,000 (max) items; then do you really need an ajax call ?
you could transfer the data straight away with the page
(depending on your architecture of course, you could come out with different solution)

Categories

Resources