I'm creating a react App that will periodically fetch Json Objects from an URL and display its content.
(example of an url i'm using: http://demo0046512.mockable.io/stream/anycontent ).
To increase flexibility to my project I want it to be able to display content when offline as well.
For that reason, when I first fetch for the urls in that Json object, I want to store their content to be able to acess it later, as the urls won't have any use when i'm offline.
To store the data I'm using localforage Api and my Idea was to create a Json Object just like the one I fetched, but every url would be replaced by it's content (text/image/video itself) and then store it with localForage to read from it when offline, but I haven't found a way to do that so far.
For instance: {ex1 : "https://video.com"} would be stored as {ex2: videoItself}
Can this be done?
Code for anyone interested (Ctrl + f and type " /*!! " for the problem)
https://pastebin.com/AagzuGmx
You want to cache the results you get back from the URL. Might these values ever change? If so, you probably want to keep the URLs; and the results you fetch from them. Something so if {ex1 : "https://video.com"} then {"https://video.com": video}
Edit:
The fact that you are caching is important here, including that it may change. But I understand what your question is now. You want to store things like videos using localForage. This can be done, but as the docs say, "All types are supported in every storage backend, though storage limits in localStorage make storing many large Blobs impossible." So the example you give, of a video, will only work for small videos. On mobile browsers, 5MB is the limit, and you might not even get that much because of how localStorage works. It only stores strings, so if strings are stored poorly (and they are in some versions of Android's browser), then you might only get 2.5MB. BTW, this limit is per domain, not file.
One more caveat: you have to encode/serialize these files before storing them. Will they just be videos? Also images? Media with metadata? Entire pages with media? Because the way you encode them might depend on that. If it's just one media type, you can encode it like the final setItem example from the docs. Then create a URI like the example does, which can be the src of an <img> or <video> tag.
Related
After doing some searching, it seems that this type of issue affects many users out there so an answer to this question could help many users of Yahoo's YQL Platform.
I am essentially aiming to extract a semi-static CVS document stored on a webserver to then parse in Javascript. Semi-static means that the CVS document isn't getting appended to with additional entries, rather each entry is getting modified.
Using the YQL console https://developer.yahoo.com/yql/console/ gives me updated data with every call made to my semi-static CVS file stored on a remote server. I can modify the data and YQL console will successfully return the updated data. When I extract their provided rest query and simply 'paste' it into a browser window, the data provided correspond to the very first query that I made. When I embed the query as a $.getJSON request in Javascript as such:
$.getJSON("https://query.yahooapis.com/v1/public/yqlq=select%20*%20from%20csv%20where%20url%3D'mywebsite.csv'&format=json").done(function (data) {
I still get the outdated data. If I switch to a different web-browser or device, the information is still outdated which gives me the feeling that it is not a cache issue on the local machine.
I believe the problem is in one of two spots:
1) Perhaps Yahoo caches the queries and only acquires updated information from tables/files that grow dynamically
2) I am not using the YQL query correctly.
As an additional note, the exact same query structure works perfectly with Google Forms (which can export as a CVS) and also works without a hitch on a dynamically growing CVS document that I used on a now antiqued database, requiring a quick switch to the simple semi-static document.
Any thoughts or fixes that can work on my semi-static CVS document?
This sounds like a browser cache issue
Try adding a timestamp in the url which makes every request have a unique url and therefore browser won't have a cache for it
var params ={
q : "select%20*%20from%20csv%20where%20url%3D'mywebsite.csv'",
format: "json",
_v : Date.now()
};
$.getJSON("https://query.yahooapis.com/v1/public/yql", params ).done...
I'm working with html canvas and creating a data URI to preview an image that the user can build himself. When he hits a preview button, the image gets displayed:
$("#show_label").html('<img src='+canvas.toDataURLWithMultiplier("png",1,1)+' width="157" height="202">');
Works fine. Now I want to save the dataURL in a cookie as I want to restore the image at a later stage on the website or store it in my database later.
$.cookie("current_label",canvas.toDataURL());
That won't work.
I understand that this is not a no-brainer, although it would make sense to store objects in a cookie as well. I guess I would have to encode the data URI and decode it later. But how? Researching didn't give me any results.
Also, could it be that cookies would become too large then so this is not even something to consider?
Should I store the dataURL in a SQL database instead and just use a cookie to identify the database entry (e.g. via ID)?
Any help appreciated.
Try using Web Storage objects.
Do you have to use cookies? Might I suggest using localStorage?
Example:
localStorage['aVariable'] = 'a Value';
//A week later this will still hold value:
alert(localStorage['aVariable'])
I'm working on an add-item page for a basic webshop, the shop owner can add item images via drag/drop or browsing directly. When images are selected i'm storing the base64 in an array. I'm now not too sure how best to deal with sending/storing of these item images for proper use. After giving Google a bit of love i'm thinking the image data could be sent as base64 and saved back to an image via something like file_put_contents('/item-images/randomNumber.jpg', base64_decode($base64)); then adding the item's image paths to its database data for later retrieval. Below is an untested example of how i currently imagine sending the image data, is something like this right?
$("#addItem").click(function() {
var imgData = "";
$.each(previewImagesArray, function(index, value) {
imgData += previewImagesArray[index].value;
});
$.post
(
"/pages/add-item.php",
"name="+$("#add-item-name").val()+
"&price="+$("#add-item-price").val()+
"&desc="+$("#add-item-desc").val()+
"&category="+$("#add-item-category :selected").text()+
"&images="+imgData
);
return false;
});
Really appreciate any help, i'm relatively new to web development.
As you are doing, so do I essentially: get the base64 from the browser, then post back, and store. A few comments...
First, HTML POST has no mandatory size limitation, but practically your backend will limit the size of posted data. (eg, 2M max_post_size in PHP.) Since you are sending base64, you are significantly reducing the effective payload you can send. That is, if every one byte of image equals three bytes of base64, you will get far less image transfered per byte of network. Either send multiple posts or increase your post size to fit your needs.
Second, as #popnoodles mentioned, using a randomNumber will likely not be sufficient in the long term. Use either a database primary key or the tempnam family of functions to generate a unique identifier. I disagree with #popnoodleson implementation, however: it's quite possible to upload the same file b/w two different people. For example, my c2013 Winter Bash avatar on SO was taken from an online internet library. Someone else could use that same icon. We would collide, so the MD5 is not sufficient in general, but in your use case could be.
Finally, you probably will want to base64 decode, but give some thought to whether you need it. You can use a data/url and inline the base64 image data. This has the same network issue as before: significantly more transfer is required to send it. But, a data URL works very well for lots of very small images (eg avatars) or pages that will be cached for a very long time (esp if your users have reasonable data connections). Summary: consider the use case before presuming you need to base64 decode.
I'm currently working on developing a web-based music player. The issue that I'm having is pulling a list of all the songs from the database and sending it to the client. The client has the ability to dynamically create playlists, and therefore they must have access to a list of the entire library. This library can range upwards of 20,000 unique songs. I'm preparing the data on the server-side using django and this tentative scheme:
{
id: "1",
cover: "http://example.com/AlbumArt.jpg",
name: "Track Name",
time: "3:15",
album: "Album Name",
disc: (1, 2),
year: "1969",
mp3: "http://example.com/Mp3Stream.mp3"
},
{
id: "2",
...
}
What is the best method of DYNAMICALLY sending this information to the client? Should I be using jSON? Could jSON effectively send this text file consisting of 20,000 entries? Is it possible to cache this playlist on the client side so this huge request doesn't have to happen every time the user logs-in, instead only when there was a change in database?
Basically, what I need at this point is a dependable method of transmitting a text-based playlist consisting of around 20,000 objects, each with their own attributes (name, size, etc...), in a timely manor. Sort of like Google Music. When you log-in, you are presented with all the songs in your library. How are they sending this list?
Another minor question that comes to mind is, can the browser (mainly Chrome) handle this amount of data without sacrificing usability?
Thank you so much for all your help!
I just took a look at the network traffic for Google Play, and they will transmit the initial library screen (around 50 tracks) via JSON, with the bare minimum for metadata (name, track ID, and album art ID). When you load the main library page, it makes a request to an extremely basic HTML page, that appears to insert items from an inline JS object Gist Sample. The total file was around 6MB, but it was cached and nothing needed to be transferred.
I would suggest doing a paginated JSON request to pull down the data, and using ETags and caching to ensure it isn't retransmitted unless it absolutely needs to be. And instead of a normal pagination of ?page=5&count=1000, try ?from=1&to=1000, so that deleting 995 will purge ?from=1&to=1000 from the cache, but not ?from=1001&to=2000 (whereas ?page=2&count=1000 would).
Google Play Music does not appear to use Local Storage, IndexedDB, or Web SQL, and loads everything from the cached file and parses it into a JS object.
Have you seen this http://code.flickr.net/2009/03/18/building-fast-client-side-searches/ ?
I've been using this array system myself lately (for 35K objects) and it is fast (assuming you dont want to render them all on screen).
Basically the server builds a long string in the form
1|2|3$cat|dog|horse$red|blue|green
Which is sent as a single string to an http request. Take the responseText field and conver it to an array using
Var arr = request.responseText.split('$');
Var ids = arr[0].split('|');
Var names = arr[1].split('|');
Clearly, you end up with arrays of strings at the end, not objects, but arrays are fast for many operations. I've used $ and | as delimiters in this example, but for live use I use something more obscure. My 35k objects are completly handled in less than 0.5sec (iPad client).
You can save the strings to localstorage, but watch the 5Mb limit, or use a shim such as lawnchair. (nb I also like SpenserJ answer, which may be easier to implement depending on your environment)
This method doesn't easily work for all JSON datatypes, they need to be quite flat. I've also found these big arrays to behave well for performance, even on smartphones, ipod touch etc ( see jsperf.com for several tests around string.split and array searching)
You could implement a file-like object that wraps the json file and spits out proper chunks.
For instance, you know that your json file is a single array of music objects, you could create a generator that wraps the json file and returns chunks of the array.
You would have to do some string content parsing to get the chunking of the json file right.
I don't know what generates your json content. If possible, I would consider generating a number of managable files, instead of one huge file.
I would test performance of sending the complete JSON in a single request. Chances are that the slowest part will be rendering the UI and not the response time of the JSON request. I recommend storing the JSON in a JavaScript object on the page, and only render UI dynamically as needed based on scrolling. The JavaScript object can serve as a data source for the client side scrolling. Should the JSON be too large, you may want to consider server backed scrolling.
This solution will also be browser agnostic (HTML < 5 )
I'm using a JSON file to autopopulate a drop down list. It's by no means massive (3000 lines and growing) but the time taken to refresh the page is becoming very noticeable.
The first time the page is loaded the JSON is read, depending on what option the user has selected dictates which part of the JSON is used to populate the drop down.
It's then loaded on every refresh or menu selection after. Is it possible to somehow cache the values to prevent the need for it to be reloaded time and time again?
Thanks.
EDIT: More Info:
It's essentially a unit converter. The JSON holds all the details. When a users selects 'Temp' for example a call is made and the lists are populated. Once a conversion is complete you can spend all day running temp conversions and they'll be fine but everytime a user changes conversion type so now length, the page refreshes and takes a noticeable amount of time.
Unfortunately, I don't know of a standardized global caching mechanism in PHP. This article says that Optimizer Plus, a third party accelerator, is being included in core PHP starting in version 5.5. Not sure what version you are using but you could try that.
On a different note, have you considered file storage as andrew pointed out? I think it combined with $_SESSION could really help you in this case. Let me give you an example that would work with your existing JSON data:
Server Side
Store your JSON data in a .json file on your PHP server:
{
"data": "some data",
"data2": "more data",
"data3": [
...
],
etc.
}
Note: Make sure to properly format your JSON data. Remember all strings must be enclosed in double quotes ".
In PHP, use an if statement to decide the appropriate action:
error_reporting(E_ALL);
ini_set("display_errors", "On");
session_start();
if(isset($_SESSION['dataCache'])) {
echo json_encode($_SESSION['dataCache']);
} else {
$file = 'data.json';
if (!is_file($file) || !is_readable($file)) {
die("File not accessible.");
}
$contents = file_get_contents($file);
$_SESSION['dataCache'] = json_decode($contents, true);
echo $contents;
}
So lets dig into the above coding a little more. So here's what we are doing in a nutshell:
Turn on error reporting and start session support.
Check to see if we've already read the file for this user.
If so, pull the value from storage and echo it out and exit. If not continue below.
Save off the file name and do a little error checking to ensure PHP can find, open and read the contents of the file.
Read the file contents.
Save the decoded json, which is not an array because of the `true` parameter passed to `json_decode`, into your `$_SESSION` variable.
Echo the contents to the screen.
This will save you the time and hazzle of parsing out JSON data and/or building it manually on the server. It will be cached for the users session so that they can use it through out.
Client Side
I assume you are using ajax to fetch the information? If not correct me, but I was assuming that's where some of your JavaScript comes into play. If so you may consider this:
Store the returned data in sessionStorage on the user's browser when it's returned from the server:
$.ajax({
...
success: function (res) {
localStorage.setItem("dataCache", JSON.stringify(res));
},
...
});
Or if you use promise objects:
$.ajax({
...
}).done(function (res) {
localStorage.setItem("dataCache", JSON.stringify(res));
});
When you need to read it you can do a simple test:
var data;
// This returns null if the item is not in local storage.
// Since JavaScript is truthy falsy, it will be evaluated as false.
if(localStorage.getItem("dataCache")) {
data = JSON.parse(localStorage.getItem("dataCache"));
} else {
// Make ajax call, fetch object and store in localStorage in the success or done callbacks as described above
}
Notes:
localStorage is a new feature in HTML5, so it's not fully supported on all browsers yet. Most of the major ones do however, even as far back as IE8 (I think). However, there is no standardized size limit on how much these browsers are required to hold per site.
It's important to take that into consideration. I can guarantee you probably will not be able to store the entire 30,000 line string in localStorage. However, you could use this as a start. Combined with the server side solution, you should see a performance increase.
Hope this helps.
I use the browser's cache to ensure that my large chunk of JSON is only downloaded once per session. I program in ASP.NET, but I'm sure PHP has the same mechanisms:
On session start, I generate a random string as session key for my dynamic JavaScripts. This key get stored in the ASP.NET session state under the key JsonSessionID. That way I can refer to it in my page markup.
I have a "generic http handler" (an ashx file) that when called by the browser, returns a .js file containing my JSON.
In my HTML I include the dynamic script:
<script type="text/javascript" src="/dynamicJSON.ashx?v=<%= JsonSessionID %>"></script>
The browser will automatically cache any URLs included as scripts. The next time the browser is asked to load a cached script from a URL, it will just load up the file from the local disk. This includes dynamic pages like this.
By adding the ?v= in there, I ensure that the JSON is updated once per session.
Edit
I just realized that your JSON is probably static. If that's the case, you can just put your JSON into a static .js file that you include in your HTML, and the browser will cache it.
// conversionData.js
var conversionData = { "a":1,"b":2,"c":3 };
When you include the conversionData.js, the conversionData variable will be in scope with the rest of your page's JavaScript that dynamically updates the drop-downs.
Edit 2
If you are serving static files, this blog post has a good pattern for cache-busting based on the file's date modified property. i.e. the file is only downloaded when it is changed on the server.
I have yet to find a good method for cache-busting JSON created via database lookup tables, other than per-session. Which isn't ideal because the database could change mid-session.
Once you've got your JSON data decoded into an object you can just keep the object around, it should persist until a page reload at least.
If you want to persist between reloads you might want to look at HTML5's localStorage etc.
You would need to come up with an age strategy, maybe just dump the current date in there with it as well so you can compare that and expire as needed.
I would suggest storing your json data to a session. On first page load you can write a script to get your json data, then store them into a session.
on each page load/refresh afterwards you can check our session to decide what to do - use the session data or fetch again your json data.
This approach suites me for small scale data (for example: an array of products - colors - sizes - prices).
Based on your data you should test you loading times.
Here is a simple hack:
Create a call to a php file as GET request with parameter "bla-bla.html"
or "bla-bla.css"... well you know, it makes browser think it is not a php, but rather "html" or "css". And browser will cache it.
To verify that the trick is working - go to the "network" tab of the browser dev panel and you will see column "type" there along with "transferred" - instead of having php there and actual size, you will find "html" and "(cached)"
This is also good to know when you passing parameters like "blah-blak.html" to the php file and expect it will not be cached. Well, it will be cached.
Tested on FireFox Quantum 57.0.1 (Mac 64bit)
P.S.
Chrome 63 on Mac is capable of recognising real file type in this situation. So it cannot be fooled.
Thinking out of the box here:
but if your list has 3000 lines and growing (as you said)
is it possible for you to establish its maximum size ?
let's say the answer is 10,000 (max) items; then do you really need an ajax call ?
you could transfer the data straight away with the page
(depending on your architecture of course, you could come out with different solution)