Homepage Ajax/Google Maps - Server overhead - javascript

Adding a Google Map plugin to our homepage, which updates a single marker dynamically whenever there is a new product search on our site (which we read from our database). So, "has there been a new search via our site? If yes, reposition the marker based on the new search's coords".
Currently every "n" seconds (haven't settled on a seconds value yet) an Ajax call is made (using SetInterval) to determine if there has been a new search, and if there has it returns a small JSON response. The script run via the Ajax call is a PHP script, which queries the database for the last row in our searches table (order by desc limit 1).
So, my question is (not being a sysadmin), could this setup put an undesirable strain on our server? Should i incorporate a timeout session, or something, which turns off the Ajax call after 100 goes, or after 15 mins (i mean, who sits for 15 mins looking at markers dynamically generate on a Google map?!).
Our homepage only receives roughly 200 visits a day.

As you have you given the statistics that your website gets 200 visits per day and that your server is a spitting JSON that you have to extract and display it on the UI, It is a normal practice to have a set up like this one. You can rather ping the server data using AJAX in every 5 sec to get more precise data but it wont cause any performance issue at this level.
Please be sure that you dont have servers that are separated geographically else you have to use some other synchronization mechanism to track users location based on there search.
For AJAX JQuery implementation details please see the following page.
For project implementation as a tutorial please visit this tutorial.

Related

How can I avoid getting the same posts twice

I'm writing the code for a web application using PHP, MySQL, and javascript.
It's a very simple social network where users can create posts and see, like and comment other users posts. On the main page it loads posts and orders them based on an index based on the number of likes, comments and when the post was created.
Since I can't load every post at once (because ideally there can be millions of them), I load the top N posts and then when I scroll down it loads more posts (with an Ajax request) and adds them at the bottom of the page. The problem is that since the posts are ordered dynamically, if I just limit the number of posts and then offset them in the later requests, I sometimes get the same post twice, and some posts never gets shown. How can I solve this?
Right now it just checks with javascript the id of every new post and it just discards the ones that are already on the page (checking the id of the posts on the page), but I dont like it because every time it loads more posts it has to check if every single post is already on the page or not and if the number of post grows it will get very slow.
If you have a lot of processing power in the server and you're able to do the scoring for any time in the past (e.g. taking into account only likes previous this time), you can make your page send: (1) the timestamp of the first loading, and (2) the number of post already there (or, better, the score of the worst post it received, because posts could have been deleted since then).
Let the server compute the list of the past and send the next posts.
If you have more memory in the server than processing power, then you could just save a copy of the scored list for some time (1h, 24h…) whenever someone request it.
And if you want the server to send the new best post of the moment: if you know the present listing and you know what the user already got (with their timestamp and last score they got), you can make the server removes what they already have and gives them the rest.

1 video view = 1 Parse request? How to improve that?

I'm working on an app a little like Vine, where several looped videos are displayed on the screen of the user. I need to count one view per loop. It means, if the user repeat the video 5 times, it will count 5 views. And this is the model I want to use for every videos of my app.
I use Parse for my back-end and a webview to show the videos. It means that I use Javascript to send requests to Parse, with Ajax calls.
My problem is that I don't really know how to limit the number of requests sent to Parse when I add a view on a video.
Maybe I should save the video views to a MySQL database and then, once a day with a cron task, save the MySQL results to Parse? I don't really know how to proceed, but I really need to limit the number of requests to Parse.
How would you design this?
Thanks!
My first thought is to not optimize too early. There should be plenty of time, as you accrue zillions of users, to improve the design.
If you want to improve it early (and still use parse), keep the object that tracks views "pinned" locally (see this blog entry). Update the view count as often as needed, then update parse on an NSTimer.
The app may become inactive at any time, and if unsaved views have been counted since last time the timer fired, then there's one more problem to solve. The app delegate gets told that applicationDidEnterBackground, and can request a moment to finish "one last thing". See here under "Executing Finite Length Tasks".
There (iIn the dispatch block suggested by the sample code), save the object that counts views (saveInBackgroundWithBlock:), invalidate the timer, and tell iOS you're done with [application endBackgroundTask:bgTask];
What I should is store the video's somewhere else and save 1 view per click.
You can save this click in the background using something like this:
userClick.saveInBackground()
It saves the click in a background proces so the user doesn't have to wait for the sync with Parse.
note: You should use Bolts (https://github.com/BoltsFramework/Bolts-iOS) to get saveInBackground() working.
* edit *
Maybe it's smart to sync with parse every x amount of clicks, maybe 5 or 10. To limit the amount of requests.

How can I chunk results (lazy load?) without rewriting my whole application (Laravel + jQuery, SPA style)

I've developed a web application with the concept of Single Page Application but none of the modern techs and frameworks.
So I have a jQuery page that dynamically requests data to localhost - a Laravel instance that compiles the entries in the DB (within a given time interval).
So the client wants to see all the entries for last week, the app works fine. But if he wants to see the results for the whole last month... well, they're so many that the default execution time of the php ins't enough to process all the data (30 seconds). I can easily override this, of course, but then the jQuery client will loop through these arrays of objects and do stuff with them (sort, find, sum...). So I'm not even sure jQuery can handle this many data.
So my question can be broken in two:
Can laravel ->paginate() be used so the ajax request of jQuery can also chunk the data? How does this work (hopefully in a manner that doesn't force me to rewrite all the code).
How could I store large amounts of information on the client? It's only temporary but the users will hang around for a considerable amount of time on my webpage, and I don't want them to wait 5 minutes every time they press a button
Thanks.
If you want to provide an interface to a large amount of data stored in a backed, you should paginate the data. This is a standard approach, so I'm sure your client will be ok with that.
Using pagination is pretty simple - see the docs for Laravel 5.0 here: http://laravel.com/docs/5.0/pagination
In order to paginate results in the backend, you need to call paginate($perPage) on your query instead of get() in your controller, like that:
$users = User::whereIsActive(true)->paginate(15);
This will return paginated result with 15 records per page. Page number will be taken from page parameter of the request. In order to get 3rd page of users, you'll need your frontend jQuery app to send a request to URL like:
/users?page=3
I don't recommend caching data in the frontend application. The data can be changed by some other user and you won't even know about it. And with pagination, your requests should be lightweight enough to stop worrying about a request sent to fetch every page of results.
Not sure if you're subscribed to laracasts but Jeffery Way is amazing in explaining features of Laravel and I highly recommend his videos.
In short you can paginate the results, then on the view when you call the foreach on your items you can array_chunk() the results to display them how you need to. But the paginated results are going to be fetched using a query in the URL, and i'm not sure that is what you want if you're already using a lot of jQuery to keep everything on the same page.
https://laracasts.com/lessons/crazy-simple-pagination
But assuming you're already paginating the results with whatever jQuery you've already written for the json data...
You could also use a query scope to get the data you need to for the amount of time to scope to create a simple api to use with ajax. I think that's probably what you're looking for.
So here's what I would do assuming you're already doing some pagination manually with your javascript.
Create a few query scopes to filter the data for different lengths of time
Create simple routes to fetch results from URI using the query scopes
Get the json data from the route preforming an ajax requests to the URIs created
More information on Query Scopes: http://laravel.com/docs/5.1/eloquent#query-scopes

Best way to handle graphing and display of large data sets

Our website provides various data services to our clients; one of which is gauge data. Some gauges log information every 15 minutes, some every minute. This data is sent to our SQL database.
All of this data is displayed via a graph (generated server side via PHP and JPGraphs) with each individual log entry being displayed as a row in a collapsible table (jquery 1.10.2).
When a client wants to view the data, they select a date range and which gauges they would like to view. If they want to view the last 3 days of a gauge that logs every minute then it loads pretty quickly. If they want to view 2 of those then it takes around 15-30 seconds to load. The real problem comes when they want to view a months worth of data; especially more than 1 gauge. This can take upwards of 15-20 minutes to load and the browser repeatedly asks if we want to stop the script from populating the collapsible table rows(jquery).
Obviously this is a problem since clients want a relatively fast response (1-5 min max). Ideally, we would also like to be able to pull gauge data from several months at a time. The only way we can do that now is to pull data 2 weeks at a time and compile the total manually.
For reference: If I wanted to pull a months data for 2 of our once-a-minute-logging gauges, then there would be 86,400 rows added via jQuery to a collapsible table. The page takes approx. 5 minutes to load and the browser is terribly slow during this time period.
My question is: What is the best way to pull/graph/populate data using a PHP based server (Symfony 1.4 framework) and javascript?
Should we look into upgrading our allotted processing power/RAM(we are hosted by GoDaddy)? Is there a faster way to populate collapsibles than with jquery? All of our calculatoins are done server side. Should we just pull the raw data and let the client side do the data processing? Should we split the data processing between client and server?
Here's a screen shot of the web page. Its cropped so that more client-sensitive information is not displayed:
In response to my comment.
Since you need the entire data-set only on the server side (you create your graph on the server), this means that you don't actually need to send the entire data-set to the client.
Instead send a small portion to the client. Let's say the first 200 results. Then you can go ahead and cache the rest of the result-set into a JSON file (lite database, whatever you want really). Then create an interface where the user can request for more data. Infinity scroll is nice but has its own problems. Maybe just a button that says load more data. As people have said anything more than a few hundred data points in a table at one time is crazy to have because people won't look at it anyways. Then when they hit the button to get more data, you send an AJAX request to the server with the correct parameters for what data you want.
For example the first time they click getMoreData() you want to get the next 200 data points. So you send getMoreData(start=200, length=200). Your server picks up the AJAX request and finds the correct data in the JSON file or the lite database, wherever you have cached the results. And the user can keep requesting more data (making sure you update your start parameter), and you only ever return a small subset. The user doesn't even realize that they don't have the whole data-set there in front of them because it looks like they do.
One that is complicated about this is sorting and searching. If you want to implement those then you need to make sure you go to the server side and sort/search through the cached results.
So basically you have a system where you can create the entire graph on the server side which shouldn't take long. What does take long is the loading of the entire data-set to the client side. So you break up that up into small chunks. You can even easily create pagination and the such with this method.

Way to asynchronously load SharePoint profile images

I'm working on a sharepoint farm that has User Profiles enabled. We're creating a community feature which has a profile wall of all members of that community. I need to retrieve and display profile pictures from a search based source and display the results as they are returned in an appealing efficient way.
We have two avenues:
1: FAST search indexes the profiles of every user every 6 hours. We can run a membership query and return all members of [x] community.
2: We can use the profile API to do a search. This is slower but does not rely on the 6 hour index and therefore gives us up to date information.
We need to make this call via JavaScript, as sever side code is locked down and not an option. I'd like to write a function that calls these profiles and loads the images into a wall one at a time as they are retrieved. Possibly in a timed loop, so an image loads every 100 milliseconds.
I believe profile photos are stored as a text property containing the photo URL, so the URL can be set as an images source.
How would I go about quickly loading a set of images asynchronously to provide a good user experience?
Since you do not have server side code option, I would suggest you to go for a Jquery Script which would render these images. This javascript code can be loaded asynchronously as suggested in this article:
https://wiki.base22.com/display/btg/How+to+load+JavaScript+dynamically+with+jQuery

Categories

Resources