How would I speed up this JSON request process? [closed] - javascript

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Apologies if this is too abstract.
I using mainly jQuery to create a Chrome extension where users can subscribe to feeds. A 'random' selection of items from these feeds are then delivered on the user's new tab page.
Here's the process I'm planning to use. Is this the fastest way of doing it?
Check which feeds the user is subscribed to and use nested $.getJSON requests to bring them in.
Cache these locally for at least the next 12 hours so they can be quickly retrieved again.
Convert the feeds to JSON objects and combine using concat.
Shuffle the items in this new, single, combined feed.
Load the first 36 results (and display using Masonry).
Add an infinite scroll that loads blocks of 36 results when the user scrolls down.
On top of this, I also intend to cache the JSON feeds on the server for 12 hours or so.
Is this the quickest way of going about it? Might it be faster to swap steps 2 and 3 around given that it's not very frequent that a user will subscribe to more feeds?
As the emphasis is on loading speed, there's plenty else I'm happy to sacrifice. For example, it doesn't have to be JSON I use if a database would be better. Similarly, the caching can be for a long period of time because the objects the user is being shown will be in a 'random' order and so needn't be the latest.

You could do most of this on the server side and only send the 36 required feeds, which should load very fast depending on their content.
Your biggest bottlenecks are the following:
Fetching the various feeds (~1 second per feed)
Building a list of those feeds for a specific user
Downloading the feeds to the user's browser
The last two are not terribly long, but the difference between loading 36 presorted items and organizing an arbitrary number of random elements is important when the new tab page is only used for a few seconds.
The easiest solution is to fetch all feeds on the server, put them in a database and retrieve the last 36 matches for the users that need them. This has several advantages:
Fetching from a database takes milliseconds. Fetching a feed from a remote page can take a few seconds.
Databases are made to be queried, so fetching, merging and sorting results is easy.
The short, pre-filtered list of items will download far faster.

Related

What would be the best solution to store and work with data in JavaScript without a database? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Alright, please allow me to elaborate on that. I am trying to build a small application for visual flight planning, to help with the calculations. I am using HTML/CSS/Javascript to make something simple. There are objects for airports, waypoints and airplanes for example. You select an airport for your origin, another for your destination, input the waypoints you will fly through and choose the aircraft, and the app uses the coordinates and airplane characteristics to calculate the legs. It has a few more steps but that's the basic concept.
So what I would like to do is store the data in a separate file and retrieve just the data I need when the element is selected. I am a beginner level, so I don't know if I am day dreaming.
Right now what I did was create an array with all airports, another with all waypoints and another with the aircrafts. A function retrieves the data from the array and returns the object. But that seems like a waste of memory.
Another idea I had was to mae a function with a switch statement, using the ids (airport/waypoint code for example) and returning the object selected. I did this for the magnetic deviation, using the coordinates.
I would like to make something that I can update later in the future, but with a simple structure, if at all possible. SQL sure pops in mind, but I am thinking of something more local. LocalStorage also appears to be an idea, but it would have to be initialized everytime the browser loaded, and still would be a waste of memory
This question is kind of relative, and some people could even view it as opinion-based. However, I think a concrete answer can be found accounting for some details.
Storing it in the application (arrays/objects)
Yes, that's what you're doing now, and that's what you view like a waste of memory, I know. However, depending on your situation, you shouldn't care too much about that.
Anyway, I'd recommend you to keep your data this way when it is not too large and is unlikely to change. Probably this is not the best alternative for you, since you said you pretend to update the data in the future. It's not only about memory, but also about maintainability.
Local web storage
If the solution above is not suitable for you, probably this also isn't. Why? It's simple. In the process of storing data in localStorage, you'll need to have that data in your code, in the form of an array or object. In the end, you'll have the data in your application AND in the local web storage. It doesn't make much sense, don't you agree?
localStorage should be used for storing data which is relative to the user and which will not be accessible for you after the page is left, unless you save it. This kind of storage would be not be a good choice for storing data that will be accessible for you at any condition.
Using a JSON file
This could be a good solution if your data is not likely to change constantly. And it also would meet your expectations, since you could request data from it only when it's needed. You should take your own situation into account, but, from what I could understand from what you said, that is, you have data which is going to change sometimes and it's somewhat large, using a JSON file would be a good if you don't want to use a database.
Finally, there are several advantages, even for static (but somewhat large) data, of using a database. You could opt, for example, for SQLite, that's an embedded database. However, that is a topic for another question.
For now, since your data will change sometimes and is potentially lengthy, using a JSON file looks like the more suitable option within the "no-database" limits.
localStorage is the way to go.
Just save your object with localStorage.setItem('item', JSON.stringify(obj)) and retrieve it later with JSON.parse.
There's no better way to save data locally in the client side.

Create navigable tree from csv of path names [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
we are struggling with a large migration project currently and are on the last leg of putting together the user interface.
I have a list of effectively folder paths from the old system like so:
/Programme1/Project1/WorkPackage1/Resources
/Programme1/Project1/WorkPackage1/Plans
/Programme1/Project1/WorkPackage1/Finance
/Programme1/Project1/WorkPackage1/Reporting
/Programme1/Project1/WorkPackage1/Documents
/Programme1/Project1/WorkPackage2/Resources
/Programme1/Project1/WorkPackage2/Plans
/Programme1/Project1/WorkPackage2/Finance
/Programme1/Project1/WorkPackage2/Reporting
/Programme1/Project1/WorkPackage2/Documents
/Programme1/Project2/WorkPackage1/Resources
/Programme1/Project2/WorkPackage1/Plans
/Programme1/Project2/WorkPackage1/Finance
/Programme1/Project2/WorkPackage1/Reporting
/Programme1/Project2/WorkPackage1/Documents
/Programme2/Project1/WorkPackage1/Resources
/Programme2/Project1/WorkPackage1/Plans
/Programme2/Project1/WorkPackage1/Finance
/Programme2/Project1/WorkPackage1/Reporting
/Programme2/Project1/WorkPackage1/Documents
/Programme2/Project1/WorkPackage2/Resources
/Programme2/Project1/WorkPackage2/Plans
/Programme2/Project1/WorkPackage2/Finance
/Programme2/Project1/WorkPackage2/Reporting
/Programme2/Project1/WorkPackage2/Documents
/Programme2/Project2/WorkPackage1/Resources
/Programme2/Project2/WorkPackage1/Plans
/Programme2/ Project2/WorkPackage1/Finance
/Programme2/Project2/WorkPackage1/Reporting
/Programme2/Project2/WorkPackage1/Documents
Currently these are in a csv, we need to be able to create a navigable object that a user can use to navigate through to find relevant documentation.
We have a number of issues:
There are 114000+ rows in the csv
We know the max number of subfolders and it's large (too many to code manually!).
There are special characters in the list, including umlauts, french accented chars + greek alphabet chars...
A fair number (2000+) rows of the list are longer than 400 characters..
We're limited also to what tools we can use. We've been playing with json/jquery/jstree/javascript/excel-vba and have had some success, but its been painful.
If anyone out there has had a similar challenge and any success I'd be interested in finding out how you went about it!
Thanks for looking.
Fohls
If I were you I'd transform the flat paths from the csv into a tree structure and store it in a database. This is essentially the information you have but without the redundancy in the csv. From there it's pretty straight forward transform it to a presentation form. Jstree is one good option. Shouldn't take long to get that up and running.
The solution was to create a mini db (1 table) containing all the folder paths in a single column, plus their parent and whether they had children (figured out using VBA...)
We then used a REST call to pull back the paths/nodes related to the node we clicked on and then pushed those into a function in javascript on the page to then convert to JSON and then use them in JSTREE!
Painful but it works... Now to work out how to make the throbber display when we click....

How 'heavy' is it on the server to run mysql commands every few seconds [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a set of data that is used by GUI for positions and stuff, and that positions are also stored in MySQL. Every time the data changes, it is reflected on the server too.
Then if I want to update MySQL as frequent as jquery (or javascript, I don't know) tracking "mousemove" motion, that is, send request to the server that for every "mousemove" change the value in MySQL, how 'heavy' is it, especially when multiple users are using the same server?
What would be a better solution? I'm thinking of waiting for 3 seconds until the motion is finished, then if there is no more motions, sending request to the server then.
Although multiple people aren't using my server, but this always concerns me and hinders me from progressing. Please help.
Its better to have an intermediate in-memory solution to mitigate the number of database calls. Eg. Memcached. But! but! before we begin, lets have some numbers.
So lets do the Math, shall we?
You want to track the events to their details.
| Action to be recorded | Frequency of Occurrence | Per user/minute avg. |
|:---------------------------:|:------------------------:|:--------------------:|
| 1. MouseMove!!! (Seriously?)| Very Heavy!!!| 200|
| 2. Clicks (MouseDown) | Medium to High| 10|
| 3. Hover (MouseOver) | High| 50|
So with a very rough ball park estimate for one active user the total events fired for a specific session would be 260/per minute.
So for lets say 10 concurrent users your events per minute become 2600.
And this becomes a sure fire way of DDoS(ing) your own server.
Some useful hints.
Try to log the events in batches, ie. log the events as they occur on the client-side and once a threshold is crossed send a request to server to log that batch.
Do not use your main application server for this kind of logging, because this thing is -- wait for it -- logging and you should use a separate server for maintaining logs.
As said earlier, on the server too, implement an in-memory store of logs before pushing them to the database.
This data doesn't even need to be stored in a Relational DB. Nearly all big companies use flat DB(NoSQL) for storing this kind of info.
If you still decide to proceed with relational DB, when doing the INSERTs use Transactions.
If the tracking is to be done for only the logged in users, and a dedicated in-memory cache is not feasible for you, you could store the tracking data in user's session, and again after a threshold move the records to DB.

Best technology to handle such a comlex system, Wep app + heavy processing [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm working on a web application (JavaScript + PHP) for a taxi company. I need to add advanced alert system in which each app user (company employees) can specify what type of alert would like to get. Let' say user#1 wanted to track taxi#101 and get alerted whenever it enters any customer's district and show their details. user#2 wanted to get alerted whenever taxi#202 and taxi#303 enters any customer's area.. etc.
Note. There are many other cases for alerts but I prefer to think of theme one after the other.
I started scratching the high level design as follows:
I'll add an alert option to each taxi on the app.
user#1 will choose taxi#101 to get alerts from.
taxiID will be added to DB table with userID.
Do batch processing every 3 hours, joining this table with other table has taxi's positions.
Get a list of positions + area's + userIDs (not sure what technology should I use here to generate this list).
Script to run and check each area from the list and get all customers info inside it.
Join result table from step 6 with table from step 3
Every time user logged in, scrip will run to check the table and list the customers info that the taxi comes near them.
My Questions are:
How can I translate these steps into real software design flow? (exact techniques)
There must be a technique to handle all this headache. imagine that there are 50 users and over 5000 taxi. I'm thinking of drools and started reading about it but still not sure how it fits here.
I'd be grateful for any hint.
EDIT
Thank you Alexander. Here are the answers for your questions:
Q: why would any one user even need to get alerted?
A: I'm not sure. For generating reports or something like that.
Q: Do you always know the positions of Taxis?
A: Yes, The system receives real time positions through GPS
Q: Do you always know the positions of Users?
A: No, it's not important. thy are regular employees. I just need their IDs so I can show each one the info he asked for.
Q: How do you know what an 'area' is, and what determines if a user/taxi is in range?
A: When I get the position of the taxi, I draw a virtual zone around it and call it and area. It should inserted in the DB as well.
Q: are the rules predefined? Or must the user be able to create them dynamically?
A: Yes, they are predefined. There are many other rules such as [The user can choose to get alerted when specified taxi has not moved for a long time]. but I started with this one.
The scenario, each user will request different alerts for different taxis and I should do some processing then display the results for each user.
I couldn't figure out how to process these different variables and what's the best solution for this case.
Out of interest: why would any one user even need to get alerted?
What data do you have access to?
Do you always know the positions of Taxis? Is this GPS? Do drivers add/update data?
Do you always know the positions of Users? Is this GPS? Do users add/update data?
How do you know what an 'area' is, and what determines if a user/taxi is in range?
KISS - Keep It Simple Shadin
I propose the following entities in your system:
Taxi
+ id : int
+ positions : Position[]
User
+ id : int
+ username / password / email etc
+ observing : Taxi[]
+ rules : Rule[]
Rule
I do not know what you need or expect from this
I don't know what you mean by area, but you probably need some entity for that as well, though maybe not in the database - I don't know.
Now, for the rules. It seems you want a Domain Specific Language (DSL).
Then the question becomes, are the rules predefined? Or must the user be able to create them dynamically?
If they are predefined, you could simply implement a class for that rule, and execute it in your controller (or whatever you use to handle http requests) when it's time to check that rule. Something like myRule.checkFor(user). May even be a worker process of some kind.
I'll update my answer when you answer my questions as comments :)
EDIT:
Ok, so I think I'm starting to get a feel for your problem. This would be my solution:
REST API:
POST yoursite.com/v1/taxi/1.json {x:0.23, y:23} //update position for taxi 1 with json payload
POST yoursite.com/v1/user/1/register {taxi:1, eventType:NO_MOVEMENT, args:"3600"} //no mov for 1 hr
POST yoursite.com/v1/user/1/unregister {taxi:1, eventType:NO_MOVEMENT }
This seems to me should be sufficient for aggregating the data you need. It answers the following questions:
Where are the taxis? Which user is looking for events on x taxi?
You'll need a table (taxipos) for storing taxi id and it's position. [taxiid:int, long:float, lat:float]
You'll need a table (listeners) for storing which taxi is listened to by which users and for what event. [userid:int, taxiid:int, event:enum, arguments:string]
You'll need a table (events) that stores each event. [eventid:int, eventType:enum, taxiid:int]
The next step in my proposed solution would be to have a worker continuously aggregate events based on the recieved data. Say you define a NoMovement process. This process is run at a frequency that is sufficient to detect the least allowed interval in your system. For this example, lets assume that we could run at 15 minute interval (that is to say, you're not interested in a taxi that stands still for 14 minutes).
Now, each of your processes will be able to store events in a new table in your database. Let's investigate the NoMovement process (which will be run every 15 minutes by your worker):
For each $taxiid present in listeners#taxiid
Retrieve taxipos closest to now, and taxipos closest to 15 minutes ago (interval must be => to 15 minutes)
Check distance between now, and 15 minutes ago
If distance is <1 THRESHOLD (I don't know what this should be)
Add to event table with (eventType:NO_MOVEMENT, taxiid: $taxiid) you'll get that from the loop
Finally, it's just a matter of
GET youriste.com/v1/events/taxiid (your client will need to store which taxiids we want to ask for events, and call get at an appropriate interval - your client should then locally set event by id as read and filter those out before alerting)
Does this make any sense to you?
From what you are saying, it sounds like your system will include mobile devices of some kind, and you will need a map system for tracking.
Best solution I can recommend is using something like Leaflet with the openstreetmap api,this would give the map as an svg with ample xml data for tracking locations, and a simple js script that would use ajax to post location data to the server at set intervals.
Using backbone js gives simple event listener control for your alerts. Just have to test taxi location vs customer area when locations are updated.
List of OSM frameworks including Leaflet http://wiki.openstreetmap.org/wiki/Frameworks

Dynamic filtering, am I doing it wrong?

So I have an umbraco site with a number of products in it that is content managed, I need to search/filter this dataset on the front end based on 5 criteria.
I'd estimate I will have 300 products. I need to filter this data very fast and hide show options that are no longer relevant based on the previous selections.
I'm currently building a webservice and jquery implementation using AJAX.
Is the best way to do this to load it into a javascript data structure and operate on it there or will AJAX calls be fast enough? Obviously this will mean duplicating the functionality on the server side for non-javascript users.
If you need to filter the data "very fast" then I imagine the best way is to preload all the data then manipulate it client side. If you're waiting for an Ajax response every time the user needs to filter the data then it's not going to be as fast as filtering it on the client (assuming they haven't got an ancient computer running IE6).
It would depend on the complexity of your filtering. If all your doing is showing results where, for example, the product's price is greater than $10, then that will definitely be much faster. If you're going to be doing complex searches then it's possible that it could be faster to process serverside. The other question is how much data is saved for each product - preloading a few hundred products with a lot of data may take some time.
As always, the only way you'll truly be able to answer this question is by profiling the two solutions.

Categories

Resources