best way to relate a variable and its ID - javascript

I am getting results from a database with a simple php while loop, one of the pieces of information is a number that links to another table where the value is stored, I can think of plenty of ways to get this information linked and display the text related to the value but I want to know the fastest way to do it as I have a huge set of results so every bit of speed will make a difference. Is an array fastest, javascript? any advice you can give me would be great.
The schema would look something like this
col_table
colID(autonumber) colName(str) colState(int) colDate(date)
state_table
stateID(int) stateType(str)
I want to select the correct state type based on the colState matching a stateID and output the stateType while preserving the stateID for so I can edit the field and update the database using the number.

Using MySQL will be faster.
If you have to get through a PHP loop to read your results and make each time a new MySQL request, your script will take longer.
You can increase speed on MySQL by creating the right kind/amount of index, choosing wisely what is store in each field.
The later you parse content, the longer it will take. If you go for js, you will have to read a DB, loop trough it in PHP and do it again in JS, and making more request again ...
A join can be a good solution. A view can be even more easier to treat. Yuo can also consider caching results

Use a timer in php and try trial and error method. Use the time returned by the timer to evaluate speed and efficiency.

you should prepare your data on server side it is faster.
Whether you choose your server or database with a fast query it depends. If you have complex object graphs then the processing of results from db in order to create associations would be time consuming so an ORM is the way to go, otherwise as is your case with a simple join i would simply retrieve all data from db.
If you use php for rendering as well then render it using php no js.
If you use js for your ui then prepare data on server side and publish it via a REST webservice in json,i.e. usind json_encode functions of php, then retrieve it from js and output.

Related

Dynamically creating tables with indexedDb

On my web-app, the user can request different data lines. Those data lines have an unique "statusID" each, say "18133". When the user requests to load the data, it is either loaded from the server or from indexedDB(that part im trying to figure out). In order to make it as fast as possible, I want the index to be the timestamp of the data, as I will request ranges which are smaller than the actual data in the indexedDB. However, I am trying to figure out how to create the stores and store the data properly. I tried to dynamically create stores everytime data with a new Id is requested, but creating stores is only possible in "onupradeneeded". I also thought about storing everything in the same store, but I fear that the performance will make that bad. I do not really now how to approach this thing.
What I do know: If you index a value, it means that the data is sorted, which is exactly what I want. I dont know if the following is possible but this would solve my issue too: store everthing in the same store, index by "statusID" and index by "timestamp". This way, it would be fast too i guess.
Note that I am talking about many many datapoints, possible in the millions.
You can index by multiple values, allowing you to get all by statusID and restricting to a range for your timestamp. So I'd go with the one datastore solution. Performance should not be an issue.
This earlier post may be helpful: Javascript: Searching indexeddb using multiple indexes

Is it possible to send localStorage data, and then retrieve it again? AJAX?

My scenario is this: I have a very small array in my js file. When the page loads up, I have a function that loops through the array, and generates an li element for each item in the array, displaying it's name and price in the li. The array is constructed like this:
var gameList = [
{ name: "", value: 0.00},
]
Secondly, I have a simple form on the page that allows me to add new items to the array, and using localStorage, it's possible for me to keep a dynamically updated array. I push new items into the array (gameList), then at the end of the session I set it using localStorage.
localStorage.setItem("updatedGameList", JSON.stringify(gameList));
I have a couple of lines at the start of my code that sets my original array 'gameList' to be equal to the locally stored, updated game list.
var retrievedData = localStorage.getItem("updatedGameList");
gameList = JSON.parse(retrievedData);
So this is fine for now, but the growing array - which I want to keep and maintain - is only available in this browser, on this machine.
So, my question is, can I send this locally stored data somewhere? Maybe my personal domain? (Which is where I will host the app when it's finished) That way I could then reference it properly in my js file so that the data is always available? Maybe the array could have it's own js file?
I realise that this may not be the best way to be handling what is essentially a database. But I'm only part way through an online course and I'm using the tools that I have to make this work.
And lastly, in terms of maintenance of the array, is there any way to send it back to sublime in the form a .js file? I know this could be a crazy question. The updated array will become pretty big, maybe 200 items eventually, and it would be much easier to maintain from within sublime.
Thanks for your time, and apologies if part of this request is ridiculous!! :)
I have just been reading about AJAX, and thought maybe there's a way to send the updated array as a json file to somewhere(!) on my website, and then request that same file at the start of each new session, so I'm always working with, and saving, the latest updated array.
Thanks for reading, and hopefully you have some answers! :)
Although not quite what I was looking for - essentially some way of automatically getting the new array, sending it somewhere more secure than local storage, then referencing the new array to give me the most up to date starting point each time (and all with just javascript) - the 'dirty' way suggested below turned out to be sufficient for now until I start using databases.
From Kirupa, over at the forums:
Not a ridiculous question at all! You can send your own data anywhere you want, but it will require some level of server-related code. The easiest way to send data back and forth is through JSON, and you can convert your array into a JSON format easily using something like the following:
var jsonData = JSON.stringify(myArray);
From here, you can send this data to a database, to another web site, or to your e-mail server. If you want something really quick and dirty, you can literally just copy the contents of your JSON-ized array using the Chrome Dev Tools, save it on disk as a .js file, and reference it again in your app. That is a manual way of doing something that you don't really care about automating.
The best solution is to store this in a database. They've gotten easier to deal with as well. Firebase is my go-to for things like this, and this video might give you some ideas: https://www.youtube.com/watch?v=xAsvwy1-oxE

Using Jquery AutoComplete with dictionary list

I have a dictionary list of about 58040 words and i don't think jquery auto complete can handle that many words as the browser hangs.
The list is
words = ['axxx','bxxx','cxxx', an so on];
$(".CreateAddKeyWords input").autocomplete({ source: words });
Am i doing something wrong
Is there another free tool that i can use
Edit
i am using .net and i have retrieved the data fro the database and can loop through the data server side, but how do you send the data back, if json format how should the format look like?
Is there another free tool that i can use
Yes, instead of hardcoding 58040 words in your HTML or javascript file you could load them from a remote datasource using AJAX. Basically you will have a server side script which when queries with the current user input will prefilter the result and send it to the client to display suggestions.
You should assign a minimum length of user entry before searching (so it isn't querying with 1 or 2 characters).
$(".CreateAddKeyWords input").autocomplete({ source: words, minLength: 3 });
It's possible the browser is hanging because it is trying to search on the very first character which is not very useful. ~58k entries is not a large dataset by most regards, especially when you narrow it by 2-3 character contents requirements.
That's just way too much data to have it load in your webpage. Limit it to 2 letters.
1) set the autocomplete min length to at least 2
2) Create a webpage that returns JSON data - http://mydomain.com/words.php?q={letters}
You can have the filter sort be 'begins with' before 'contains'; or any variation you prefer.
Use that page as your remote data source. With the min length set, autocomplete knows when to query for new data.
I thought this was an interesting problem, and hacked up a backend service that solves auto-completion.
My code is at https://github.com/badgerman/fastcgi/ (look for complete.c), and the quick and dirty javascript proof of concept from that repository is currently at http://enno.kn-bremen.de/prefix.html (no guarantees that it will stay up for very long, since this is running on the Raspberry Pi in my home).

How dangerous is it to store JSON data in a database?

I need a mechanism for storing complex data structures created in client side javascript. I've been considering using the stringify method to convert the javascript object into a string, store it in the database and then pull it back out and use the reverse parse method to give me the javascript object back.
Is this just a bad idea or can it be done safely? If it can, what are some pitfalls I should be sure to avoid? Or should I just come up with my own method for accomplishing this?
It can be done and I've done it. It's as safe as your database.
The only downside is it's practically impossible to use the stored data in queries. Down the track you may come to wish you'd stored the data as table fields to enable filtering and sorting etc.
Since the data is user created make sure you're using a safe method to insert the data to protect yourself from injection attacks (don't just blindly concatenate the data into a query string).
It's fine so long as you don't deserialize using eval.
Because you are using a database it means you need a serverside language to communicate with the database. Any data you have is easily converted from and to json with most serverside languages.
I can't imagine a proper usecase unless you have a sh*tload of javascript, it needs to be very performant, and you have exhausted all other possibilities such as caching, query optimization, etc...
An other downside of doing this is that you can't easily query the data in your database which is always nice when you want to get any kind of reporting done.
And what if your json structure changes? Will you update all the scripts in your database? Or will you force yourself to cope with the changes in the parsing code?
Conclusion
Imho it is not dangerous to do so but it leaves little room for manageability and future updates.

Finding changes in MongoDB database

I'm designing a MongoDB database that works with a script that periodically polls a resource and gets back a response which is stored in the database. Right now my database has one collection with four fields , id, name, timestamp and data.
I need to be able to find out which names had changes in the data field between script runs, and which did not.
In pseudocode,
if(data[name][timestamp]==data[name][timestamp+1]) //data has not changed
store data in collection 1
else //data has changed between script runs for this name
store data in collection 2
Is there a query that can do this without iterating and running javascript over each item in the collection? There are millions of documents, so this would be pretty slow.
Should I create a new collection named timestamp for every time the script runs? Would that make it faster/more organized? Is there a better schema that could be used?
The script runs once a day so I won't run into a namespace limitation any time soon.
OK, this is a neat question b/c the short is basically: you will have to iterate and run javascript over each item.
The part where this gets "neat" is that this isn't really different from what an SQL solution would have to do. I mean, you're basically joining a table to itself where x.1=x.1 and y.1=y.2. Even if the relational DB can handle such a beast, it's definitely not going to be fast with millions of entries.
So the truth is, you're doing this right way. Here are the extra details I would use to make this cleaner.
Ensure that you have an index on Name/Timestamp.
Run a db.mycollection.find().foreach() across the data set.
Foreach entry you're going to a) Perform comparison. b) Save appropriately. c) Update a flag indicating that this record has been processed.
On future loads you should be able to add a query to your find. db.mycollection.find({flag:{$exists:false}}).foreach()
Use db.eval() to help with speed.
The reason for the "Name/Timestamp" index is that you're going to be looking up each "successor" by "Name/Timestamp", so you want to be quick here.
The reason for the "processed" flag is that you should never have to re-run the same item. If given timestamp 'n' you find 'n+1', then that's the only 'n+1' you're going to have.
Honestly, if you're only running this once / day, it's quite likely that the speed will be just fine, especially if you only running on new records. Just assume that it's going to take several minutes.

Categories

Resources