I have a situation where I need to insert
into multiple tables and the last table will depend on the table before it with some extra list of items. I am using pg with node and javascript.
Table 1:
insert into col1, col2 values (1, 2) return col1 as table1_id
Table 2:
insert into col1 col2 values ( table1_id, val1),
(table1_id, val2),
.
.
.
.
(table1_id, val_n)
I have seen here some who have done that but with just 1 row in the last table but I will have a list of items as val1....valn... to be merged with the id generated from the first insert in table1 and added to a list that I have. This is going to be a relationship table with a list of items that belong to a record in table1.
Any clues would be appreciated..
Thanks
I think you should create a 'dao' layer in postgres. An Postgres function with all parameters and return what you should have in frontend.
And this postgres function insert all row what you want.
Like this you can manage your problem:
CREATE OR REPLACE FUNCTION function_what_i_want(val1 integer, val2 integer, my_argument integer[])
RETURNS my_schema_and_table[] AS
$BODY$
DECLARE
result my_schema_and_table[];
BEGIN
insert into table1(col1,col2,col3) VALUES (nextval('seq1'),val1, val2);
insert into table2(c1,c2)
select currval('seq1'),m.c2
from unnest(my_argument) m(c2);
END;
I don't know your project. If you doesn't want search on this tables, perhaps is better store this lists in json.
I want to store this list of lists but I don't know how to store it in MySQL
list[x][y] the items in this list contains {li:[{x:x,y:y}] , pos:{x:y}}
list[x][y].li[z].x
list[x][y].li[z].y
list[x][y].pos.x
list[x][y].pos.y
for better undersing, please have a look at this
edited:
is this right? so this means i will only have 2 tables?
You should use a separate table with sub-lists that have a column parent_id, and then a third table with actual list items of low level lists.
The query for this will look like this:
SELECT li.x, li.y, sl.id
FROM li_items li
JOIN sub_lists sl on li.list_id = sl.id
JOIN lists l on sl.parent_id = l.id;
The process of converting the result rows depends on if you use some ORM or plain mysql client.
You could also store it as a JSON, as deleted answer has suggested, but than you wan't be able to query specific items without selecting and parsing all the lists. You could also use MySQL's JSON column, but In your case having separate tables seems to be better
I've searched into the documentation of data tables of how one can add data to the table here
https://datatables.net/examples/data_sources/index.html but no where I could find a way to insert data into a single cell for a given column. For example, I need sth like:
"columns": [ {target (0), "value which will be inserted"} ]
There is a way to inser data from an array but each section of the array contains values for the whole row ( see here https://datatables.net/examples/data_sources/js_array.html) but I need to insert data into different cells in respect with the columns since I don't know the column labels initially. This is because the data will be in json format and first I need to extract the unique dates for each json object. These will be my column headers. And then for each of the objects based on its date I need to put it into the relevant date column. So the logic should be sth like:
if this date column (from the table) == json object date then put it there
Thanks
It seems like after a lot of researching this feature is not available in DataTables. So that's why I've solved the problem with jQuery. Using the code below I can iterate through a column and insert values in it.
//change 1 to the column at hand
$("#table tr > :nth-child(1)").each(function(index) {
//skip the column header
if (index > 0){
//insert value
$(this).html("<span class='fixed-size-square'> value to be inserted");
}
});
I am struggling to come up with a query for sqlite table to UPDATE OR INSERT at rowid. The problem I have is that I have a database where it relies on the rowid but when populating the table there may not be enough rows. So the table may look like this:
rowID data1 data2 data3
------ ------ ----- ------
0 18 1543
1 5
2 35 918
And for my query I want to be able to insert data2 = 453 at rowID = 16 for example. Also just to specify I do not have a column of rowID.
I am thinking that the only way possible would be to insert empty rows from rowID 3 -> 15 before inserting the 16th row. It does not matter for me if rows are empty as they will eventually be populated. I will not be working with more than 50 rows or more than 8 columns so it is a reasonably small table. Anyone know a way forward for me to work?
Also additional thing to note, I am using the query on Titanium, so the programming language is JavaScript.
The documentation says:
The rowid value can be accessed using one of the special case-independent names "rowid", "oid", or "rowid" in place of a column name.
[...]
An INSERT statement may provide a value to use as the rowid for each row inserted.
So just specify the values you want:
INSERT INTO MyTable(rowid, data2) VALUES(16, 453)
I have an ajax function which call a servlet to get list of products from various webservices, the number of products can go up to 100,000. I need to show this list in a html table.
I am trying to provide users an interface to filter this list based on several criteria. Currently I am using a simple jQuery plugin to achieve this, but I found it to hog memory and time.
The Javascript that I use basically uses regex to search and filter rows matching the filtering criteria.
I was thinking of an alternate solution wherein I filter the JSON array returned by my servlet and bind the html table to it. Is there a way to achieve this, if there is, then is it more efficient than the regex approach.
Going through up to 100,000 items and checking if they meet your criteria is going to take a while, especially if the criteria might be complex (must be CONDO with 2 OR 3 bedrooms NOT in zip code 12345 and FIREPLACE but not JACUZZI).
Perhaps your servlet could cache the data for the 100,000 items and it could do the filtering, based on criteria posted by the user's browser. It could return, say, "items 1-50 of 12,456 selected from 100,000" and let the user page forward to the next 50 or so, and even select how many items to get back (25, 50, all).
If they select "all" before narrowing down the number very far, then a halfway observant user will expect it to take a while to load.
In other words, don't even TRY to manage the 100,000 items in the browser, let the server do it.
User enters filter and hits
search.
Ajax call to database, database has indexes on appropriate
columns and the database does the filtering.
Database returns result
Show result in table. (Probably want it to be paged to
only show 100-1000 rows at a time
because 100,000 rows in a table can
really slow down your browser.
Edit: Since you don't have a database, the best you're going to be able to do is run the regex over the JSON dataset and add results that match to the table. You'll want to save the JSON dataset in a variable in case they change the search. (I'm assuming that right now you're adding everything to the table and then using the jquery table plugin to filter it)
I'm assuming that by filtering you mean only displaying a subset of the data; and not sorting.
As you are populating the data into the table add classes to each row for everything in that row you want to filter by. e.g.:
<tr class="filter1 filter2 filter3">....
<tr class="filter1 filter3">....
<tr class="filter2">....
<tr class="filter3">....
Then when you want to apply a filter you can do something like:
$('TR:not(.filter1)').hide();
I agree with Berry that 100000 rows in the browser is bit of a stretch, but if there's anything that comes close to handling something of that magnitude then it's jOrder. http://github.com/danstocker/jorder
Create a jOrder table based on your JSON, and add the most necessary indexes. I mean the ones that you must at all cost filter by.
E.g. you have a "Name" field with people's names.
var table = jOrder(json)
.index('name', ['Name'], { sorted: true, ordered: true });
Then, for instance, this is how you select the records where the Name field starts with "John":
var filtered = table.where([{ Name: 'John' }], { mode: jOrder.startof, renumber: true });
Later, if you need paging in your table, just feed the table builder a filtered.slice(...).
If you're getting back xml, you could just use jQuery selection
$('.class', context) where context is your xml response.
From this selection, you could just write the xml to the page and use CSS to style it. That's where I'd start at first, at least. I'm doing something similar in one of my applications, but my dataset is smaller.
I don't know what you mean by "bind"? You can parse JSON and then use for loop (or $.each()) to populate ether straight HTML or by using grid plugin's insert/add