live HTML table source updated by multi user - javascript

sorry for the strange question but I've been stuck on this for a few days now....
We're trying to connect all of our sales from multiple locations, and query our warehouse quantity.
Unfortunately it's too close to Christmas to be able to make POS software upgrades, so I've been tasked with writing the middleware.
Currently, I'm copying all .mdb from all shopfronts, running linq to join warehouse quantities and auto generating a HTML table all in VS Win Forms and C#.
So, What I'm wondering is, where should I go from here?
Ideally I'd love the final result to be a webpage with an interactive table in which users can remove/hide rows (once item is picked), export final results picked/non picked and have all this accessed simultaneously and live updated on tablet pc by multiple users who are picking.
I've tried the jQuery DataTables plugin but unless I'm incorrect it doesn't support simultaneous editing and have tried everything. (I'm very new to jQuery).
If someone could please point me in the right direction, even if it means I should change database types like sql, use php, or even use carrier pigeons etc they will be my hero!!!!

Related

How to transfer Values(items) from one Netsuit business system form to another Netsuit form automatically?

Currently, I have to fill out 2 of the same forms with the same language & part numbers on the Web Based business system "Netsuit" made by Oracle (extremely annoying & waste of time). I need to use a software/code system to read one form entry and duplicate it to the other automatically, just still feeling out the best way to do this and get it to transfer/skim properly.
This is between 2 sister companies, each value(Part) has a different part number linked to them, but internally they cannot be linked due to reporting purposes and which company sales what.
One company starts with 100XXX-XX numbers and the other starts with 300XXX-XX numbers for the parts. Again, they are basically the same Parts.
Not sure if Tampermonkey or java will be able to do this properly as I don't even know where to start.
Any recommendations or walkthough on the best way to do this would be awesome, I know it might be a little hard since its 2 different item systems.
Maybe just pull the description of the items since they will be almost the same?
You can create a user event script on the first company and RESTlet on the second one.
The user event script on the first company will create a JSON object of the item that is being created and pre-process the changes in the part number(or any other changes that is required) for the second company and send it to the second company's RESTlet. The RESTlet will now then create the item for the second company.
By using this approach you don't need any 3rd party application to deal with.

How to handle an extremely big table in a search?

I'm looking for suggestions on how to go about handling the following use case scenario with python django framework, i'm also open to using javascript libraries/ajax.
I'm working with pre-existing table/model called revenue_code with over 600 million rows of data.
The user will need to search three fields within one search (code, description, room) and be able to select multiple search results similar to kendo controls multi select. I first started off by combining the codes in django-filters as shown below, but my application became unresponsive, after waiting 10-15 minutes i was able to view the search results but couldn't select anything.
https://simpleisbetterthancomplex.com/tutorial/2016/11/28/how-to-filter-querysets-dynamically.html
I've also tried to use kendo controls, select2, and chosen because i need the user to be able to select as many rev codes as they need upward to 10-20, but all gave the same unresponsive page when it attempted to load the data into the control/multi-select.
Essentially what I'm looking for is something like this below, which allows the user to select multiple selections and will handle a massive amount of data without becoming unresponsive? Ideally i'd like to be able to query my search without displaying all the data.
https://petercuret.com/add-ajax-to-django-without-writing-javascript/
Is Django framework meant to handle this type of volume. Would it be better to export this data into a file and read the file? I'm not looking for code, just some pointers on how to handle this use case.
What the basic mechanism of "searching 600 millions"? Basically how database do that is to build an index, before search-time, and sufficiently general enough for different types of query, and then at search time you just search on the index - which is much smaller (to put into memory) and faster. But no matter what, "searching" by its nature, have no "pagination" concept - and if 600 millions record cannot go into memory at the same time, then multiple swapping out and in of parts of the 600 millions records is needed - the more parts then the slower the operation. These are hidden behind the algorithms in databases like MySQL etc.
There are very compact representation like bitmap index which can allow you to search on data like male/female very fast, or any data where you can use one bit per piece of information.
So whether Django or not, does not really matters. What matters is the tuning of database, the design of tables to facilitate the queries (types of indices), and the total amount of memory at server end to keep the data in memory.
Check this out:
https://dba.stackexchange.com/questions/20335/can-mysql-reasonably-perform-queries-on-billions-of-rows
https://serverfault.com/questions/168247/mysql-working-with-192-trillion-records-yes-192-trillion
How many rows are 'too many' for a MySQL table?
You can't load all the data into your page at once. 600 million records is too many.
Since you mentioned select2, have a look at their example with pagination.
The trick is to limit your SQL results to maybe 100 or so at a time. When the user scrolls to the bottom of the list, it can automatically load in more.
Send the search query to the server, and do the filtering in SQL (or NoSQL or whatever you use). Database engines are built for that. Don't try filtering/sorting in JS with that many records.

HTML to search Excel and display Result

I am very new to html and JavaScript.
My project is to make a website in html that can search excel for user-defined criteria and display results back in the webpage.
Example: a user searches the ID number #1234, and the website then displays information like name, location, gender etc. That correspondences with that ID number (ID number in col A).
The data is in excel, typically one row per unique ID. And is currently in xlsx format but this can be changed.
Sorry for the lengthy question and lack of coding example. I have searched many different options however I cant even get those to work.
Any direction or help on this would be greatly appreciated.
In my opinion, this is going to be a monumental task if you are beginning. I'd say you should start with a smaller project. Especially since you don't know where to start / what to search for on google. If however you can convert your data to json (assuming you own the data), that will be more reachable.
If you want to do it with excel data anyway, with javascript, there is this parser for spread sheet that can help you:
Parser and writer for various spreadsheet formats. Pure-JS cleanroom
implementation from official specifications, related documents, and
test files. Emphasis on parsing and writing robustness, cross-format
feature compatibility with a unified JS representation, and ES3/ES5
browser compatibility back to IE6.
This is the community version. We also offer a pro version with
performance enhancements, additional features by request, and
dedicated support.

Get list of users from Domino Directory in JavaScript [duplicate]

I'm aware that the SSJS version of #DbColumn() has the same 64k limitiation as the original Formula language version. So up until now I used NotesView.getColumnValues() instead, believing that here I wouldn't face such a limitation.
Which obviously is wrong as an urgent support call yesterday tells me, as well as this crash report by IBM.
The code in question is used to populate the selectItems control in a comboBox; opening the page hosting the comboBox crashes the server's http task and then in consequence the entire server:
<xp:selectItems>
<xp:this.value><![CDATA[#{javascript:database.getView("vwInvBySupplier").getColumnValues(0);}]]>
</xp:this.value>
</xp:selectItems>
This is looking up all category entries from a view. I'm using the combo as a dynamic category filter to the view displayed on the same page.
What alternatives are there to retrieve the complete list of all category entries from the view, even if the data retrieved exceed 64k?
Sidenotes:
I'm fully aware that showing over 2000 entries in a comboBox might not be a convincing usability concept for some, but the customer loves to be able to see all available entries in one place, and then being able to select from that list. At least the standard solution with a view panel full of category entries, twisties, and the need to step through numerous pages is not a solution.
The application is running on Domino 9.0.1, WinSrv 2008/64k
Luckily, an JavaScript array is not limited to 64K.
Create an array var values = [];,
"walk" through view with view navigator and add entry values to array
with values.push("new value"),
return values

Best Practices for displaying large lists

Are there any best practices for returning large lists of orders to users?
Let me try to outline the problem we are trying to solve. We have a list of customers that have 1-5,000+ orders associated to each. We pull these orders directly from the database and present them to the user is a paginated grid. The view we have is a very simple "select columns from orders" which worked fine when we were first starting but as we are growing, it's causing performance/contention problems. Seems like there are a million and one ways to skin this cat (return only a page worth of data, only return the last 6 months of data, etc.) but like I said before just wondering if there are any resources out there that provide a little more hand holding on how to solve this problem.
We use SQL Server as our transaction database and select the data out in XML format. We then use a mixture of XSLT and Javascript to create our grid. We aren't married to the presentation solution but are married to the database solution.
My experience.
Always set default values in the UI for the user that are reasonable. You don't want them clicking "Retrieve" and getting everything.
Set a limit to the number of records that can be returned.
Only return from the database the records you are going to display.
If forward/backward consistencency is important, store the entire results set from the query in a temp table and return just the page you need to display. When paging up/down retrieve the next set from the temp table.
Make sure your indexs are covering your queries.
Use different queries for different purposes. Think "Open Orders" vs "Closed Orders". These might perfrom much better as different queries instead of one generic query.
Set parameter defualts in the stored procedures. Protect your query from a UI that is not setting reasonable limits.
I wish we did all these things.
I'd recommend doing some profiling to find the actual bottlenecks. Perhaps you have access to Visual Studio Profiler? http://msdn.microsoft.com/en-us/magazine/cc337887.aspx There are plenty of good profilers out there.
Otherwise, my first stop would be pagination to bring back less records from the db, which is easier on the connection and the memory footprint. Take a look at this (I'm assuming you're on SQL Server >= 2005)
http://www.15seconds.com/issue/070628.htm
I"m not sure from the question exactly what UI problem you are trying to solve.
If it's that the customer can't work with a table that is just one big amorphous blob, then let him sort on the fields: order date, order number, your SKU number, his SKU number maybe, and I guess others,too. He might find it handy to do a multi-column stable sort, too.
If it's that the table headers scroll up and disappears when he scrolls down through his orders, that's more difficult. Read the SO discussion to see if the method there gives a solution you can use.
There is also a JQuery mechanism for keeping the header within the viewport.
HTH
EDIT: plus I'll second #Iain 's answer: do some profiling.
Another EDIT: #Scott Bruns 's answer reminded me that when we started designing the UI, the biggest issue by far was limiting the number of records the user had to look at. So, yes I agree with Scott that you should give the user some way to see only a limited number of records right from the start; that is, before he ever sees a table, he has told you a lot about what he wants to see.
Stupid question, but have you asked the users of your application for input on what records that they would like to see initially?

Categories

Resources