PostgreSQL: INSERT Error while reading data from CSV File - javascript

I am inserting values from a CSV file into a postgresql table, the code used to work fine earlier, but now that I'm on my local machine, it refuses despite so many different attempts.
const query =
"INSERT INTO questions VALUES (DEFAULT,$1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12)";
questionData.forEach((row) => {
questions.push(
db.query(query, row).catch((err) => {
console.log(err);
})
);
});
This is my insertion logic, the questionData just holds every row of the CSV file, and questions is the array of promises which I Promise.all() in the end.
The Error I get is in this link
I am going crazy trying to fix this, I have changed absolutely nothing in the backend, my CSV files have only 12 rows which are the one's I'm trying to insert.
Edit:
What is 5+3,qwerty,mcq,chemistry,cat,cat,easy,2001,FALSE,nah,{8},"{7,2,8,3}"
What is 5+4,qwerty,mcq,maths,cat,cat,easy,2002,FALSE,nah,{9},"{7,9,5,3}"
What is 5+5,qwerty,mcq,physics,cat,cat,easy,2003,FALSE,nah,{10},"{7,2,10,3}"
What is 5+6,qwerty,mcq,chemistry,cat,cat,easy,2004,FALSE,nah,{11},"{11,2,5,3}"
What is 5+7,qwerty,mcq,maths,cat,cat,easy,2005,FALSE,nah,{12},"{7,2,12,3}"
What is 5+8,qwerty,mcq,physics,cat,cat,easy,2006,FALSE,nah,{13},"{13,2,5,3}"
What is 5+9,qwerty,mcq,chemistry,cat,cat,easy,2007,FALSE,nah,{14},"{7,14,5,3}"
What is 5+10,qwerty,mcq,maths,cat,cat,easy,2008,FALSE,nah,{15},"{7,2,15,3}"
This is my CSV

The error states that you're trying to push more values than the columns in your table.
i see your insert statement has a DEFAULT first parameter... what is that about?
If your target table has only 12 columns then you should be inserting the following
INSERT INTO questions VALUES ($1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12)

Related

MongoImport csv combine/concat various columns to one array for import

I have another interesting case which I have never faced before, so I'm asking help from SO community and also share my experience with it.
The case || What we have:
A csv file (exported from other SQL DB) with such structure
(headers):
ID,SpellID,Reagent[0],Reagent[1..6]Reagent[7],ReagentCount[0],ReagentCount[1..6],ReagentCount[7]
You could also check a full -csv data file here, at my
dropbox
My gist from Github, which helps you to understand how MongoImport works.
What we need:
I'd like to receive such structure(schema) to import it into MongoDB collection:
ID(Number),SpellID(Number),Reagent(Array),ReagentCount(Array)
6,898,[878],[1]
with ID, SpellID, and two arrays, in first we store all Reagent IDs, like [0,1,2,3,4,5,6,7] from all Reagent[n] columns, and in the second array we have the array with the same length that represent quantity of ReagentIDs, from all ReagentCount[n]
OR
A transposed objects with such structure (schema):
ID(Number),SpellID(Number),ReagentID(Number),Quantity/Count(Number)
80,2675,1,2
80,2675,134,15
80,2675,14,45
As you may see, the difference between the first example and this one, that every document in the collection represents each ReagentID and it's quantity to SpellID. So if one Spell_ID have N different reagents it will be N documents in the collection, cause we all know, that there can't be more then 7 unique Reagent_ID belonging to one Spell_ID according to our -csv file.
I am working on this problem right now, with the help of node js and npm i csv (or any other modules for parsing csv files). Just to make my csv file available for importing to my DB via mongoose. I'll be very thankful for all those, who could provide any relevant contribution to this case. But anyway, I will solve this problem eventually and share my solution in this question.
As for the first variant I guess there should be one-time script for MongoImport that could concat all columns from Reagent[n] & ReagentCount[n] to two separate arrays like I mentioned above, via -fields but unfortunately I don't know it, and there are no examples on SO or official Mongo docs relevant to it. So if you have enough experience with MongoImport feel free to share it.
Finally I solve my problem as I want it to, but without using mongoimport
I used npm i csv and write function for parsing my csv file. In short:
async function FuncName (path) {
try {
let eva = fs.readFileSync(path,'utf8');
csv.parse(eva, async function(err, data) {
//console.log(data[0]); we receive headers, if they exist
for (let i = 1; i < data.length; i++) { //we start from 1, because 0 is headers, if we don't have it, then we start from 0
console.log(data[i][34]); //where i is row number and j(34) is a header address
}
});
} catch (err) {
console.log(err);
}
}
It loops over csv file and shows data in array that allows you to operate with them as you want it to.

Reading data from sqlite using sql.js

I am new with java-script, it might seems a stupid question, but I tried a lot to get the right result, but still fail... I have to read and write data in sqlite file, and I found this library which is described as a port by the developer, https://github.com/lovasoa/sql.js. Please see my code below:
First, I am selecting a local file via this HTML
<input type="file" id="check" onchange="working(this.files)">
The js file behind this is:
function working(data) {
//adding database
var sql = window.SQL;
var db = new sql.Database(data);
//writing a query
//read existing table called propertystring
var query = "SELECT * FROM propertystring";
var result = db.exec(query);
console.log(result);}
I am trying to debug using Google chrome console, its saying
Uncaught Error: no such table: propertystring
Although, I have entered in my sqlite database this table and its showing me in sqlitemen The description of table is as below:
-- Describe PROPERTYSTRING
CREATE TABLE "propertystring" (
"sValue" TEXT,
"TypeID" TEXT
)
INSERT INTO propertystring (sValue, TypeID) VALUES ("yes its working", "71")
I hope, its very simple thing and any person using sql.js can simple answer me.

MongoDb bulk insert limit issue

Im new with mongo and node. I was trying to upload a csv into the mongodb.
Steps include:
Reading the csv.
Converting it into JSON.
Pushing it to the mongodb.
I used 'csvtojson' module to convert csv to json and pushed it using code :
MongoClient.connect('mongodb://127.0.0.1/test', function (err, db) { //connect to mongodb
var collection = db.collection('qr');
collection.insert(jsonObj.csvRows, function (err, result) {
console.log(JSON.stringify(result));
console.log(JSON.stringify(err));
});
console.log("successfully connected to the database");
//db.close();
});
This code is working fine with csv upto size 4mb; more than that its not working.
I tried to console the error
console.log(JSON.stringify(err));
it returned {}
Note: Mine is 32 bit system.
Is it because there a document limit of 4mb for 32-bit systems?
I'm in a scenario where I can't restrict the size and no.of attributes in the csv file (ie., the code will be handling various kinds of csv files). So how to handle that? I there any modules available?
If you are not having a problem on the parsing the csv into JSON, which presumably you are not, then perhaps just restrict the list size being passed to insert.
As I can see the .csvRows element is an array, so rather than send all of the elements at once, slice it up and batch the elements in the call to insert. It seems likely that the number of elements is the cause of the problem rather than the size. Splitting the array up into a few inserts rather than 1 should help.
Experiment with 500, then 1000 and so on until you find a happy medium.
Sort of coding it:
var batchSize = 500;
for (var i=0; i<jsonObj.csvRows.length; i += batchSize) {
var docs = jsonObj.csvRows.slice(i, i+(batchSize -1));
db.collection.insert( docs, function(err, result) {
// Also don't JSON covert a *string*
console.log(err);
// Whatever
}
}
And doing it in chunks like this.
You can make those data as an array of elements , and then simply use the MongoDB insert function, passing this array to the insert function

How to load related entity of external data source in Lightswitch (Visual Studio 2013)

I have 2 tables which are both in an Azure SQL Database which is connected to my Lightswitch Sharepoint app. I am doing some manipulation of the data in code, and it appears to be working, except that when I load the entities from one table, I am not able to see the related entities in the other.
Basically, I have a products table and an invoice lines table. Each invoice line record contains a product code, which relates to the products table PK. I have defined the relationship in Lightswitch, but when I load the invoice line record, I can't see the product information.
My code is as follows:
// Select invoice and get products
myapp.AddEditServiceRecord.InvoicesByCustomer_ItemTap_execute = function (screen) {
screen.ServiceRecord.InvoiceNumber = screen.InvoicesByCustomer.selectedItem.INVO_NO;
// Delete existing lines (if any)
screen.ServiceDetails.data.forEach(function (line) {
line.deleteEntity();
});
// Add products for selected invoice
screen.getInvoiceLinesByNumber().then(function (invLines) {
invLines.data.forEach(function (invLine) {
invLine.getProduct().then(function (invProduct) {
var newLine = new myapp.ServiceDetail();
newLine.ServiceRecord = screen.ServiceRecord;
newLine.ProductCode = invLine.ProductCode;
newLine.ProductDescription = invProduct.Description;
newLine.CasesOrdered = invLine.Cases;
});
});
});
};
The idea is that a list of invoices are on the screen 'InvoicesByCustomer', and the user clicks one to add the details of that invoice to the 'ServiceRecord' table. If I comment out the newLine.ProductDescription = invProduct.Description line it works perfectly in adding the correct product codes and cases values. I have also tried a few other combinations of the below code, but in each case the related product entity appears as undefined in the Javascript debugger.
EDIT: I also read this article on including related data (http://blogs.msdn.com/b/bethmassi/archive/2012/05/29/lightswitch-tips-amp-tricks-on-query-performance.aspx) and noticed the section on 'Static Spans'. I checked and this was set to 'Auto (Excluded)' so I changed it to 'Included', but unfortunately this made no difference. I'm still getting the invProduct is undefined message. I also tried simply invLine.Product.Description but it gives the same error.
The solution in this case was a simple one. My data was wrong, and therefore Lightswitch was doing it's job correctly!
In my Invoices table, the product code was something like 'A123' whereas in my Products table, the product code was 'A123 ' (padded with spaces on the right). When doing SQL queries against the data, it was able to match the records but Lightswitch (correctly) saw the 2 fields as different and so could not relate them.
I may have wasted several hours on this, but it's not wasted when something has been learnt...or so I'll tell myself!

Get full data set, sorted with YUI Data Table with Pagination

I hope I am describing my issue enough.. here goes:
I have a YUI data table, get a server side set of records via JSON, and then populates the data.
Users can click on the headers to sort the data in three of the 6 columns, (which are using a custom sort function for each column). The sorting is done client-side.
When a user sorts the data, I need to be able to get a complete list of the values from one of the columns being shown. I need all the data available, not just what's rendered to the page. The data hidden via pagination must be included.
Any ideas? I've tried the handleDataReturnPayload and doBeforeLoadData methods of the DataTable but both give the original, unsorted data.
I'm really stuck here and I've got a client depending on a feature that depends on me getting this sorted list.
Thanks in advance.
Satyam, over at the YUI Forums answered my question perfectly.
The data is actually stored in the
RecordSet. At any time you can go and
look at it, and it will be sorted as
shown on the screen, but it will have
all the data, whether shown or not.
Method getRecordset() will give you a
reference to it and then you can loop
through its records.
You can listen to the columnSortEvent
to be notified a sort has occurred.
I just subscribed to the columnSortEvent event and looped through the array returned by datatable.getRecordSet().getRecords().
I'd recommend posting this to the YUI forums -- http://yuilibrary.com/forum/ -- that's a great place to get support on DataTable issues.
I stumbled upon this question looking for information on how to retrieve information from a dataset that was NOT displayed in the datatable. IF you place the hidden data in the datasource before any field you wish to be displayed, it will be rendered blank, but if you place it after your last field that will be rendered (as defined by the columns object), then they will not render but still be accessible through the record).
var columns = [
{key:"Whatchamacallits", children:[
{key:"name" },
{key:"price" }
]}
];
var ds = new YAHOO.util.DataSource('index.php...');
oDataSource.responseType = YAHOO.util.DataSource.TYPE_JSARRAY;
oDataSource.responseSchema = {
fields:["name","price","id"]
};
var dt = new YAHOO.widget.DataTable("dt-id", columns, ds, {});
dt.subscribe("rowClickEvent", dt.onEventSelectRow);
dt.subscribe("rowSelectEvent", function(p){ alert(p.getData('id'); });

Categories

Resources