Mongodb insert document via shell - javascript

I want to perform crud operation on my collection. To automate this, I wrote a script for inserting documents on my collection.
This script should basically read the data from a json file and insert it into my db-collection.
'use strict';
const fs = require('fs');
const path = require('path');
const { spawn }= require('child_process');
(function () {
const username = process.env.DB_USERNAME,
password = process.env.DB_PASSWORD,
args = process.argv.slice(3),
log = console.log,
mongoServer = "mongodb+srv://***server-name***/";
let database,
documents,
collection;
if (args.length === 2) {
database = args[0];
collection = args[1];
const raw = fs.readFileSync(path.join(__dirname, 'documents.json'));
documents = JSON.parse(raw);
log(documents)
const writeAction = spawn('mongo', [`\"${mongoServer}${database}\" -u ${username} -p ${password} --eval \"db.${collection}.insert(${documents})\" `], {shell: true});
writeAction.stdout.on('data', data => log((`stout: ${data}`));
writeAction.stderr.on('data', data => log((`StdErr: ${data}`)));
writeAction.on('close', (code) => log(`Child process exited with code: ${code}.`));
} else {
log('A database and a collection has to be specified!\n In Order:\n 1. Database\n 2. Collection');
process.exit(1);
}
})();
If I read the json file the console log the following:
[ { id: '3685b542-61d5-45da-9580-162dca725966',
mission:
'The American Red Cross prevents and alleviates human suffering in the face of emergencies by mobilizing the power of volunteers and the generosity of donors.',
street1: '2025 E Street, NW',
profile_url:
'https://www.pledgeling.com/organizations/42/american-red-cross' } ]
So the json looks fine to me but if I execute the script it throws me the error:
stout: 2019-11-08T18:08:33.901+0100 E QUERY [js] uncaught exception: SyntaxError: missing ] after element list :
#(shell eval):1:23
2019-11-08T18:08:33.901+0100 E - [main] exiting with code -4
Does any of you know how to overcome this error?

It clearly does not see the closing brackets. So first thing try to lose the brackets and test. Also, do not forget to force 'utf8' encoding.
const raw = fs.readFileSync(path.join(__dirname, 'documents.json'), 'utf8');
documents = JSON.parse(raw);

the line...
writeAction.stdout.on('data', data => log((`stout: ${data}`));
... is missing a parenthesis.

Related

MongoError: BSON field 'insert.documents.0' is the wrong type 'binData', expected type 'object'

I'm using node 18.7 on ubuntu. I'm trying to parse a bunch of csv files to objects and load into mongo db. I'm trying to follow https://github.com/AbdullahAli/node-stream-to-mongo-db example 4 - Stream from a local file. I have created the collection in the local db
MongoError: BSON field 'insert.documents.0' is the wrong type 'binData', expected type 'object'
My code:
const csv = require('csvtojson');
const streamToMongoDB = require('stream-to-mongo-db').streamToMongoDB;
// where the data will end up
// const outputDBConfig = { dbURL : 'mongodb://localhost:27017/streamToMongoDB', collection : 'devTestOutput' };
const outputDBConfig = { dbURL: 'mongodb://localhost:27017/local', collection: 'devTestOutput' };
// create the writable stream
const writableStream = streamToMongoDB(outputDBConfig);
const main = async () => {
const readFileStream = fs.createReadStream(path);
const writeFileStream = fs.createWriteStream(__dirname + '/file2');
await readFileStream.pipe(csv(), { objectMode: true }).on('data', x => console.log(x.toString())).on('error', err => console.log(err))
.pipe(writableStream);
console.log('finish');
}
main()
When I run index.js I get:
MongoError: BSON field 'insert.documents.0' is the wrong type 'binData', expected type 'object'
What am I doing wrong?
Base on https://github.com/Keyang/node-csvtojson#api:
const csv=require('csvtojson')
const converter=csv(parserParameters, streamOptions)
The stream options are inside the csv object. After I saw this , I was able to change the code to:
await readFileStream.pipe(csv({delimiter:',',ignoreEmpty:true},{ objectMode: true })).on('data', data => console.log(data))
and it started working

SyntaxError: Unexpected token { in JSON at position 101033 at JSON.parse (<anonymous>) fix?

So I am trying to complete the open source mission in TwilioQuest but I can't becuase the terminal gives me an error when I run this command:
git commit -m "feat(pixels): add my new pixel"
It tells me there is an error in one of the scripts which is this one and its written in Javascript:
const fs = require('fs');
const path = require('path');
const { promisify } = require('util');
const { sortPixels, pixelsToString } = require('../utils/pixels-helper');
const readFile = promisify(fs.readFile);
const writeFile = promisify(fs.writeFile);
const pixelFilePath = path.join('_data', 'pixels.json');
readFile(pixelFilePath, { encoding: 'utf8' })
.then(pixelFileData => {
const pixels = JSON.parse(pixelFileData);
const sortedPixelString = pixelsToString(sortPixels(pixels));
writeFile(pixelFilePath, sortedPixelString);
})
.catch(console.log);
The error it gives me for this script is this:
SyntaxError: Unexpected token { in JSON at position 101033
at JSON.parse (<anonymous>)
8 | const filePath = path.resolve(__dirname, '../_data/', dataJsonFile);
9 | const pixelJsonString = await readFile(filePath, 'utf8');
> 10 | return JSON.parse(pixelJsonString);
| ^
11 | }
12 |
13 | describe('pixels', () => {
What is this error about and how can I fix it?
Regarding git features in this matter : this error is triggered by a custom script, most probably .git/hooks/pre-commit.
You may open .git/hooks/pre-commit in an editor to view how your code is tested, and execute in your terminal .git/hooks/pre-commit to see if the test passes or fails.
Regarding javascript debugging : use your regular tools and methods to debug.
For example :
you may add console.log(filePath) before calling JSON.parse, to see which file was opened, or console.log(pixelJsonString) to see what is returned from your function,
you may use a node debugger to follow you program's execution
...

GCloud Function with Puppeteer - 'Process exited with code 16' error

Having trouble finding any documentation or cause for this sort of issue. I'm trying to run a headless chrome browser script that pulls the the current song playing from kexp.org and returns it as a JSON object. Testing with the NPM package #Google-clound/functions-framework does return the correct response however when deployed into GCloud, I receive the following error when hitting the API trigger:
Error: could not handle the request
Error: Process exited with code 16
at process.on.code (invoker.js:396)
at process.emit (events.js:198)
at process.EventEmitter.emit (domain.js:448)
at process.exit (per_thread.js:168)
at logAndSendError (/workspace/node_modules/#google-cloud/functions framework/build/src/invoker.js:184)
at process.on.err (invoker.js:393)
at process.emit (events.js:198)
at process.EventEmitter.emit (domain.js:448)
at emitPromiseRejectionWarnings (internal/process/promises.js:140)
at process._tickCallback (next_tick.js:69)
Full Script:
const puppeteer = require('puppeteer');
let browserPromise = puppeteer.launch({
args: [
'--no-sandbox'
]
})
exports.getkexp = async (req, res) => {
const browser = await browserPromise
const context = await browser.createIncognitoBrowserContext()
const page = await context.newPage()
try {
const url = 'https://www.kexp.org/'
await page.goto(url)
await page.waitFor('.Player-meta')
let content = await page.evaluate(() => {
// finds elements by data type and maps to array note: needs map because puppeeter needs a serialized element
let player = [...document.querySelectorAll('[data-player-meta]')].map((player) =>
// cleans up and removes empty strings from array
player.innerHTML.trim());
// creates object and removes empty strings
player = {...player.filter(n => n)}
let songList = {
"show":player[0],
"artist":player[1],
"song":player[2].substring(2),
"album":player[3]
}
return songList
});
context.close()
res.set('Content-Type', 'application/json')
res.status(200).send(content)
} catch (e) {
console.log('error occurred: '+e)
context.close()
res.set('Content-Type', 'application/json')
res.status(200).send({
"error":"occurred"
})
}
}
Is there documentation for this error type? It's been deployed on GCloud via CLI shell with the following parameters:
gcloud functions deploy getkexp --trigger-http --runtime=nodejs10 --memory=1024mb

Error: ENOENT: no such file or directory even when file exists in Firebase Cloud Functions

I'm relatively new to Cloud Functions and have been trying to solve this issue for a while. Essentially, the function I'm trying to write is called whenever there is a complete upload onto Firebase Cloud Storage. However, when the function runs, half the time, it runs to the following error:
The following error occured: { Error: ENOENT: no such file or directory, open '/tmp/dataprocessing/thisisthefilethatiswritten.zip'
errno: -2,
code: 'ENOENT',
syscall: 'open',
path: '/tmp/dataprocessing/thisisthefilethatiswritten.zip' }
Here's the code:
const functions = require('firebase-functions');
const admin = require('firebase-admin')
const inspect = require('util').inspect
const path = require('path');
const os = require('os');
const fs = require('fs-extra');
const firestore = admin.firestore()
const storage = admin.storage()
const runtimeOpts = {
timeoutSeconds: 540,
memory: '2GB'
}
const uploadprocessing = functions.runWith(runtimeOpts).storage.object().onFinalize(async (object) => {
const filePath = object.name
const fileBucket = object.bucket
const bucket_fileName = path.basename(filePath);
const uid = bucket_fileName.match('.+?(?=_)')
const original_filename = bucket_fileName.split('_').pop()
const bucket = storage.bucket(fileBucket);
const workingDir = path.join(os.tmpdir(), 'dataprocessing/');
const tempFilePath = path.join(workingDir, original_filename);
await fs.ensureDir(workingDir)
await bucket.file(filePath).download({destination: tempFilePath})
//this particular code block I included because I was worried that the file wasn't
//being uploaded to the tmp directly, but the results of the function
//seems to confirm to me that the file does exist.
await fs.ensureFile(tempFilePath)
console.log('success!')
fs.readdirSync(workingDir).forEach(file => {
console.log('file found: ', file);
});
console.log('post success')
fs.readdirSync('/tmp/dataprocessing').forEach(file => {
console.log('tmp file found: ', file);
})
fs.readFile(tempFilePath, function (err, buffer) {
if (!err) {
//data processing comes here. Please note that half the time it never gets into this
//loop as instead it goes into the else function below and outputs that error.
}
else {
console.log("The following error occured: ", err);
}
})
fs.unlinkSync(tempFilePath);
return
})
module.exports = uploadprocessing;
I've been trying so many different things and the weird thing is that when I add code into the "if (!err)" (which doesn't actually run because of the err) it just arbitrarily starts working sometimes quite consistently, but then it stops working when I add different code. I would have assumed that the issue arises from the code that I added, but then the error comes up literally when I just change/add/remove comments as well... Which should technically have no effect on the function running...
Any thoughts? Thank you in advance!!! :)
fs.readFile is asynchronous and returns immediately. Your callback function is invoked some time later with the contents of the buffer. This means that fs.unlinkSync is going to delete the file at the same time it's being read. This means you effectively have a race condition, and it's possible that the file will be removed before it's ever read.
Your code should wait until the read is complete before moving on to the delete. Perhaps you want to use fs.readFileSync instead.

How to manually create Git blob object and then read the contents using node.js like in the ProGit chapter on Git internals?

I read the chapter called "Git Internals - Git Objects" in the ProGit book.
The final part, entitled "Object Storage", shows you how you can manually create a Git blob object, and then read the contents of that object. This is shown using Ruby.
I tried to do the same thing in node.
First I created a directory called my-git-tests, and in it I ran git init. I created one javascript file called s.js analogous to the commands in the chapter with Ruby, and here it is:
const crypto = require('crypto');
const path = require('path');
const fs = require('fs');
const zlib = require('zlib');
const content = 'what is up, doc?';
const header = `blob ${Buffer.from(content).length}\0`;
console.log('Header', header.length, header);
const store = header + content;
console.log('Store is ', store);
const hash = crypto.createHash('sha1');
const sha1 = hash.update(store, 'utf-8').digest('hex');
console.log('SHA-1 is ', sha1);
const objectPath = `.git/objects/${sha1.substr(0, 2)}/${sha1.substr(2)}`;
console.log('Path is ', objectPath);
fs.mkdirSync(path.dirname(objectPath));
let zlibCompress;
zlib.deflate(store, (err, buffer) => {
if (!err) {
zlibCompress = buffer.toString('base64');
console.log('zlib: ', zlibCompress);
fs.writeFile(objectPath, zlibCompress, function(err) {
if (err) {
console.log(err);
}
console.log('saved');
});
} else {
console.log('Error compressing.');
}
});
When I run this script, the output is
Header 8 blob 16
Store is blob 16what is up, doc?
SHA-1 is bd9dbf5aae1a3862dd1526723246b20206e5fc37
Path is .git/objects/bd/9dbf5aae1a3862dd1526723246b20206e5fc37
zlib: eJwFwYEBACAEBMCV8kKNQ8/+I3RXvKyxzJbU4yDF4AHF9sLC8rZ5Gh/tqwrk
saved
However, when I try to read the Git object:
git cat-file -p bd9dbf5aae1a3862dd1526723246b20206e5fc37
I get
error: inflate: data stream error (incorrect header check)
error: unable to unpack bd9dbf5aae1a3862dd1526723246b20206e5fc37 header
fatal: Not a valid object name bd9dbf5aae1a3862dd1526723246b20206e5fc37
I'm not sure what I am doing wrong here.
Don't use base64.
Replace zlibCompress = buffer.toString("base64); with zlibCompress = buffer;
git cat-file will read this perfectly fine.

Categories

Resources