I have 2 databases running psql on 2 servers. To free some space on db1, I would copy the following tables from db1 to db2: logs_foo ; logs_bar ; logs_baz.
To keep consistency in our Rails application, I would keep the same name above in the db2. Obviously, once the transfer is done, I will delete the 3 tables on db1. db1 is set as a "primary" db and db2 will contain only logs_*.
I already made changes in my Rails application and all new "logs" goes into the proper logs_* table.
Whats is the best way to copy a table from db1 into db2 on 2 differents servers? Should we go with a script? Better in Rails or bash ? Ca I do it in a soft like Dbeaver?
https://stackoverflow.com/questions/13595333/how-copy-data-from-one-database-to-another-on-different-server This one talk about Oracle soft but it was post in 2012...
I though to create a copy of logs_foo named logs_foo1 in db1 and after moving logs_foo1in db2, I'll rename or copy it into logs_foo. This way is so time consuming and I'll lost some logs wrote in production.
I also could copy the whole db1 as a file and put it in db2 then deleting all the useless data but I don't think it's the proper way to do it.
You can try using pg_dump and restore method. Kindly check below commands
Export table level:
pg-dump -h localhost -U postgres -p 5432 -t table database > path/table_name.sql
Import same table into target site:
psql -h localhost -U postgres -p 5432 database < path/table_name.sql
Related
I've recently started work on a project that requires a psql database. I've followed numerous guides on setting up psql database on fedora, and have got my db up and running. I am able to connect into it using sudo -u postgres psql the project I am working on has a init-db bash scrip that is used to set up all the relation and tables for the database, which I am able to query once I am in the psql view on my terminal.
My issue is that no other application or software is able to connect to the database. At first I though it was a issue with the code left by previous developers but with further investigation I am unable to connect to it even using Jetbrain 'Database Tools and SQL' plug in.
I Have Tried a few thing to fix this (listed bellow) but none have worked
edited the postgresql.conf file to contain linsten_addresses = '*'
editted the pg_hba.conf file to have host all all 0.0.0.0/0
tried connecting to the database via JetBrains 'Database Tools and SQL' using localhost, my public ip, and my private ip
( yes I am sure i have to correct login details )
None of the above worked. I have tinkered with this for 4 hours now and have not gotten it to connect, so any suggestions would be much appreciated. If there is anything that i have missed out please ask and I will update the post with the needed information
I have 2 database servers. One for dev and one for production, what I want to do is to create a dumb of production DB and sync it with dev DB at a specific time using nodejs.
Postgres provides pg_dump (single DB) and pg_dumpall (all DBs) utilities out of the box. The simplest way would probably be a cron job to automate the backup of your source DB and scp it over to a destination server, then restore the DB on the destination server by a cron job:
On the source server:
crontab -e
0 0 * * 0 pg_dumpall -U postgres > ~/backups/backup.bak
# scp to your destination server
If you’re working in one of the major cloud environments they will have their own tools that can help, eg AWS you can automate snapshots, then restore from a snapshot (or have a lambda perform the restore from the snapshot as you suggest in nodeJs)
Or (super-cool way) is to use AWS DMS (Data Migration Services) CDC - Change Data Capture and you can replicate a source DB instantaneously with one or many target replicas (avoiding the need for dumps and restores)
I asume you meant dump, and googled that for you:
https://www.joseverissimo.com/blog/automatically-export-a-database-and-import-it-to-another-using-node-js
Now you just should make sure it runs as a cronjob.
I have installed MongoDB in my computer. I did't define any password and username for the installation. I can see everything with Robo 3T.
Now I want to protect my database.
I have tried to set authentication for the database. I followed https://docs.mongodb.com/manual/tutorial/enable-authentication/ . It did not work.
I still can reach mongodb with robo 3t and see all information.
I have also tried to start mongodb with --auth parameter. I have define a configuration file looks like
And for starting mongodb
mongod -f C:\mongodb\conf\mongodb.conf
Mongodb has been started but, it did not ask any pass. And I can save simple data with Postman without authentication.
What I want to do:
Protect my database against Robo 3t. :))
I dont want to save any data without auth.
Building Nodejs connection string that include pass like
mysql://root:password#localhost:port/dbName
Here is my Nodejs index.js code
this is my model.
PS: I am very new in Nodejs and Mongodb.
EDIT: inside conf file
I am studying the MEAN stack by using this tutorial. But the tutorial connects to a remote mongodb installation. I have MongoDB running in the CentOS7 localhost.
How do I alter the mongoose connect line in server.js from the tutorial link above to connect to a localhost database instead of a remote database?
Here is the current line from server.js that needs to be changed to point to the localhost mongodb:
mongoose.connect('mongodb://node:nodeuser#mongo.onmodulus.net:27017/uwO3mypu');
A specific mongodb database has not been created yet. Do I need to create that also?
I'm fairly new at Mongo to, but I know how to connect to a local db. Basically I had to do the following:
Download the latest version of mongodb from https://www.mongodb.com/download-center?jmp=nav#community (according to your settings)
Once installed, I've created a folder that will be containing my DB.
Use a command line instance to start mongo like this:
mongod --dbpath [YOUR_DB_PATH]
Created a DB use mydb
With that you should have already a mongodb db instance looking for connections on default port. So you can change that line for this:
mongoose.connect('mongodb://localhost:27017/mydb');
Again, this is really basic and it is creating a mongo DB connection with all default options. So this will keep you rolling, but you may need to dig a bit more for custom options.
Hope this helps
I have an web app built with nodejs and I want to do some integration testing of my API that will involve hitting a live database.
Is there any easy way to load & execute an SQL dump file to pre-populate the database before running my tests?
You could group your sql queries for restoring your databases in any teardown or tearup events when needed.
you might want to use any kind of flag before running your tests that needs a clean "data structure"
To load the dump, in a terminal:
mysql -u youruser -p yourdatabasename < /path/to/your/dump_file.sql
type in the password.
If you want to create the dump, in a terminal:
mysqldump -u youruser -p yourdatabasename > /path/to/your/dump_file.sql