Specify different unique key in Dexie database schema? - javascript

My basic dexie database scheme is something like this.
const db = new Dexie('MyDatabase');
// Declare tables, IDs and indexes
db.version(1).stores({
myrecords: 'record_id'
});
I want to use my record_id as a unique key. In indexeddb I can do this like the below
var myrecordsObjectStore = db.createObjectStore('myrecords' , {
keyPath: 'record_id'
});

Should work using & prefix for unique as noted in docs
db.version(1).stores({
myrecords: '&record_id'
});
See Dexie Quick Reference

Related

TypeORM createQueryBuilder: Is possible to use table name as string instead of entity

In my current scenario, a third party will be creating tables dynamically on my DB, and storing the table name as a varchar and column names as jsonb in other table which is defined as an Entity in my NestJS backend.
This is so I can keep track (and query) these tables, since I have no control over its creation.
For this purpose I'd like to use TypeORM's createQueryBuilder instead of using raw queries as its easier for me to play with abstraction.
As far as I know TypeORMs createQueryBuilder needs a defined Entity in the from clause.
Something like this:
return await getManager()
.createQueryBuilder()
.select('*')
.from(MyDefinedModel, 'modelAlias')
.getRawMany();
}
So I'd like to do something like:
const tableName = await getDynamicallyGenetaredTableNames().getFirstOne()
// now tableName points to a string that is a table name, i.e 'table-data-192239'
return await getManager()
.createQueryBuilder()
.select('*')
.from(tableName, 'tableAlias')
.getRawMany();
So passing the table name I point to the right table, but TypeORM (and TS) complains because that tableName is a string, and not an Entity (Class) type
I really don't want to type-cast and start doing nasty things if there is something cleaner to achieve this.
I didn't find any solution in the official docs
Any brilliant ideas out there?
Thanks, y'all
We can pass table name instead of Entity for getRepository.
let tableName = 'user';
let query = getRepository(tableName).createQueryBuilder(tableName);
You can select from a table by its table name without defining an entity before like this:
const res = await manager
.createQueryBuilder()
.select()
.from("tableName", null)
.where("tableName.id = :id", { id: 1 })
.getRawMany();
Be sure that you set the second parameter of the from() explicitly to null.
You could try using a raw query:
import { getManager } from 'typeorm'
const tableName = await getDynamicallyGenetaredTableNames().getFirstOne()
// Use the table name in a raw SQL query
const result = await getManager().query(`SELECT * FROM ${tableName}`)

Firebase pushing array - Javascript

I am using Firebase to store information for a workout application.
I user adds a workout name and then I push it to the database. I can continue pushing these but my issue is that it does not seem to be pushing as an array just an object. See the screen shots below...
As you can see in the console log picture the workouts property is an object not an array like I expect.
The code I'm using to push it:
let newWorkout = {
title: 'title',
exercises: [{
name: 'pulldownsnsn',
sets: 4
}]}
let ref = firebase.database().ref("/userProfile/"+this.userId);
ref.child("workouts").push(newWorkout);
The Firebase Database stores lists of data in a different format, to cater for the multi-user and offline aspects of modern web. The -K... are called push IDs and are the expected behavior when you call push() on a database reference.
See this blog post on how Firebase handles arrays, this blog post on the format of those keys, and the Firebase documentation on adding data to lists.
Arrays are handy, but they are a distributed database nightmare for one simple reason: index element identification is not reliable when elements get pushed or deleted. Firebase database instead uses keys for element identification:
// javascript object
['hello', 'world']
// database object
{ -PKQdFz22Yu: 'hello', -VxzzHd1Umr: 'world' }
It gets tricky when using push(), because it does not actually behaves like a normal push, but rather as a key generation followed by object modification.
Example usage
firebase.database().ref('/uri/to/list').push(
newElement,
err => console.log(err ? 'error while pushing' : 'successful push')
)
Heres an example from firebase documentation:
const admin = require('firebase-admin');
// ...
const washingtonRef = db.collection('cities').doc('DC');
// Atomically add a new region to the "regions" array field.
const unionRes = await washingtonRef.update({
regions: admin.firestore.FieldValue.arrayUnion('greater_virginia')
});
// Atomically remove a region from the "regions" array field.
const removeRes = await washingtonRef.update({
regions: admin.firestore.FieldValue.arrayRemove('east_coast')
});
More info on this firebase documentation.

Select all the fields in a mongoose schema

I want to obtain all the fields of a schema in mongoose. Now I am using the following code:
let Client = LisaClient.model('Client', ClientSchema)
let query = Client.findOne({ 'userclient': userclient })
query.select('clientname clientdocument client_id password userclient')
let result = yield query.exec()
But I want all the fields no matter if they are empty. As always, in advance thank you
I'm not sure if you want all fields in a SQL-like way, or if you want them all in a proper MongoDB way.
If you want them in the proper MongoDB way, then just remove the query.select line. That line is saying to only return the fields listed in it.
If you meant in a SQL-like way, MongoDB doesn't work like that. Each document only has the fields you put in when it was inserted. If when you inserted the document, you only gave it certain fields, that document will only have those fields, even if other documents in other collections have different fields.
To determine all available fields in the collection, you'd have to find all the documents, loop through them all and build an object with all the different keys you find.
If you need each document returned to always have the fields that you specify in your select, you'll just have to transform your object once it's returned.
const fields = ['clientname', 'clientdocument', 'client_id', 'password', 'userclient'];
let Client = LisaClient.model('Client', ClientSchema)
let query = Client.findOne({ 'userclient': userclient })
query.select(fields.join(' '))
let result = yield query.exec()
fields.forEach(field => result[field] = result[field]);
That forEach loop will set all the fields you want to either the value in the result (if it was there) or to undefined if it wasn't.
MongoDB is schemaless and does not have tables, each collection can have different types of items.Usually the objects are somehow related or have a common base type.
Retrive invidual records using
db.collectionName.findOne() or db.collectionName.find().pretty()
To get all key names you need to MapReduce
mapReduceKeys = db.runCommand({
"mapreduce": "collection_name",
"map": function() {
for (var key in this) {
emit(key, null);
}
},
"reduce": function(key, stuff) {
return null;
},
"out": "collection_name" + "_keys"
})
Then run distinct on the resulting collection so as to find all the keys
db[mapReduceKeys.result].distinct("_id") //["foo", "bar", "baz", "_id", ...]

Update Array from Document (MongoDB) in Javascript not Working

I've looking for an answer for like 5 five hours straight, hope somebody can help. I have a MongoDb collection results (I'm using mLab) which looks like this:
{
"user":"5818be9c74aaec1824c28626"
"results":[{
"game_id":14578,
"level1":-1,
"level2":-1,
"level3":-1
},
{ ....
}],
{ "user":....
}
}
"user" is a MongoID I save in a previous part of the code, "results" is a record of scores. When an user does a new score, I have to update the score of the corresponding level (I'm using NodeJS).
This is one of the things I've tried so far.
app.get('/levelCompleted/:id/:time', function (request, response) {
var id = request.params.id;
var time = parseInt(request.params.time);
var u= game.getUserById(id);
var k = "results.$.level"+(u.level);
//I build the key to update dinamycally
dbM.collection("results").update(
{user:id,
"results.game_id":u.game_id
//u has its own game_id
},
{$set: {k:time}}
);
...
response.send(...);
});
I've checked the content of every variable and parameter, tried also using $elemMatch and dot notation, set upsert and multi, with no results. I've used an identical command on mongo shell and it has work on the first try.
Update with Mongo Shell
If someone could tell me what I'm doing wrong or point me in the right direction, it would be great.
Thanks
When you use a MongoId as a field in a MongoDB, you can't just pass a string with the id to do the query, you have to identify that string as an ObjectId (Id type in Mongo). Just add a new require in your node.js file.
var ObjectID = require("mongodb").ObjectID;
And use the imported constructor in your update request.
dbM.collection("results").update(
{user:ObjectID(id),...
...
}

Nedb Multiple Collection Single Datastore

I am a new to nedb. Its a kinda what sqlite is for sql community but for the node.js community.
[https://github.com/louischatriot/nedb]
I wanted to ask is possible to have multiple collections in a single database file (datastore).
If there is, could please show me some code sample on how to go about it?
I have tried this:
var Datastore = require('nedb'),
databaseURL="tudls.db",
db = new Datastore({filename: databaseURL, autoload: true});
This creates a single datastore called db.
From the documentation, I saw that nedb is mongo-like. So to insert a record I tried this:
app.post('/todos', function(req, res){
var task = req.body.text;
db.todols.insert({text: task, done: false}, function(err, saved){
if(err||!saved){
res.send("Task not saved...");
}
res.send("Task saved...");});
});
However, I get a 'cannot call method insert of undefined.' I thought that if I call the collection name (todols) when inserting a record it would work so that I can proceed to add another collection to the datastore (db.user) but I was mistaken.
Hence, is it possible to have multiple collections in a single datastore or am I to have a datastore for each collection? If it is possible, does anyone know how to achieve this?
Thank you...
This really is a "lite" interpretation of MongoDB and as such there really isn't the same concept of "databases" and "collections" that exists in the full featured product, as well as omitting a lots of other features.
If you want things to appear as if you do have various "collections", then as suggested in the manual page you define various DataStore objects within a structure to make things look that way:
var db = {};
db.todols = new DataStore('/path/to/todols.db');
db.other = new DataStore('/path/to/other.db');
That makes it appear that you have "collections" which are in fact to "neDB" actually just DataStore objects.
//create multiple collections
var Datastore = require('nedb');
var db = {};
db.users = new Datastore('path/to/users.db');
db.robots = new Datastore('path/to/robots.db');
//access it
db.users.loadDatabase();
db.robots.loadDatabase();
//find some documents
db.users.find({name: "john"}, function (err,docs){ console.log(docs); });

Categories

Resources