Cannot return Promise - javascript

I'm using Typeorm so I've created a method that return a connection:
public static async getConnection(): Promise<Connection> {
if (DatabaseProvider.connection) {
return DatabaseProvider.connection;
}
const { type, host, port, username, password, database, extra, entities, migrations } = DatabaseProvider.configuration;
DatabaseProvider.connection = await createConnection({
type, host, port, username, password, database,
extra,
entities: [
entities
],
migrations: [
migrations
]
} as any);
return DatabaseProvider.connection;
}
I want assign the connection to a bot instance of Telegraf so I have created a .d.ts file for specify the type:
export interface TelegrafContext extends Context {
db: Connection
}
then:
bot.context.db = DatabaseProvider.getConnection().then((conn) => { return conn; });
and I get:
Type 'Promise' is missing the following properties from type 'Connection': name, options, isConnected, driver, and 32 more.
What I did wrong?

Probably because you're trying to assign Promise<Connection> to bot.context.db and it should be a Connection.
so you can either:
DatabaseProvider.getConnection().then((conn) => {
bot.context.db = conn
});
or:
bot.context.db = await DatabaseProvider.getConnection()

Related

How to dynamically connect to a MongoDB database using Nest.js

I need to create a separate database for each entity in my client's app.
I'm going to determine the database name according
to the subdomain from which the request is coming from.
How can I achieve that and connect dynamically to a database using nestjs and mongoose?
UserService
async findOneByEmail(email: string, subdomain: string): Promise<User | any> {
const liveConnections = await Databases.getConnection(subdomain)
const user = await liveConnections.model(User.name, UserSchema).find()
// { 'securityInfo.email': email }
if (!user)
throw new HttpException(
{
status: HttpStatus.BAD_REQUEST,
error: 'user not found',
field: 'user',
},
HttpStatus.BAD_REQUEST
)
return user
}
class that I create
class DataBases extends Mongoose {
private clientOption = {
keepAlive: true,
useNewUrlParser: true,
useUnifiedTopology: true,
}
private databases: { [key: string]: Connection } = {}
private getConnectionUri = (companyName = '') =>
`mongodb+srv://${process.env.MONGODB_USERNAME}:${process.env.MONGODB_PASSWORD}#cluster0.2lukt.mongodb.net/${companyName}?retryWrites=true&w=majority`
public async getConnection(companyName = ''): Promise<Connection> {
const connection = this.databases[companyName]
return connection ? connection : await this.createDataBase(companyName)
}
private async createDataBase(comapnyName = ''): Promise<Connection> {
// create new connection and if the database not exists just create new one
const newConnection = await this.createConnection(
this.getConnectionUri(comapnyName),
this.clientOption
)
this.databases[comapnyName] = newConnection
return newConnection
}
}
I fix it. the DataBases Class that I create works really great and if you know a better way please tell me .
what I had to change is the way I use the connection to MongoDB.
and now I can connect to different databases depends on the subdomain.
I hope it will help someone!
UserService
async findOneByEmail(email: string, subdomain: string): Promise<User | any> {
const liveConnections = await Databases.getConnection(subdomain)
const user = await liveConnections
.model(User.name, UserSchema)
.findOne({ 'securityInfo.email': email })
.exec()
if (!user)
throw new HttpException(
{
status: HttpStatus.BAD_REQUEST,
error: 'user not found',
field: 'user',
},
HttpStatus.BAD_REQUEST
)
return user
}

Get populated data from Mongoose to the client

On the server, I am populating user-data and when I am printing it to the console everything is working fine but I am not able to access the data on the client or even on Playground of GraphQL.
This is my Schema
const { model, Schema } = require("mongoose");
const postSchema = new Schema({
body: String,
user: {
type: Schema.Types.ObjectId,
ref: "User",
},
});
module.exports = model("Post", postSchema);
const userSchema = new Schema({
username: String,
});
module.exports = model("User", userSchema);
const { gql } = require("apollo-server");
module.exports = gql`
type Post {
id: ID!
body: String!
user: [User]!
}
type User {
id: ID!
username: String!
}
type Query {
getPosts: [Post]!
getPost(postId: ID!): Post!
}
`;
Query: {
async getPosts() {
try {
const posts = await Post.find()
.populate("user");
console.log("posts: ", posts[0]);
// This works and returns the populated user with the username
return posts;
} catch (err) {
throw new Error(err);
}
},
}
But on the client or even in Playground, I can't access the populated data.
query getPosts {
getPosts{
body
user {
username
}
}
}
My question is how to access the data from the client.
Thanks for your help.
you are using this feature in the wrong way you should defined a Object in your resolvers with your model name and that object should contain a method that send the realated user by the parant value.
here is a full document from apollo server docs for how to use this feature
use lean() like this :
const posts = await Post.find().populate("user").lean();

Does this method close the MongoDB connection?

I have a method that connects to the MongoDB, but I can't figure out if this connection ends after a call was made or not.
This is the method:
import { Db, MongoClient } from "mongodb";
let cachedConnection: { client: MongoClient; db: Db } | null = null;
export async function connectToDatabase(mongoUri?: string, database?: string) {
if (!mongoUri) {
throw new Error(
"Please define the MONGO_URI environment variable inside .env.local"
);
}
if (!database) {
throw new Error(
"Please define the DATABASE environment variable inside .env.local"
);
}
if (cachedConnection) return cachedConnection;
cachedConnection = await MongoClient.connect(mongoUri, {
useNewUrlParser: true,
useUnifiedTopology: true,
}).then((client) => ({
client,
db: client.db(database),
}));
return cachedConnection!;
}
I use this with Next.js and I am afraid that the app I am actually doing goes down if there will be too many connections. Intuitively, I think that the mongoDB connection ends after a call, but I am not sure.

TypeORM Apollo nested query resolver

I have a schema (with the appropriate database tables and entity classes defined) like
type User {
id: Int!
phoneNumber: String!
}
type Event {
id: Int!
host: User
}
and I'm trying to use Apollo to write a query like
Query{
event(id:1){
host{
firstName
}
}
}
But I can't figure out how to get the Apollo library to resolve the User type in the host field to the hostId that is stored on the event object.
I modified the event to return the hostId field, and it works perfectly fine, but Graphql won't resolve the id to the appropriate user type. What am I missing?
edit: missing resolver code
event: async (parent: any, args: { id: number }) => {
const eventRepository = getConnection().getRepository(Event);
const event = await eventRepository.findOne(args.id);
return event;
},
I managed to get a working version by using findOne(args.id, { relations: ['host']}), but I don't like that because it seems like something that would be appropriate to delegate to graphql to handle.
Your resolver should be like that
const resolver = {
Query: {
event: async (_: any, args: any) => {
return await event.findOne(args.id);
}
},
event: {
host: async (parent: any, args: any, context: any) => {
return await user.find({ id: parent.id });
}
}
};

Using dataloader for resolvers with nested data from ArangoDB

I'm implementing a GraphQL API over ArangoDB (with arangojs) and I want to know how to best implement dataloader (or similar) for this very basic use case.
I have 2 resolvers with DB queries shown below (both of these work), the first fetches Persons, the 2nd fetches a list of Record objects associated with a given Person (one to many). The association is made using ArangoDB's edge collections.
import { Database, aql } from 'arangojs'
import pick from 'lodash/pick'
const db = new Database('http://127.0.0.1:8529')
db.useBasicAuth('root', '')
db.useDatabase('_system')
// id is the auto-generated userId, which `_key` in Arango
const fetchPerson = id=> async (resolve, reject)=> {
try {
const cursor = await db.query(aql`RETURN DOCUMENT("PersonTable", ${String(id)})`)
// Unwrap the results from the cursor object
const result = await cursor.next()
return resolve( pick(result, ['_key', 'firstName', 'lastName']) )
} catch (err) {
return reject( err )
}
}
// id is the auto-generated userId (`_key` in Arango) who is associated with the records via the Person_HasMany_Records edge collection
const fetchRecords = id=> async (resolve, reject)=> {
try {
const edgeCollection = await db.collection('Person_HasMany_Records')
// Query simply says: `get all connected nodes 1 step outward from origin node, in edgeCollection`
const cursor = await db.query(aql`
FOR record IN 1..1
OUTBOUND DOCUMENT("PersonTable", ${String(id)})
${edgeCollection}
RETURN record`)
return resolve( cursor.map(each=>
pick(each, ['_key', 'intro', 'title', 'misc']))
)
} catch (err) {
return reject( err )
}
}
export default {
Query: {
getPerson: (_, { id })=> new Promise(fetchPerson(id)),
getRecords: (_, { ownerId })=> new Promise(fetchRecords(ownerId)),
}
}
Now, if I want to fetch the Person data with the Records as nested data, in a single request, the query would be this:
aql`
LET person = DOCUMENT("PersonTable", ${String(id)})
LET records = (
FOR record IN 1..1
OUTBOUND person
${edgeCollection}
RETURN record
)
RETURN MERGE(person, { records: records })`
So how should I update my API to employ batch requests / caching? Can I somehow run fetchRecords(id) inside of fetchPerson(id) but only when fetchPerson(id) is invoked with the records property included?
The setup file here, notice I'm using graphql-tools, because I took this from a tutorial somewhere.
import http from 'http'
import db from './database'
import schema from './schema'
import resolvers from './resolvers'
import express from 'express'
import bodyParser from 'body-parser'
import { graphqlExpress, graphiqlExpress } from 'apollo-server-express'
import { makeExecutableSchema } from 'graphql-tools'
const app = express()
// bodyParser is needed just for POST.
app.use('/graphql', bodyParser.json(), graphqlExpress({
schema: makeExecutableSchema({ typeDefs: schema, resolvers })
}))
app.get('/graphiql', graphiqlExpress({ endpointURL: '/graphql' })) // if you want GraphiQL enabled
app.listen(3000)
And here's the schema.
export default `
type Person {
_key: String!
firstName: String!
lastName: String!
}
type Records {
_key: String!
intro: String!
title: String!
misc: String!
}
type Query {
getPerson(id: Int!): Person
getRecords(ownerId: Int!): [Record]!
}
type Schema {
query: Query
}
`
So, the real benefit of dataloader is that it stops you from doing n+1 queries. Meaning for example, if in your schema, Person had a field records, and then you asked for the first 10 people's 10 records. In a naive gql schema, that would cause 11 requests to be fired: 1 for the first 10 people, and then one for each of their records.
With dataloader implemented, you cut that down to two requests: one for the first 10 people, and then one for all of the records of the first ten people.
With your schema above, it doesn't seem that you can benefit in any way from dataloader, since there's no possibility of n+1 queries. The only benefit you might get is caching if you make multiple requests for the same person or records within a single request (which again, isn't possible based on your schema design unless you are using batched queries).
Let's say you want the caching though. Then you could do something like this:
// loaders.js
// The callback functions take a list of keys and return a list of values to
// hydrate those keys, in order, with `null` for any value that cannot be hydrated
export default {
personLoader: new DataLoader(loadBatchedPersons),
personRecordsLoader: new DataLoader(loadBatchedPersonRecords),
};
You then want to attach the loaders to your context for easy sharing. Modified example from Apollo docs:
// app.js
import loaders from './loaders';
app.use(
'/graphql',
bodyParser.json(),
graphqlExpress(req => {
return {
schema: myGraphQLSchema,
context: {
loaders,
},
};
}),
);
Then, you can use them from the context in your resolvers:
// ViewerType.js:
// Some parent type, such as `viewer` often
{
person: {
type: PersonType,
resolve: async (viewer, args, context, info) => context.loaders.personLoader,
},
records: {
type: new GraphQLList(RecordType), // This could also be a connection
resolve: async (viewer, args, context, info) => context.loaders.personRecordsLoader;
},
}
I guess I was confused about the capability of dataloader. Serving nested data was really the stumbling block for me.
This is the missing code. The export from resolvers.js needed a person property,
export default {
Person: {
records: (person)=> new Promise(fetchRecords(person._key)),
},
Query: {
getPerson: (_, { id })=> new Promise(fetchPerson(id)),
getRecords: (_, { ownerId })=> new Promise(fetchRecords(ownerId)),
},
}
And the Person type in the schema needed a records property.
type Person {
_key: String!
firstName: String!
lastName: String!
records: [Records]!
}
Seems these features are provided by Apollo graphql-tools.

Categories

Resources