Parse Server one-to-one relationship, to store different pointer - javascript

I design Event Model of my server, which contains a lot of events related to many users. For example (since I use javascript SDK, let me describe the structure with Typescript):
interface IEventBase {
user: Parse.User;
}
interface IEventStartWork extends IEventBase {
late: boolean;
}
interface IEventAfterWork extends IEventBase {
jobDone: boolean;
}
interface IEventDrinkWaters extends IEventBase {
howManyTimes: number;
}
interface IEventGoOutForLunch extends IEventBase {
howLong: number;
}
In my case, I have more than 100 different kind of events, and I need to take all events happened to one user as fast as possible. And I cannot put them into one single collection, because I have to add index on each event when necessary. So I attempt to put them into one indexed, collection with pointer:
interface IEvents {
user: Parse.User;
pointer: IEventBase;
}
By this design I can quickly pull out all events of an user, in any time range.
But then I got error when storing different events into Events collection, showing error:
schema mismatch for Events.entity; expected Pointer<EventStartWork> but got Pointer<EventAfterWork>
What can I do to achieve this goal with Parse Server, maybe disable schema checking on single field?

Related

How to exclude hooks from entity types in typeorm / typegraphql

Im sure this is a really common issue but I just cannot seem to find any accepted solutions.
When combining typeorm and typegraphql, you create entities with the properties of the entity. However typeorm also allows for hooks such as beforeInsert to be added to the entity.
The issue im having is that the entity includes these hooks as properties which are not returned from the database e.g.
// Define the entity
#Entity()
#ObjectType()
export class MyEntity extends CustomBaseEntity {
#Field(() => ID)
#PrimaryGeneratedColumn("uuid")
id: string;
#Field()
#Column({ type: "bigint", nullable: true })
created: number;
#BeforeInsert()
beforeUpdate() {
this.created = Date.now();
}
}
// Pull the entity from the database
const myEntityRow = myEntityRepository.findOneWhere({id})
// As you can see there is a type mismatch
// The type is MyEntity type (including the beforeInsert method) even though there is no
// beforeInsert prop on the actual entity
console.log(
myEntityRow // { id: 1, created: 123 }
)
Meaning that something like this does not work:
const destructuredEntity = {...myEntityRow}
await myEntityRepository.save(destructuredEntity) // Typeerror - missing properties "beforeInsert"
Right now i'm probably just thinking that I need to remove these hook functions and just put any methods like this within the service, any ideas?
and just put any methods like this within the service, any ideas?
that's definitely the best choice.
It's better not to use inversion of control in case you can avoid it or you don't have some significant advantages because of using it.
Just put created initialization in constructor or make it default in your database.

Typescript data modeling shared by both front-end and back-end - Classes or interfaces?

I wonder what's the best way of sharing the same data types between the client (React) and the server (Express + Socket.IO).
In my game I have different rooms, each room saves the current status, something like:
class GameRoom {
players: Player[];
started: boolean;
currentPlayerTurn; Player;
dices: [number, number];
constructor({players = [], started = false, currentPlayerTurn = null, dices = [1,1]) {
this.players = players;
this.started = started;
this.currentPlayerTurn = currentPlayerTurn;
this.dices = dices;
}
startGame() {
this.currentPlayerTurn = this.players[0];
this.started = true;
}
// etc..
}
The room is being generated in the server, being sent to the client as JSON, and then rebuilt in the client. I sync the data with socket events, and everything's perfect.
But there's a problem with the React side of the story: changing GameRoom properties won't cause a rerender. That means I have to forceRerender() each time something is edited, or listen to class changes. Both options are a mess and I described it deeply in this question.
This mess made me think maybe classes are not the best way to go. Using interface will solve this problem entirely, but I do lose instance functions like GameRoom.startGame(), that will have to be turned into utility functions, like:
export function startGame(gameRoom: GameRoom) {
gameRoom.currentPlayerTurn = gameRoom.players[0];
gameRoom.started = true;
}
which is another mess, since they're hidden in code, and the developer needs to know they exist, and not edit gameRoom directly.
If you guys have any idea on how to model my data types, I'd be more than happy to hear.
Thanks!
I would go with a functional approach. I'm not a huge fan of classes in JS.
Define a type for your game, but set the properties to readonly. This will tell the developer that they shouldn't mutate the GameRoom.
export type GameRoom = {
readonly players: Player[];
readonly started: boolean;
readonly currentPlayerTurn: Player;
readonly dices: [number, number];
}
Then, you can define all your changes as pure functions, so you define your inputs and create a new object that is your updated GameRoom. This makes things easy to test and track changes.
export function startGame(gameRoom: GameRoom): GameRoom {
return {
...gameRoom,
currentPlayerTurn: gameRoom.players[0],
started: true
}
}
Alternatively, you could use something like Redux and define your changes as a set of actions. (e.g. add-player, start-game, etc) Then, you can dispatch those actions from anywhere in your code.
The great thing about using immutability is every time you make a change to your object, you are returning a new one, so React will always re-render appropriately.

Angular extend API model for component purpose

I often have needs to extend my API model with parameters I use just in component view.
For Example I have a model:
export class Person {
name: string;
surname: string;
address: string;
}
It is something I get from API:
getPersons(): Observable<Person[]> {
return this.httpClient.get<Person[]>`${environment.API}/person`);
}
When I get this in my component I often need to extend model with parameter/attribute I get in data processing after request or just simple 'active'/'select' parameter for UI visualization tracking.
Which approach to use for this purpose. I know for 2 solutions:
1) Add parameter to the class model even if that parameter do not participate in server response, just separate them from standard parameters:
export class Person {
name: string;
surname: string;
address: string;
ui_active: boolean; // I know that those parameters are not from API
ui_fullName: string; // response but they are here to make my life easier
}
2) Make another extend class with those parameters:
export class PersonExtended extends Person {
ui_active: boolean;
ui_fullName: string
}
But this approach complicate thing since I have 2 models, and I need to switch between to them all the time.
What is the best practice for this kind of situation ?
Just make those fields optional with the ?-operator:
export class Person {
name: string;
surname: string;
address: string;
ui_active?: boolean;
ui_fullName?: string;
}
Thus, you can use them but you don't have to.
EDIT:
If you have to remove them somewhen use the following generic method:
private omitFields<T>(object: T, fields: Array<string>): T {
for (let field of fields) {
delete object[field];
}
return object;
}
And use it, for example, like this:
this.omitFields(el, ['ui_active', 'ui_fullName']);
I think codewise second option is right, but as far as I know there is nothing wrong with the first option, just make sure the new parameters can be null or ignored some way so you dont get errors if you dont set them where they are not needed.

Does it make sense to normalize/denormalize tree structured data when designing ngrx state?

I am building a todo application with a list of todos which can also have sub Todos. Before using ngrx my model looked like this:
export interface Todo {
id: string;
title: string;
isDone?: boolean;
parentId?: string;
subTodos?: [Todo];
}
<todo *ngFor="let todo of todoss$|async">
<todo *ngFor="let subTodo of todo.subTodos"></todo>
</todo>
This was handy for displaying the data inside the view as I was able to use nested *ngFors to do so. Because the reducer logic for updating the sub tasks got a little complicated, I thought about using entities. But normalizing data seems to come with some trade offs. Now I not only have to filter my Todos to remove the sub tasks from the main list, I also have to query the data for every single SubTodo. This seems to be a lot of computation being done for simplifying the update logic.
Model looks like this now:
export interface Todo {
id: string;
title: string;
isDone?: boolean;
parentId?: string;
subTodos?: [string]; // an Todo.id
}
I read a lot about ngrx but was unable to find any example with such a data structure. I'm wondering how would you tackle the problem? Do you always normalize your arrays and split up related models? And if so do you denormalize data when you use it in your views or do you look it up in designated components (which seems ugly to me)? And how do you design the interfaces if so? Do you create two separate interfaces for view model and the state model?
You should be able to still have the same view models as before.
Here is where selectors are coming into play, they can transform your state into a viewmodel that fits your needs the best.

How can I instantiate a Sequelize model to be used in Unit Tests?

I am trying to write unit tests for a Sequelize model. I have an instance method on the class that call's Sequelize's update method. Turning update into a mocked function is no problem. I just can't seem to figure out how to properly do a X = new MyModel() without some kind of error. It tells me Class constructor model cannot be invoked without 'new'.
I took a look at https://sequelize-mock.readthedocs.io/en/stable/ but the way it's written makes me think it's better suited for mocking the Models when testing classes that are taking advantage of a model, rather than the model itself.
const BaseUtut = sequelizeInstance.define('utut', {
id: {
type: Sequelize.INTEGER,
primaryKey: true,
autoIncrement: true
},
metadata: {
type: Sequelize.JSON
}
});
class Utut extends BaseUtut {
async updateFromFlat(id, metadata) {
// in reality, i modify metadata, and want to test things around that
modifiedMetadata = methodThatModified(metadata);
// I want to see that update is called with the right output of the modified metadata
return await this.update(
{
id: id,
metadata: modifiedMetadata
},
{
where: {
locale: metadata.locale
}
}
);
}
}
So with the above code sample, if I have an insance of Utut, I know I can easily mock update. I just don't know how to initialize the class to then mock things.
The intent of the test I want to write is simply to see that update is called with the right parameters, based on my own code that has nothing to do with Sequelize. I want to make sure my transform on the inputs are set right when going to Sequelize. I am not planning to check that the DB is actually updated. The only problem I am having is actually instantiating the model.
So all I need is for code like this to work:
const instance = new Utut(); // This line doesn't work
instance.update = Jest.fn();
I'm not sure why it says I need to invoke with new, when that is exactly what I am doing.
So taking advantage of the Model being returned from Sequelize.define causes this issue. Rather than doing class Utut extends BaseUtut {, I did this:
// Static methods
Object.assign(BaseUtut, {});
// Instance methods
Object.assign(BaseUtut.prototype, {});
Doing that lets me do new on the model with no issue in my tests
Update
Just saw this was unanswered on 9/23/19. The above answer looks to be for an older version. I now just do
class MyClass extends Sequelize.Model {}
MyClass.init(definition, options);
export default MyClass;

Categories

Resources