Set up internal dependencies at build time - javascript

I'm looking at using Terraform to build a small multi-cloud project. The project will call some API, perform some manipulation of the data received, then store in the cloud.
I've written a Typescript project that I intend to upload as either an AWS Lambda function or an Azure function, depending on the choice of cloud provider, set at deployment.
The problem I'm facing is that the Typescript project must have the ability to switch its storage mechanism depending on which cloud provider the project is being deployed to. I have something similar to this, setting the cloud provider as an environment variable on deployment:
export const handler = async (event: any = {}): Promise<any> => {
const users: User[] = await fetchUsers();
users.PerformSomeUpdate();
const userRepository: IUserRepository = getUserRepository();
userRepository.Save(users);
return JSON.stringify(event, null, 2);
}
function getUserRepository(): IUserRepository {
if (process.env["CLOUD_PROVIDER"] == "AWS") {
return new S3Repository();
}
if (process.env["CLOUD_PROVIDER"] == "AZURE") {
return new AzureBlobRepository();
}
}
This works fine, but I'd rather not have to check the cloud provider on each execution of the function. Is there a way I can set this dependency at the build/deployment stage instead?
I've looked into several DI frameworks, but I don't think they're the right answer, as a DI container resolves dependencies at run time.

Related

How to wrap code that uses transcations in a transaction and then rollback?

I'm setting up my integration testing rig. I'm using the beforeEach and afterEach hooks to wrap every single test in a transaction that rollsback so that the tests don't affect each other. A simplified example might be this:
const { repository } = require("library")
describe("Suite", function () {
beforeEach(async function () {
await knex.raw("BEGIN");
});
afterEach(async function () {
await knex.raw("ROLLBACK");
});
it("A test", async function () {
const user = await repository.createUser()
user.id.should.equal(1)
});
});
This worked fine because I configured knex to use a single DB connection for tests. Hence calling knex.raw("BEGIN"); created a global transaction.
Now however, the library's repository which I can't control started using transactions internally. I.e. createUser() begins and then commits the created user. This broke my tests as now my afterEach hook doesn't rollback the changes because they were already committed.
Is there a way in Postgres to rollback a transaction that have (already committed) nested transactions?
Or maybe a way to use knex to prevent the repository from starting transactions in the first place? It uses knex.transaction() to create them.
Thanks!
Judging by the looks of an example debug log, knex does in fact detect transaction nesting automatically and switches nested transactions from using irreversible commit/rollback to manageable savepoint s1/release s1/rollback to s1 the way I was guessing in my comment.
In this case, it should be enough for you to wrap your calls in a transaction, so that you "own" the top-level one. Knex should detect this and force the underlying transactions to use savepoints instead of commits, all of which you can then undo, rolling back the top-level transaction. If I read the doc right:
const { repository } = require("library")
describe("Suite", function () {
it("A test", async function () {
try {
await knex.transaction(async trx => {
const user = await repository.createUser();
user.id.should.equal(1);
trx.rollback();
})
} catch (error) {
console.error(error);
}
});
});
That's assuming none of the calls below issues a knex.raw("COMMIT") or somehow calls .commit() on the outer, top-level transaction.
As may be guessed from the tags the library in question is Strapi and I'm trying to write tests for the custom endpoints I implemented with it.
As noted by #zagarek, Postgres itself can't rollback already committed transactions. Knex does support nested transactions (using save-points) but you must explicitly refer to the parent transaction when creating a new one for it to get nested.
Many tried to achieve this setup. See the threads under e.g. here or here. It always boils down to somehow passing the test-wrapping transcation all the way down to your ORM/repository/code under test and instructing it to scope all queries under that transaction.
Unfortunately, Strapi doesn't provide any way to be given a transaction nor to create a global one. Now, cover your eyes and I'll tell you how I hacked around this.
I leverage one nice aspect of Knex: its Transaction object behaves (mostly) the same as a Knex instance. I mercilessly replace Strapi's reference of Knex instance with a Knex transaction and then rollback it in afterEach hook. To not make this too easy, Strapi extends its knex instance with a getSchemaName function. I therefore extend the transaction in disguise too and proxy to the original.
This does it: (Note that I'm using Mocha where this can be used to pass state between hooks and/or tests.)
const Strapi = require("#strapi/strapi");
before(async function () {
// "Load" Strapi to set the global `strapi` variable.
await Strapi().load();
// "Listen" to register API routes.
await strapi.listen();
// Put aside Strapi's knex instance for later use in beforeEach and afterEach hooks.
this.knex = strapi.db.connection;
});
after(async function () {
// Put back the original knex instance so that Strapi can destroy it properly.
strapi.db.connection = this.knex;
await strapi.destroy();
});
beforeEach(async function () {
// Replace Strapi's Knex instance with a transaction.
strapi.db.connection = Object.assign(await this.knex.transaction(), {
getSchemaName: this.knex.getSchemaName.bind(this.knex),
});
});
afterEach(async function () {
strapi.db.connection.rollback();
});
it("Health-check is available.", async function () {
// Any changes made within here will get rolled back once the test finishes.
await request(strapi.server.httpServer).get("/_health").expect(204);
});
Lastly, it's worth noting that some Knex maintainers persistently discourage using transcations to isolate tests so consider if chasing this hacky setup is a good idea.

How to configure AWS CDK Account and Region to look up a VPC

I am learning the AWS CDK, and this is a problem I can't seem to figure out. JS/Node are not languages I use often, so if there is some obvious native thing that I am missing, please don't be too harsh. I'm trying to deploy a container to an existing VPC / new ECS Cluster. The following code isn't my whole script but is an important part. Hopefully, it gives the idea of what I'm trying to do.
//import everything first
stack_name = "frontend";
class Frontend extends core.Stack {
constructor(scope, id, props = {}) {
super(scope, id);
console.log("env variable " + JSON.stringify(props));
const base_platform = new BasePlatform(this, id, props);
//this bit doesn't matter, I'm just showing the functions I'm calling to set everything up
const fargate_load_balanced_service = ecs_patterns.ApplicationLoadBalancedFargateService();
this.fargate_load_balanced_service.taskDefinition.addToTaskRolePolicy();
this.fargate_load_balanced_service.service.connections.allowTo();
const autoscale = this.fargate_load_balanced_service.service.autoScaleTaskCount({});
this.autoscale.scale_on_cpu_utilization();
}
}
class BasePlatform extends core.Construct {
constructor(scope, id, props = {}) {
super(scope, id);
this.environment_name="frontend";
console.log("environment variables " + JSON.stringify(process.env));
//This bit is my problem child
const vpc = ec2.Vpc.fromLookup(
this, "VPC",{
vpcId: 'vpc-##########'
});
//this bit doesn't matter, I'm just showing the functions I'm calling to set everything up
const sd_namespace = service_discovery.PrivateDnsNamespace.from_private_dns_namespace_attributes();
const ecs_cluster = ecs.Cluster.from_cluster_attributes();
const services_sec_grp = ec2.SecurityGroup.from_security_group_id();
}
}
const app = new core.App();
_env = {account: process.env.CDK_DEFAULT_ACCOUNT, region: process.env.CDK_DEFAULT_REGION };
new Frontend(app, stack_name, {env: _env});
app.synth();
When I run CDK synth, it spits out:
Error: Cannot retrieve the value from context provider vpc-provider since the account/region is not specified at the stack level. Either configure "env" with explicit account and region when you define your stack or use the environment variables "CDK_DEFAULT_ACCOUNT" and "CDK_DEFAULT_REGION" to inherit environment information from the CLI (not recommended for production stacks)
But I don't know why. My usage here fits several other Stackoverflow answers to similar questions, it loos like the examples in the AWS docs, and when I console.log(process.env), it spits out the correct/expected values of CDK_DEFAULT_REGION and CDK_DEFAULT_ACCOUNT. When I log "env" it spits out the expected values as well.
So my question is, how do I configure my environment so ec2.Vpc.fromLookup knows my account info, or how do I pass the values properly to "env"?
As I understand, it you must specify an environment explicitly if you want to use environment specifics at synth time.
The AWS CDK distinguishes between not specifying the env property at all and specifying it using CDK_DEFAULT_ACCOUNT and CDK_DEFAULT_REGION. The former implies that the stack should synthesize an environment-agnostic template. Constructs that are defined in such a stack cannot use any information about their environment. For example, you can't write code like if (stack.region === 'us-east-1') or use framework facilities like Vpc.fromLookup (Python: from_lookup), which need to query your AWS account. These features do not work at all without an explicit environment specified; to use them, you must specify env.
If you want to share environment variables with the cli you can do it like this:
new MyDevStack(app, 'dev', {
env: {
account: process.env.CDK_DEFAULT_ACCOUNT,
region: process.env.CDK_DEFAULT_REGION
}});
Pass the props with env to the parent construct constructor explicitly as mentioned by Nick Cox
class BasePlatform extends core.Construct {
constructor(scope, id, props = {}) {
super(scope, id, props);
Since I was not able to comment, I am posting my query here.
From the look of it, there is just a single stack frontend. So I believe you can also try hard coding account-id and region in code, and see if it works.
Also I am curious what is the output of
console.log("environment variables " + JSON.stringify(process.env));
Replace super(scope, id) with super(scope, id, props);
The props need to be passed to super for the vpc-provider to use it.
Easiest way is to use aws cli(aws configure).
You will need to have programmatic access for your user and generate access keys from aws console.
AWS CDK documentation

Check if functions have been called in unit test

Hi I'm trying to write some unit tests in Jest for a module I write, but kind of stuck currently and need some advice how to continue.
export const submitOrder = async (body, key) => {
const clientRepo = new ClientRepository(db)
const companyRepo = new CompanyRepository(db)
const company = await getCompanyByKey(
companyRepo,
key
);
const client = await createClient(
clientRepo,
body
);
await addClientToCompany(
companyRepo,
client.id,
company.id
);
.. More things
}
I can easily test each function(getCompanyByKey, createClient & addClientToCompany) by passing down a mocked repository.
But I would also like to test my "flow" of the submitOrder function, by checking if my repository functions have been called. But I would then need the instance of each repository, which I don't instantiate until my submitOrder function.
Something like this, which is similar how I unit test my functions.
jest.mock('../repositories/ClientRepository');
jest.mock('../repositories/CompanyRepository');
test('should be able to submit an order', async () => {
const apiKey = 'mocked-super-key';
const body = getMockData();
const result = await submitOrder(body, apiKey);
expect(result).toMatchSnapshot();
expect(CompanyRepository.findByKey).toHaveBeenCalled();
expect(ClientRepository.create).toHaveBeenCalled();
expect(CompanyRepository.addClient).toHaveBeenCalled();
});
Do you have any tips of how I can test if my repositories have been called?
The problem you describe is one of the motivating factors behind dependency injection.
As a single example: your submitOrder() code uses new to directly instantiate a client repository of the specific implementation ClientRepository. Instead, it could declare that it has a dependency - it needs an object that implements the interface of a client repository. It could then allow for such an object to be supplied by the surrounding environment (a "dependency injection container" in buzzword-ese). Then during testing you would create and provide ("inject") a mock implementation instead of the real implementation.
This has the added benefit that if you ever have to be able to select between multiple "real" implementations, you're already set up to do that too.
There are many ways to achieve this. It can be as simple as a design pattern, or for a more complete solution you could use a dependency injection framework.
If you absolutely cannot refactor your code for this practice, then JavaScript is dynamic enough that you can probably cobble together a way to intercept the invocation of new and thereby simulate dependency injection.
You can pass a mock implementation factory as a second parameter to jest.mock, as described in the docs.
You can use this to mock out the methods that you want to check to have been called.
Try this:
jest.mock('../repositories/CompanyRepository', () => {
findByKey: jest.fn(),
addClient: jest.jn()
});
const mockCreate = jest.fn();
jest.mock('../repositories/CompanyRepository', () => class {
create(...args) {
mockCreate(...args);
}
});
test('should be able to submit an order', async () => {
const apiKey = 'mocked-super-key';
const body = getMockData();
const result = await submitOrder(body, apiKey);
expect(result).toMatchSnapshot();
expect(CompanyRepository.findByKey).toHaveBeenCalled();
expect(ClientRepository.create).toHaveBeenCalled();
expect(CompanyRepository.addClient).toHaveBeenCalled();
});
Since CompanyRepository is created with “new”, we use a class definition in this case and pass in a mock function that is called when the “create” method is invoked.

Where does one hold service instances in a react/redux application?

Suppose I am writing an application in Redux and I am tasked to add logging using a 3rd party library. Its API is as follows:
function createLogger(token) {
// the logger has internal state!
let logCount = 0;
return {
log(payload) {
logCount++; // modify local state
fetch('/someapi', { // ship payload to some API
method: 'POST',
body: payload
});
}
};
}
I would then use the library something like this:
let logger = createLogger('xyz');
logger.log('foobar');
I definitely want to create the logger instance just once during application init. But then the question is: where do I store the logger instance?
First instict is to put it somewhere in the store. But is that a good idea? As I have demonstrated in the code the logger object is stateful, it stores a counter in the closure. I do not get a new instance like I would with an immutable object. As we know, state should only be modified via pure reducer functions.
Other possibilities are to create the instance somewhere in a redux middleware closure or just create a global variable, which is obviously evil in terms of testability.
Is there a best practice for this (I would think) rather common scenario?
Since you are using ES6 modules I would setup your logger as a module, export it, and import it wherever you plan to use it. I think logging from the actions is a solid plan, since it keeps the components unaware, and doesn't pollute the store with side-effects.
function createLogger(token) {
// the logger has internal state!
let logCount = 0;
return {
log(payload) {
logCount++; // modify local state
fetch('/someapi', { // ship payload to some API
method: 'POST',
body: payload
});
}
};
}
export default const logger = createLogger('xyz');
Your action creators
import logger from 'logger-module';
//
logger.log('somestuff');
Testing is still easily achievable by importing the logger and placing whatever spy/stub on its methods that you need to intercept.
From the Redux documentation:
/**
* Sends crash reports as state is updated and listeners are notified.
*/
const crashReporter = store => next => action => {
try {
return next(action)
} catch (err) {
console.error('Caught an exception!', err)
Raven.captureException(err, {
extra: {
action,
state: store.getState()
}
})
throw err
}
}
Raven being a third-party library.
If the library has its own state then it shouldn't be an issue using it in middleware (the state belongs in the library and not your app). If you're creating a state for it, for some reason, then that state should belong in the Redux store, probably under store.logger or something.

Where should I initialize pg-promise

I just started to learn nodejs-postgres and found the pg-promise package.
I read the docs and examples but I don't understand where should I put the initialization code? I using Express and I have many routes.
I have to put whole initialization (including pg-monitor init) to every single file where I would like to query the db or I need to include and initalize/configure them only in the server.js?
If I initialized them only in the server.js what should I include other files where I need a db query?
In other words. Its not clear to me if pg-promise and pg-monitor configuration/initalization was a global or a local action?
It's also unclear if I need to create a db variable and end pgp for every single query?
var db = pgp(connection);
db.query(...).then(...).catch(...).finally(**pgp.end**);
You need to initialize the database connection only once. If it is to be shared between modules, then put it into its own module file, like this:
const initOptions = {
// initialization options;
};
const pgp = require('pg-promise')(initOptions);
const cn = 'postgres://username:password#host:port/database';
const db = pgp(cn);
module.exports = {
pgp, db
};
See supported Initialization Options.
UPDATE-1
And if you try creating more than one database object with the same connection details, the library will output a warning into the console:
WARNING: Creating a duplicate database object for the same connection.
at Object.<anonymous> (D:\NodeJS\tests\test2.js:14:6)
This points out that your database usage pattern is bad, i.e. you should share the database object, as shown above, not re-create it all over again. And since version 6.x it became critical, with each database object maintaining its own connection pool, so duplicating those will additionally result in poor connection usage.
Also, it is not necessary to export pgp - initialized library instance. Instead, you can just do:
module.exports = db;
And if in some module you need to use the library's root, you can access it via property $config:
const db = require('../db'); // your db module
const pgp = db.$config.pgp; // the library's root after initialization
UPDATE-2
Some developers have been reporting (issue #175) that certain frameworks, like NextJS manage to load modules in such a way that breaks the singleton pattern, which results in the database module loaded more than once, and produce the duplicate database warning, even though from NodeJS point of view it should just work.
Below is a work-around for such integration issues, by forcing the singleton into the global scope, using Symbol. Let's create a reusable helper for creating singletons...
// generic singleton creator:
export function createSingleton<T>(name: string, create: () => T): T {
const s = Symbol.for(name);
let scope = (global as any)[s];
if (!scope) {
scope = {...create()};
(global as any)[s] = scope;
}
return scope;
}
Using the helper above, you can modify your TypeScript database file into this:
import * as pgLib from 'pg-promise';
const pgp = pgLib(/* initialization options */);
interface IDatabaseScope {
db: pgLib.IDatabase<any>;
pgp: pgLib.IMain;
}
export function getDB(): IDatabaseScope {
return createSingleton<IDatabaseScope>('my-app-db-space', () => {
return {
db: pgp('my-connect-string'),
pgp
};
});
}
Then, in the beginning of any file that uses the database you can do this:
import {getDB} from './db';
const {db, pgp} = getDB();
This will ensure a persistent singleton pattern.
A "connection" in pgp is actually an auto-managed pool of multiple connections. Each time you make a request, a connection will be grabbed from the pool, opened up, used, then closed and returned to the pool. That's a big part of why vitaly-t makes such a big deal about only creating one instance of pgp for your whole app. The only reason to end your connection is if you are definitely done using the database, i.e. you are gracefully shutting down your app.

Categories

Resources