Can cold starts be good in some cases? (Cloud Functions) - javascript

I am developing a project which will have +100 cloud functions, so for making the work easier I am structuring all my functions like this:
/functions
/src
spotify/
searchMusic.function.js
...
auth/
signUp.function.js
...
...
index.js // Auto export every cloud function
In each file that ends with .function.js I export an unique cloud function. My question is that, as I don't have more than one google cloud function per file, should I make lazy imports/initializations?
For example, I know that it is useful to do this when there are two functions in a module and one of them doesn't use a package which is used in the other:
const is_f1_admin_initialized = false;
exports.f1 = functions.region...{
const admin = require("firebase-admin");
// Lazy initialization
if(!is_f1_admin_initialized) {
// Initialize the admin SDK
...
is_f1_admin_initialized = true;
}
}
exports.f2 = functions.region...{}
But in my situation, where I only have f1 in f1.function.js, will lazy imports/initializations reduce the cold start? Or will it be better to make all imports in the global scope?
Thank you.
UPDATE
That's what I mean:
"use-strict";
const functions = require("firebase-functions");
const admin = require("firebase-admin");
// All my imports...
// The admin SDK can only be initialized once.
try {
const googleCloudServiceAccount = require("../../utils/json/googleCloudServiceAccount.json");
admin.initializeApp({
...
});
} catch (e) {}
// The unique cloud function in the module
exports.function = functions.region...
// All my helper functions...

But in my situation, where I only have f1 in f1.function.js, will lazy imports/initializations reduce the cold start?
No, it will make no difference.
In either case (running in either the global scope, or within the function), the code is in the critical path of serving the first request. Changing the scope doesn't change the time it takes for that code to run.
The only time that moving code out of the global scope will help is when you have multiple functions that do not share that code. Moving the execution of that code into the function simply ensures that other functions don't unnecessarily execute it.

Related

How to set environment variables like test environment for playwright-cucumber js

I've been using Playwright with Cucumber for e2e-automation of a react web application. This repo has been my starting point and it's been working out pretty good.
However, I'm looking for pointers on how to run these tests on different test environments - like development or QA, so that the target urls and other params vary as per the environment passed. For eg -
if (env == dev){
baseurl = dev_url
}
else{
baseurl = qa_url
}
The Cucumber documentation mentions the World parameter - an this issue looks like a similar issue, however I'm skeptical of passing a different JSON for this task.
Can this be achieved only at a Cucumber level or is there a Playwright or Node way of doing this?
As you are already using cucumber, define your world file like this:
First you can segregate your properties files into: properties-dev.json and properties-qa.json. Below code reads properties file based on env we are passing in defaultOptions object and stores entire properties file data into 'this'. Use 'this' in your hooks file and call this.keyNameForUrl to get url specific to environment.
Note: 'this' can be accessible only in world and hooks files (refer // https://github.com/cucumber/cucumber-js/blob/master/docs/support_files/world.md). If you need this data in other files, create a separate class and declare all public static varibles in it. In Hooks BeforeAll function, reassign values from 'this' to the static variables created in the class.
import { setWorldConstructor } from 'cucumber';
const fs = require('fs');
const defaultOptions = {
env: 'qa'
};
function processOptions(options) {
let envFile = 'properties-' + options.env + '.json';
let environment = JSON.parse(fs.readFileSync(envFile));
return Object.assign(options, environment);
}
function World(input) {
this.World = input;
});
Object.assign(this, processOptions(Object.assign(defaultOptions, options)), options);
}
setWorldConstructor(World);
// https://github.com/cucumber/cucumber-js/blob/master/docs/support_files/world.md

Correct place to declare many modules providing similar service in javascript

I am writing an interface which groups existing node modules providing the same service eg geolocation. App admin can set/choose only one module which will provide the service for the entire app.
My questions is: Where should I put the require declarations?
a) In the beginning? all declarations together:
const _service1 = require('service1');
...
const _serviceN = require('serviceN');
or b) Within each own case? One declaration at a time:
switch (serviceName) {
case 'serviceOne':
const _service1 = require('service1');
...
break;
...
case 'serviceEN':
const _serviceN = require('serviceN');
...
break;
In first case all declarations are called only once, but in second case one declaration is called each time the service is requested.
I have thought to use a workaround like:
case 'serviceEX':
if (!_serviceX) {
const _serviceX = require('serviceX');
}
...
break;
so as declaration takes place only once (when it is called for first time) but I have seen that nowhere else to be used so I do not know if it is practically correct. Please, advise. Tia
According to the Airbnb javascript styleguide you should always do your imports before non-imports code.
See here for more information.
In js , the most common style of declaring modules is on the start of file.

Best approach to passing variables between multi-file Node.js modules?

I have a Node.js module that I have kept as a single file up to this point. It's getting rather large though and has a lot of functionality in it that might be better separated into other modules. For example, separating out logging initialization and functionality into it's own module.
My module has a lot of (I want to say "global" but not really) top-level variables that lots of different functions access, use and modify. If I separate out functionality into separate files/modules and require them into my primary module, what is the proper approach to passing those variables between the modules?
For example, with everything in one module/file, it's easy to do this:
const logger = (log, message) {........}
const makeRequestHandler = (url, filepath) {
....
logger.info('some message here')
....
}
So it's pretty easy to access top-level systems like the logger. But, if I decided to split my logger and makeRequestHandler into their own modules/files, how would I handle this?
let logger = require('./mylogger') // Custom module
let makeRequest = require('./makerequest') // Another custom module
makeRequest.handler(url, filepath, logger)
This would work, but it doesn't seem elegant or optimal. It would get even more weird if I have a lot of different variables that I needed to pass in:
makeRequest.handler(url, filepath, logger, profiler, reportingBuffer, compressionHandler)
I've also considered passing stuff into the modules when requiring:
let makeRequest = require('./makeRequest')(logger)
or better yet:
let makeRequest = require('./makeRequest')(this) // I can access all variables made in my primary/top-level module
Is there an approach here that is more proper and better/easier to maintain? Is the last one the best approach?
What about a global locator pattern or service locator/service provider pattern as pointed out in comments wherein you can have something like a service registry and include these services in any module you want to use them in.
Although I am not sure about being the best solution of all, but it is easier to implement and feels like a neater solution than passing in the this context around the modules.
//logger.js
const logger = (log, message) {........}
export logger
Now, in the app file is where you can initialize the logger and other service instances and register them in the global locator
let logger = require('./mylogger') // Custom module
init() {
//init and set the logger
global.logger = new logger();
...
}
And this is how you can use it in the code to makRequest
let logger = global.logger;
const makeRequestHandler = (url, filepath) {
....
logger.info('some message here')
....
}
What I feel is problem with these solutions :
//Solution 1 : As you pointed out yourself this can get messy when number of paramters increase and is not very readable or understandable.
let logger = require('./mylogger')
let makeRequest = require('./makerequest')
makeRequest.handler(url, filepath, logger)
//Solution 2 : Passing around the `this` context is never a good idea,for keeping sensitive data independent or scope isolation
let makeRequest = require('./makeRequest')(this)
note :
This article explains some aspects of this solution in detail for your consideration.
Also there are some npm modules which provide these features like Service Locator
. HTH

How to migrate legacy JS app to modules

I have a large (~15k LoC) JS app (namely a NetSuite app) written in old-style all-global way. App consists of 26 files and dependencies between them are totally unclear.
The goal is to gracefully refactor the app to smaller modules. By gracefully I mean not breaking\locking the app for long time, but doing refactoring in smaller chunks, while after completing each chunk app remains usable.
An idea I have here is to concat all the JS files we have now into single-file bundle. After that some code could be extracted into modules. And the legacy code could start importing it. The modules & imports should be transpiled with webpack\whatever, while legacy code remains all-globals style. Finally all this is packed into single JS file and deployed.
My questions are
is there a better approach maybe? This sounds like a typical problem
are there any tools available to support my approach?
I gave webpack a try and I haven't managed to get what I want out of it. The export-loader and resolve-loader are no options because of amount of methods\vars that needs to be imported\exported.
Examples
Now code looks like
function someGlobalFunction() {
...
}
var myVar = 'something';
// and other 15k lines in 26 files like this
What I would ideally like to achieve is
function define(...) { /* function to define a module */ }
function require(moduleName) { /* function to import a module */ }
// block with my refactored out module definitions
define('module1', function () {
// extracted modularised code goes here
});
define('module2', function () {
// extracted modularised code goes here
});
// further down goes legacy code, which can import new modules
var myModule = require('myNewModule');
function myGlobalLegacyFunction() {
// use myModule
}
I'm following an approach similar to that outlined here: https://zirho.github.io/2016/08/13/webpack-to-legacy/
In brief:
Assuming that you can configure webpack to turn something like
export function myFunction(){...}
into a file bundle.js that a browser understands. In webpack's entry point, you can import everything from your module, and assign it to the window object:
// using namespace import to get all exported things from the file
import * as Utils from './utils'
// injecting every function exported from utils.js into global scope(window)
Object.assign(window, Utils).
Then, in your html, make sure to include the webpack output before the existing code:
<script type="text/javascript" src="bundle.js"></script>
<script type="text/javascript" src="legacy.js"></script>
Your IDE should be able to help identify clients of a method as you bring them into a module. As you move a function from legacy.js to myNiceModule.js, check to see if it still has clients that are aware of it globally - if it doesn't, then it doesn't need to be globally available.
No good answer here so far, and it would be great if the person asking the question would come back. I will pose a challenging answer saying that it cannot be done.
All module techniques end up breaking the sequential nature of execution of scripts in the document header.
All dynamically added scripts are loaded in parallel and they do not wait for one another. Since you can be certain that almost all such horrible legacy javascript code is dependent on the sequential execution, where the second script can depend on the first previous one, as soon as you load those dynamically, it can break.
If you use some module approach (either ES7 2018 modules or require.js or you own) you need to execute the code that depends on the loading having occurred in a call-back or Promise/then function block. This destroys the implicit global context, so all these spaghetti coils of global functions and var's we find in legacy javascript code files will not be defined in the global scope any more.
I have determined that only two tricks could allow a smooth transition:
Either some way to pause continuation of a script block until the import Promise is resolved.
const promise = require("dep1.js", "dep2.js", "dep3.js");
await promise;
// legacy stuff follows
or some way to revert the scope of a block inside a function explicitly into the global scope.
with(window) {
function foo() { return 123; }
var bar = 543;
}
But neither wish was granted by the javascript fairy.
In fact, I read that even the await keyword essentially just packs the rest of the statements into function to call when promise is resolved:
async function() {
... aaa makes promise ...
await promise;
... bbb ...
}
is just, I suppose, no different from
async function() {
... aaa makes promise ...
promise.then(r => {
... bbb ...
});
}
So this means, the only way to fix this is by putting legacy javascript statically in the head/script elements, and slowly moving things into modules, but continue to load them statically.
I am tinkering with my own module style:
(function(scope = {}) {
var v1 = ...;
function fn1() { ... }
var v2 = ...;
function fn2() { ... }
return ['v1', 'fn1', 'v2', 'fn2']
.reduce((r, n) => {
r[n] = eval(n);
return r;
}, scope);
})(window)
by calling this "module" function with the window object, the exported items would be put on there just as legacy code would do.
I gleaned a lot of this by using knockout.js and working with the source readable file that has everything together but in such module function calls, ultimately all features are on the "ko" object.
I hate using frameworks and "compilation" so generating the sequence of HTML tags to load them in the correct order by the topologically sorted dependency tree, while I could write myself such a thing quickly, I won't do this, because I do not want to have any "compilation" step, not even my own.
UPDATE: https://stackoverflow.com/a/33670019/7666635 gives the idea that we can just Object.assign(window, module) which is somewhat similar to my trick passing the window object into the "module" function.

Difference between require('module')() and const mod = require('module') mod() in node/express

I have two files: server.js and db.js
server.js looks as such:
...
const app = express();
app.use('/db', db());
app.listen(3000, () => {
console.log('Server started on port 3000')
});
...
and db.js as such:
...
function init() {
const db = require('express-pouchdb')(PouchDB, {
mode: 'minimumForPouchDB'
});
return db;
}
...
This works just fine, and I am able to reach the pouchdb http-api from my frontend. But before, I had const PouchDBExpress = require('pouchdb-express'); in the top of db.js, and the first line in init() looked like this; const db = PouchDBExpress(PouchDB, {. This gave an error in one of the internal files in pouchdb saying cannot set property query on req which only has getters (paraphrasing).
So this made me copy the exaples from pouchdb-servers GitHub examples which requires and invokes pouched-express directly, and everthing worked fine. Is there an explanation for this? I'm glad it works now, but I'm sort of confused as to what could cause this.
The only difference between:
require('module')()
and
const mod = require('module');
mod();
is that in the second case, you retain a reference to the module exports object (perhaps for other uses) whereas in the first one you do not.
Both cases load the module and then call the exported object as a function. But, if the module export has other properties or other methods that you need access to then, obviously, you need to retain a reference to it as in the second option.
For us to comment in more detail about the code scenario that you said did not work, you will have to show us that exact code scenario. Describing what is different in words rather than showing the actual code makes it too hard to follow and impossible to spot anything else you may have inadvertently done wrong to cause your problem.
In require('module')(), you don't retain a reference of the module imported.
While in const mod = require('module'); mod(), you retain a reference and can use the same reference later in your code.
This problem might be due to some other reason like -
Are you using a some another global instance of the db, and your code works in the given case as you are making a local instance
Some other code dependent scenario.
Please provide more details for the same

Categories

Resources