Import ipfs in TypeScript - javascript

Importing and module initialization is generally simple using JavaScript/TypeScript using either require or import. I'm having trouble running the basic example from the JS IPFS website to initialize ipfs.
If I follow the general instructions I get an error: Module parse failed: Cannot use keyword 'await' outside an async function (6:13)
This is the critical code:
const IPFS = require('ipfs-core');
const ipfs = await IPFS.create();
If I follow the suggestion to place the ipfs creation in an async function I just delay the inevitable. If I call such a function twice I get an error from Unhandled Rejection (LockExistsError): Lock already being held for file: ipfs/repo.lock. It seems I could create a hack to test whether ipfs is created or not and initialize it global to a module as null, but that would still be a hack.
How should I implement or refactor const ipfs = await IPFS.create(); without error?

Probably your Node version is prior to version 14 and don't support calling await in the top-level. You got be in the context of an async block. You can do something like:
const IPFS = require('ipfs')
async function main() {
const ipfs = await IPFS.create()
/* Your code here */
}
// and now you can tell node to run your async main function...
main()
Check https://v8.dev/features/top-level-await for more info about it in the v8 engine. And also found this post about the Node 14 support for it: https://pprathameshmore.medium.com/top-level-await-support-in-node-js-v14-3-0-8af4f4a4d478

In my case, it was due to me initializing IFPS unnecessarily too many times in a row. After making sure the IPFS instance is only initialized once when my app starts, I was able to resolve the error.
let ready = false
if(!ready) {
const ipfs = await IPFS.create()
ready = true
}

In my case the user input goes to ipfs and on additional upload the error "ipfs/repo.lock" continued to come up.
After some research on ipfs wiki, it appears that there are conflicts with how ipfs actually works. Randomization of repo name is a very rough patch in this case:
const node = await IPFS.create({ repo: "ok" + Math.random() });

Related

GCP Cloud Functions Gen 2 - Cannot call through NodeJs or gcloud

I created a gen 2 cloud function on my project "project-x" and used everything default, with permission Allow unauthenticated:
const functions = require('#google-cloud/functions-framework');
functions.http('helloHttp', (req, res) => {
res.send(`Hello ${req.query.name || req.body.name || 'World'}!`);
});
This generated a URL for this function, e.g. https://my-function-bvskvwq11c-uc.a.run.app, which when I call unauthenticated (or visit on the browser) it works. I see the response.
Now here's the problem...
A. Using the npm package #google-cloud/functions I tried to call this endpoint with the following:
await functionsClient.callFunction({
name: 'my-function',
})
This gives me a weird error of the format:
7 PERMISSION_DENIED: Cloud Functions API has not been used in project ********* before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/cloudfunctions.googleapis.com/overview?project=********* then retry. If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.
I say this is a weird error, cause the project ID provided in this error is not the ID of my project on GCP and when I visit the link provided it says I do not have permissions to view this project.
Then I tried to list my functions by doing the following, but I get back an empty list.
const [functions] = await functionsClient.listFunctions({
parent: `projects/project-x/locations/us-central1`,
});
// functions = [];
B. I thought to try gcloud this time, so first I listed all functions for my project:
gcloud functions list
This actually returned the correct function:
NAME STATE TRIGGER REGION ENVIRONMENT
my-function ACTIVE HTTP Trigger us-central1 2nd gen
I'm like cool! Let's try to call it now. So I run:
gcloud functions call my-function
And I get the following back:
ResponseError: status=[404], code=[Ok], message=[Function my-function in region us-central1 in project project-x does not exist]
Can someone please shed some light in all of these? Listing through the npm package yields different results than the gcloud command and neither of them are able to call the function. One gives 404 and the other one permission denied. What would be the best approach?
Answering my own question and summarizing.
To call a gen2 function from the gcloud command, use the --gen2 option, as stated on the gcloud documentation:
gcloud functions call my-function --gen2
To call gen2 functions from NodeJs, do not use the #google-cloud/functions package. As stated in Google's Rate Limits documentation:
The CALL API only applies to Cloud Functions (1st gen)... Please keep in mind that this API is meant for testing via Cloud Console or gcloud functions call CLI, and it cannot handle heavy traffic.
Instead, use the google-auth-library package, where they also have an example on how to call a function:
const {GoogleAuth} = require('google-auth-library');
async function main() {
const url = 'https://cloud-run-1234-uc.a.run.app';
const auth = new GoogleAuth();
const client = await auth.getIdTokenClient(url);
const res = await client.request({url});
console.log(res.data);
}

Mocha.js - How to save a global variable?

I'm working with Mocha.js for testing in a Node.js - Express.js - Firebase
I need a token from Firebase to access the API endpoints, I have a before hook in all my files, but after about 250 tests, probably calling the authentication endpoint multiple times, I'm getting rate limited by firebase.
I want to get the token once and use it in all my tests.
The tests are spread in different files, I have an index.js that requires them all.
I'm aware of Root Level Hooks, but how can I save the token and use it in all my separate files?
Thanks!
you can create a function that gets the token. then call it. then create your test suite only after that
function getToken(callback) {
//
}
// define tests
function allTests(token) {
describe(xxxxxx, function () {
it(xxxxxxxxx, function() {
//
})
});
}
// start all
getToken(function(token) {
allTests(token);
});
I managed to solve it myself, if anyone needs an answer on how to approach it, take a look at this.
I have multiple files where we write our unit testing, we unite them in an index.spec.js that we execute for testing ($ mocha index.spec.js)
I created a utility file that looks like this:
let token;
(() => { token = getYourToken() })()
module.exports = {
getToken: () => {
return new Promise((resolve) => {
const interval = setInterval(() => {
if (token) {
clearInterval(interval);
resolve(token);
}
}, 100);
});
}
};
Basically, it's a singleton, in the index.spec.js I require this file, executing the 'getYourToken()' once (add your logic to get token here). Then I store it in a variable that then I export.
In the export, I use an interval because my current code is not using promises, use your best practice method, interval + Promise worked for me.
This way I require this file in my tests and get the token I got at the beginning once, avoiding rate-limiting and any issue with firebase.
Create a JSON file in your test root directory.
Import the file.
Append a token property with the token value.
Then import it anywhere to access the token property .

Importing "#daily-co/daily-js" into SvelteKit app throws "global is not defined" error

What I tried:
I tried work around it via if (browser), more specifically{
if (!browser) { let DailyIframe = await import('daily-co/daily-js) } in the load function inside <script context="module"> ) so the code is always executed on the server). Then pass it as a prop to a component. However, although it worked on the server, the local dev environment re-runs the load function (which has to return an empty prop as it never imported anything) and overrides DailyIframe's value (might be a bug with Vite/SvelteKit).
I tried to import the library in an end-point e.g. api.json.js instead, which is always executed on the server. However, it has to return a json, and I can't pass an entire library variable onto it.
After research
It seems like a combination of problems from Vite, SvelteKit and certain libraries where global is undefined: SvelteKit With MongoDB ReferenceError: global is not defined)
But I cannot use his solution of putting it in an endpoint, because I need the DailyIframe and the mic audio stream from the client to create a video conference room
Also, why would certain libraries Daily (and looking at other related Stackoverflow posts, MongoDB) throw this error in the first place, while other libraries are safe to use?
Anyway suggestion is appreciated!
Why ?
Vite doesn't include shims for Node builtins variables.
Read suggestion to understand:
https://github.com/vitejs/vite/issues/728
https://github.com/angular/angular-cli/issues/9827#issuecomment-369578814
Anyway suggestion is appreciated!
In index.html add :
<script>
var global = global || window;
</script>
then for example in App.svelte :
<script>
import { onMount } from 'svelte'
import DailyIframe from '#daily-co/daily-js'
onMount(async () => {
let callObject = DailyIframe.createFrame()
const stream = await navigator.mediaDevices.getUserMedia({ audio: true })
let recorder = new MediaRecorder(stream)
recorder.start()
})
</script>
👉 Demo
https://stackblitz.com/edit/sveltekit-1yn6pz?devtoolsheight=33
logs preview

How to unit test a function that is calling other functions?

I am testing my REST API with jest for the first time and I am having a hard time unit testing the controllers.
How should I go about testing a function that contains other function calls (npm modules as well as other controllers). Here's pseudo code. (I've tried mocking but can't seem to get it right)
async insertUser(uid, userObject){
// Function to check user role and permissions
const isAllowed = await someotherController.checkPermissions(uid);
//Hash password using an npm module
const pass = password.hash;
//const user = new User(userObj)
user.save();
}
So basically, how to test such a function that contains all these different functions.
I have written tests for simple function and they went all good but I am stuck at these functions.
I would go with https://sinonjs.org/ and mock somecontroller. Be carefull with the user.save(). It looks like you use some kind of persistence here. In case you use mongoose, you should have a look at https://github.com/Mockgoose/Mockgoose.

Node.js listen for module load

With RequireJS on the front-end, we can listen to see when modules get loaded into the runtime module cache using:
requirejs.onResourceLoad = function (context, map, depArray) {
console.log('onResourceLoad>>>', 'map.id:', map.id, 'context:', context);
};
Can we do this with Node.js somehow? Will be useful for debugging. Especially when servers are loading different files (or in different order) based on configuration.
I assume this might be documented in
https://nodejs.org/api/modules.html
but I am not seeing anything
If you look at the source code for require(), you will find this:
Module._load = function(request, parent, isMain) {
if (parent) {
debug('Module._load REQUEST %s parent: %s', request, parent.id);
}
This shows that you can leverage the debug() call to get the information you need. In order to do this, you will notice that module is setup using util.debuglog('module'). This means that you need to run your application with with the NODE_DEBUG variable set to module. You can do it the following way from the console:
NODE_DEBUG=module node main.js
This will log what you are looking for.
I'm not aware of a documented callback API for the purpose of module load callbacks (although a logging mechanism for module loading appears to exist).
Here's a quick workaround to the apparent lack of such a callback, by monkeypatching Module._load:
const Module = require('module');
const originalModuleLoad = Module._load;
Module._load = function() {
originalModuleLoad.apply(this, arguments);
console.log("Loaded with arguments " + JSON.stringify(arguments));
}
I executed the above code in a REPL and then did require('assert'). Lo and behold:
> require('assert')
Loading with arguments {"0":"assert","1":{"id":"<repl>","exports":{},"filename":null,"loaded":false,"children":[],"paths":["/Users/mz2/Projects/manuscripts-endnote-promo/repl/node_modules","/Users/mz2/Projects/manuscripts-endnote-promo/node_modules","/Users/mz2/Projects/node_modules","/Users/mz2/node_modules","/Users/node_modules","/Users/mz2/.nvm-fish/v6.1.0/lib/node_modules","/Users/mz2/.node_modules","/Users/mz2/.node_libraries","/Users/mz2/.nvm-fish/v6.1.0/lib/node"]},"2":false}
Please don't think about using code like above for anything but debug only purposes.
Because node.js modules are imported (required) synchronously, simply having the require statement means the module is imported.
While RequireJS can import modules asynchronously, the even listening is an important feature, but native require in Node.js leaves this necessity out. This way, as you probably know:
const module = require('module')
// You can use the module here, async or sync.
To add to that, not only require is sync, but also in order to use a module it has to be explicitly required in the same file where it's used. This can be bypassed in several ways, but best practice is to require in every module where you use a module.
For specific modules which require async initialization, either the module should provide an event, or you can wrap the init function using a promise or a callback. For example, using a promise:
const module = require('module')
// Create a promise to initialize the module inside it:
const initialized = new Promise((resolve, reject) => {
// Init module inside the promise:
module.init((error) => {
if(error){
return reject(error)
}
// Resolve will indicate successful init:
resolve()
})
})
// Now with wrapped init, proceed when done:
initialized
.then(() => {
// Module is initialized, do what you need.
})
.catch(error => {
// Handle init error.
})

Categories

Resources