Can't export wsEndpoint created from puppeteer browser - javascript

I am trying to open a puppeteer browser upon startup and then exporting the wsEndpoint so that I may use the link to connect to the browser rather than opening a new browser every time I call the function.
Here is the code snippet in the file app.js that is the entry point for node.
const mainFunction = async () => {
const browser = await puppeteer.launch()
const wsEndpoint = browser.wsEndpoint()
return wsEndpoint
}
mainFunction().then(async endpoint => {
console.log(endpoint)
module.exports = endpoint
})
upon startup, the console log above returns a link that I then export
And here is the code snippet in the utility file equities.js
const puppeteer = require("puppeteer")
const endpoint = require("../../app.js")
module.exports = async(symbol)=>{
console.log(endpoint)
const browser = await puppeteer.connect({connectWSEndpoint: endpoint})
}
Every time I call the function, the console log only returns an empty object meaning that the export in app.js failed for some reason. I tried to google a few things and tried different ways of exporting but none seem to work. Can someone help guide me? Thank you so much in advance.

A few things here seem amiss to me -- this code feels like it wasn't tested along the way, leading to multiple points of failure. Try to take smaller steps so you can isolate problems instead of accumulating them.
For starters, the mainFunction code abandons the browser object, creating a leaked subprocess resource can't be closed.
I'd return or store the browser variable along with the endpoint so someone can clean it up through a function. Or just return the browser and let the client code pull the endpoint out of it if they want, as well as manage and close the resource.
Next, the export code:
mainFunction().then(async endpoint => {
console.log(endpoint)
module.exports = endpoint
})
I don't understand the motivation for this extra then wrapper that receives an async resolution function that never uses await. You may think Node awaits all of this code, then sets the module.exports value before the client file's require runs synchronously. That's not the case, which can be determined with a simpler piece of code:
app.js (in the same folder throughout this post for convenience):
const mainFunction = async () => 42;
mainFunction().then(async endpoint => {
console.log("endpoint":, endpoint)
module.exports = endpoint
})
index.js:
const endpoint = require("./app");
console.log("imported:", endpoint);
node index gives me:
imported: {}
endpoint: 42
The promise resolved after the require, which synchronously brought in the default blank object module.exports -- probably not what you expected.
If you have async code, it has to stay async forever, including exports and imports. Try exporting the promise directly, then awaiting it in the client:
app.js:
const mainFunction = async () => 42;
module.exports = mainFunction;
index.js:
const getEndpoint = require("./app");
getEndpoint().then(endpoint => console.log("imported:", endpoint));
Running node index gives me: imported: 42.
The client code in equities.js looks more reasonable because it exports a promise synchronously, but it's going to have to await the endpoint promise it imported anywhere it uses it.
Also, Puppeteer throws on puppeteer.connect({connectWSEndpoint: endpoint}), Error: Exactly one of browserWSEndpoint, browserURL or transport must be passed to puppeteer.connect. I'll leave that up to you to work out based on your goals.
Here's a rewrite sketch that fixes the promise problems, but is only a proof of concept which will need tweaks to do whatever you're trying to do:
app.js:
const puppeteer = require("puppeteer");
const browserPromise = puppeteer.launch();
const endpointPromise = browserPromise
.then(browser => browser.wsEndpoint())
;
module.exports = {browserPromise, endpointPromise};
equities.js:
const puppeteer = require("puppeteer");
const {browserPromise, endpointPromise} = require("./app");
module.exports = async symbol => {
const endpoint = await endpointPromise;
console.log(endpoint);
//const browser = await puppeteer.connect({connectWSEndpoint: endpoint}) // FIXME
const browser = await browserPromise;
await browser.close();
};
index.js:
const equitiesFn = require("./equities");
(async () => {
await equitiesFn();
})();
Run node index and you should see the ws printed.
Note that you can wrap the exported promises in functions or as part of an object which is a layer of abstraction more typical for the interface of a library if you want. But this doesn't change the fundamental asynchrony. The client will call the exported functions and await the endpoint and/or browser promises through that extra layer of indirection,
require("./app").getBrowser().then(browser => /* */);
versus
require("./app").browserPromise.then(browser => /* */);
If you don't want to expose the browser object, that's fine, but I'd suggest exposing a function that closes the underlying browser so you can get a clean exit, e.g.
app.js:
const puppeteer = require("puppeteer");
const browserPromise = puppeteer.launch();
const closeBrowser = () =>
browserPromise.then(browser => browser.close())
;
module.exports = {closeBrowser};
index.js:
require("./app")
.closeBrowser()
.then(() => console.log("closed"))
;

Related

Electron await function and access to returned data

I've seen similar questions asked, but still having trouble grasping the concept of Async/await functions and access to main.js within an Electron app.
I have an Electron app with the following files:
main.js ( some of the un-related details missing here )
const { app, BrowserWindow, ipcMain } = require('electron')
const path = require('path')
const https = require('https')
const axios = require('axios')
const url = require('url')
// Handle creating/removing shortcuts on Windows when installing/uninstalling.
if (require('electron-squirrel-startup')) { // eslint-disable-line global-require
app.quit();
}
const createWindow = () => {
// Create the browser window.
const mainWindow = new BrowserWindow({
webPreferences: {
contextIsolation: true,
preload: MAIN_WINDOW_PRELOAD_WEBPACK_ENTRY,
preload: path.join(__dirname, 'preload.js')
},
width: 1920,
height: 1080,
});
ipcMain.handle("doSomethingAxios", async () => {
const response = await axios.get('http://www.boredapi.com/api/activity/');
console.log(response.data);
return response.data;
});
preload.js
const { contextBridge, icpMain, ipcRenderer } = require('electron')
let indexBridge = {
doSomethingAxios: async () => {
var result = await ipcRenderer.invoke("doSomethingAxios");
var whattodo = document.getElementById("whattodo");
whattodo.innerText = result.activity; // This works.
// Can't access Jquery or other libraries here.
}
}
contextBridge.exposeInMainWorld("indexBridge", indexBridge);
renderer.js
$("#callApi").click(function(){
window.indexBridge.doSomethingAxios();
// Can access Jquery libraries here, but can only return promise from this call.
});
index.html
<button id="callApi">I am bored</button>
<div id="whattodo"></div>
This works, and I am able to call the function and populate the results in the div tags within my index.html, but where I get confused, is if I wanted to perform some post processing with the data using Jquery.
In my preload.js, I can do some manipulation of the data with JS, but what if I wanted to use another JS library to modify the data?
For example, let's say that I wanted to use Jquery to modify the data after it is returned. As far as I can tell and have read, I can't include Jquery in my preload.js file. So, it seems all I can do is return the data to the preload.js, and only return a promise to the renderer.js.
Another example if not clear.
Sometimes I like to use the datatables plugin.
After data is populated into the table, with datatables, you need to call the reload() function to reload the table. In a normal web application, I would make an ajax call to a server side script, get the data, then return an ajax response to populate the table and call the reload function.
I guess, i'm just hung up on how to make below function like a ajax call, where I call a function to server side, return the data to calling function, and then have access to JS libraries to manipulate the data. Right now, it seems that I am only able to return data to the preload.js, and if I try to return from the preload.js to the renderer.js, I can only return a promise, and not the actual data.
Any ideas of what general concept I am missing here?

Import ipfs in TypeScript

Importing and module initialization is generally simple using JavaScript/TypeScript using either require or import. I'm having trouble running the basic example from the JS IPFS website to initialize ipfs.
If I follow the general instructions I get an error: Module parse failed: Cannot use keyword 'await' outside an async function (6:13)
This is the critical code:
const IPFS = require('ipfs-core');
const ipfs = await IPFS.create();
If I follow the suggestion to place the ipfs creation in an async function I just delay the inevitable. If I call such a function twice I get an error from Unhandled Rejection (LockExistsError): Lock already being held for file: ipfs/repo.lock. It seems I could create a hack to test whether ipfs is created or not and initialize it global to a module as null, but that would still be a hack.
How should I implement or refactor const ipfs = await IPFS.create(); without error?
Probably your Node version is prior to version 14 and don't support calling await in the top-level. You got be in the context of an async block. You can do something like:
const IPFS = require('ipfs')
async function main() {
const ipfs = await IPFS.create()
/* Your code here */
}
// and now you can tell node to run your async main function...
main()
Check https://v8.dev/features/top-level-await for more info about it in the v8 engine. And also found this post about the Node 14 support for it: https://pprathameshmore.medium.com/top-level-await-support-in-node-js-v14-3-0-8af4f4a4d478
In my case, it was due to me initializing IFPS unnecessarily too many times in a row. After making sure the IPFS instance is only initialized once when my app starts, I was able to resolve the error.
let ready = false
if(!ready) {
const ipfs = await IPFS.create()
ready = true
}
In my case the user input goes to ipfs and on additional upload the error "ipfs/repo.lock" continued to come up.
After some research on ipfs wiki, it appears that there are conflicts with how ipfs actually works. Randomization of repo name is a very rough patch in this case:
const node = await IPFS.create({ repo: "ok" + Math.random() });

how to create new ethereum/solidity contract for each test in javascript/truffle

background
I have written an ethereum smart-contract in the Solidity language. In order to test things, I can run a local node using Ganache and deploy my contract on it using truffle migrate.
requirements
I want to test my contract using JavaScript. I want to create a new instance of my contract for each test.
what i've tried
I created a test file tests/test.js in my project:
const expect = require('chai').expect
const Round = artifacts.require('Round')
contract('pledgersLength1', async function(accounts) {
it('1 pledger', async function() {
let r = await Round.deployed()
await r.pledge(5)
let len = (await r.pledgersLength()).toNumber()
expect(len).to.equal(1)
})
})
contract('pledgersLength2', async function(accounts) {
it('2 pledgers', async function() {
let r = await Round.deployed()
await r.pledge(5)
await r.pledge(6)
let len = (await r.pledgersLength()).toNumber()
expect(len).to.equal(2)
})
})
I run it with truffle test. It's basically Mocha, but truffle defines artifacts for you with a JavaScript connection to the smart contracts.
The truffle contract function is almost the same as Mocha's describe function, with a small change that I don't understand! I assumed that contract would make my contract new each time. It doesn't. Perhaps I can use something like new Round() instead of Round.deployed(), but I just don't know how.
The solution does not have to use truffle.
Please note that .new and .deployed are not the same. See what I've found here.
Follow this example and it should solve your problem:
// Path of this file: ./test/SimpleStorage.js
var simpleStorage = artifacts.require("./SimpleStorage.sol");
contract('SimpleStorage', function(accounts) {
var contract_instance;
before(async function() {
contract_instance = await simpleStorage.new();
});
it("owner is the first account", async function(){
var owner = await contract_instance.owner.call();
expect(owner).to.equal(accounts[0]);
});
});
The .new keyword will deploy an instance of your contract in a new context.
But, .deployed will actually use the contract that you have deployed earlier i.e. when you are using truffle migrate command.
In the context of unit test, it's better to use .new so that you will always start with fresh contract.

Getting around an async/await issue

I am creating a simple node script to learn the functionality of Cosmos DB. I want to create a way to not have to provide the following at the top of every async function (yes, I know I could chain the async calls with but that means still means I have to use a new db instance at the top of every function. So, I want to do something like this:
const {database} = await client.databases.createIfNotExists({id: databaseId});
const {container} = await database.containers.createIfNotExists({id: containerId});
With that said, I've bumped my head on this for a few hours and can't find a way to create one database and one container for all my functions to share. The idea (but not implementation because it doesn't work, is to do something like this:
getConnections = async () => {
const {database} = await client.databases.createIfNotExists({id: databaseId});
const {container} = await database.containers.createIfNotExists({id: containerId});
let connections = {};
connections.db = database;
connections.container = container;
return connections;
};
But since the getCoonections method is async (which it must be because the methods that would use it are, too) the function doesn't necessarily finish before the first insert is made in another function, thereby causing an exception.
Has anyone found a way to centralize these objects so I don't have to declare them in each async function of my app?
It sounds like you need to get these connections before the app does anything else. So why not simply make the loading of your app use async/await too?
async function init() {
const connections = await getConnections();
const app = initializeTheRestOfYourApp(connections); // now safe to do inserts
};
init();
This pretty much works now, Not sure why as there is no blocking between this init() and the next async method in the call chain uses the connection, but it's working. – David Starr - Elegant Code just now

How to pass modules inputs while calling it from index.js file

I am building some custom plugins for a product that uses proprietary npm module to setup connection and perform an operation(read, write and submit) on mainframe emulator.
I am setting up connection in index.js file and want to pass terminal variable to separate module while calling it
index.js snippet
var terminal;
const mainframeTerminal =require(‘private_module’);
const accountDetailsModule =require('./src/accountDetails');
terminal = private_module.connect('11.11.11.1:789');
let screen = await terminal.status();
// expose module from index.js file so that it can be consumes in product
export.getAccountDetails = accountDetailsModule.getAccountDetails(terminal)
accountDetails.js
module.exports.getAccountDetails = async function(terminal){
//perform some operation with termianl var - passed from index file
return data;
}
I am getting below error
exports.getAccountDetails = accountDetailsModule.getAccountDetails is not a function.
I also need to pass data input but for the time being it's not required,
Would like to know how node.js function will understand mapping if I just need to pass anyone of input only.
Please throw some inputs, I am new to coding.
like this
index.js
const mainframeTerminal =require('private_module');
const accountDetailsModule = require('./src/accountDetails');
const terminal = private_module.connect('11.11.11.1:789');
let screen = await terminal.status();
// expose module from index.js
exports = accountDetailsModule(terminal);
accountDetails.js
const getAccountDetails = async (terminal) => {
// perform some operation
return data;
}
exports = getAccountDetails;
In CommonJS, modules are just a way to assign variables. So what you get in index.js when you require('./src/accountDetails') is exactly what you export in accountDetails.js (the value of getAccountDetails, which is an async function taking one argument).
runMyCode.js
Here's how you might call your code...
const mainframeTerminal =require('private_module');
const accountDetailsModule = require('./src/accountDetails');
const terminal = private_module.connect('11.11.11.1:789');
let screen = await terminal.status();
// run the code a print the result to console
accountDetailsModule(terminal).then(console.log);

Categories

Resources