I have a react application interacting with a drop wizard REST API. In my code I noticed the following odd bug.
In attempting to set the values of two variable and then use them to set state, I had the following
const [inventorRes, sellerRes] =
await Promise.all([
LoginTransport.getAllUsers('Inventory Owner'),
LoginTransport.getAllUsers('Seller'),
]);
where LoginTransport is a method that calls the external API. Now I noticed that in doing this, both inventor res and sellerRes would be set to the value of LoginTransport.getAllUsers('Inventory Owner') as opposed to the respective arguments.
When I break it up into:
const inventorRes = await LoginTransport.getAllUsers('Inventory Owner');
const sellerRes = await LoginTransport.getAllUsers('Seller');
The behavior is correct.
My understanding from: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/all was that Promise.all returned back an iterable. Is not the case that javascript would attempt to match the values on the left-hand iterable with the right hand? And assuming that's true but I insisted on using Promise.all() how could I accomplish the task I want to do?
Now I noticed that in doing this, both inventor res and sellerRes would be set to the value of LoginTransport.getAllUsers('Inventory Owner') as opposed to the respective arguments.
That would be the fault of LoginTransport.getAllUsers. Promise.all does what you are expecting it to do.
Related
New to MongoDB, very new to Atlas. I'm trying to set up a trigger such that it reads all the data from a collection named Config. This is my attempt:
exports = function(changeEvent) {
const mongodb = context.services.get("Cluster0");
const db = mongodb.db("TestDB");
var collection = db.collection("Config");
config_docs = collection.find().toArray();
console.log(JSON.stringify(config_docs));
}
the function is part of an automatically created realm application called Triggers_RealmApp, which has Cluster0 as a named linked data source. When I go into Collections in Cluster0, TestDB.Config is one of the collections.
Some notes:
it's not throwing an error, but simply returning {}.
When I change context.services.get("Cluster0"); to something else, it throws an error
When I change "TestDB" to a db that doesnt exist, or "Config" to a collection which doesn't exist, I get the same output; {}
I've tried creating new Realm apps, manually creating services, creating new databases and new collections, etc. I keep bumping into the same issue.
The mongo docs reference promises and awaits, which I haven't seen in any examples (link). I tried experimenting with that a bit and got nowhere. From what I can tell, what I've already done is the typical way of doing it.
Images:
Collection:
Linked Data Source:
I ended up taking it up with MongoDB directly, .find() is asynchronous and I was handling it incorrectly. Here is the reply straight from the horses mouth:
As I understand it, you are not getting your expected results from the query you posted above. I know it can be confusing when you are just starting out with a new technology and can't get something to work!
The issue is that the collection.find() function is an asynchronous function. That means it sends out the request but does not wait for the reply before continuing. Instead, it returns a Promise, which is an object that describes the current status of the operation. Since a Promise really isn't an array, your statment collection.find().toArray() is returning an empty object. You write this empty object to the console.log and end your function, probably before the asynchronous call even returns with your data.
There are a couple of ways to deal with this. The first is to make your function an async function and use the await operator to tell your function to wait for the collection.find() function to return before continuing.
exports = async function(changeEvent) {
const mongodb = context.services.get("Cluster0");
const db = mongodb.db("TestDB");
var collection = db.collection("Config");
config_docs = await collection.find().toArray();
console.log(JSON.stringify(config_docs));
};
Notice the async keyword on the first line, and the await keyword on the second to last line.
The second method is to use the .then function to process the results when they return:
exports = function(changeEvent) {
const mongodb = context.services.get("Cluster0");
const db = mongodb.db("TestDB");
var collection = db.collection("Config");
collection.find().toArray().then(config_docs => {
console.log(JSON.stringify(config_docs));
});
};
The connection has to be a connection to the primary replica set and the user log in credentials are of a admin level user (needs to have a permission of cluster admin)
I'm using Sinon for stubbing some data retrieval methods during unit testing. Most of these data methods are async, so the resolves syntax has been handy so far for this. What I'm trying to achieve is to dynamically generate different test data based on Math.random() to cover different branches on my code automatically, without having actually to provide hardcoded sample input data for each case. Still, I've realized that the stub just is actually called once upon initialization and, not the return value of it gets fixed/constant during the execution of the testing process (Mocha based). Is there any way to actually provide different outcomes for a single stub using? I've checked the onCall syntax, but it also provides fixed output, just selectable based on current iteration index, but not actual dynamic output, which could even be args/params based, perhaps.
All ideas are welcome!
Current stubbing using Sinon:
sinon.stub(dynamodb, 'get').resolves(stubGet())
The stub itself:
function stubGet () {
// Choose random repo
const i = Math.round(Math.random() * sampleData.length)
const repo = sampleData[i]
// Should it have "new code/push date"?
const isNew = Math.round(Math.random()) === 1
if (isNew) {
repo.pushed_at = { S: '1970-01-01T00:00:00Z' }
}
console.log('repo', repo)
const item = { Item: repo }
console.log(item)
return item
}
The goal would be to hopefully get the random repo or the isNew value.
Randomness is unpredictable. Test code should be predictable, including test data. Otherwise, your tests could be failed with some random data in someday
We should write multiple test cases, each test case uses fixed, as simple as possible test data to test each branch, scene, etc. of the code. Assert whether the returned value meets your expectations.
You should make the test code, test data predictable. For more info, see Unpredictable Test Data
So, I have an async function, which fetches an array of objects, to which I want to do testing to. But the problem is, each tests refers to the same array, even if I fetch it individually before each test.
describe('Package parser mock status file tests', async () => {
let packages: Package[] = [];
beforeEach(async function () {
packages = await PackageParser.fromStatusFile("tests/mockStatusFile.txt");
});
afterEach(async function () {
packages.length = 0;
});
This causes the array being filled as many times as there are tests, even though I empty it before each test. Even if I set the arrays to a different variable in each test. I'm aware of object and array referencing and async to some extent, but I don't understand how this is possible.
I'm using Mocha.
Problem was not related to mocha or async or testing. My PackageParser object contains an array that keeps track of processed packages. This array was never emptied.
I had previously seen weird things when working with async, object referencing and such things, that I immediately thought it was something else.
In typescript, I have noticed that when I take a complex object and put it in an array, when I attempt to access that object from the array it loses its type and instead simply becomes of type object.
For example
let myArray = return await this.browser.evaluate((sel: string) => Array.from(document.querySelectorAll(sel)), selector)
document.querySelectorAll(sel) returns a NodeList<Element> which is ArrayLike. Array.from should convert the NodeList into an array of elements, but once the array is formed all of the array elements lose their Element type
I have a function that will only accept parameters of type Element, but when I try to pass in myArray[0] as a parameter to said function, I get the error: Error: Unsupported target type: object
I have tried so many different things to try and get the array to maintain its object type that it would be difficult to explain each and every one of them. I am wondering how can I create an array of Elements and have them continue to be Elements when accessed later instead of generic objects
Here is a little more context in the testing I've done
I am going to this page: https://www.w3schools.com/html/html_tables.asp
and the selector I am passing into evaluate is table[id="customers"] tbody tr This should match with the 6 rows that appear in the table.
let test = await this.browser.evaluate((sel: string) =>
Array.from(document.querySelectorAll(sel)), selector)
console.log('testy1: ', test)
console.log('testy2: ', test[0])
console.log('testy3: ', typeof(test[0]))
When I run the above code this is the output I get in the console log:
testy1: [ {}, {}, {}, {}, {}, {}, {} ]
testy2: {}
testy3: object
It seems to be matching grabbing the elements from the page because it is correctly returning 6 elements. but maybe the issue is that the objects returned are empty? I am not sure.
I think my problem may be related to this question: puppeteer page.evaluate querySelectorAll return empty objects
but the solution to that question doesn't work for me because href isn't a property of object type Element
The problem here is that the function you are passing to page.evaluate is run inside the browser context (inside the browser page). To send the results from the browser context to the Node.js environment, the results are serialized.
See the return type in the docs for page.evaluate:
returns: Promise<Serializable> Promise which resolves to the return value of pageFunction
The Serializable here means that your data will be passed to the Node.js environment via JSON.stringify and there automatically parsed for you. This process will however remove any non-serializable properties of objects. This is the reason why you end up with many empty objects.
Get element handles in puppeteer
To get an handle of an element on the page, you need to use the page.$, which creates an object (in your Node.js environment) that is linked to an element inside the browser context. These kind of handles can also be passed to page.evaluate calls. To query for multiple element, you can use the function page.$$.
Code sample
Here is an example, which first queries an element and then passes the element handle to an evaluate function to read an attribute.
const elementHandle = await page.$('a');
const result = await page.evaluate(el => el.href, elementHandle);
Usage of TypeScript
The problem regarding TypeScript is, that TypeScript is not able to predict the type correctly in this scenario. For the TypeScript compiler this looks like a normal function call while in reality, the function is send to the client to be executed. Therefore, you have to cast the type yourself in this case as otherwise Typescript will just assume any as argument type:
const elementHandle = await page.$('a');
const result = await page.evaluate((el: { href: string }) => el.href, elementHandle);
i'm facing the same problem with types definition, and Element is not exported from any Lib.
To get apropriate intelisense with el methods, i changed my code with below example:
From:
...
await page.waitForTimeout(3000)
await page.waitForSelector('button[type=submit]')
await page.$eval('button[type=submit]', el => el.click())
await page.waitForNavigation({waitUntil: "load"})
await page.goto(url)
...
To this:
...
await page.waitForTimeout(3000)
await page.waitForSelector('button[type=submit]')
await page.$('button[type=submit]').then(el => {
el.click()
})
await page.waitForNavigation({waitUntil: "load"})
await page.goto(url)
...
puppeteer element intelisense example
PS: I found the issue in definition types on DefinedType Github https://github.com/DefinitelyTyped/DefinitelyTyped/issues/24419
I would like to find a doc in a collection, and add items to a sub collection (which might not exist yet):
projects (collection)
project (doc)
cluster (collection) // might not exist
node1 (doc) // might not exist
statTypeA (collection) // might not exist
I was hoping for something like this:
// Know the doc:
db.ref(`projects/${projectId}/cluster/node1/${statType}`).add()
// Or filter and ref:
db.collection('projects').where(..).limit(1).ref(`cluster/node1/${statType}`).add()
I ended up solving it like this but it's ugly, verbose and slow as it has to come back with a number of read ops first. Am I doing this right?
const projectRefs = await db.collection('projects')
.where('licenseKey', '==', licenseKey)
.limit(1)
.get();
if (!projectRefs.docs) {
// handle 404
}
const projectRef = projectRefs.docs[0].ref;
const cluster = await projectRef.collection('cluster')
.doc('node1').get();
await cluster.ref.collection(statType).add({ something: 'hi' });
Edit:
The way I ended up handling this in a better way is a combination of flattening to other collections and also using arrays for stats. Feels much better:
// projects
{
projectId1
}
// instances (to-many-relationship) (filter based on projectId)
{
projectId
statTypeA: []
statTypeB: []
}
Your "nasty thing" is much closer to the way things work.
In your first attempt, you're trying to combine a query and a document creation in one operation. The SDK doesn't work like that at all. You are either reading or writing with any given bit of code, never both at once. You should do the query first, find the document, then use that to create more documents.
get() returns a promise that you need to use to wait on the results of the query. The results are not available immediately, as your code is currently assuming.
The documentation shows example code of how to handle the results of an asynchronous query. Since your code uses async/await, you can convert it as needed. Note that you have to iterate the QuerySnapshot obtained from the returned promise to see if a document is found.