The bounty expires in 3 days. Answers to this question are eligible for a +100 reputation bounty.
Tom A is looking for a more detailed answer to this question.
Since I'm still fairly new to Javascript (and Node.js and React), I've got another question for the experts here. I'm trying to get some client/server communication set up, and while I think I have data sending set up for this, I'm not sure that I have the receiving part correct. Best as I can figure, this either isn't returning in time for me to pass back to calling code, or I'm not getting the order of operations correct. My code:
import React, { useEffect, useState, WebSocket } from 'react';
const URL_WEB_SOCKET = 'wss://localhost:5000/ws';
const URL_WEB_LOGIN = 'wss://localhost:5000/ws';
let socketMessage = "";
function useWebSockets (message, isControl) {
let data = "";
React.useEffect(() => {
const websocket = new WebSocket(isControl ? URL_WEB_LOGIN : URL_WEB_SOCKET);
websocket.onmessage = function(event) {
socketMessage = event.data.toString();
// first attempt
//data = event.data.toString();//JSON.parse(event.data);
//return event.data;
}
websocket.onopen = () => {
websocket.send(message);
}
}, [])
}
export function getData(message) {
const ws = new useWebSockets(message, false);
return ws.data;
}
export function Login(message) {
let ws = new useWebSockets(message, true);
return ws.data;
}
This is all in one file. I'm calling the Login and getData functions from another file, and they should be returning whatever data is passed back by the socket. However, I'm getting undefined errors. Do I have this set up correctly? Thanks.
Update
I have updated my code to the above, and I now can get back the websocket itself, but still no data (It's always blank). I think my code here is mostly correct, but from what I know about programming, I wonder if Javascript is dealing with this in an asynchronous manner and needing to wait for the onmessage to trigger before I can read the results. The server is sending data back fine, so how can I get the data and push it back through the Login and GetData functions to the calling code?
You can use websocket.onclose in front end, and close the websocket on the backend side.
You declared data const data = ""; as a String type,while JSON.parse(event.data); returns Object.
you should use let or declare it inside websocket.onmessage function.
Related
I am learning JavaScript (Node.js - using the Pipedream platform). I have been writing scripts to help automate some little tasks in my day to day work.
I am creating one that generates a report on recent interactions with clients.
As part of this I am using axios to get "engagements" from the Hubspot API (basically a list of identifiers I will use in later requests).
The API returns paginated responses. I have encountered pagination previously and understand the principle behind it, but have never written a script to handle it. This is my first.
It works. But I feel it could be improved. Below I've commented how I've approached it.
The endpoint returns up to 100 values 'per page' along with a "hasMore":true flag and an "offset":987654321 value which can be passed as a query parameter in subsequent requests (if hasMore === true).
Example API response:
{"results":[1234,1235,1236],"hasMore":true,"offset":987654321}
My code:
import axios from 'axios';
//function to get each page of data
async function getAssoc(req){
const options = {
method: 'GET',
url: `https://api.hubapi.com/${req}`,
headers: {
Authorization: `Bearer ${auths}`,
},
};
return await axios(options);
}
//declare array in which to store all 'associations'
const assocs = [];
//store the ID that I get in an earlier step
const id = vid;
//declare variable in which to temporarily store each request response data
var resp;
//declare query parameter value, initially blank, but will be assigned a value upon subsequent iterations of do while
var offset = '';
do {
//make request and store response in resp variable
resp = await getAssoc(`crm-associations/v1/associations/${id}/HUBSPOT_DEFINED/9?offset=${offset}`);
//push the results into my 'assocs' (associations) array
resp.data.results.forEach(element => assocs.push(element));
//store offset value for use in next iteration's request
offset = resp.data.offset;
} while (resp.data.hasMore); //hasMore will be false when there's no more records to request
return assocs;
I feel it could be improved because:
The DO WHILE loop, I believe, is making sequential requests. Is parallel a better/faster/more efficient option? (EDIT Thanks #Evert - of course I cannot make parallel requests because of the offset!)
I'm re-assigning new values to vars instead of using consts which seems simple and intuitive in my beginner's mind, but I don't understand a better way in this instance.
I would welcome any feedback or suggestions on how I can improve this for my own learning.
Thank you in advance for your time and any assistance you can offer.
I'm trying to write a test download works, which requires to check if xhr response has status READY. I created a client function in TestCafe, using promises, but it's failing in case of recursion.
How should I fix my code to handle this situation?
P.S. many apologies for newbies questions, I have just started my journey in automation testing.
fixture`Download report works`
test
.requestHooks(logger)//connected a request hook, will wait for logger request
('I should be able to download PDF report from header of the page', async t => {
//recursively check if response status is READY, and then go to assertions
const waitForDownloadResponseStatus = ClientFunction((log) => {
return new Promise((resolve,rejects)=>{
const waitForStatus=()=>{
const arrayFromResponse = JSON.parse(log.response.body);
const responseStatus = arrayFromResponse.status;
if (responseStatus == 'READY')
{
resolve(responseStatus);
}
else {
waitForStatus();
}
}
waitForStatus();
})
});
//page objects
const reportTableRaw = Selector('div.contentcontainer').find('a').withText('April 2019').nth(0);
const downloadPdfButton = Selector('a.sr-button.sr-methodbutton.btn-export').withText('PDF');
//actions.
await t
.navigateTo(url)
.useRole(admin)
.click(reportTableRaw)//went to customise your report layout
.click(downloadPdfButton)
.expect(logger.contains(record => record.response.statusCode === 200))
.ok();//checked if there is something in logger
const logResponse = logger.requests[0];
// const arrayFromResponse = JSON.parse(logResponse.response.body);
// const responseStatus = arrayFromResponse.status;
console.log(logger.requests);
await waitForDownloadResponseStatus(logResponse).then((resp)=>{
console.log(resp);
t.expect(resp).eql('READY');
});
});
When you pass an object as an argument or a dependency to a client function, it will receive a copy of the passed object. Thus it won't be able to detect any changes made by external code. In this particular case, the waitForStatus function won't reach its termination condition because it can't detect changes in the log object made by an external request hook. It means that this function will run indefinitely until it consumes the all available stack memory. After that, it will fail with a stack overflow error.
To avoid this situation, you can check out that response has status READY if you change the predicate argument of the contains function.
Take a look at the following code:
.expect(logger.contains(record => record.response.statusCode === 200 &&
JSON.parse(record.response.body).status === 'READY'))
.ok({ timeout: 5000 });
Also, you can use the timeout option. It's the time (in milliseconds) an assertion can take to pass before the test fails.
Hello I am trying to access an Httprequest triggerd by a react component from my javascript code in order to test the url?
can anybody help please ?
Screenshot of the httprequest I want to access
Here is an example of the unit test I'am running, I want to add an other unit test that checks if the httprequest is called correctly.
.add('Search with "Occasion" keyword', () => {
const result = search('Iphone Occasion');
specs(() =>
describe('SEO Navigation Links', () => {
it('Should not contain "Occasion" keyword', () => {
const searchValue = result.find(Search).node.state.value.toLowerCase();
const contains = searchValue.includes('occasion');
expect(contains).toBeTruthy();
});
}),
);
return result;
});
The best i can recommend is to "monkey patch" the fetch function (if it uses that)
const realFetch = fetch;
fetch = (...args) => realFetch(...args).then(doStuff);
It creates a "middleware", and when the website tries to call the fetch function, it will call yours
Make sure you make a copy of the original function to avoid infinite recursion
If you install a Service worker, you can run some code client-side on all the requests your page makes. I'm not sure what you need to do in order to test the code you are talking about, but a Service Worker could report the request back to your own test code on the page, or respond with whatever content you want, or modify the server's response.
I have an app that uses firebase, the whole stack pretty much, functions, database, storage, auth, messaging, the whole 9. I want to keep the client end very lightweight. So if a user comments on a post and "tags" another user, let's say using the typical "#username" style tagging, I moved all of the heavy lifting to the firebase functions. That way the client doesn't have to figure out the user ID based on the username, and do everything else. It is setup using triggers, so when the above scenario happens I write to a "table" called "create_notifications" with some data like
{
type: "comment",
post_id: postID,
from: user.getUid(),
comment_id: newCommentKey,
to: taggedUser
}
Where the taggedUser is the username, the postID is the active post, the newCommentKey is retrieved from .push() on the comments db reference, and the user.getUid() is from the firebase auth class.
Now in my firebase functions I have a "onWrite" trigger for that specific table that gets all of the relevant information and sends out a notification to the poster of the post with all the relevant details. All of that is complete, what I am trying to figure out is... how do I delete the incoming event, that way I don't need any sort of cron jobs to clear out this table. I can just grab the event, do my needed calculations and data gathering, send the message, then delete the incoming event so it never even really exists in the database except for the small amount of time it took to gather the data.
A simplified sample of the firebase functions trigger is...
exports.createNotification = functions.database.ref("/create_notifications/{notification_id}").onWrite(event => {
const from = event.data.val().from;
const toName = event.data.val().to;
const notificationType = event.data.val().type;
const post_id = event.data.val().post_id;
var comment_id, commentReference;
if(notificationType == "comment") {
comment_id = event.data.val().comment_id;
}
const toUser = admin.database().ref(`users`).orderByChild("username").equalTo(toName).once('value');
const fromUser = admin.database().ref(`/users/${from}`).once('value');
const referencePost = admin.database().ref(`posts/${post_id}`).once('value');
return Promise.all([toUser, fromUser, referencePost]).then(results => {
const toUserRef = results[0];
const fromUserRef = results[1];
const postRef = results[2];
var newNotification = {
type: notificationType,
post_id: post_id,
from: from,
sent: false,
create_on: Date.now()
}
if(notificationType == "comment") {
newNotification.comment_id = comment_id;
}
return admin.database().ref(`/user_notifications/${toUserRef.key}`).push().set(newNotification).then(() => {
//NEED TO DELETE THE INCOMING "event" HERE TO KEEP DB CLEAN
});
})
}
So in that function in the final "return" of it, after it writes the finalized data to the "/user_notifications" table, I need to delete the event that started the whole thing. Does anyone know how to do that? Thank you.
First off, use .onCreate instead of .onWrite. You only need to read each child when they are first written, so this will avoid undesirable side effects. See the documentation here for more information on the available triggers.
event.data.ref() holds the reference where the event occurred. You can call remove() on the reference to delete it:
return event.data.ref().remove()
The simplest way to achieve this is through calling the remove() function offered by the admin sdk,
you could get the reference to the notification_id through the event, i.e event.params.notification_id then remove it when need be with admin.database().ref('pass in the path').remove(); and you are good to go.
For newer versions of Firebase, use:
return change.after.ref.remove()
Goal: Front-end of application allows users to select files from their local machines, and send the file names to a server. The server then matches those file names to files located on the server. The server will then return a list of all matching files.
Issue: This works great if you a user select less than a few hundred files, otherwise it can cause long response times. I do not want to limit the number of files a user can select, and I don't want to have to worry about the http requests timing out on the front-end.
Sample code so far:
//html on front-end to collect file information
<div>
<input (change)="add_files($event)" type="file" multiple>
</div>
//function called from the front-end, which then calls the profile_service add_files function
//it passes along the $event object
add_files($event){
this.profile_service.add_files($event).subscribe(
data => console.log('request returned'),
err => console.error(err),
() => //update view function
);
}
//The following two functions are in my profile_service which is dependency injected into my componenet
//formats the event object for the eventual query
add_files(event_obj){
let file_arr = [];
let file_obj = event_obj.target.files;
for(let key in file_obj){
if (file_obj.hasOwnProperty(key)){
file_arr.push(file_obj[key]['name'])
}
}
let query_obj = {files:title_arr};
return this.save_files(query_obj)
}
//here is where the actual request to the back-end is made
save_files(query_obj){
let payload = JSON.stringify(query_obj);
let headers = new Headers();
headers.append('Content-Type', 'application/json');
return this.http.post('https://some_url/api/1.0/collection',payload,{headers:headers})
.map((res:Response) => res.json())
}
Possible Solutions:
Process requests in batches. Re-write the code so that the profile-service is only called with 25 files at a time, and upon each response call profile-service again with the next 25 files. If this is the best solution, is there an elegant way to do this with observables? If not, I will use recursive callbacks which should work fine.
Have the endpoint return a generic response immediately like "file matches being uploaded and saved to your profile". Since all the matching files are persisted to a db on the backend, this would work and then I could have the front-end query the db every so often to get the current list of matching files. This seem ugly, but figured I'd throw it out there.
Any other solutions are welcome. Would be great to get a best-practice for handling this type of long-lasting query with angular2/observables in an elegant way.
I would recommend that you break up the number of files that you search for into manageable batches and then process more as results are returned, i.e. solution #1. The following is an untested but I think rather elegant way of accomplishing this:
add_files(event_obj){
let file_arr = [];
let file_obj = event_obj.target.files;
for(let key in file_obj){
if (file_obj.hasOwnProperty(key)){
file_arr.push(file_obj[key]['name'])
}
}
let self = this;
let bufferedFiles = Observable.from(file_arr)
.bufferCount(25); //Nice round number that you could play with
return bufferedFiles
//concatMap will make sure that each of your requests are not executed
//until the previous completes. Then all the data is merged into a single output
.concatMap((arr) => {
let payload = JSON.stringify({files: arr});
let headers = new Headers();
hearders.append('Content-Type', 'application/json');
//Use defer to make sure because http.post is eager
//this makes it only execute after subscription
return Observable.defer(() =>
self.post('https://some_url/api/1.0/collection',payload, {headers:headers})
}, resp => resp.json());
}
concatMap will keep your server from executing more than whatever the size of your buffer is, by preventing new requests until the previous one has returned. You could also use mergeMap if you wanted them all to be executed in parallel, but it seems the server is the resource limitation in this case if I am not mistaken.
I'd suggest to use websocket connections instead because they don't time out.
See also
- https://www.npmjs.com/package/angular2-websocket
- http://mmrath.com/post/websockets-with-angular2-and-spring-boot/
- http://www.html5rocks.com/de/tutorials/websockets/basics/
An alternative approach would be polling, where the client makes repeated requests in a defined interval to get the current processing state from the server.
To send multiple requests and waiting for all of them to complete
getAll(urls:any[]):Observable {
let observables = [];
for(var i = 0; i < items.length; i++) {
observables.push(this.http.get(urls[i]));
}
return Observable.forkJoin(observables);
}
someMethod(server:string) {
let urls = [
'${server}/fileService?somedata=a',
'${server}/fileService?somedata=b',
'${server}/fileService?somedata=c'];
this.getAll(urls).subscribe(
(value) => processValue(val),
(err) => processError(err),
() => onDone());
}