How to extract 'div' element from different web page? - javascript

I have an allure report webpage that provides a dashboard with a percentage on it. I have to create a completely separate webpage which should show that percentage.
How can I pull that div from that page?
Here's the picture of a div

Fetch it, and then parse it until you have the contents of that div.
For fetch, something simple like (polyfill fetch or use axios instead if using nodejs):
fetch('https://jsonplaceholder.typicode.com').then(function (response) {
// The API call was successful!
return response.text();
}).then(function (data) {
// This is the HTML text from our response
console.log(data);
}).catch(function (err) {
// There was an error
console.warn('Something went wrong.', err);
});
And then for parsing, answers like this will be helpful:
Parse an HTML string with JS

Related

How to correctly code an interchange data between server-side and client-side using JavaScript?

I work on a web application and need to incorporate editable tables in an HTML page. Found awesome JS tool "Tabulator", however, faced with lack of my basic fundamental knowledge in data transferring from server to client and back. Well, get to the point.
On server-side:
Node.js file, let say app.js
required data is stored in Mongo DB in two collections, User and Resource
On client-side:
HTML file stat.html to embedded two Tabulator's table
JS file table.js
=== Stage 1. Send data from server to client ===
Step 1. app.js pulls data from Mongo DB and sends it to table.js
//=== render stat.html page ===//
router.get('/stat', (req,res)=>{
res.render('stat');
});
//=== pull user data from mongo and send it to table.js ===//
router.get('/stat', (req,res) => {
User.find({}, (err, users) => {
if (err) throw err;
res.send(users);
});
});
//=== pull resource data from mongoand send it to table.js ===//
router.get('/stat', (req,res) => {
Resource.find({}, (err, resources) => {
if (err) throw err;
res.send(resources);
});
});
Here I've got several questions:
Question #1. Do I do it right?
Question #2. Is it possible to render the page and send data to client JS in one route? I tried but got an error.
Question #3. The output from mongoose Collection.find is in JSON format and it's what Tabulator needs. Do I need to manipulate it anyhow before sending it to the client-side? For instance use JSON.stringify
Question #4. Is there any way to check if data is actually sending?
Question #5. I need to send two different JSONs for different tables. If they are sent in one route how to separate one from the other on the client-side? I was thinking to use different paths, like /stat/user and /stat/resource but then how to trigger them?
Step 2. On the client-side table.js file receives data and feeds it to Tabulator tables. Tabulator has its own option to request remote data ajaxURL
//setup user table
var table = new Tabulator('#user-table',{
ajaxURL:"stat/user",
});
//setup resource table
var table = new Tabulator('#resource-table',{
ajaxURL:"stat/resource",
});
Questions:
Question #6. Since there're two tables on the page I need to use two different URLs. And this brings back to Question #5. How to trigger these paths when rendering the page
Question #7. Assuming that there's no ajaxURL how data can be read in table.js? I tried Fetch API but with no success.
Fetch request in table.js
fetch('/stat/user', {
headers: {
'Content-Type': 'application/json'
},
body: JSON.stringify(users)
})
.then(response => response.json())
.then(function(response) {
if (response.ok) {
console.log('got data: ', response.users);
}
throw new Error('Request failed.');
})
.catch(function(error) {
console.log(error);
});
Stage 2 was meant to be about sending data back from the client to the server. But I think this is already overwhelming and perhaps it'd better put this part aside for a while.
I know there are a ton of articles on the internet, questions in StackOverflow regarding this subject, and read many of them but still haven't grasped it, and so please do not suggest something like the MDN manual.
I'll be very thankful for any help.
Sorted out by myself. Special thanks to #jonrsharpe.
If anyone needs help on this subject, I'll be glad to help.

Twitter API Javascript extended mode

I am using twitter stream API with Javascript in order to collect tweets. The problem is that I am not using the extended mode and I don't get the full tweet when it is greater than 140 characters. I have searched info and I discovered that I need to pass tweet_mode=extended in the request and the response will contain a full_text which is the tweet complete. The problem is that I don't know where to write tweet_mode=extended. This is my code:
twitter.stream('statuses/filter', {track: word, function (stream) {
stream.on('data', function (tweet) {
console.log(tweet.text)
});
stream.on('error', function (error) {
console.log("error:", error);
})
})
Unfortunately the streaming API does not have the option to add the tweet_mode param
Documentation here: https://developer.twitter.com/en/docs/tweets/tweet-updates
This paragraph is of note to your concern:
The Streaming API does not provide the same ability to provide query
parameters to configure request options. Therefore, the Streaming API
renders all Tweets in compatibility mode at this time.
...
Streaming API consumers should update their code to first check for
the presence of the extended_tweet dictionary, and use that in
preference to the truncated data as is applicable for their use case.
When extended_tweet is not present, they must fall back to using the
existing fields.
As Van said, tweets in streaming API are mixed. So you can try this :
twitter.stream('statuses/filter', {track: word, function (stream) {
stream.on('data', function (tweet) {
let text = tweet.extended_tweet?tweet.extended_tweet.full_text:tweet.full_text?tweet.full_text:tweet.text;
console.log(text)
});
stream.on('error', function (error) {
console.log("error:", error);
})
})

WEB SCRAPING - nightmare js and request

I am using combination of nightmare, cheerio and request in NODEjs, for making custom web scraping bot... I did authentication and filter setup with nightmare js, and now I need to call function like
request(URL, function(err, response, body){
if (err) console.error(err);
var scraping = cheerio.load(body);
.
.
.
.
But problem is that I don't know how to forward loaded "body" (by nightmare). I can't use URL because it's dynamically generated content (tables), which means that URL is always the same... I tried to use this instead of URL, but it wont work.
Any suggestions?
Thank you
You don't need to use request. In fact, you shouldn't. Nightmare itself can pass the html data to cheerio.
Once you logged in and went to your desired webpage in nightmare, use evaluate to get the html. You can do something like this:
nightmare
.viewport(1280, 800)
.goto(url)
.wait('#emailselectorId')
.type('#emailselectorId', 'theEmail\u000d')
.type('#ap_password', 'thePassword\u000d')
.click('#signInSubmit')
//do something in the chain to go to your desired page.
.evaluate(() => document.querySelector('body').outerHTML)
.then(function (html) {
cheerio.load(html);
// do something
})
.catch(function (error) {
console.error('Error:', error);
});

How do I display response data in the front end?

I've made GET requests to the github API:
axios.get('https://api.github.com/users/roadtocode822')
.then(function (response) {
console.log(response.data);
})
I get the response data. This function lives in the app.js file.
Also lives on the app.js file is the following code:
app.get('/', function(req, res){
Article.find({}, function(err, articles){
if(err){
console.log(err);
} else {
res.render('index', {
title: "Articles",
articles: articles
});
}
});
});
I'm able to query data from my mongodb database through the Article.js mongoose model and send the data to my index.pug file.
I want to be able to take the GITHUB response data and also render it in one of my pug view files. I feel like I'm missing some sort of concept in Javascript that's preventing me from achieving this.
Thanks in advance.
To get the Github response as a JSON, just use JSON.parse(). You won't be able to use your .pug template on the front end, however. That template is interpreted on the server side and is sent from server to client as plain old HTML. If you're interested in front-end templating, check out something like handlebars.js.
axios.get('https://api.github.com/users/roadtocode822')
.then(function (response) {
console.log(response.data);
})
from the code above, response.data will be a html content because your server returns res.render.
in the front-end, you should use a tag and form post instead of ajax call like this
Click

Serving New Data With Node.js

There may already by an answer to this question but I was unable to find it.
Let's say I have a Node.js webpage doing somewhat time-consuming API calls and computations:
var request = require('request'),
Promise = require('es6-promise').Promise,
is_open = require('./is_open');
// Fetch the name of every eatery
var api_url = 'url of some api';
request(api_url, function (error, response, body) {
if (error) {
console.log(error);
} else if (!error && response.statusCode == 200) {
// Good to go!
var results = JSON.parse(body).events;
results.(function (result) {
// This line makes its own set of API calls
is_open(result
.then(function (output) {
console.log(output);
if (output == false) {
console.log('CLOSED\n');
} else {
console.log(output);
console.log();
}
})
.catch(console.error);
});
} else {
console.log('Returned an unknown error.');
console.log(error);
console.log(response);
console.log(body);
}
});
(I haven't yet created an actual web server, I'm just running the app locally through the command line.)
I want the web server to serve a loading page first to every user. Then, once the API calls are complete and the data is ready, it should send that data in a new webpage to the same user.
The reason I think there's an issue is because in order to serve a webpage, you must end with:
res.end();
Therefore ending the connection to that specific user.
Thanks for the help!
You must conceptually separate static content from dynamic content (later you will serve static with nginx or apache leaving only dynamic to node)
The best solution to your "problem" is to make the first webpage ask the data via AJAX once loaded. Ideally, your node app will return JSON to an ajax query from the first page, and js on the page will format the result creating DOM nodes.

Categories

Resources