How to remove text from JSON entry being appended onto webpage? - javascript

I have a webpage that takes a JSON via var url = myReport.json and displays information. The information it displays are server information that I am monitoring.
The server information monitors all applications running on the server, but it filters back ones that are named company_name_ and looks like this:
company_name_app1
company_name_app2
company_name_app3
company_name_app4
company_name_app5
On the html page, I want to remove the strings company_name_ so that the webpage will just display the following:
app1
app2
app3
app4
app5
Is there a way to remove a specified string match from the JSON entries so I can display it as wanted above?
EDIT:
<script>
//define a function that will fetch status and set icon URL
function setServerProcessesServer1761() {
var url = "Server1761.json";
var $svc = $("#Server1761-processes"); //get a reference to your <div>
$.getJSON(url, function(data) {
document.getElementById("Server1761-processes").innerHTML = data.processes.filter(s => s.Name.value.includes("company_name")).map(s => `<tr><td>${s.Name.value}</td> <td>${s.Status}</td></tr>`).join('\n');
$('#Server1761-processes').html(function (_, html) { return html.replace(/runn1ng/g,"<img src='smallgreen.png' />")});
$('#Server1761-processes').html(function (_, html) { return html.replace(/fai1ed/g,"<img src='smallred.png' />")});
}); // you will have to concatenate the entire services request line on line 130
}
//special jQuery syntax to run when document is ready (like onload but better)
$(function() {
setServerProcessesServer1761();
});
</script>

s.Name.value.replace("company_name_", "");
The above will replace company_name_part in the string with nothing. It uses the replace method from the String object.
String.replace(input (String, Regex), replace (String, Function))

Related

cypress: comparing information in 2 sites

I'm starting with cypress and need to compare 2 different environments.
I did a script, but is not working properly.
My goal is:
1 - search for a specific selector value at 2 different environments.
2 - get it's value (in both env) , and then compare it if equal.
The below comparision work, but seems very poor code and it stop in first error assert and can't query reference selector, just text.
Any help is appreciated.
describe('Testing Page', function() {
//urls i need to test
var relative_urls = [
'/products/test1',
'/products/test2',
]
relative_urls.forEach((url) => {
//each url is compared here...
var productInfo = [];
//here goes the environments URL.
var testURL = 'https://www.myurl.com' + url;
var referenceURL = 'http://test.myurl.com' + url;
it('Comparing data from url:' + url, function() {
cy.visit(testURL)
//get data from selector and add it to array
cy.get('body').find(".myselector h1").should(($input) => {
productInfo.push($input.val())
})
cy.get('body').find(".myselector h2").should(($input) => {
productInfo.push($input.val())
})
//requesting second url
cy.request(referenceURL)
.its('body').should( ($input) => {
for (var j=0;j<productInfo.length;j++) {
//expect works, but compares to all site, and i need to search in a specific selector.
//Also, when it gets the first error, it stops and do not search all elements of array
expect($input.includes(productInfo[j]), 'Notice: ' + productInfo[j]).to.be.true
}
}
})
})
})
})
From reading the documentation, cy.request is really making a plain HTTP request without doing any parsing, which means you basically have to parse the body of the response yourself. cy.visit will actually get the DOM elements so you can use Cypress's query syntax to navigate the page.
I think once you have the element values from the first page you should just do cy.visit again and parse the second page.
EDIT: Apparently you can't use cy.visit cross-domain. In that case, maybe you could try parsing the response body into a DOM node like this:
var el = document.createElement( 'html' );
el.innerHTML = request.body // you can get the request object from cy.request;
Then use el.querySelector to navigate the DOM tree using CSS syntax.

Append new files

I have an api running that fetches all the file names in the directory and returns me an array inside an array. Currently, I am running it every second to check if a new file is added and if so... embed it to my div. The issue is that I have to empty my html every time and then re-embed the html. Is there a better way to do this? That way I only embed new filenames rather than all again.
setInterval(function(){
$.ajax({
url : 'getFiles',
success: function (data) {
$("#pics").html("");
console.log(data);
$.each(data, function (k, o) {
$.each(o, function (key, obj) {
$("#pics").append("<a href='#'>" + obj + "</a>");
});
});
}
});
}, 1000);
const images = [];
setInterval(function(){
$.ajax({
url : 'getFiles',
success: function (data) {
const fetchedImages = data.images;
if(images.length !== fetchedImages.length){ //They do not have the same elements
images = fetchedImages;
$("#pics").html("");
const domImages = fetchedImages.map(image => "<a href='#'>" + image + "</a>");
$("#pics").append(domImages.join(''));
}
}
});
}, 1000);
From our discussion i was able to create this solution.
Since you know that you only need a list of images, then you can just get it directly.
Then you can check if the images which are saved locally have the same amount of elements which you got from the server.
If they do not match, then it must mean that the list has been changed (a side-effect could be that someone changed the name of a file, then the length would be the same)
Now we just empty the #pics HTML, create a new array where each element is wrapped in an <a> tag
Lastly join just takes an array and converts it to a string. '' means that there shouldn't be any text between each element, so the string looks like this
"<a href='#'>image1.jpg</a><a href='#'>image2.jpg</a><a href='#'>image3.jpg</a>"
In your case, I will suggest to keep current implementation: clean all and generate the list by new received data. The reason is:
From performance view, clean all and regenerate it will faster than for each compare and check if duplicated => keep, remove old item or insert new item
the order of item can be easily kept as the received data. Won't be confused with the old item list.
The rule I suggest is: if the new list is totally the same, return without change directly. If the list has been changed, clean all and rebuild the list directly.

How to store and get data from a json file with javascript?

I would like to create a little kind of CMS. For that I want to code a function, which automatically creates a list of the files, which were created with the CMS. This list sould be display in a html file and styled with CSS. For that I want to create a json-File, in which i store the title and the location of the new file.
Then it should look like this:
{
"new_sites":
{
"title": "source",
"otherTitle": "otherSource"
}
}
Now I want to know, how I can get (or store new) data from the json-File and use it as variables in javascript so that I can display it on the html page.
Given these three variables:
var myNewSite = 'newSite';
var myNewTitle = 'newTitle';
var myNewSource = 'newSource';
and your final json variable initialized in this way:
var myJson = {};
You could simply:
myJson[myNewSite] = {
myNewTitle: myNewSource
};
You can use Jquery api to get data from json-file.Suppose you have a data.json file in directory.
<script>
var container={};
$.getJSON('data.json', function(data) {
container=data;
});
</script>

Write to a existing js file using javascript

Suppose I have a file on server file1.js
Now my file1.js contains an code:
var arr = [ {'id':1, 'name':'one'},
{'id':2, 'name':'two'} ]; // array of two objects
There are two input boxes:
ID: <input id>
Name: <input name>
<button type="submit" action="xyz">
Now on clicking the submit button, I want to open file1.js and add {'id':<user_entered>, 'name':<user_entered>} i.e. an object to the existing array.
Is it doable? I don't want to involve any database here.
arr.push(..)
is the temporary solution i.e. If I open the web page on other PC or refresh the page, the source rendered here will not contain an array of 3 objects..isn't it????
If you are in a server-side environment you could first load the file using fs.readFile, parse the JSON into a normal JS array, manipulate the array however you need (i.e. append the extra object), and then write the serialized JSON back to the same path.
It would look something like this:
fs.readFile(path.join(__dirname, 'myFile.json'), 'utf-8', (err, data) => {
if (err) {
console.log('Error reading file');
}
const myArray = JSON.parse(data);
myArray.push({ new: 'object' });
fs.writeFile(path.join(__dirname, 'myFile.json'), JSON.stringify(myArray), e => {
if (e) {
console.log('Error writing file');
}
console.log('Success');
});
});
If you are only building for the web, however, this sounds like a job for localStorage. It would work very similarly:
const savedArray = localStorage.getItem('my_saved_array');
const myArray = savedArray ? JSON.parse(savedArray) : {};
myArray.push({ new: 'object' });
localStorage.setItem('my_saved_array', JSON.stringify(myArray));
It is possible for JSON.parse to throw and error so you may want to wrap this in a try-catch. This also gets around the issue of opening the page in a different tab as localStorage is persisted for each site in the same browser. If you were to open the site in chrome and then in safari, however, localStorage will not sync.
Hope this helps.

Gmail API - Parse message content (Base64 decoding?) with Javascript

I'm trying to use the Gmail API to get a user's email, grab the message subject and body, and then display it on a webpage. I'll be doing other stuff with it, but this is the part that I am having difficulty with. I am using Angular.js.
Here is my API call:
function makeApiCall() {
gapi.client.load('gmail', 'v1', function() {
var request = gapi.client.gmail.users.messages.list({
labelIds: ['INBOX']
});
request.execute(function(resp) {
var content = document.getElementById("message-list");
angular.forEach(resp, function(message) {
var email = gapi.client.gmail.users.messages.get({'id': message.id});
// var raw = email.payload.parts;
// console.log(raw);
content.innerHTML += JSON.stringify(email) + "<br>";
})
});
});
}
So gapi.client.gmail.users.messages.list returns an array of my messages, with their ID numbers. That is working.
The call to gapi.client.gmail.users.messages.get({<specific message ID>}) outputs this - {"B":{"method":"gmail.users.messages.get","rpcParams":{},"transport":{"name":"googleapis"}}}.
Not sure what that is, but trying to get the message payload (email.payload.parts), results in undefined. So, how can I get the message content?
Also, I would assume that if I can get the message contents, I would then have to Base64 decode the contents to get some English out of it. Any suggestions for that would be of great help also. I've found this: https://github.com/kvz/phpjs, but since I'm not sure how to go about getting the message contents so that I can try and decode them, so not sure if that php.js is of an help in that regard.
Regarding the Base64 decoding, you can use
atob(dataToDecode)
For Gmail, you'll also want to replace some characters:
atob( dataToDecode.replace(/-/g, '+').replace(/_/g, '/') );
The above function is available to you in JavaScript (see ref). I use it myself to decode the Gmail messages. No need to install extra stuff. As an interesting tangent, if you want to encode your message to Base64, use btoa.
Now, for accessing your message payload, you can write a function:
var extractField = function(json, fieldName) {
return json.payload.headers.filter(function(header) {
return header.name === fieldName;
})[0].value;
};
var date = extractField(response, "Date");
var subject = extractField(response, "Subject");
referenced from my previous SO Question and
var part = message.parts.filter(function(part) {
return part.mimeType == 'text/html';
});
var html = atob(part.body.data.replace(/-/g, '+').replace(/_/g, '/'));
Depending on what your emails look like (single text/plain part? multipart with text/html? attachments, etc?) you may or may not have any "parts" in your email.payload and instead you'll have what you're looking for in "email.payload.body.data" (for single-part messages). This is all assuming you're doing a message.get with the default format ("full"). If you instead want to get the entire email in the message.raw field and deal with it in email libraries for your language you can call message.get(format=raw).
For more info check out the "body" and "parts[]" field documentation for "Message" at https://developers.google.com/gmail/api/v1/reference/users/messages
Ah! I figured it out. parts is an array, so I should have been calling it like: gapi.client.gmail.users.messages.get({'id': <message ID>}).payload.parts[0].body.data
Now my problem is decoding the emails, which is proving successful in plain text emails, but failing in emails from non-personal locations (businesses, social media update emails, etc.). But I'll make a new question to get answers for that.
You need to search where the body for a given mime type is, I have written a recursive function for that:
function searchBodyRec(payload, mimeType){
if (payload.body && payload.body.size && payload.mimeType === mimeType) {
return payload.body.data;
} else if (payload.parts && payload.parts.length) {
return payload.parts.flatMap(function(part){
return searchBodyRec(part, mimeType);
}).filter(function(body){
return body;
});
}
}
So now you can call
var encodedBody = searchBodyRec(this.message.payload, 'text/plain');
See the flatMap method up there? Classic FP method missing in js, here is how to add it (or you can use lodash.js, or underscore.js if you don't want to mess with the native objects)
Array.prototype.flatMap = function(lambda) {
return Array.prototype.concat.apply([], this.map(lambda));
};

Categories

Resources