Output capitalized names from randomUser.me api? - javascript

Forgive me if this isn't the right platform to ask this question. And let me preface that I'm a designer with very little API and javascript experience.
I'm using the randomUser api to generate a json file or url I can input in Invision's Craft tool for Sketch, so I can input real data in my designs. https://screencast.com/t/jAkwUpUja2. However, it gives the names in lowercase instead of title-case/capitalized.
I'm generating the JSON by typing the endpoints I need in the browser: https://screencast.com/t/E8Cmjk5XSSCk
So, is there a way I can force the api to give me capitalized names? Thanks!
EDIT: here is the JSON url: https://randomuser.me/api/?results=20&nat=us&inc=name,gender,picture&format=pretty

Here is the simplest way to capitalize a string with JS, as far as i know:
// let's assume, that you have stored the lastname as follow:
let last = 'rodney';
To transform the lastname, you apply this pattern:
let capitalizedLast = last[0].toUpperCase() + last.substr(1);
last[0] returns the first letter of the string r.
last.substr(1) gives the rest of the lastname odney
with toUpperCase() you transform the first letter and + will concatenate both to the final result.
You just need to itterate over the results from your API and transform the elements that you needed in that way.

A quick look at the documentation suggests that there might not be a way to get the API to return capitalized names directly. So you're going to have to write some JavaScript to do the job for you.
This code should print out the data to the console with all names capitalized.
It iterates through the items in the result array, goes through all properties of the item's name property and capitalizes them.
The capitalize function gets the first character of the name, converts it to upper case and appends it to the rest of the name.
function capitalize(text) {
return (!text || !text.length) ?
text :
(text[0].toUpperCase() + text.slice(1));
}
$.get("https://randomuser.me/api/?results=20&nat=us&inc=name,gender,picture&format=pretty",
function(data) {
if (!data || !data.results)
return;
data.results.forEach(function(user) {
if (user.name) {
for (var name in user.name) {
user.name[name] = capitalize(user.name[name]);
}
}
});
console.log(data);
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>

Related

CSV string to array when there is \n in body [duplicate]

This question already has answers here:
How to parse CSV data that contains newlines in field using JavaScript
(2 answers)
Closed 10 months ago.
I'm trying to convert a CSV string into an array of array of objects. Although the issue is, there is a bunch of \n in the body from the incoming request, with are causing the request to split and mess up all the code. I'm attempting to fix this even with \n in the body
The string looks like this, all the messages that are strings from the incoming request, starts with a \" and finishes with \".
"id,urn,title,body,risk,s.0.id,s.1.id,s.2.id,a.0.id,a.1.id,a.2.id,a.3.id
302,25,\"Secure Data\",\"Banking can save a lot of time but it’s not without risks. Scammers treat your bank account as a golden target –
it can be a quick and untraceable way to get money from you\n\n**TOP TIPS**\n\n**Always read your banks rules.** These tips don’t replace your banks rules - \
in fact we fully support them. If you don’t follow their rules, you may not get your money back if you are defrauded \n\n**Saving passwords or allowing auto-complete.**
Saving passwords in your browser is great for remembering them but if a hacker is able to access your computer, they will also have access to your passwords.
When on your banking site the password box we recommend you don’t enable the auto-complete function – a hacked device means they are able to gain access using this method \n\n**Use a
PIN number on your device.** It’s really important to lock your device when you’re not using it.\",,2,20,52,1,2,3,4"
I have attempted to make it smaller since there is a bunch of content, but the string that comes is basically the above, The big string with is messing my code up start at Banking can save and finishes at not using it. I have several other datas that have the same type of body, and always comes inside \" body \", I have been attempting to perform a function to separate the content from this CSV string, into an array of array or an array of objects.
This is what I attempted:
function csv_To_Array(str, delimiter = ",") {
const header_cols = str.slice(0, str.indexOf("\n")).split(delimiter);
const row_data = str.slice(str.indexOf("\n") + 1).split("\n");
const arr = row_data.map(function (row) {
const values = row.split(delimiter);
const el = header_cols.reduce(function (object, header, index) {
object[header] = values[index];
return object;
}, {});
return el;
});
// return the array
return arr;
}
I have thought on using regex too, where I would split if it had a comma of a \n, although if there is a /" it will split when it finds the next /":
array.split(/,/\n(?!\d)/))
Try this:
csvData.replace(/(\r\n|\n|\r)/gm, "");
Once you've used that to replace the new lines, or removed them, this code will help you get started with understanding how to build an array from the new CSV string:
const splitTheArrayAndLogIt = () => {
const everySingleCharacter = csvData.split(""); // <-- this is a new array
console.log(everySingleCharacter);
const splitAtCommas = csvData.split(",");
console.log(splitAtCommas);
}

How to parse and format strings out of irregular CSV in javascript?

I've scraped this allergy data string from a public website:
Cedar 679 gr/m3 High, Grass 20 gr/m3 Medium, Trees 80 gr/m3 Medium, Molds Low.
Sometimes the number of items is fewer, but the general format for trees and grasses is always like this, with commas separating each type of allergen:
TYPE AMOUNT g/m3 LEVEL
Molds is the exception; assume it will always be a string of text. Assume we don't require the molds data at all.
What library or technique would you use to parse this into a neat JSON object, for example:
{
"Cedar": "679",
"Grass": "20",
"Trees": "80"
}
As Sam stated in the comments, it'd be ideal to utilize npmjs.com/package/csv-parser
However, if you want to use vanilla JS, I wrote a basic script that works given your input:
//function takes a csv string and returns a list of Objects
//and only includes values with 4 parts
function parseCsv(csvString) {
let out = {};
let spacedValues = csvString.split(/,\s*/);
let values = spacedValues.map(str => str.split(" "));
values.forEach((value, index) => {
if (value.length === 4) {
//you can change the value from an object to value[1] if you only need the amount
out[`${value[0]}`] = {
AllergenAmount: value[1],
AllergenUnits: value[2],
AllergenLevel: value[3]
}
});
}
// add an else if here if you want to keep values with more/less than 4 parts of the string
});
return out;
}
//wrapper that implements the builtin JSON.stringify method
const csvToJSONString = csvString => JSON.stringify(parseCsv(csvString));
To use it, just pass the csv string into the csvToJSONString function, and it will return a JSON string. You can also change the properties from an object to value[1] if you only needed the amount (commented in code).
I worked with "csvtojsn" module before in a similar situation and it helped a lot.
https://www.npmjs.com/package/csvtojson
You should try csv-parse I'm using it in my current project and it works like a charm.

OQL in VisualVM v1.4.4 - Get A Class's Field Names

I would like to execute an OQL query in VisualVM (v1.4.4) to retrieve the (non-static) field names for an object.
The OQL documentation describes heap.findClass(className). This returns an object which includes a fields property (an array of field names).
When I execute the following OQL...
heap.findClass('java.io.ByteArrayInputStream').fields;
... it returns an array of 4 field objects (ByteArrayInputStream has 4 fields - buf, count, mark, and pos - I am assuming these are what are being returned):
org.netbeans.lib.profiler.heap.HprofField#56de8c
org.netbeans.lib.profiler.heap.HprofField#56de95
org.netbeans.lib.profiler.heap.HprofField#56de9e
org.netbeans.lib.profiler.heap.HprofField#56dea7
If I then try to manipulate this array, for example to access each field's name and signature properties (as described in the OQL docs), I get no results. I can't even get the length of the array. For example:
heap.findClass('java.io.ByteArrayInputStream').fields.length;
and:
heap.findClass('java.io.ByteArrayInputStream').fields[0];
Both of the above statements return <no results>.
What am I doing wrong? Probably something basic. I not very familiar with JavaScript - or with how data is displayed in VisualVM, for that matter.
You need to use map() function. The following OQL retrieves the field names of ByteArrayInputStream class:
select map(heap.findClass('java.io.ByteArrayInputStream').fields, 'it.name')
Just to add to the very helpful answer from #Tomas - which I have accepted.
Based on his insight, I can also now do things like this in OQL - using a callback instead of an expression string:
map(heap.findClass('java.io.ByteArrayInputStream').fields, function (it) {
var res = '';
res += toHtml(it.name) + " : " + toHtml(it.signature);
return res + "<br>";
});
The above example is trivial, but it opens up more possibilities.
His answer also made me realize where I was going wrong: OQL uses JavaScript expression language - not the exactly the same as JavaScript.

Adding parameters to URL, putting array in query string

Objective
I've built an interactive where people can choose six players to make their all-star team. When they click share to Twitter, my hope is to have a URL containing parameters of all six players something like website.com/?playerName=picked?playerName=picked so that people can share their teams
Question
What is the best way to append parameters to a URL?
How do you put an array into a query string?
You can use an array directly in a url, however you would need to serialize the array into a string. like this player[]=one&player[]=two
here is a little function to automate it.
when using url's you should always use encodeURIComponent to encode any non url friendly characters. The players are an array so we map over it and get a new array that has been encoded.
After that we simply need to join the array with &
const players = [
'player Name 1',
'playerName2',
'playerName3'
]
const parameterizeArray = (key, arr) => {
arr = arr.map(encodeURIComponent)
return '?'+key+'[]=' + arr.join('&'+key+'[]=')
}
console.log(parameterizeArray('player', players))
edit
The only difference is the function declaration style, everything else is standard ES5
function parameterizeArray(key, arr) {
arr = arr.map(encodeURIComponent)
return '?'+key+'[]=' + arr.join('&'+key+'[]=')
}
Cleaner:
website.com/?players=player1,player2,player3,player4
Then split the query to get result:
var arrayResult = query.players.split(",")

optimize search through large js string array?

if I have a large javascript string array that has over 10,000 elements,
how do I quickly search through it?
Right now I have a javascript string array that stores the description of a job,
and I"m allowing the user to dynamic filter the returned list as they type into an input box.
So say I have an string array like so:
var descArr = {"flipping burgers", "pumping gas", "delivering mail"};
and the user wants to search for: "p"
How would I be able to search a string array that has 10000+ descriptions in it quickly?
Obviously I can't sort the description array since they're descriptions, so binary search is out. And since the user can search by "p" or "pi" or any combination of letters, this partial search means that I can't use associative arrays (i.e. searchDescArray["pumping gas"] )
to speed up the search.
Any ideas anyone?
As regular expression engines in actual browsers are going nuts in terms of speed, how about doing it that way? Instead of an array pass a gigantic string and separate the words with an identifer.
Example:
String "flipping burgers""pumping gas""delivering mail"
Regex: "([^"]*ping[^"]*)"
With the switch /g for global you get all the matches. Make sure the user does not search for your string separator.
You can even add an id into the string with something like:
String "11 flipping burgers""12 pumping gas""13 delivering mail"
Regex: "(\d+) ([^"]*ping[^"]*)"
Example: http://jsfiddle.net/RnabN/4/ (30000 strings, limit results to 100)
There's no way to speed up an initial array lookup without making some changes. You can speed up consequtive lookups by caching results and mapping them to patterns dynamically.
1.) Adjust your data format. This makes initial lookups somewhat speedier. Basically, you precache.
var data = {
a : ['Ant farm', 'Ant massage parlor'],
b : ['Bat farm', 'Bat massage parlor']
// etc
}
2.) Setup cache mechanics.
var searchFor = function(str, list, caseSensitive, reduce){
str = str.replace(/(?:^\s*|\s*$)/g, ''); // trim whitespace
var found = [];
var reg = new RegExp('^\\s?'+str, 'g' + caseSensitive ? '':'i');
var i = list.length;
while(i--){
if(reg.test(list[i])) found.push(list[i]);
reduce && list.splice(i, 1);
}
}
var lookUp = function(str, caseSensitive){
str = str.replace(/(?:^\s*|\s*$)/g, ''); // trim whitespace
if(data[str]) return cache[str];
var firstChar = caseSensitive ? str[0] : str[0].toLowerCase();
var list = data[firstChar];
if(!list) return (data[str] = []);
// we cache on data since it's already a caching object.
return (data[str] = searchFor(str, list, caseSensitive));
}
3.) Use the following script to create a precache object. I suggest you run this once and use JSON.stringify to create a static cache object. (or do this on the backend)
// we need lookUp function from above, this might take a while
var preCache = function(arr){
var chars = "abcdefghijklmnopqrstuvwxyz".split('');
var cache = {};
var i = chars.length;
while(i--){
// reduce is true, so we're destroying the original list here.
cache[chars[i]] = searchFor(chars[i], arr, false, true);
}
return cache;
}
Probably a bit more code then you expected, but optimalisation and performance doesn't come for free.
This may not be an answer for you, as I'm making some assumptions about your setup, but if you have server side code and a database, you'd be far better off making an AJAX call back to get the cut down list of results, and using a database to do the filtering (as they're very good at this sort of thing).
As well as the database benefit, you'd also benefit from not outputting this much data (10000 variables) to a web based front end - if you only return those you require, then you'll save a fair bit of bandwidth.
I can't reproduce the problem, I created a naive implementation, and most browsers do the search across 10000 15 char strings in a single digit number of milliseconds. I can't test in IE6, but I wouldn't believe it to more than 100 times slower than the fastest browsers, which would still be virtually instant.
Try it yourself: http://ebusiness.hopto.org/test/stacktest8.htm (Note that the creation time is not relevant to the issue, that is just there to get some data to work on.)
One thing you could do wrong is trying to render all results, that would be quite a huge job when the user has only entered a single letter, or a common letter combination.
I suggest trying a ready made JS function, for example the autocomplete from jQuery. It's fast and it has many options to configure.
Check out the jQuery autocomplete demo
Using a Set for large datasets (1M+) is around 3500 times faster than Array .includes()
You must use a Set if you want speed.
I just wrote a node script that needs to look up a string in a 1.3M array.
Using Array's .includes for 10K lookups:
39.27 seconds
Using Set .has for 10K lookups:
0.01084 seconds
Use a Set.

Categories

Resources