Ok, so I am a bit of a noob with javascript and I need to read data from a csv to make a barchart with d3. The barchart is no problem for me, the reading from the csv file is. This is my code:
var dataset;
d3.csv("gender_ratio.csv", function(data) {
dataset = data;
return dataset;
});
var add = function(year, total, males, females){
var year = {
year: year,
total: total,
males: males,
females: females
};
newdata.push(year);
return newdata;
};
for (var i = 0; i < dataset.length; i += 4){
add(dataset[i], dataset[i+1], dataset[i+2], dataset[i+3]);
return newdata;
};
Can someone tell me what I is going wrong here? I am running this with modzilla firefox, so the browser security isn't the problem here.
The call to load the csv data completes asynchronously. That means your for loop is run before the data has been loaded.
If you move the for loop into the callback function of the call to d3.csv then the data will be available.
You should also check what the returned data looks like for d3.csv. Your code assumes it is returning a flat array, whereas it actually returns an array of objects where each element represents a row. If you add a console.log in the callback of the function you'll get a better sense of what the data looks like.
You also have a return statement in that for loop which means it'll only process the first element of data before exiting the loop.
d3.csv("gender_ratio.csv", function(data) {
dataset = data;
// process data here
console.log(data);
});
First, d3's .csv function works asynchronous, thus you need to call te actual bar chart drawing function within the .csv function. If the csv file has a first row featuring column names, you can use a callback function:
var dataset = [];
d3.csv("gender_ratio.csv", function(d) {
return {
year: d.year,
total: d.total,
males: d.males,
females: d.females,
};
}, function(error, rows) {
dataset = rows;
drawBarChart(); /* <<-- This would be the call to the drawing function. */
});
Related
Script must log postback information about call detail from zvonok.com to google spreadsheets. I has write function which only append row to sreadsheet - no update or modify of any cell in code and during few manual test calls rows has been append correct, but when my client began his usual call campaign, calls and postbacks going very often one after other, values in last row began changing few times and in some cases leave strange values
I seen behavior like this first time made short video record:
https://youtu.be/0_H_mVAbp4g
here is one column with strange value
2103052006092385
2,10305E+15
210305412464544
I have found 9 cases from 248 rows.
Client has show me excel from his user cabinet, totally was maded 5649 calls, so in google spreadsheets must be 5649 rows instead 248.
function getJsonFromUrl(url) {
var query = url;
var result = {};
if (query == undefined){
return result;
}
query.split("&").forEach(function(part) {
var item = part.split("=");
result[item[0]] = decodeURIComponent(item[1]);
});
return result;
}
function doGet(e){
const ctCompl = 'ct_completed';
var doc = SpreadsheetApp.openById(SHEET_KEY);
var sheet = doc.getSheetByName(SHEET_NAME);
var row = [];
if(typeof e !== undefined){
mArr = getJsonFromUrl(e.queryString);
for (i in mArr) if( i == ctCompl) {
row.push(convTimeLong(mArr[i]));
} else
row.push(mArr[i]);
sheet.appendRow(row);
} else {
sheet.appendRow(['e undefined!']);
}
SpreadsheetApp.flush();
return handleResponse(e)
}
function convTimeLong(dateTime) {
let d = new Date();
let dt=dateTime.replace('+', 'T');
try {
var res = Utilities.formatDate(d,"GMT+2", "dd.MM.yyyy HH:mm");
return res
} catch(e){
return dateTime; }
}
executions dashboard show status "completed' everywhere, execution time longest - 1.688 s
Client has set delay 5 second between call's, right now I don't now is percent of lost postback's decreased after delay was set or not, but it still very high.
https://youtu.be/0_H_mVAbp4g
In general, using Google Sheets as a database is a bad idea. It's not designed for this so it could fail really bad. Using a proper database will make everything much, much easier. If you are using the spreadsheet to then cook the data, I'd advise to use a function that imports data like IMPORTXML (see reference).
That being said, if you insist on using Sheets, you could try using locks:
function appendRow(sheet, row) {
const lock = LockService.getScriptLock()
while (!lock.tryLock(100000)) /* Spin the lock until it gets aquired */;
try {
sheet.appendRow(row)
SpreadsheetApp.flush()
} finally {
lock.releaseLock()
}
}
To use it, you only need to pass the sheet and the values to add: sheet.appendRow(row) to appendRow(sheet, row).
It will make sure that entries don't get overridden. Note that this will slow down the code a lot and the script can time out if there are a lot of requests.
Im in javascript on a HTML page trying to use Solr query data in a table.
So what I want to do is get a bunch of different responses to solr queries, and store them in an array as they come in. Then return each of these results to their relevant spot in a graphic chart to display my information.
What I am having trouble doing is storing the result from JSON in a variable for later use. However it won't save, as the results show up as undefined when checked. Yet if I assign the response to some HTML location like with $('#yoyo') from within function on_data(data), it seems to work just fine.
So I can get results live, but I can't store them for some reason?
From reading lots of other posts it seems it could be something to do with it being asynchronous. It feels to me like functions are running out of time. Like its trying to return the answer before it's assigned any value or something.
Can somebody show me where I'm going wrong here?
if (typeof variable !== 'undefined') {
var global_tick = -1
}
var results = []
var testB = 'query response failed';
$('#testB').prepend('<div>' + testB + '</div>');
function on_data(data) {
results[global_tick] = parseInt(data.response.numFound);
$('#yoyo').prepend('<div>' + results[global_tick] + '</div>');
global_tick = global_tick + 1;
}
function partyQuery(party, region){
pQuery(party, region)
var res = pResult()
return res
}
function pQuery(party, region){
var url='http://localhost:8983/solr/articles/select?q=text:'+party+'+AND+region:'+region+'&wt=json&callback=?&json.wrf=on_data';
$.getJSON(url);
}
function pResult(){
return results[global_tick]
}
//Parties
var pqFG = partyQuery("\"Fine Gael\"", 'ROI');
var pqFF = partyQuery("\"Fianna Fail\"", 'ROI');
// Load the Visualization API and the corechart package.
google.charts.load('current', {'packages':['corechart']});
// Set a callback to run when the Google Visualization API is loaded.
google.charts.setOnLoadCallback(drawChart);
// Callback that creates and populates a data table,
// instantiates the pie chart, passes in the data and
// draws it.
function drawChart() {
// Party data table.
var data_party = new google.visualization.DataTable();
data_party.addColumn('string', 'Party');
data_party.addColumn('number', 'Mentions');
data_party.addRows([
['Fine Gael', pqFG],
['Fianna Fáil', pqFF],
]);
This is a follow-up question of the question here.
I would like to load several datasets using d3.csv and d3.json and then combine those datasets using d3.zip. In the example below I use only two. The first dataset will be stored in xyData and the second one in colData. My goal is to call something like
var combinedData = d3.zip(colData, xyData);
however, since these datasets are only accessible inside the d3.csv and d3.json scope, respectively, that does not work. Is there any workaround for that? How would one deal with that if one has even more datasets to load?
The first dataset looks like this:
//xyData.csv
x,y
0,0.00e+000
0.6981317,6.43e-001
1.3962634,9.85e-001
My JSON dataset looks as follows:
//colData.json
{
"el1": [
{"color": "green"},
{"color": "purple"},
{"color": "brown"}
],
"el2": [
{"color": "black"},
{"color": "red"},
{"color": "yellow"}
],
"el3":[
{"color": "brown"},
{"color": "yellow"},
{"color": "blue"}
]
}
I read these datasets in as follows:
//using foreach
var xyData = [];
d3.csv("xyData.csv", function(myData) {
myData.forEach(function(d) {
d.x = +d.x; //convert data to numbers
d.y = +d.y;
});
console.log(myData[1]);
xyData = myData;
console.log(xyData[1])
});
console.log(xyData) //this will be an empty array
//loading the json data
var colData = [];
d3.json("colData.json", function(error, jsonData) {
if (error) return console.warn(error);
colData = jsonData;
console.log(colData)
console.log(colData.el1[0])
});
console.log(colData) //this will be an empty array
//my goal would be:
//var combinedData = d3.zip(colData, xyData);
My console.log looks like this:
Array [ ]
Array [ ]
Object { x: 0.6981317, y: 0.643 }
Object { x: 0.6981317, y: 0.643 }
Object { el1: Array[3], el2: Array[3], el3: Array[3] }
Object { color: "green" }
Which shows that loading the data works as expected. But storing them as global variables does not work due to the asynchronous nature of these data loaders (therefore, the two arrays are still empty).
My question is: What is the best way to combine two datasets to one dataset?
D3.js can actually process a JavaScript object instead of a file. If you replace the file name with the variable name of the object storing (let's say, a JSON array of data) with D3.json(myData){...}, it will have access to that data.
Let's say we are using jQuery and we also include a helper library called Papa Parse (it makes life easier).
Step 1. Turn your CSV data into JSON data and store it in a variable A:
var A = Papa.parse(yourCSV);
Step 2. Read your JSON data and store it in a variable called B
var B;
$(document).ready(function() {
$.getJSON('yourJSON.json', function(json){
B = json;
});
});
Step 3. Combine datasets A and B into variable C IMPORTANT: You might need to format the CSV json stored in A to look how you expect it to look before we give it to D3 later
var C={};
$.extend(C, A, B);
Step 4. Give C to D3
d3.json(C, function(error, jsonData) {
// Use data here to do stuff
});
I've used the above as a work around in my own projects.
You might be able to try calling D3.json within D3.csv, but I haven't tried this before:
d3.csv("A.csv", function(errorA, dataA) {
d3.json("B.json", function(errorB, dataB) {
// Use data to do stuff
});
});
Since you said you have jQuery available (*), we can use it's Deferred feature to manage the two asynchronous operations you are looking at.
We are doing this by converting D3's callback-based approach into a promise-based approach.
For that, we set up two helper functions that wrap D3's .csv and .json helpers and return jQuery promises:
d3.csvAsync = function (url, accessor) {
var result = $.Deferred();
this.csv(url, accessor, function (data) {
if (data) {
result.resolve(data);
} else {
result.reject("failed to load " + url);
}
});
return result.promise();
};
d3.jsonAsync = function (url) {
var result = $.Deferred();
this.json(url, function (error, data) {
if (error) {
result.reject("failed to load " + url + ", " + error);
} else {
result.resolve(data);
}
});
return result.promise();
};
Now we can invoke the requests in parallel and store them in variables. We can use .then() to transform the results on the fly, as well:
var colDataReq = d3.jsonAsync("colData.json");
var xyDataReq = d3.csvAsync("xyData.csv").then(function (data) {
data.forEach(function (d) {
d.x = +d.x;
d.y = +d.y;
});
return data;
});
Finally, we use the $.when() utility function to wait on both resources and have them handled by a single callback.
$.when(xyDataReq, colDataReq).done(function (xyData, colData) {
var combinedData = d3.zip(colData, xyData);
// now do something with combinedData
}).fail(function (error) {
console.warn(error);
});
This way we can avoid nesting (and therefore needlessly serializing) the two requests.
Also, since the requests are stored in variables, we can simply re-use them without having to change our existing functions. For example, if you wanted to log the contents of one of the requests, you could do this anywhere in your code:
xyDataReq.done(function (data) {
console.log(data);
});
and it would run as soon as xyDataReq has returned.
Another consequence of this approach is that — since we have decoupled loading a resource from using it — we can perform the loading very early, even before the rest of the page has rendered. This can save additional time.
I'm learning FRP using Bacon.js, and would like to assemble data from a paginated API in a stream.
The module that uses the data has a consumption API like this:
// UI module, displays unicorns as they arrive
beautifulUnicorns.property.onValue(function(allUnicorns){
console.log("Got "+ allUnicorns.length +" Unicorns");
// ... some real display work
});
The module that assembles the data requests sequential pages from an API and pushes onto the stream every time it gets a new data set:
// beautifulUnicorns module
var curPage = 1
var stream = new Bacon.Bus()
var property = stream.toProperty()
var property.onValue(function(){}) # You have to add an empty subscriber, otherwise future onValues will not receive the initial value. https://github.com/baconjs/bacon.js/wiki/FAQ#why-isnt-my-property-updated
var allUnicorns = [] // !!! stateful list of all unicorns ever received. Is this idiomatic for FRP?
var getNextPage = function(){
/* get data for subsequent pages.
Skipping for clarity */
}
var gotNextPage = function (resp) {
Array.prototype.push.apply(allUnicorns, resp) // just adds the responses to the existing array reference
stream.push(allUnicorns)
curPage++
if (curPage <= pageLimit) { getNextPage() }
}
How do I subscribe to the stream in a way that provides me a full list of all unicorns ever received? Is this flatMap or similar? I don't think I need a new stream out of it, but I don't know. I'm sorry, I'm new to the FRP way of thinking. To be clear, assembling the array works, it just feels like I'm not doing the idiomatic thing.
I'm not using jQuery or another ajax library for this, so that's why I'm not using Bacon.fromPromise
You also may wonder why my consuming module wants the whole set instead of just the incremental update. If it were just appending rows that could be ok, but in my case it's an infinite scroll and it should draw data if both: 1. data is available and 2. area is on screen.
This can be done with the .scan() method. And also you will need a stream that emits items of one page, you can create it with .repeat().
Here is a draft code (sorry not tested):
var itemsPerPage = Bacon.repeat(function(index) {
var pageNumber = index + 1;
if (pageNumber < PAGE_LIMIT) {
return Bacon.fromCallback(function(callback) {
// your method that talks to the server
getDataForAPage(pageNumber, callback);
});
} else {
return false;
}
});
var allItems = itemsPerPage.scan([], function(allItems, itemsFromAPage) {
return allItems.concat(itemsFromAPage);
});
// Here you go
allItems.onValue(function(allUnicorns){
console.log("Got "+ allUnicorns.length +" Unicorns");
// ... some real display work
});
As you noticed, you also won't need .onValue(function(){}) hack, and curPage external state.
Here is a solution using flatMap and fold. When dealing with network you have to remember that the data can come back in a different order than you sent the requests - that's why the combination of fold and map.
var pages = Bacon.fromArray([1,2,3,4,5])
var requests = pages.flatMap(function(page) {
return doAjax(page)
.map(function(value) {
return {
page: page,
value: value
}
})
}).log("Data received")
var allData = requests.fold([], function(arr, data) {
return arr.concat([data])
}).map(function(arr) {
// I would normally write this as a oneliner
var sorted = _.sortBy(arr, "page")
var onlyValues = _.pluck(sorted, "value")
var inOneArray = _.flatten(onlyValues)
return inOneArray
})
allData.log("All data")
function doAjax(page) {
// This would actually be Bacon.fromPromise($.ajax...)
// Math random to simulate the fact that requests can return out
// of order
return Bacon.later(Math.random() * 3000, [
"Page"+page+"Item1",
"Page"+page+"Item2"])
}
http://jsbin.com/damevu/4/edit
I'm a bit clueless with javascript, so would appreciate pointers with what's (not) happening here.
The following snippet is supposed to populate the data variable with the response (JSON) from a PHP backend. The response variable indeed contains the data (I confirmed with Firebug and a breakpoint):
[Object { identifier=0, value="clothing made in the us"}, Object { identifier=1, value="club penguin trading cards"}, Object { identifier=2, value="cobra quad bikes"}, 22 more...]
However, by the time the return data; line is reached, data contains nothing.
var data = [];
new response.each(function(identifier, item){
this.include({value: identifier, text: item.text});
}, data);
return data;
I'm having difficulty mapping my knowledge of (eg) Perl's foreach loop with what's happening here. I'd appreciate any pointers.
Thanks
Solved
var data = [];
response.each(function(obj) {
this.include({identifier: obj.id, value: obj.descr});
}, data);
return data;
I'll eventually get this JS.