I have a list containing folders, and I'm trying to get the count of the total number of files in these folders.
I manage to retrieve a ListItemCollection containing my folders. Then it starts being... picky.
ctx is my ClientContext, and collection my ListItemCollection.
function countFiles()
{
var enumCollection = collection.getEnumerator();
while(enumCollection.moveNext())
{
currentItem = enumCollection.get_current();
var folder = currentItem.get_folder();
if (folder === 'undefined')
return;
ctx.load(folder, 'ItemCount');
ctx.executeQueryAsync(Function.createDelegate(this, function()
{
totalCount += folder.get_itemCount();
}), Function.createDelegate(this, onQueryFailed));
}
}
So it works... half of the time. If I have 6 items in my collection, I get 3 or 4 "The property or field 'ItemCount' has not been initialized" exceptions, and obviously my totalCount is wrong. I just can't seem to understand why, since the executeQueryAsync should not happen before the folder is actually loaded.
I'm very new to Javascript, so it may look horrid and be missing some essential code I didn't consider worthy of interest, feel free to ask if it is so.
Referencing closure variables (like folder in this case) from an asynchronous callback is generally a big problem. Thankfully it's easy to fix:
function countFiles()
{
function itemCounter(folder) {
return function() { totalCount += folder.get_itemCount(); };
}
var enumCollection = collection.getEnumerator();
while(enumCollection.moveNext())
{
var folder = enumCollection.getCurrent().get_folder();
if (folder === undefined) // not a string!
return;
ctx.load(folder, 'ItemCount');
ctx.executeQueryAsync(itemCounter(folder), Function.createDelegate(this, onQueryFailed));
}
}
(You don't need that .createDelegate() call because the function doesn't need this.)
Now, after that, you face the problem of knowing when that counter has been finally updated. Those asynchronous callbacks will eventually finish, but when? You could keep a separate counter, one for each query you start, and then decrement that in the callback. When it drops back to zero, then you'll know you're done.
Since SP.ClientContext.executeQueryAsync is an async function it is likely that the loop could be terminated before the first call to callback function completes, so the behavior of specified code could be unexpected.
Instead, i would recommend another and more clean approach for counting files (including files located under nested folders) using SharePoint JSOM.
How to count the total number of files in List using JSOM
The following function allows to count the number of list items in List:
function getItemsCount(listTitle, complete){
var ctx = SP.ClientContext.get_current();
var list = ctx.get_web().get_lists().getByTitle(listTitle);
var items = list.getItems(createQuery());
ctx.load(items);
ctx.executeQueryAsync(
function() {
complete(items.get_count());
},
function() {
complete(-1);
}
);
function createQuery()
{
var query = new SP.CamlQuery();
query.set_viewXml('<View Scope="RecursiveAll"><Query><Where><Eq><FieldRef Name="FSObjType" /><Value Type="Integer">0</Value></Eq></Where></Query></View>');
return query;
}
}
Usage
getItemsCount('Documents', function(itemsCount){
console.log(String.format('Total files count in Documents library: {0}',itemsCount));
});
Related
I made a js with data in a const array as below
const messages = [
{ date: '2020-1-1', content:'message1'},
]
In order to make my file cleaner I decide to put the data in a Json file and want to call the Data in my Js in order to use it like before.
my Json is like this
[
{
"date":"2020-1-1",
"content":"message1"
}
]
In order to import my Json I put this code:
let messages = [];
$.getJSON("messages.json", function(data) {
messages = data;
console.log(messages);
});
The result is that my array is loaded in the console but the variable dont work, I tried things with Object.keys but no more result. I dont use framework also and dont find a solution on other questions here. Any help will be appreciated. Thank you very much!
I dont use framework
You are using a library, though, $ === jQuery
in order to make my file cleaner I decide to put the data in a Json file
You can just define a constants.js file and load that before your other scripts.
For example,
constants.js
const messages = [
{ date: '2020-1-1', content:'message1'},
]
main.js
alert(messages);
index.html
<script src="constants.js"></script>
<script src="main.js"></script>
#Alvin Stefanus
Because x,z and messages were undefined and didnt let the code work, I also add
let messages,x,z = [];
And now it works perfectly with your solution.
I will use it as you told also for the other operations.
Thank you very much it helped for my problem and gave me a new technique.
EDIT:
I also tried to delete this part
var start = x;
$.getJSON('messages.json', function(data) {
messages = data;
}); <-- this
var end = z;
And it also works! That means that the problem was not the async function but because I didnt put the loop within the curly bracket of
$.getJSON('messages.json', function(data) {
//The data is finished being filled here
}
Ok this is probably the issue. $.getJSON() is an async function, the code will not wait until the closing curly bracket of the method:
var start = x;
$.getJSON('messages.json', function(data) {
messages = data;
}); <-- this
var end = z;
The code will run var end = z; before the $.getJSON() finished getting the result, because it is asynchronous function. In other word, when the code is currently at var end = z, $.getJSON() is still working to get the data, and has not been finished. That is why the messages = data is not being called yet.
So here is what you want to do:
$.getJSON('messages.json', function(data) {
messages = data;
for (const item of messages) {
if (item.date === todayDay) {
console.log(item.content);
var newPara = document.createElement("p");
var textNode = document.createTextNode(item.content);
newPara.appendChild(textNode);
var nodeParent = document.getElementById("titre");
var nodeChild = document.getElementById("child1");
nodeParent.appendChild(newPara, nodeChild);
}
}
});
Do all your needed operations within the curly bracket of
$.getJSON('messages.json', function(data) {
//The data is finished being filled here
}
This is a callback function. You can learn more about callback function here
Some Info
Yes the loop has to be inside the callback function to run after the data has been retrieved. To give you better understanding about the async function, you can also put your loop outside within a timeout function, but you will never want this, because you will not know how long the operation for retrieving the data will run.
For example:
$.getJSON('messages.json', function(data) {
messages = data;
});
setTimeout(function() {
for (const item of messages) {
if (item.date === todayDay) {
console.log(item.content);
var newPara = document.createElement("p");
var textNode = document.createTextNode(item.content);
newPara.appendChild(textNode);
var nodeParent = document.getElementById("titre");
var nodeChild = document.getElementById("child1");
nodeParent.appendChild(newPara, nodeChild);
}
}
}, 2000); //run after 2 seconds
Im sure the code above will also work, the process of getting the data should not be longer than 2 seconds.
Again this is not a correct way to do it, just to give you better understanding of async function.
i have a Meteor Application which is very "slow" as there are a lot of API-Calls.
What i try to do is to break apart the loading/calls.
What i just did is:
i have loading template via iron-router
i waitOn for the first API-Call has finished
then i start the next API-calls in the Template.myTemplate.rendered - function
This was already a big benefit for the speed of my Application, but i want to break it up even more as the second call is in fact more like 5-25 API-calls.
So what i try to do now is inside the rendered function is a self-calling function which calls itself as long as there are no more to do and saves the response inside a session. (Until now it just rewrites, but even to this point i can´t get)
Template.detail.rendered = function(){
//comma separated list of numbers for the API-Call
var cats = $(this.find(".extra")).attr('data-extra').split(',');
var shop = $(this.find(".extra")).attr('data-shop');
var counter = 0;
var callExtras = function(_counter){
var obj = {
categories : [cats[_counter]],
shop : shop
};
if(_counter <= cats.length){
Meteor.subscribe('extra', obj,function(result){
//TODO dickes todo... nochmal nachdenken und recherchieren
//console.log(_counter);
Session.set('extra',Extra.find('extra').fetch()[0].results);
counter++;
callExtras(counter);
});
}
};
callExtras(counter);
Session.set('loading_msg', '' );
};
Now i have again problems with my reactive parts of the app desscribed here - Meteor: iron-router => waitOn without subscribe As i can´t find a proper way to update my client-side per user base collection. Also in the docs it is described the publish method also creates a new collection. (The new document´s ID) here - http://docs.meteor.com/#/full/publish_added
here is the publish from server
Meteor.publish('extra', function(obj){
var that = this;
Meteor.call('extra', obj, function(error, result){
if (result){
//console.log(result);
that.added("extra", "extra", {results: result});
//that.changed('extra','extra',{results: result});
that.ready();
} else {
//that.ready();
}
});
});
So my question is: Is there from scratch a better way to structuring my code means solving the problem somehow different? If not how can i achive it the cleanest way? Because for my understanding this is just strange way to do it.
EDIT:
For example.
Can i do a per-user-collection (maybe only client-side like now) and push data from the server and just subscribe to this collection? But then how can i check when the async API-Call has finshed to start the next round. So the view gets data piece by piece. I am just confused right now.
My fault was simple as i thaught: You don´t need to use subscribe.
I just added "error,result" in the callback of Meteor.call
Only "result" leads to the result is always undefined.
var cats = $(this.find(".extra")).attr('data-extra').split(',');
var shop = $(this.find(".extra")).attr('data-shop');
var counter = 0;
var callExtras = function(_counter){
var obj = {
categories : [cats[_counter]],
shop : shop
};
if(_counter <= cats.length){
Meteor.call('extra', obj,function(error,result){
var actual_session = Session.get('extra');
if(actual_session === false){
actual_session = [];
}
actual_session = actual_session.concat(result);
Session.set('extra',actual_session);
counter++;
callExtras(counter);
});
}
};
callExtras(counter);
Then in the template helper
"extra" : function(){
return Session.get('extra');
},
I'm learning FRP using Bacon.js, and would like to assemble data from a paginated API in a stream.
The module that uses the data has a consumption API like this:
// UI module, displays unicorns as they arrive
beautifulUnicorns.property.onValue(function(allUnicorns){
console.log("Got "+ allUnicorns.length +" Unicorns");
// ... some real display work
});
The module that assembles the data requests sequential pages from an API and pushes onto the stream every time it gets a new data set:
// beautifulUnicorns module
var curPage = 1
var stream = new Bacon.Bus()
var property = stream.toProperty()
var property.onValue(function(){}) # You have to add an empty subscriber, otherwise future onValues will not receive the initial value. https://github.com/baconjs/bacon.js/wiki/FAQ#why-isnt-my-property-updated
var allUnicorns = [] // !!! stateful list of all unicorns ever received. Is this idiomatic for FRP?
var getNextPage = function(){
/* get data for subsequent pages.
Skipping for clarity */
}
var gotNextPage = function (resp) {
Array.prototype.push.apply(allUnicorns, resp) // just adds the responses to the existing array reference
stream.push(allUnicorns)
curPage++
if (curPage <= pageLimit) { getNextPage() }
}
How do I subscribe to the stream in a way that provides me a full list of all unicorns ever received? Is this flatMap or similar? I don't think I need a new stream out of it, but I don't know. I'm sorry, I'm new to the FRP way of thinking. To be clear, assembling the array works, it just feels like I'm not doing the idiomatic thing.
I'm not using jQuery or another ajax library for this, so that's why I'm not using Bacon.fromPromise
You also may wonder why my consuming module wants the whole set instead of just the incremental update. If it were just appending rows that could be ok, but in my case it's an infinite scroll and it should draw data if both: 1. data is available and 2. area is on screen.
This can be done with the .scan() method. And also you will need a stream that emits items of one page, you can create it with .repeat().
Here is a draft code (sorry not tested):
var itemsPerPage = Bacon.repeat(function(index) {
var pageNumber = index + 1;
if (pageNumber < PAGE_LIMIT) {
return Bacon.fromCallback(function(callback) {
// your method that talks to the server
getDataForAPage(pageNumber, callback);
});
} else {
return false;
}
});
var allItems = itemsPerPage.scan([], function(allItems, itemsFromAPage) {
return allItems.concat(itemsFromAPage);
});
// Here you go
allItems.onValue(function(allUnicorns){
console.log("Got "+ allUnicorns.length +" Unicorns");
// ... some real display work
});
As you noticed, you also won't need .onValue(function(){}) hack, and curPage external state.
Here is a solution using flatMap and fold. When dealing with network you have to remember that the data can come back in a different order than you sent the requests - that's why the combination of fold and map.
var pages = Bacon.fromArray([1,2,3,4,5])
var requests = pages.flatMap(function(page) {
return doAjax(page)
.map(function(value) {
return {
page: page,
value: value
}
})
}).log("Data received")
var allData = requests.fold([], function(arr, data) {
return arr.concat([data])
}).map(function(arr) {
// I would normally write this as a oneliner
var sorted = _.sortBy(arr, "page")
var onlyValues = _.pluck(sorted, "value")
var inOneArray = _.flatten(onlyValues)
return inOneArray
})
allData.log("All data")
function doAjax(page) {
// This would actually be Bacon.fromPromise($.ajax...)
// Math random to simulate the fact that requests can return out
// of order
return Bacon.later(Math.random() * 3000, [
"Page"+page+"Item1",
"Page"+page+"Item2"])
}
http://jsbin.com/damevu/4/edit
I have a couple of .csv files that I need to compare against another large .csv file (over 300,000 rows) and I am running into an Out of Memory error on my server. I am running this on a server with 4GB RAM so I am not sure why this is happening but my code looks like this.
I am using the ya-csv to read in the csv lines:
var csv = require('ya-csv');
var fs = require('graceful-fs');
var async = require('async');
var first_silo = [];
var second_Silo = [];
var combined = [];
var reader = csv.createCsvFileReader('december_raw.csv', {columnsFromHeader:true,'separator': ','});
var first = csv.createCsvFileReader('first_data.csv', {columnsFromHeader:false,'separator': ','});
var second = csv.createCsvFileReader('second_data.csv', {columnsFromHeader:false,'separator': ','})
async.series([
//push data from other .csv files into arrays
function(callback){
first.addListener('data', function(data){
first_silo.push(data[0]);
})
first.addListener('end', function(){
callback();
})
},
function(callback){
second.addListener('data', function(data){
second_silo.push(data[0]);
});
second.addListener('end', function(data){
callback();
});
},
function(callback){
reader.addListener('data', function(data){
//compare the data from reader to each item in the first array and append the items that get a match to a .csv.
for(var i=0;i<first_silo.length;i++){
if(data[0] === first_silo[i]){
fs.appendFileSync('results.csv', data[0]+","+first_silo[i])
break;
}
}
});
},
function(callback){
reader.addListener('data', function(data){
//do the same with the first array as the second.
for(var i=0;i<second_silo.length;i++){
if(data[0] === second_silo[i]){
fs.appendFileSync('results.csv', data[0]+","+second_silo[i]);
break;
}
}
})
}
])
When I do this I dont get the past first_silo comparison. The node app will just stop and I can see an out of memory error when I dmesg.
I have tried to run this program with this flag as well:
--max-old-space-size=3000
I still get the same error.
Is there a smarter way to do this? Any help would be greatly appreciated.
Here's an even more memory efficient answer, without any assumptions.
In it, you make sure you pass in the smallest CSV file as the first argument to a compareRows function.
This really makes sure that you're being as memory efficient as possible, by keeping only the smallest set possible stored in memory.
var csv = require('ya-csv');
var fs = require('graceful-fs');
var smallFileName = ""; // used to see if we need to really reload the file again.
var smaller_silo = [];
compareRows('smaller.csv', 'larger.csv', function(){
compareRows('smaller.csv', 'anotherLarger.csv', function(){
smaller_silo = []; }); // done
});
function compareRows(smallerFileName, largerFileName, callBack){
var reader;
if(smallerFileName !== smallFileName){
smallFileName = smallerFileName;
reader = csv.createCsvFileReader(smallerFileName, { columnsFromHeader: true, separator: ','});
reader.addListener('data', function(data){
smaller_silo.push(data[0]);
});
reader.addListener('end', function(){
compareSmallerToLarger(largerFileName, callBack);
});
}
else{
compareSmallerToLarger(largerFileName, callBack);
}
}
function compareSmallerToLarger(largerFileName, callBack){
var csvStream = csv.createCsvFileReader( largerFileName, { columnsFromHeader: false, 'separator':','});
csvStream.addListener('data', function(data){
for (var i = 0; i < smaller_silo.length; i++) {
if(data[0] === smaller_silo[i]){
fs.appendFileSync('results.csv', data[0]+","+smaller_silo[i]);
break;
}
}
});
csvStream.addListener('end', function(data){
if(callBack && typeof callBack === "function") callBack();
});
}
Anyway, I shouldn't obsess over things...
Your algorithm is running pretty inefficient for a few reasons. Please forgive me, but I'm going to do this without using the async.series call you're using. Hopefully it will still be useful.
First thing's first: I'm making an assumption. I'm assuming that the data size of your first file december_raw.csv is smaller than your second and third files. Even if this isn't the case, this should still work without running out of memory as long as the file's contents aren't over your memory limitation.
Second, you're loading up two arrays at the same time instead of doing one at a time. This is basically doubling your memory usage.
Third, my hunch is that when you're running csv.createCsvFileReader, you're beginning the stream on all of them at the same time. You likely don't want this.
Because you're comparing two files to the contents of december_raw.csv, it might be better to load the contents of that file in memory completely, and then stream-compare the other two files to this in series using a callBack and a universal comparison function.
var csv = require('ya-csv');
var fs = require('graceful-fs');
var reader_silo = []; // a variable that holds the rows of the main csv.
var reader = csv.createCsvFileReader('december_raw.csv', {columnsFromHeader:true,'separator': ','});
reader.addListener('data', function(data){
reader_silo.push(data[0]); // load each read in row into the array
});
reader.addListener('end', function(){
//start comparing with first csv file.
compareRows('first_data.csv', function(){
// compare with second data
compareRows('second_data.csv');
});
});
// the comparison function, takes in the filename, and a callBack if there is one.
function compareRows(csvFileName, callBack){
var csvStream = csv.createCsvFileReader(csvFileName, {columnsFromHeader:false,'separator': ','}); // begin stream
csvStream.addListener('data', function(data){
for (var i = 0; i < reader_silo.length; i++) {
if(data[0] === reader_silo[i]){
fs.appendFileSync('results.csv', data[0]+","+reader_silo[i]);
break;
}
}
});
csvStream.addListener('end', function(data){
// if there's a callBack then we can execute it.
// in this case the first time it is executed there is a callBack which executes this function again with the next file.
if(callBack && typeof callBack === "function") callBack();
});
}
PS. If your script continues beyond this, you might also want to consider zeroing out the reader_silo when you're done your comparisons. So your 'end' listener callBack would look like this:
reader.addListener('end', function(){
compareRows('first_data.csv', function(){
compareRows('second_data.csv', function(){
reader_silo = [];
});
});
});
I have a Single Page Application that is working pretty well so far but I have run into an issue I am unable to figure out. I am using breeze to populate a list of projects to be displayed in a table. There is way more info than what I actually need so I am doing a projection on the data. I want to add a knockout computed onto the entity. So to accomplish this I registered and entity constructor like so...
metadataStore.registerEntityTypeCtor(entityNames.project, function () { this.isPartial = false; }, initializeProject);
The initializeProject function uses some of the values in the project to determine what the values should be for the computed. For example if the Project.Type == "P" then the rowClass should = "Red".
The problem I am having is that all the properties of Project are null except for the ProjNum which happens to be the key. I believe the issue is because I am doing the projection because I have registered other initializers for other types and they work just fine. Is there a way to make this work?
EDIT: I thought I would just add a little more detail for clarification. The values of all the properties are set to knockout observables, when I interrogate the properties using the javascript debugger in Chrome the _latestValue of any of the properties is null. The only property that is set is the ProjNum which is also the entity key.
EDIT2: Here is the client side code that does the projection
var getProjectPartials = function (projectObservable, username, forceRemote) {
var p1 = new breeze.Predicate("ProjManager", "==", username);
var p2 = new breeze.Predicate("ApprovalStatus", "!=", "X");
var p3 = new breeze.Predicate("ApprovalStatus", "!=", "C");
var select = 'ProjNum,Title,Type,ApprovalStatus,CurrentStep,StartDate,ProjTargetDate,CurTargDate';
var isQaUser = cookies.getCookie("IsQaUser");
if (isQaUser == "True") {
p1 = new breeze.Predicate("QAManager", "==", username);
select = select + ',QAManager';
} else {
select = select + ',ProjManager';
}
var query = entityQuery
.from('Projects')
.where(p1.and(p2).and(p3))
.select(select);
if (!forceRemote) {
var p = getLocal(query);
if (p.length > 1) {
projectObservable(p);
return Q.resolve();
}
}
return manager.executeQuery(query).then(querySucceeded).fail(queryFailed);
function querySucceeded(data) {
var list = partialMapper.mapDtosToEntities(
manager,
data.results,
model.entityNames.project,
'ProjNum'
);
if (projectObservable) {
projectObservable(list);
}
log('Retrieved projects using breeze', data, true);
}
};
and the code for the partialMapper.mapDtosToEntities function.
var defaultExtension = { isPartial: true };
function mapDtosToEntities(manager,dtos,entityName,keyName,extendWith) {
return dtos.map(dtoToEntityMapper);
function dtoToEntityMapper(dto) {
var keyValue = dto[keyName];
var entity = manager.getEntityByKey(entityName, keyValue);
if (!entity) {
extendWith = $.extend({}, extendWith || defaultExtension);
extendWith[keyName] = keyValue;
entity = manager.createEntity(entityName, extendWith);
}
mapToEntity(entity, dto);
entity.entityAspect.setUnchanged();
return entity;
}
function mapToEntity(entity, dto) {
for (var prop in dto) {
if (dto.hasOwnProperty(prop)) {
entity[prop](dto[prop]);
}
}
return entity;
}
}
EDIT3: Looks like it was my mistake. I found the error when I looked closer at initializeProject. Below is what the function looked like before i fixed it.
function initializeProject(project) {
project.rowClass = ko.computed(function() {
if (project.Type == "R") {
return "project-list-item info";
} else if (project.Type == "P") {
return "project-list-item error";
}
return "project-list-item";
});
}
the issue was with project.Type I should have used project.Type() since it is an observable. It is a silly mistake that I have made too many times since starting this project.
EDIT4: Inside initializeProject some parts are working and others aren't. When I try to access project.ProjTargetDate() I get null, same with project.StartDate(). Because of the Null value I get an error thrown from the moment library as I am working with these dates to determine when a project is late. I tried removing the select from the client query and the call to the partial entity mapper and when I did that everything worked fine.
You seem to be getting closer. I think a few more guard clauses in your initializeProject method would help and, when working with Knockout, one is constantly battling the issue of parentheses.
Btw, I highly recommend the Knockout Context Debugger plugin for Chrome for diagnosing binding problems.
Try toType()
You're working very hard with your DTO mapping, following along with John's code from his course. Since then there's a new way to get projection data into an entity: add toType(...) to the end of the query like this:
var query = entityQuery
.from('Projects')
.where(p1.and(p2).and(p3))
.select(select)
.toType('Project'); // cast to Project
It won't solve everything but you may be able to do away with the dto mapping.
Consider DTOs on the server
I should have pointed this out first. If you're always cutting this data down to size, why not define the client-facing model to suit your client. Create DTO classes of the right shape(s) and project into them on the server before sending data over the wire.
You can also build metadata to match those DTOs so that Project on the client has exactly the properties it should have there ... and no more.
I'm writing about this now. Should have a page on it in a week or so.