I've been scouring the web over this one for quite some time now.
I'm prototyping an Angular service for an Ionic app. The purpose of this service is to download an image. Now this is a problem that, in standard JS, I'd like to solve with some recursive calls to avoid duplicate code.
I've tried writing it using promises to get my feet wet with the concept of Promises and it's giving me a hard time.
Consider the following code:
public getBgForName = (name: string) => {
name = name.toLowerCase();
var instance = this;
var dir = this.file.dataDirectory;
return new Promise(function (fulfill, reject) {
instance.file.checkDir(dir, name).then(() => {
// directory exists. Is there a bg file?
dir = dir + '/' + name + '/';
instance.file.checkFile(dir, 'bg.jpg').then(() => {
console.log('read file');
fulfill(dir + '/' + 'bg.jpg')
}, (err) => {
// dl file and re-call
console.log('needs to download file!')
instance.transfer.create().download(encodeURI('https://host.tld/'+name+'/bg.jpg'), dir + 'bg.jpg', true, {})
.then((data) => {
return instance.getBgForName(name).then((url) => {return url});
}, (err) => {
console.log(err)
})
})
}, (err) => {
// create dir and re-call
instance.file.createDir(dir, name, true).then(() => {
instance.getBgForName(name).then((url) => {fulfill(url)});
})
})
});
}
the promise, when called - never quite fully resolves. I think, after reading this article that the problem lies in my the promise resolving not being passed correctly to the "original" promise chain - so that it resolves to solve level, but not all the way to the top. This is supported by the promise resolving correctly when the following is assured:
the directory has already been created
the file has already been downloaded
so I reckon the return statements somehow break up the link here, leading to the promise not being resolved after it's first recursive call.
What is the correct way to call a promise recursively, ensuring the the original caller receives the result when it is ready?
Edit: Outlining the desired result, as suggested by David B.
What the code is supposed to be is the function that is called on a list of items. For each item, there is a background image available, which is stored on a server. This background image will be cached locally. The goal of using recursively calls here is that no matter the state (downloaded, not downloaded) the function call will always return an url to the image on the local filesystem. The steps for this are as follows:
create a directory for the current item
download the file to this directory
return a local URL to the downloaded file
subsequent calls thereafter will only return the image straight from disk (after checking that it exists), with no more downloading.
After reading about the benefits of async / await over promises (and falling in love with the cleaner syntax) I rewrote it using async / await. The refactored (but not perfect!) code looks like this:
public getBgForName = async (name: string) => {
name = name.toLowerCase();
let instance = this;
let dir = this.file.dataDirectory;
try{
await instance.file.checkDir(dir, name)
dir = dir + name + '/';
try{
await instance.file.checkFile(dir, 'bg.jpg')
return dir + 'bg.jpg';
}catch(err) {
// download file
await instance.transfer.create().download(encodeURI('https://host.tld/'+name+'/bg.jpg'), dir + 'bg.jpg', true, {})
return this.getBgForName(name);
}
}catch(err) {
// not catching the error here since if we can't write to the app's local storage something is very off anyway.
await instance.file.createDir(dir, name, true)
return this.getBgForName(name);
}
}
and works as intended.
Related
Final? Update: Solution 1: I changed func1 to NOT return a value back to my javascript event, but instead to return the value directly to my calling method using
return DotNet.invokeMethodAsync('AppWASM', 'SetImageData', result);
from func1 (so I also changed invoke of func1(file) to not assign to a variable). However, the limitation that the method had to be static had me going in circles. I could have used this to save it directly to my database but from the app's standpoint that wasn't a good solution.
Final Solution: I saved the value to local storage via
localStorage.setItem("thisimage", result);
which is in my revised code, below. For this app, that makes the most sense since the idea is to hold the image to see if the user wants to save it (or not). So I think I kind of got around the Promise confusion by just going a different route - which is what I found in the similar questions. The code (and image below) shows the value stored in localstorage.
=================================================================**
1st Update: I revised my code to return the Promise eventArg as a JsonDocument (property set as public object Promise {get; set;} in my eventargs class). How do I extract the data from this Promise in C#? The json document returned does not have a Promise.result element.
I know there are similar questions - scripts can log the result of a Promise to the console, but the returned result is undefined - but the answers are to call the promise function from an async function (and many answers have comments that they don't work). Since my "calling" script is for an event, I don't see how to make it async. So event execution continues without waiting for the result. I've tried assigning to a top level variable with same result. The custom event is fired by a browser event, and the args are returned to a method in the razor page. I'd like to send imageB64string (or globalimagevar) as a return arg. This is for Net6 Blazor WASM (and Server once it works) apps.
async function func1(fileParam) {
await blobToBase64(fileParam).then(result => {
localStorage.setItem("thisimage", result);
return;
});
}
async function blobToBase64(blob) {
const reader = new FileReader();
reader.readAsDataURL(blob);
return await new Promise((resolve, _ ) => {
reader.onloadend = () => {
resolve(reader.result);};});}
Blazor.registerCustomEventType("pastemultimedia", {
browserEventName: 'paste', createEventArgs: event => {
let isMultimedia = false;
let data = event.clipboardData.getData('text');
let promise = "";
const items = event.clipboardData.items;
const acceptedMediaTypes = ["image/png"];
for (let i = 0; i < items.length; i++) {
const file = items[i].getAsFile();
if (!file) { continue; }
if (!items[i].type.includes('image')) { continue; }
if (acceptedMediaTypes.indexOf(items[i].type) === -1) { continue; }
isMultimedia = true;
const url = window.URL || window.webkitURL;
data = url.createObjectURL(file);
func1(file);
}
return { isMultimedia, data };
}
})
Promise result as json (partial shown):
Oriignal Result / execution continued before promise was resolved:
imageB64string is: function toString() { [native code] }
ImagePasteEvent.js:40 globalimagevar is:
ImagePasteEvent.js:5 logged from func1: data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAVEAAAHaCAYAAAC5AXpZAAAgAElEQVR4nO3df3xU9YHv/xfJYZiEiTNhJmbCBJliiMFgDBo2CroGBbGyUlqxtqDfsqU+9F573W6rbRd7v9r7uHXd1nZdv2VXH1y37u1qa6tbyi60FCpxKWiWKJESDRBoEEISMyETMwzDMAnfP2byexICn0B....
Note: Credit for the base code for the pastemultimedia event to Felipe Gavilan blog
I've been learning about Promises in JS and although its been pretty exciting its also been a bit frustrating. So I'm working on some code that will allow users to drag-and-drop Files and Folders to my web app. However some processes are dependent on others to complete.
My idea is to capture all fileItems (for additional context they're of type FileSystemEntry), then convert them to an object type 'File Upload Object' (my own custom class), then push them all into an array of the same type, to then finally display them on the screen.
Here is my attempt to the solution:
First I want to create a new instance of an object so I can push it into the array.
async returnFileUploadObject(file): Promise<FileUploadObject> {
let fileModTime;
let fileSize;
return new Promise((resolve, reject) => {
try {
file.getMetadata((metadata) => {
fileModTime = metadata.modificationTime;
fileSize = metadata.size;
const fileUploadObj = new FileUploadObject(file.name, fileModTime, fileSize, String(this.uploadState.Spinner), false, file.fullPath);
resolve (fileUploadObj);
});
} catch (e) {
fileModTime = Date.now();
fileSize = 0;
const fileUploadObj = new FileUploadObject(file.name, fileModTime, fileSize, String(this.uploadState.Spinner), false, file.fullPath);
resolve(fileUploadObj);
}
});
}
Then, I want to push that new object into an array.
async createFileObjectList(fileEntries): Promise<FileUploadObject[]> {
return new Promise(async (resolve, reject) => {
let list = [];
for (let file of fileEntries) {
let fileObj = await this.returnFileUploadObject(file);
list.push(fileObj);
}
console.log('Time to resolve this promise', list);
resolve(list);
});
}
Once the array is finished being built, I want to pass it back to another list that will then display the files in HTML:
async dropFiles(items) {
// createFileObjectList
this.fileUploadList = await this.createFileObjectList(items);
console.log('File Upload List', this.fileUploadList);
}
I thought I was doing everything correctly with Promises but when I console.log the results, the arrays appear to have items (28 FileUploadObjs) but actually have a length of 0. I also want to note that sometimes the console.log statement "console.log('Time to resolve this promise', list);" does print out the items in the array - sometimes.
Any kind of help would be greatly appreciated! I am really trying my best to understand Promises and I want my code to finally work. Also any tips for better coding practices when using Promises would be awesome too, thanks! If you need more clarification, I'd be happy to provide it to you.
So I have a function that is supposed to recursively return all the files in a folder, here it is:
async function getFiles(dir) {
const subdirs = await fs.readdirSync(dir);
const files = await Promise.all(
subdirs.map(async (subdir) => {
const res = resolve(dir, subdir);
return (await stat(res)).isDirectory() && !subdir.startsWith("__")
? getFiles(res)
: res;
})
);
return files.reduce((a, f) => a.concat(f), files);
}
Looks great, right? Works fine too, except, not always. I'm calling it in a pretty straightforward fashion like getFiles("./directory"), and half the time, it returns all the contents. But sometimes, it will omit contents of one subdirectory, while returning all the others.
So, let's say if the given directory has 5 subdirectories, it will only return the contents of 4. This happens infrequently and if there is some underlying pattern, I am not able to detect it. Please help!
Your code is a bit misguided for a number of reasons:
You're mixing synchronous file I/O calls with promises. There's no reason to use promises if your code is entirely synchronous. That just makes things more complicated than needed.
It's unclear what the call to resolve(dir, subdir) is supposed to do. If you're trying to make a full path, you should be using path.join(dir, subdir).
You should be using the withFileTypes option with readdir() as that saves extra roundtrips to the file system so you can just immediately check if each file is a file or directory.
You don't use await with synchronous functions.
So, if you're doing a synchronous version, you can just do this:
const fs = require('fs');
const path = require('path');
function getFilesSync(dir, files = []) {
const listing = fs.readdirSync(dir, {withFileTypes:true});
let dirs = [];
for (let f of listing) {
const fullName = path.join(dir, f.name);
if (f.isFile()) {
files.push(fullName);
} else if (f.isDirectory()) {
dirs.push(fullName);
}
}
for (let d of dirs) {
getFilesSync(d, files);
}
return files;
}
let files = getFilesSync(somePath);
console.log(files);
If you wanted an asynchronous version using promises, then you can do this:
const fsp = require('fs').promises;
const path = require('path');
async function getFiles(dir, files = []) {
const listing = await fsp.readdir(dir, {withFileTypes: true});
let dirs = [];
for (let f of listing) {
const fullName = path.join(dir, f.name);
if (f.isFile()) {
files.push(fullName);
} else if (f.isDirectory()) {
dirs.push(fullName);
}
}
for (let d of dirs) {
await getFiles(d, files);
}
return files;
}
getFiles(somePath).then(files => {
console.log(files);
}).catch(err => {
console.log(err);
});
Note how using the fs.promises interface along with async/await allows the asynchronous version to be very, very similar to the synchronous version.
I see your code has a subdir.startsWith("__") test in it. I don't know exactly what you were trying to do with that. You can add that into the logic I have if that's required.
I would have put this as a comment but i do not have enough reputation :s
I'm not really clear with the async / await methods for promise so i'm not really sure about what i'm saying!
So maybe an error is occuring, but you can't see it because you don't reject or catch nothing.
I guess that with the async/await methods, an error would be rejected in your const, and then you can console.log() your const to see if, when your function omit some files, it's not because of an error that occured!
And for your last await, you put it in a return, it would be interesting to console.log() it too.
/////////////////////edited later ////////////////////////////////////////
https://javascript.info/async-await
In real situations, the promise may take some time before it rejects. In that case there will be a delay before await throws an error.
We can catch that error using try..catch, the same way as a regular throw:
try {
let response = await fetch('http://no-such-url');
} catch(err) {
alert(err); // TypeError: failed to fetch
}
}
I want to make a request and cache it, in a functional style.
const req = (uri) =>
(console.log(`requesting: ${uri}`), Promise.resolve({ status: 200 }));
const cache = (fn) => (...args) =>
fn(...args).then((result) => { console.log('caching:', result) });
const cachedReq = cache(req);
cachedReq('example.com/foo');
Two questions:
Is this code idiomatic?
How can I supply logic to generate the cache key from the result, while maintaining separation of concerns? For example, I might use req to retrieve different kinds of resource which need different logic to generate the key to be used in the cache. How should I supply this key-generation logic to the cache function?
Edit:
In reality, the URI should be the key (thanks to #epascarello). I chose a poor example. But I'd like to ask about the more general case, where logic needs to be supplied "down composition", while maintaining decent separation of concerns.
You almost close to achieve your goal, you are in the right direction, with composition concept. maybe this code can help you to make your goal come true.
Let's simulate your req function like so:
var req = (uri) => {
console.log("inside req", uri);
return new Promise((resolve, reject) => {
setTimeout(() => {
resolve({ status: 200 });
}, 3000);
});
}
then you have the cacheFunc version as:
var withCache = (promiseFunc) => {
const cache = {};
return (...args) => {
// suppose first param is uri
var uri = args[0];
return new Promise((resolve, reject) => {
if (cache.hasOwnProperty(uri)) {
return resolve(cache[uri]);
}
promiseFunc(...args).then((data) => {
cache[uri] = data;
resolve(data);
}).catch(reject);
});
}
}
as you can see, you need to create and cache object into the first function, so this is a a little similar to Currying in JS, so you need to wrap your req (that is a promise) wrapped into another promise from the cache version, so before execute the req function, you need to verify if some response exists into cache with the same uri key, if it is, so resolve inmmediatly the promise, else execute the req function, once you receive the response cache the response and resolve the cache promise version.
So you can use it like so:
var cacheReq = withCache(req);
cacheReq('https://anywhere.com').then(console.log.bind(null, 'response')).catch(console.log.bind(null, 'error response'));
you will notice that in the first time you promise wait until 3 seconds to resolve the req, in the second call the promise will resolve the promise ASAP because of cache, if you try with another uri it will wait 3 seconds again and will cache the response to use it the next time.
Hope it can help you.
You can use a combination of a Map and the Request constructor:
// I'll be using ramda for object equality, but any
// deepEquals checker should work.
const R = window.R;
const genRequest = ((cache, eqComparator) => {
return (url, fetchOpts={}) => {
const key = {url, fetchOpts};
const alreadyHave = [...cache.keys].find(x => eqComparator(x, key));
if (alreadyHave) return cache.get(alreadyHave);
const req = new Request(url, fetchOpts);
cache.set(key, req);
return req;
};
})(new Map(), R.equals);
const req = genRequest('http://www.google.com');
fetch(req)
.then(...)
.catch(...);
Some nice properties fall out of this:
Each request is constructed only once but can be repeatedly fetched.
No side-effects until you fetch: creating the request and fetching it are separate.
...thus, concerns are about as separated as they can be.
You could re-jigger parameter application to easily support custom equality comparisons using the same cache.
You can use the same strategy to cache the results of a fetch, separately from caching the requests.
I am using [ssh2-sftp-client][1] package to recursively read all the directories inside a given remote path.
Here is the code.
const argv = require('yargs').argv;
const client = require('ssh-sftp-client');
const server = new Client();
const auth = {
host: '192.168.1.11',
username: argv.u,
password: argv.p
};
const serverRoot = '/sites/';
const siteName = 'webmaster.com';
// list of directories on the server will be pushed to this array
const serverPaths = [];
server.connect(auth).then(() => {
console.log(`connected to ${auth.host} as ${auth.username}`);
}).catch((err) => {
if (err) throw err;
});
server.list('/sites/').then((dirs) => {
redursiveDirectorySearch(dirs, `${serverRoot}${siteName}/`);
})
.catch((err) => {
if (err) throw err;
});
function recursiveDirectorySearch(dirs, prevPath) {
let paths = dirs.filter((dir) => {
// returns directories only
return dir.type === 'd';
});
if (paths.length > 0) {
paths.forEach((path) => {
server
.list(`${prevPath}${path.name}`)
.then((dirs) => {
console.log(`${prevPath}${path.name}`);
recursiveDirectorySearch(dirs, `${prevPath}${path.name}`);
serverPaths.push(`${prevPath}${path.name}`);
})
}
}
}
At first, a connection will be made to the server and then list whatever is under '/sites/' directory, which will then be passed to 'recursiveDirectorySearch' function. This function will receive an array of whatever is found under '/sites/' directory on the server as the first parameter, which will be filtered out so it only has directories. If one or more directory was found, a call to the server for each directory in the array will be made in order to retrieve everything under '/sites/'+'name of the directory in the array'. This same function will be called again with whatever is returned by the call to the server until no other directory is found.
Whenever a directory is found, its name in string will be pushed to 'serverPaths' array. As far as I can tell, this search is working and successfully pushing all the directory names to the array.
However, I can't think of a way to detect when this recursive search for all the directories is complete so I can do something with the 'serverPaths' array.
I tried to take advantage of Promise.all() but don't know how to use it when how many function calls are made is unknown.
You're simply lacking a couple of returns, add a Promise.all, and an Array#map and you're done
Note: not using Promise.all on serverPaths, but rather, using the fact that returning a Promise in .then will result in the Promise that is returned by .then taking on the Promise that is returned (hmmm, that isn't very well explained, is it, but it's Promises 101 stuff really!
server.list('/sites/').then((dirs) => {
// added a return here
return recursiveDirectorySearch(dirs, `${serverRoot}${siteName}/`);
})
.then(() => {
// everything is done at this point,
// serverPaths should be complete
})
.catch((err) => {
if (err) throw err;
});
function recursiveDirectorySearch(dirs, prevPath) {
let paths = dirs.filter((dir) => {
// returns directories only
return dir.type === 'd';
});
// added a return, Promise.all and changed forEach to map
return Promise.all(paths.map((path) => {
//added a return here
return server
.list(`${prevPath}${path.name}`)
.then((dirs) => {
console.log(`${prevPath}${path.name}`);
// swapped the next two lines
serverPaths.push(`${prevPath}${path.name}`);
// added a return here, push the path before
return recursiveDirectorySearch(dirs, `${prevPath}${path.name}`);
})
}));
}
One of the main things that is jumping out at me is your initial if statement. (if paths.length > 0) { run recursion } This appears to work really well for the first call because you know that the data coming back will be populated with an array full of directories.
Your function however, does not appear to have logic built out for an array with a length of 0. In this scenario it would be possible for you to get all of the directory names you are looking for. Presented in the manner that you are looking for. It would also mean that your calls on the higher parts of the tree are never able to resolve.
Try to add logic to handle cases for an array with a length of zero | if (paths.length === 0) return; | This would be a hard break out of the recursive calls on the higher parts of the stack.