Using async/await with a forEach loop
up vote
469
down vote
favorite
Are there any issues with using async/await
in a forEach
loop? I'm trying to loop through an array of files and await
on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await
in a higher order function like this so I just wanted to ask if there was any issue with this.
javascript node.js promise async-await ecmascript-2017
add a comment |
up vote
469
down vote
favorite
Are there any issues with using async/await
in a forEach
loop? I'm trying to loop through an array of files and await
on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await
in a higher order function like this so I just wanted to ask if there was any issue with this.
javascript node.js promise async-await ecmascript-2017
add a comment |
up vote
469
down vote
favorite
up vote
469
down vote
favorite
Are there any issues with using async/await
in a forEach
loop? I'm trying to loop through an array of files and await
on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await
in a higher order function like this so I just wanted to ask if there was any issue with this.
javascript node.js promise async-await ecmascript-2017
Are there any issues with using async/await
in a forEach
loop? I'm trying to loop through an array of files and await
on the contents of each file.
import fs from 'fs-promise'
async function printFiles () {
const files = await getFilePaths() // Assume this works fine
files.forEach(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
}
printFiles()
This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await
in a higher order function like this so I just wanted to ask if there was any issue with this.
javascript node.js promise async-await ecmascript-2017
javascript node.js promise async-await ecmascript-2017
edited Dec 8 '17 at 17:20
asked Jun 1 '16 at 18:55
saadq
8,81283471
8,81283471
add a comment |
add a comment |
12 Answers
12
active
oldest
votes
up vote
965
down vote
accepted
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles
function does immediately return after that.
If you want to read the files in sequence, you cannot use forEach
indeed. Just use a modern for … of
loop instead, in which await
will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
If you want to read the files in parallel, you cannot use forEach
indeed. Each of the async
callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map
instead, and you can await the array of promises that you'll get with Promise.all
:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
9
Could you please explain why doesfor ... of ...
work?
– Demonbane
Aug 15 '16 at 18:04
33
ok i know why... Using Babel will transformasync
/await
to generator function and usingforEach
means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context ofnext()
with others. Actually, a simplefor()
loop also works because the iterations are also in one single generator function.
– Demonbane
Aug 15 '16 at 19:21
9
@Demonbane: In short, because it was designed to work :-)await
suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).
– Bergi
Aug 15 '16 at 23:28
2
@arve0 Not really, anasync
function is quite different from aPromise
executor callback, but yes themap
callback returns a promise in both cases.
– Bergi
Mar 29 '17 at 16:25
2
When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)
– Félix Gagnon-Grenier
May 16 '17 at 21:04
|
show 28 more comments
up vote
44
down vote
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const file of fs.readFile(file, 'utf8')) {
console.log(contents)
}
}
See spec: https://github.com/tc39/proposal-async-iteration
2018-09-10: This answer has been getting a lot attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration: http://2ality.com/2016/10/asynchronous-iteration.html
3
Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.
– saadq
Jun 15 at 16:40
1
Upvoted for show to the community a very new approach
– Pablo De Luca
Jul 27 at 19:38
2
Where isfile
defined in your code?
– ma11hew28
Sep 3 at 7:12
Unsure wherefile
is defined. This is ported 1:1 from the question above.
– Francisco Mateo
Sep 8 at 14:13
How's the performance of this?
– sorxrob
Sep 18 at 4:16
|
show 1 more comment
up vote
17
down vote
To me using Promise.all()
with map()
is a bit difficult to understand and verbose, but if you want to do it in plain JS that's your best shot I guess.
If you don't mind adding a module, I implemented the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles()
p-iteration
4
Wow, p-iteration is so smooth. Saved my day!
– Antonio Torres
Dec 7 '17 at 0:52
1
I like this as it has the same functions / methods as JS itself - in my case I neededsome
rather thanforEach
. Thanks!
– mikemaccana
yesterday
add a comment |
up vote
12
down vote
Instead of Promise.all
in conjunction with Array.prototype.map
(which does not guarantee the order in which the Promise
s are resolved), I use Array.prototype.reduce
, starting with a resolved Promise
:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}, Promise.resolve());
}
1
This works perfectly, thank you so much. Could you explain what is happening here withPromise.resolve()
andawait promise;
?
– parrker9
Mar 28 at 20:48
1
This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9 at 0:24
1
This is very clever! Thank you!
– Micah Henning
Jun 15 at 17:13
1
@parrker9Promise.resolve()
returns an already resolvedPromise
object, so thatreduce
has aPromise
to start with.await promise;
will wait for the lastPromise
in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.
– Timothy Zorn
Jun 17 at 15:00
add a comment |
up vote
10
down vote
Here are some forEach async prototypes:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation
– DaniOcean
Mar 28 at 13:55
As long as the name is unique in the future (like I'd use_forEachAsync
) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.
– mikemaccana
Apr 3 at 13:29
They should be standalone functions. We've got modules to not pollute globals with our personal things.
– estus
Nov 5 at 6:25
add a comment |
up vote
2
down vote
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
add a comment |
up vote
2
down vote
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: }).save();
var cleanParams = ;
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, );
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
2
Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!
– Jay Edwards
Sep 26 '17 at 9:08
add a comment |
up vote
1
down vote
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
FWIW, ++1 on this. It's an elegant implementation.
– Donald E. Foss
Apr 3 at 18:53
add a comment |
up vote
1
down vote
In addition to @Bergi’s answer, I’d like to offer a third alternative. It's very similar to @Bergi’s 2nd example, but instead of awaiting each readFile
individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map()
does not need to be async
, since fs.readFile
returns a Promise object anyway. Therefore promises
is an array of Promise objects, which can be sent to Promise.all()
.
In @Bergi’s answer, the console may log file contents out of order. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files
array. However, in my method above, you are guaranteed the console will log the files in the same order as they are read.
add a comment |
up vote
0
down vote
One important caveat is: The await + for .. of
method and the forEach + async
way actually have different effect.
Having await
inside a real for
loop will make sure all async calls are executed one by one. And the forEach + async
way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise
(less elegant) if you do not use async/await
and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as nativeforEach
- accessing indices instead of relying on iterability - and pass the index to the callback.
– Bergi
Nov 16 '17 at 13:57
You can useArray.prototype.reduce
in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258
– Timothy Zorn
Mar 26 at 19:54
add a comment |
up vote
0
down vote
Similar to Antonio Val's p-iteration
, an alternative npm module is async-af
:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af
has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
add a comment |
up vote
-2
down vote
I would use the well-tested (millions of downloads per week) pify and async modules. If you are unfamiliar with the async module, I highly recommend you check out its docs. I've seen multiple devs waste time recreating its methods, or worse, making difficult-to-maintain async code when higher-order async methods would simplify code.
const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')
async function getFilePaths() {
return Promise.resolve([
'./package.json',
'./package-lock.json',
]);
}
async function printFiles () {
const files = await getFilePaths()
await pify(async.eachSeries)(files, async (file) => { // <-- run in series
// await pify(async.each)(files, async (file) => { // <-- run in parallel
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
console.log('HAMBONE')
}
printFiles().then(() => {
console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```
This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/….
– jbustamovej
Feb 20 at 6:24
as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling
– Zachary Ryan Smith
Feb 21 at 1:54
add a comment |
protected by georgeawg Aug 16 at 17:41
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?
12 Answers
12
active
oldest
votes
12 Answers
12
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
965
down vote
accepted
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles
function does immediately return after that.
If you want to read the files in sequence, you cannot use forEach
indeed. Just use a modern for … of
loop instead, in which await
will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
If you want to read the files in parallel, you cannot use forEach
indeed. Each of the async
callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map
instead, and you can await the array of promises that you'll get with Promise.all
:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
9
Could you please explain why doesfor ... of ...
work?
– Demonbane
Aug 15 '16 at 18:04
33
ok i know why... Using Babel will transformasync
/await
to generator function and usingforEach
means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context ofnext()
with others. Actually, a simplefor()
loop also works because the iterations are also in one single generator function.
– Demonbane
Aug 15 '16 at 19:21
9
@Demonbane: In short, because it was designed to work :-)await
suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).
– Bergi
Aug 15 '16 at 23:28
2
@arve0 Not really, anasync
function is quite different from aPromise
executor callback, but yes themap
callback returns a promise in both cases.
– Bergi
Mar 29 '17 at 16:25
2
When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)
– Félix Gagnon-Grenier
May 16 '17 at 21:04
|
show 28 more comments
up vote
965
down vote
accepted
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles
function does immediately return after that.
If you want to read the files in sequence, you cannot use forEach
indeed. Just use a modern for … of
loop instead, in which await
will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
If you want to read the files in parallel, you cannot use forEach
indeed. Each of the async
callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map
instead, and you can await the array of promises that you'll get with Promise.all
:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
9
Could you please explain why doesfor ... of ...
work?
– Demonbane
Aug 15 '16 at 18:04
33
ok i know why... Using Babel will transformasync
/await
to generator function and usingforEach
means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context ofnext()
with others. Actually, a simplefor()
loop also works because the iterations are also in one single generator function.
– Demonbane
Aug 15 '16 at 19:21
9
@Demonbane: In short, because it was designed to work :-)await
suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).
– Bergi
Aug 15 '16 at 23:28
2
@arve0 Not really, anasync
function is quite different from aPromise
executor callback, but yes themap
callback returns a promise in both cases.
– Bergi
Mar 29 '17 at 16:25
2
When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)
– Félix Gagnon-Grenier
May 16 '17 at 21:04
|
show 28 more comments
up vote
965
down vote
accepted
up vote
965
down vote
accepted
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles
function does immediately return after that.
If you want to read the files in sequence, you cannot use forEach
indeed. Just use a modern for … of
loop instead, in which await
will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
If you want to read the files in parallel, you cannot use forEach
indeed. Each of the async
callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map
instead, and you can await the array of promises that you'll get with Promise.all
:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles
function does immediately return after that.
If you want to read the files in sequence, you cannot use forEach
indeed. Just use a modern for … of
loop instead, in which await
will work as expected:
async function printFiles () {
const files = await getFilePaths();
for (const file of files) {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
}
}
If you want to read the files in parallel, you cannot use forEach
indeed. Each of the async
callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map
instead, and you can await the array of promises that you'll get with Promise.all
:
async function printFiles () {
const files = await getFilePaths();
await Promise.all(files.map(async (file) => {
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}));
}
edited Jul 1 at 21:50
Joshua Pinter
23.3k8134159
23.3k8134159
answered Jun 1 '16 at 19:02
Bergi
356k55530846
356k55530846
9
Could you please explain why doesfor ... of ...
work?
– Demonbane
Aug 15 '16 at 18:04
33
ok i know why... Using Babel will transformasync
/await
to generator function and usingforEach
means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context ofnext()
with others. Actually, a simplefor()
loop also works because the iterations are also in one single generator function.
– Demonbane
Aug 15 '16 at 19:21
9
@Demonbane: In short, because it was designed to work :-)await
suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).
– Bergi
Aug 15 '16 at 23:28
2
@arve0 Not really, anasync
function is quite different from aPromise
executor callback, but yes themap
callback returns a promise in both cases.
– Bergi
Mar 29 '17 at 16:25
2
When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)
– Félix Gagnon-Grenier
May 16 '17 at 21:04
|
show 28 more comments
9
Could you please explain why doesfor ... of ...
work?
– Demonbane
Aug 15 '16 at 18:04
33
ok i know why... Using Babel will transformasync
/await
to generator function and usingforEach
means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context ofnext()
with others. Actually, a simplefor()
loop also works because the iterations are also in one single generator function.
– Demonbane
Aug 15 '16 at 19:21
9
@Demonbane: In short, because it was designed to work :-)await
suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).
– Bergi
Aug 15 '16 at 23:28
2
@arve0 Not really, anasync
function is quite different from aPromise
executor callback, but yes themap
callback returns a promise in both cases.
– Bergi
Mar 29 '17 at 16:25
2
When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)
– Félix Gagnon-Grenier
May 16 '17 at 21:04
9
9
Could you please explain why does
for ... of ...
work?– Demonbane
Aug 15 '16 at 18:04
Could you please explain why does
for ... of ...
work?– Demonbane
Aug 15 '16 at 18:04
33
33
ok i know why... Using Babel will transform
async
/await
to generator function and using forEach
means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context of next()
with others. Actually, a simple for()
loop also works because the iterations are also in one single generator function.– Demonbane
Aug 15 '16 at 19:21
ok i know why... Using Babel will transform
async
/await
to generator function and using forEach
means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context of next()
with others. Actually, a simple for()
loop also works because the iterations are also in one single generator function.– Demonbane
Aug 15 '16 at 19:21
9
9
@Demonbane: In short, because it was designed to work :-)
await
suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).– Bergi
Aug 15 '16 at 23:28
@Demonbane: In short, because it was designed to work :-)
await
suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).– Bergi
Aug 15 '16 at 23:28
2
2
@arve0 Not really, an
async
function is quite different from a Promise
executor callback, but yes the map
callback returns a promise in both cases.– Bergi
Mar 29 '17 at 16:25
@arve0 Not really, an
async
function is quite different from a Promise
executor callback, but yes the map
callback returns a promise in both cases.– Bergi
Mar 29 '17 at 16:25
2
2
When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)
– Félix Gagnon-Grenier
May 16 '17 at 21:04
When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)
– Félix Gagnon-Grenier
May 16 '17 at 21:04
|
show 28 more comments
up vote
44
down vote
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const file of fs.readFile(file, 'utf8')) {
console.log(contents)
}
}
See spec: https://github.com/tc39/proposal-async-iteration
2018-09-10: This answer has been getting a lot attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration: http://2ality.com/2016/10/asynchronous-iteration.html
3
Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.
– saadq
Jun 15 at 16:40
1
Upvoted for show to the community a very new approach
– Pablo De Luca
Jul 27 at 19:38
2
Where isfile
defined in your code?
– ma11hew28
Sep 3 at 7:12
Unsure wherefile
is defined. This is ported 1:1 from the question above.
– Francisco Mateo
Sep 8 at 14:13
How's the performance of this?
– sorxrob
Sep 18 at 4:16
|
show 1 more comment
up vote
44
down vote
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const file of fs.readFile(file, 'utf8')) {
console.log(contents)
}
}
See spec: https://github.com/tc39/proposal-async-iteration
2018-09-10: This answer has been getting a lot attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration: http://2ality.com/2016/10/asynchronous-iteration.html
3
Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.
– saadq
Jun 15 at 16:40
1
Upvoted for show to the community a very new approach
– Pablo De Luca
Jul 27 at 19:38
2
Where isfile
defined in your code?
– ma11hew28
Sep 3 at 7:12
Unsure wherefile
is defined. This is ported 1:1 from the question above.
– Francisco Mateo
Sep 8 at 14:13
How's the performance of this?
– sorxrob
Sep 18 at 4:16
|
show 1 more comment
up vote
44
down vote
up vote
44
down vote
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const file of fs.readFile(file, 'utf8')) {
console.log(contents)
}
}
See spec: https://github.com/tc39/proposal-async-iteration
2018-09-10: This answer has been getting a lot attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration: http://2ality.com/2016/10/asynchronous-iteration.html
With ES2018, you are able to greatly simplify all of the above answers to:
async function printFiles () {
const files = await getFilePaths()
for await (const file of fs.readFile(file, 'utf8')) {
console.log(contents)
}
}
See spec: https://github.com/tc39/proposal-async-iteration
2018-09-10: This answer has been getting a lot attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration: http://2ality.com/2016/10/asynchronous-iteration.html
edited Sep 10 at 20:58
answered Jun 15 at 11:17
Francisco Mateo
3,31221327
3,31221327
3
Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.
– saadq
Jun 15 at 16:40
1
Upvoted for show to the community a very new approach
– Pablo De Luca
Jul 27 at 19:38
2
Where isfile
defined in your code?
– ma11hew28
Sep 3 at 7:12
Unsure wherefile
is defined. This is ported 1:1 from the question above.
– Francisco Mateo
Sep 8 at 14:13
How's the performance of this?
– sorxrob
Sep 18 at 4:16
|
show 1 more comment
3
Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.
– saadq
Jun 15 at 16:40
1
Upvoted for show to the community a very new approach
– Pablo De Luca
Jul 27 at 19:38
2
Where isfile
defined in your code?
– ma11hew28
Sep 3 at 7:12
Unsure wherefile
is defined. This is ported 1:1 from the question above.
– Francisco Mateo
Sep 8 at 14:13
How's the performance of this?
– sorxrob
Sep 18 at 4:16
3
3
Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.
– saadq
Jun 15 at 16:40
Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.
– saadq
Jun 15 at 16:40
1
1
Upvoted for show to the community a very new approach
– Pablo De Luca
Jul 27 at 19:38
Upvoted for show to the community a very new approach
– Pablo De Luca
Jul 27 at 19:38
2
2
Where is
file
defined in your code?– ma11hew28
Sep 3 at 7:12
Where is
file
defined in your code?– ma11hew28
Sep 3 at 7:12
Unsure where
file
is defined. This is ported 1:1 from the question above.– Francisco Mateo
Sep 8 at 14:13
Unsure where
file
is defined. This is ported 1:1 from the question above.– Francisco Mateo
Sep 8 at 14:13
How's the performance of this?
– sorxrob
Sep 18 at 4:16
How's the performance of this?
– sorxrob
Sep 18 at 4:16
|
show 1 more comment
up vote
17
down vote
To me using Promise.all()
with map()
is a bit difficult to understand and verbose, but if you want to do it in plain JS that's your best shot I guess.
If you don't mind adding a module, I implemented the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles()
p-iteration
4
Wow, p-iteration is so smooth. Saved my day!
– Antonio Torres
Dec 7 '17 at 0:52
1
I like this as it has the same functions / methods as JS itself - in my case I neededsome
rather thanforEach
. Thanks!
– mikemaccana
yesterday
add a comment |
up vote
17
down vote
To me using Promise.all()
with map()
is a bit difficult to understand and verbose, but if you want to do it in plain JS that's your best shot I guess.
If you don't mind adding a module, I implemented the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles()
p-iteration
4
Wow, p-iteration is so smooth. Saved my day!
– Antonio Torres
Dec 7 '17 at 0:52
1
I like this as it has the same functions / methods as JS itself - in my case I neededsome
rather thanforEach
. Thanks!
– mikemaccana
yesterday
add a comment |
up vote
17
down vote
up vote
17
down vote
To me using Promise.all()
with map()
is a bit difficult to understand and verbose, but if you want to do it in plain JS that's your best shot I guess.
If you don't mind adding a module, I implemented the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles()
p-iteration
To me using Promise.all()
with map()
is a bit difficult to understand and verbose, but if you want to do it in plain JS that's your best shot I guess.
If you don't mind adding a module, I implemented the Array iteration methods so they can be used in a very straightforward way with async/await.
An example with your case:
const { forEach } = require('p-iteration');
const fs = require('fs-promise');
async function printFiles () {
const files = await getFilePaths();
await forEach(files, async (file) => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles()
p-iteration
edited Oct 16 '17 at 0:03
answered Jul 10 '17 at 8:15
Antonio Val
1,4931416
1,4931416
4
Wow, p-iteration is so smooth. Saved my day!
– Antonio Torres
Dec 7 '17 at 0:52
1
I like this as it has the same functions / methods as JS itself - in my case I neededsome
rather thanforEach
. Thanks!
– mikemaccana
yesterday
add a comment |
4
Wow, p-iteration is so smooth. Saved my day!
– Antonio Torres
Dec 7 '17 at 0:52
1
I like this as it has the same functions / methods as JS itself - in my case I neededsome
rather thanforEach
. Thanks!
– mikemaccana
yesterday
4
4
Wow, p-iteration is so smooth. Saved my day!
– Antonio Torres
Dec 7 '17 at 0:52
Wow, p-iteration is so smooth. Saved my day!
– Antonio Torres
Dec 7 '17 at 0:52
1
1
I like this as it has the same functions / methods as JS itself - in my case I needed
some
rather than forEach
. Thanks!– mikemaccana
yesterday
I like this as it has the same functions / methods as JS itself - in my case I needed
some
rather than forEach
. Thanks!– mikemaccana
yesterday
add a comment |
up vote
12
down vote
Instead of Promise.all
in conjunction with Array.prototype.map
(which does not guarantee the order in which the Promise
s are resolved), I use Array.prototype.reduce
, starting with a resolved Promise
:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}, Promise.resolve());
}
1
This works perfectly, thank you so much. Could you explain what is happening here withPromise.resolve()
andawait promise;
?
– parrker9
Mar 28 at 20:48
1
This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9 at 0:24
1
This is very clever! Thank you!
– Micah Henning
Jun 15 at 17:13
1
@parrker9Promise.resolve()
returns an already resolvedPromise
object, so thatreduce
has aPromise
to start with.await promise;
will wait for the lastPromise
in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.
– Timothy Zorn
Jun 17 at 15:00
add a comment |
up vote
12
down vote
Instead of Promise.all
in conjunction with Array.prototype.map
(which does not guarantee the order in which the Promise
s are resolved), I use Array.prototype.reduce
, starting with a resolved Promise
:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}, Promise.resolve());
}
1
This works perfectly, thank you so much. Could you explain what is happening here withPromise.resolve()
andawait promise;
?
– parrker9
Mar 28 at 20:48
1
This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9 at 0:24
1
This is very clever! Thank you!
– Micah Henning
Jun 15 at 17:13
1
@parrker9Promise.resolve()
returns an already resolvedPromise
object, so thatreduce
has aPromise
to start with.await promise;
will wait for the lastPromise
in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.
– Timothy Zorn
Jun 17 at 15:00
add a comment |
up vote
12
down vote
up vote
12
down vote
Instead of Promise.all
in conjunction with Array.prototype.map
(which does not guarantee the order in which the Promise
s are resolved), I use Array.prototype.reduce
, starting with a resolved Promise
:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}, Promise.resolve());
}
Instead of Promise.all
in conjunction with Array.prototype.map
(which does not guarantee the order in which the Promise
s are resolved), I use Array.prototype.reduce
, starting with a resolved Promise
:
async function printFiles () {
const files = await getFilePaths();
await files.reduce(async (promise, file) => {
// This line will wait for the last async function to finish.
// The first iteration uses an already resolved Promise
// so, it will immediately continue.
await promise;
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
}, Promise.resolve());
}
edited Jun 17 at 15:01
answered Mar 26 at 19:48
Timothy Zorn
698615
698615
1
This works perfectly, thank you so much. Could you explain what is happening here withPromise.resolve()
andawait promise;
?
– parrker9
Mar 28 at 20:48
1
This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9 at 0:24
1
This is very clever! Thank you!
– Micah Henning
Jun 15 at 17:13
1
@parrker9Promise.resolve()
returns an already resolvedPromise
object, so thatreduce
has aPromise
to start with.await promise;
will wait for the lastPromise
in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.
– Timothy Zorn
Jun 17 at 15:00
add a comment |
1
This works perfectly, thank you so much. Could you explain what is happening here withPromise.resolve()
andawait promise;
?
– parrker9
Mar 28 at 20:48
1
This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9 at 0:24
1
This is very clever! Thank you!
– Micah Henning
Jun 15 at 17:13
1
@parrker9Promise.resolve()
returns an already resolvedPromise
object, so thatreduce
has aPromise
to start with.await promise;
will wait for the lastPromise
in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.
– Timothy Zorn
Jun 17 at 15:00
1
1
This works perfectly, thank you so much. Could you explain what is happening here with
Promise.resolve()
and await promise;
?– parrker9
Mar 28 at 20:48
This works perfectly, thank you so much. Could you explain what is happening here with
Promise.resolve()
and await promise;
?– parrker9
Mar 28 at 20:48
1
1
This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9 at 0:24
This is pretty cool. Am I right in thinking the files will be read in order and not all at once?
– GollyJer
Jun 9 at 0:24
1
1
This is very clever! Thank you!
– Micah Henning
Jun 15 at 17:13
This is very clever! Thank you!
– Micah Henning
Jun 15 at 17:13
1
1
@parrker9
Promise.resolve()
returns an already resolved Promise
object, so that reduce
has a Promise
to start with. await promise;
will wait for the last Promise
in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.– Timothy Zorn
Jun 17 at 15:00
@parrker9
Promise.resolve()
returns an already resolved Promise
object, so that reduce
has a Promise
to start with. await promise;
will wait for the last Promise
in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.– Timothy Zorn
Jun 17 at 15:00
add a comment |
up vote
10
down vote
Here are some forEach async prototypes:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation
– DaniOcean
Mar 28 at 13:55
As long as the name is unique in the future (like I'd use_forEachAsync
) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.
– mikemaccana
Apr 3 at 13:29
They should be standalone functions. We've got modules to not pollute globals with our personal things.
– estus
Nov 5 at 6:25
add a comment |
up vote
10
down vote
Here are some forEach async prototypes:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation
– DaniOcean
Mar 28 at 13:55
As long as the name is unique in the future (like I'd use_forEachAsync
) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.
– mikemaccana
Apr 3 at 13:29
They should be standalone functions. We've got modules to not pollute globals with our personal things.
– estus
Nov 5 at 6:25
add a comment |
up vote
10
down vote
up vote
10
down vote
Here are some forEach async prototypes:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
Here are some forEach async prototypes:
Array.prototype.forEachAsync = async function (fn) {
for (let t of this) { await fn(t) }
}
Array.prototype.forEachAsyncParallel = async function (fn) {
await Promise.all(this.map(fn));
}
answered Mar 22 at 15:11
Matt
684212
684212
Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation
– DaniOcean
Mar 28 at 13:55
As long as the name is unique in the future (like I'd use_forEachAsync
) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.
– mikemaccana
Apr 3 at 13:29
They should be standalone functions. We've got modules to not pollute globals with our personal things.
– estus
Nov 5 at 6:25
add a comment |
Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation
– DaniOcean
Mar 28 at 13:55
As long as the name is unique in the future (like I'd use_forEachAsync
) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.
– mikemaccana
Apr 3 at 13:29
They should be standalone functions. We've got modules to not pollute globals with our personal things.
– estus
Nov 5 at 6:25
Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation
– DaniOcean
Mar 28 at 13:55
Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation
– DaniOcean
Mar 28 at 13:55
As long as the name is unique in the future (like I'd use
_forEachAsync
) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.– mikemaccana
Apr 3 at 13:29
As long as the name is unique in the future (like I'd use
_forEachAsync
) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.– mikemaccana
Apr 3 at 13:29
They should be standalone functions. We've got modules to not pollute globals with our personal things.
– estus
Nov 5 at 6:25
They should be standalone functions. We've got modules to not pollute globals with our personal things.
– estus
Nov 5 at 6:25
add a comment |
up vote
2
down vote
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
add a comment |
up vote
2
down vote
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
add a comment |
up vote
2
down vote
up vote
2
down vote
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:
Promise.all(PacksList.map((pack)=>{
return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
snap.forEach( childSnap => {
const file = childSnap.val()
file.id = childSnap.key;
allItems.push( file )
})
})
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))
answered Aug 26 '17 at 10:47
Hooman Askari
630818
630818
add a comment |
add a comment |
up vote
2
down vote
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: }).save();
var cleanParams = ;
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, );
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
2
Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!
– Jay Edwards
Sep 26 '17 at 9:08
add a comment |
up vote
2
down vote
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: }).save();
var cleanParams = ;
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, );
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
2
Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!
– Jay Edwards
Sep 26 '17 at 9:08
add a comment |
up vote
2
down vote
up vote
2
down vote
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: }).save();
var cleanParams = ;
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, );
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:
module.exports = function () {
var self = this;
this.each = async (items, fn) => {
if (items && items.length) {
await Promise.all(
items.map(async (item) => {
await fn(item);
}));
}
};
this.reduce = async (items, fn, initialValue) => {
await self.each(
items, async (item) => {
initialValue = await fn(initialValue, item);
});
return initialValue;
};
};
now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:
...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
var myAsync = new MyAsync();
var doje = await Doje.findOne({ name: 'Doje', noises: }).save();
var cleanParams = ;
// FOR EACH EXAMPLE
await myAsync.each(['bork', 'concern', 'heck'],
async (elem) => {
if (elem !== 'heck') {
await doje.update({ $push: { 'noises': elem }});
}
});
var cat = await Cat.findOne({ name: 'Nyan' });
// REDUCE EXAMPLE
var friendsOfNyanCat = await myAsync.reduce(cat.friends,
async (catArray, friendId) => {
var friend = await Friend.findById(friendId);
if (friend.name !== 'Long cat') {
catArray.push(friend.name);
}
}, );
// Assuming Long Cat was a friend of Nyan Cat...
assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}
edited Sep 26 '17 at 9:07
answered Sep 22 '17 at 23:03
Jay Edwards
174112
174112
2
Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!
– Jay Edwards
Sep 26 '17 at 9:08
add a comment |
2
Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!
– Jay Edwards
Sep 26 '17 at 9:08
2
2
Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!
– Jay Edwards
Sep 26 '17 at 9:08
Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!
– Jay Edwards
Sep 26 '17 at 9:08
add a comment |
up vote
1
down vote
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
FWIW, ++1 on this. It's an elegant implementation.
– Donald E. Foss
Apr 3 at 18:53
add a comment |
up vote
1
down vote
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
FWIW, ++1 on this. It's an elegant implementation.
– Donald E. Foss
Apr 3 at 18:53
add a comment |
up vote
1
down vote
up vote
1
down vote
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
Using Task, futurize, and a traversable List, you can simply do
async function printFiles() {
const files = await getFiles();
List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
.fork( console.error, console.log)
}
Here is how you'd set this up
import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';
const future = futurizeP(Task)
const readFile = future(fs.readFile)
Another way to have structured the desired code would be
const printFiles = files =>
List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
.fork( console.error, console.log)
Or perhaps even more functionally oriented
// 90% of encodings are utf-8, making that use case super easy is prudent
// handy-library.js
export const readFile = f =>
future(fs.readFile)( f, 'utf-8' )
export const arrayToTaskList = list => taskFn =>
List(files).traverse( Task.of, taskFn )
export const readFiles = files =>
arrayToTaskList( files, readFile )
export const printFiles = files =>
readFiles(files).fork( console.error, console.log)
Then from the parent function
async function main() {
/* awesome code with side-effects before */
printFiles( await getFiles() );
/* awesome code with side-effects after */
}
If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )
import { curry, flip } from 'ramda'
export const readFile = fs.readFile
|> future,
|> curry,
|> flip
export const readFileUtf8 = readFile('utf-8')
PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p
edited Apr 3 at 22:51
answered Feb 28 at 4:41
Babak
662517
662517
FWIW, ++1 on this. It's an elegant implementation.
– Donald E. Foss
Apr 3 at 18:53
add a comment |
FWIW, ++1 on this. It's an elegant implementation.
– Donald E. Foss
Apr 3 at 18:53
FWIW, ++1 on this. It's an elegant implementation.
– Donald E. Foss
Apr 3 at 18:53
FWIW, ++1 on this. It's an elegant implementation.
– Donald E. Foss
Apr 3 at 18:53
add a comment |
up vote
1
down vote
In addition to @Bergi’s answer, I’d like to offer a third alternative. It's very similar to @Bergi’s 2nd example, but instead of awaiting each readFile
individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map()
does not need to be async
, since fs.readFile
returns a Promise object anyway. Therefore promises
is an array of Promise objects, which can be sent to Promise.all()
.
In @Bergi’s answer, the console may log file contents out of order. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files
array. However, in my method above, you are guaranteed the console will log the files in the same order as they are read.
add a comment |
up vote
1
down vote
In addition to @Bergi’s answer, I’d like to offer a third alternative. It's very similar to @Bergi’s 2nd example, but instead of awaiting each readFile
individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map()
does not need to be async
, since fs.readFile
returns a Promise object anyway. Therefore promises
is an array of Promise objects, which can be sent to Promise.all()
.
In @Bergi’s answer, the console may log file contents out of order. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files
array. However, in my method above, you are guaranteed the console will log the files in the same order as they are read.
add a comment |
up vote
1
down vote
up vote
1
down vote
In addition to @Bergi’s answer, I’d like to offer a third alternative. It's very similar to @Bergi’s 2nd example, but instead of awaiting each readFile
individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map()
does not need to be async
, since fs.readFile
returns a Promise object anyway. Therefore promises
is an array of Promise objects, which can be sent to Promise.all()
.
In @Bergi’s answer, the console may log file contents out of order. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files
array. However, in my method above, you are guaranteed the console will log the files in the same order as they are read.
In addition to @Bergi’s answer, I’d like to offer a third alternative. It's very similar to @Bergi’s 2nd example, but instead of awaiting each readFile
individually, you create an array of promises, each which you await at the end.
import fs from 'fs-promise';
async function printFiles () {
const files = await getFilePaths();
const promises = files.map((file) => fs.readFile(file, 'utf8'))
const contents = await Promise.all(promises)
contents.forEach(console.log);
}
Note that the function passed to .map()
does not need to be async
, since fs.readFile
returns a Promise object anyway. Therefore promises
is an array of Promise objects, which can be sent to Promise.all()
.
In @Bergi’s answer, the console may log file contents out of order. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files
array. However, in my method above, you are guaranteed the console will log the files in the same order as they are read.
edited Apr 21 at 21:21
answered Feb 23 at 0:47
chharvey
3,12332858
3,12332858
add a comment |
add a comment |
up vote
0
down vote
One important caveat is: The await + for .. of
method and the forEach + async
way actually have different effect.
Having await
inside a real for
loop will make sure all async calls are executed one by one. And the forEach + async
way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise
(less elegant) if you do not use async/await
and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as nativeforEach
- accessing indices instead of relying on iterability - and pass the index to the callback.
– Bergi
Nov 16 '17 at 13:57
You can useArray.prototype.reduce
in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258
– Timothy Zorn
Mar 26 at 19:54
add a comment |
up vote
0
down vote
One important caveat is: The await + for .. of
method and the forEach + async
way actually have different effect.
Having await
inside a real for
loop will make sure all async calls are executed one by one. And the forEach + async
way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise
(less elegant) if you do not use async/await
and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as nativeforEach
- accessing indices instead of relying on iterability - and pass the index to the callback.
– Bergi
Nov 16 '17 at 13:57
You can useArray.prototype.reduce
in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258
– Timothy Zorn
Mar 26 at 19:54
add a comment |
up vote
0
down vote
up vote
0
down vote
One important caveat is: The await + for .. of
method and the forEach + async
way actually have different effect.
Having await
inside a real for
loop will make sure all async calls are executed one by one. And the forEach + async
way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise
(less elegant) if you do not use async/await
and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
One important caveat is: The await + for .. of
method and the forEach + async
way actually have different effect.
Having await
inside a real for
loop will make sure all async calls are executed one by one. And the forEach + async
way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).
You can also use reduce + promise
(less elegant) if you do not use async/await
and want to make sure files are read one after another.
files.reduce((lastPromise, file) =>
lastPromise.then(() =>
fs.readFile(file, 'utf8')
), Promise.resolve()
)
Or you can create a forEachAsync to help but basically use the same for loop underlying.
Array.prototype.forEachAsync = async function(cb){
for(let x of this){
await cb(x);
}
}
answered Sep 24 '17 at 20:00
Leon li
2,4912426
2,4912426
Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as nativeforEach
- accessing indices instead of relying on iterability - and pass the index to the callback.
– Bergi
Nov 16 '17 at 13:57
You can useArray.prototype.reduce
in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258
– Timothy Zorn
Mar 26 at 19:54
add a comment |
Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as nativeforEach
- accessing indices instead of relying on iterability - and pass the index to the callback.
– Bergi
Nov 16 '17 at 13:57
You can useArray.prototype.reduce
in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258
– Timothy Zorn
Mar 26 at 19:54
Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as native
forEach
- accessing indices instead of relying on iterability - and pass the index to the callback.– Bergi
Nov 16 '17 at 13:57
Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as native
forEach
- accessing indices instead of relying on iterability - and pass the index to the callback.– Bergi
Nov 16 '17 at 13:57
You can use
Array.prototype.reduce
in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258– Timothy Zorn
Mar 26 at 19:54
You can use
Array.prototype.reduce
in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258– Timothy Zorn
Mar 26 at 19:54
add a comment |
up vote
0
down vote
Similar to Antonio Val's p-iteration
, an alternative npm module is async-af
:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af
has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
add a comment |
up vote
0
down vote
Similar to Antonio Val's p-iteration
, an alternative npm module is async-af
:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af
has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
add a comment |
up vote
0
down vote
up vote
0
down vote
Similar to Antonio Val's p-iteration
, an alternative npm module is async-af
:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af
has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
Similar to Antonio Val's p-iteration
, an alternative npm module is async-af
:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
// since AsyncAF accepts promises or non-promises, there's no need to await here
const files = getFilePaths();
AsyncAF(files).forEach(async file => {
const contents = await fs.readFile(file, 'utf8');
console.log(contents);
});
}
printFiles();
Alternatively, async-af
has a static method (log/logAF) that logs the results of promises:
const AsyncAF = require('async-af');
const fs = require('fs-promise');
function printFiles() {
const files = getFilePaths();
AsyncAF(files).forEach(file => {
AsyncAF.log(fs.readFile(file, 'utf8'));
});
}
printFiles();
However, the main advantage of the library is that you can chain asynchronous methods to do something like:
const aaf = require('async-af');
const fs = require('fs-promise');
const printFiles = () => aaf(getFilePaths())
.map(file => fs.readFile(file, 'utf8'))
.forEach(file => aaf.log(file));
printFiles();
async-af
answered Jun 21 at 16:55
Scott Rudiger
1136
1136
add a comment |
add a comment |
up vote
-2
down vote
I would use the well-tested (millions of downloads per week) pify and async modules. If you are unfamiliar with the async module, I highly recommend you check out its docs. I've seen multiple devs waste time recreating its methods, or worse, making difficult-to-maintain async code when higher-order async methods would simplify code.
const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')
async function getFilePaths() {
return Promise.resolve([
'./package.json',
'./package-lock.json',
]);
}
async function printFiles () {
const files = await getFilePaths()
await pify(async.eachSeries)(files, async (file) => { // <-- run in series
// await pify(async.each)(files, async (file) => { // <-- run in parallel
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
console.log('HAMBONE')
}
printFiles().then(() => {
console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```
This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/….
– jbustamovej
Feb 20 at 6:24
as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling
– Zachary Ryan Smith
Feb 21 at 1:54
add a comment |
up vote
-2
down vote
I would use the well-tested (millions of downloads per week) pify and async modules. If you are unfamiliar with the async module, I highly recommend you check out its docs. I've seen multiple devs waste time recreating its methods, or worse, making difficult-to-maintain async code when higher-order async methods would simplify code.
const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')
async function getFilePaths() {
return Promise.resolve([
'./package.json',
'./package-lock.json',
]);
}
async function printFiles () {
const files = await getFilePaths()
await pify(async.eachSeries)(files, async (file) => { // <-- run in series
// await pify(async.each)(files, async (file) => { // <-- run in parallel
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
console.log('HAMBONE')
}
printFiles().then(() => {
console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```
This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/….
– jbustamovej
Feb 20 at 6:24
as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling
– Zachary Ryan Smith
Feb 21 at 1:54
add a comment |
up vote
-2
down vote
up vote
-2
down vote
I would use the well-tested (millions of downloads per week) pify and async modules. If you are unfamiliar with the async module, I highly recommend you check out its docs. I've seen multiple devs waste time recreating its methods, or worse, making difficult-to-maintain async code when higher-order async methods would simplify code.
const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')
async function getFilePaths() {
return Promise.resolve([
'./package.json',
'./package-lock.json',
]);
}
async function printFiles () {
const files = await getFilePaths()
await pify(async.eachSeries)(files, async (file) => { // <-- run in series
// await pify(async.each)(files, async (file) => { // <-- run in parallel
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
console.log('HAMBONE')
}
printFiles().then(() => {
console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```
I would use the well-tested (millions of downloads per week) pify and async modules. If you are unfamiliar with the async module, I highly recommend you check out its docs. I've seen multiple devs waste time recreating its methods, or worse, making difficult-to-maintain async code when higher-order async methods would simplify code.
const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')
async function getFilePaths() {
return Promise.resolve([
'./package.json',
'./package-lock.json',
]);
}
async function printFiles () {
const files = await getFilePaths()
await pify(async.eachSeries)(files, async (file) => { // <-- run in series
// await pify(async.each)(files, async (file) => { // <-- run in parallel
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
console.log('HAMBONE')
}
printFiles().then(() => {
console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```
const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')
async function getFilePaths() {
return Promise.resolve([
'./package.json',
'./package-lock.json',
]);
}
async function printFiles () {
const files = await getFilePaths()
await pify(async.eachSeries)(files, async (file) => { // <-- run in series
// await pify(async.each)(files, async (file) => { // <-- run in parallel
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
console.log('HAMBONE')
}
printFiles().then(() => {
console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```
const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')
async function getFilePaths() {
return Promise.resolve([
'./package.json',
'./package-lock.json',
]);
}
async function printFiles () {
const files = await getFilePaths()
await pify(async.eachSeries)(files, async (file) => { // <-- run in series
// await pify(async.each)(files, async (file) => { // <-- run in parallel
const contents = await fs.readFile(file, 'utf8')
console.log(contents)
})
console.log('HAMBONE')
}
printFiles().then(() => {
console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```
answered Feb 4 at 16:03
Zachary Ryan Smith
95111019
95111019
This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/….
– jbustamovej
Feb 20 at 6:24
as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling
– Zachary Ryan Smith
Feb 21 at 1:54
add a comment |
This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/….
– jbustamovej
Feb 20 at 6:24
as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling
– Zachary Ryan Smith
Feb 21 at 1:54
This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/….
– jbustamovej
Feb 20 at 6:24
This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/….
– jbustamovej
Feb 20 at 6:24
as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling
– Zachary Ryan Smith
Feb 21 at 1:54
as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling
– Zachary Ryan Smith
Feb 21 at 1:54
add a comment |
protected by georgeawg Aug 16 at 17:41
Thank you for your interest in this question.
Because it has attracted low-quality or spam answers that had to be removed, posting an answer now requires 10 reputation on this site (the association bonus does not count).
Would you like to answer one of these unanswered questions instead?