getify / fasy Goto Github PK
View Code? Open in Web Editor NEWFP iterators that are both eager and asynchronous
License: MIT License
FP iterators that are both eager and asynchronous
License: MIT License
Hi @getify,
Wanted to comment on a part of the readme below:
Unfortunately, aside from being more verbose, this "fix" is fairly limited. It really only works for map(..) and not for something like filter(..). Also, since it assumes concurrency, there's no way to do the iterations serially (for any of various reasons).
although assumes concurrency is true for async function
s inside of Promise.all
, you still have for..of
loop available which would perfectly suit iterating over users
(let it be array or any other iterable) that, given you await
inside the loop, make it native sequential processing.
So I'm not sure if there's no way to do the iterations serially. Sure it fits map
only - doesn't fit filter
, reduce
- but it's worth mentioning. IMHO it's the easiest approach for sequential processing within async functions, if you just map
items sequentially.
Anyway, good work, as always! :)
Add note about how itiriri-async handles lazy async iteration.
Going through the list of methods in FPO, I think we probably should add serial/concurrent versions (where applicable) of these:
flatMap(..)
pipe(..)
(using reduce(..)
)compose(..)
(using reduceRight(..)
)filter(..)
alias to filterIn(..)
filterOut(..)
(the inverse of filter(..)
)Hi again Kyle :)
I've got a question about API.compose/API.pipe. The only difference I see between them is that the former is right-to-left, whereas the latter is left-to-right. Not sure what is the standard for composing functional APIs (if you know, I'd appreciate a link/resource), I thought that you write functions left-to-right and it is the internal implementation that reverses it.
Anyway, I found it somehow confusing what fasy offers right now: compose & pipe differ only in order and the names are different and logically they seem to be the same, somehow. I find reduce/reduceRight straightforward. Do you think compose/composeRight would make sense?
If you do:
FA.concurrent(5).map(someTask);
This could be interpreted as "I want task batches of maximum size 5 to execute concurrently (in parallel." But the question is, what happens when the first of those tasks completes? Does it immediately spin off another task, or does it wait for the batch to finish and then do another batch of size 5?
I could see either behavior being desirable, depending on the situation. So, we're going to handle it by letting you optionally specify a second number, a minimum-active threshold:
// FA.concurrent(5).map(someTask) --> means:
FA.concurrent(5,5).map(someTask);
This means, always try to keep 5 active tasks (as long as there are more tasks waiting in the pool. This is "continuous pooling".
But this:
FA.concurrent(5,3).map(someTask);
..means: "start a batch of 5, refill the batch once the active tasks count falls below 3."
And this:
FA.concurrent(5,1).map(someTask);
..means: "start a batch of 5, refill the batch once it's completely empty". In other words, this is standard "batch mode", whereas the previous (5,5)
would be like continuously drawing in new active tasks from the pool.
The default will be that the minimum-active threshold defaults to the batch size (aka, (5)
means (5,5)
and (3)
means (3,3)
).
Explore some way to combine/bring in CAF (cancelable async functions) capabilities so that fasy's async iterations are optionally cancelable.
This might need to be a separate namespace where the API methods assume the cancelable token being passed in, or it may just be that existing methods could be polymorphic in some way.
Also to be explored: should fasy re-implement the cancelation, or should there be an adapter that can be optionally applied which delegates to CAF (which would need to be present separately) when needed.
Like Ramda: https://ramdajs.com/docs/#reduced
Check out the iterators in itiriri-async and see if any of them would be useful to support in Fasy.
Provide a way to chunk concurrent request sets, say limited to maximum of 5 at a time, or whatever. Basically, serial is "concurrent with a limit of 1".
I'm thinking something like:
FA.concurrent(5).map(getFile,filenames);
This would limit to no more than five pending getFile(..)
calls at once.
Async-generators are stage3, so fairly likely to land in JS. Seems like fasy should support them eventually, probably sooner than later.
[UPDATE]: They landed in ES2018.
For example:
FP.serial.map(async function *mapper(v){
if (await lookup(v)) v = yield something(v);
return v;
});
Implementation should be fairly straightforward, in _runner(..)
.
Note: I don't think the for-await-of
loop will work to run the async-generator, because that construct doesn't let us send a "next value" back in after each iteration. But I think we can do a for-of
loop with await
inside it, to get almost the same syntax-sugar but have the capability to drive the iterator as we need to. We'll look for Symbol.asyncIterator
to know that we need to take this path.
However, before proceeding, we need to decide:
yield
s out a promise, do we wait on it and resume with its resolution, like we do in the normal generator-runner pattern?_runner(..)
)? Usually, return
d values from a generator would be thrown awayreturn
or return undefined
), should we assume the last yield
ed value (or its resolution, if it was a promise) as the overall completion of that function's promise (in _runner(..)
)?A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.