From 25066363883ae73392544fe3b3d690dcc78965de Mon Sep 17 00:00:00 2001 From: Harendra Kumar Date: Mon, 5 Aug 2019 07:19:45 +0530 Subject: [PATCH] Fix documentation for movement of APIs to prelude hide ltake and ltakeWhile as well. --- src/Streamly.hs | 203 ++++++++++++++++++++++++------------- src/Streamly/Fold.hs | 87 ++++++++-------- src/Streamly/Fold/Types.hs | 8 +- src/Streamly/Prelude.hs | 75 +++++++------- 4 files changed, 220 insertions(+), 153 deletions(-) diff --git a/src/Streamly.hs b/src/Streamly.hs index 8764f5d53..6d0866d6c 100644 --- a/src/Streamly.hs +++ b/src/Streamly.hs @@ -7,22 +7,49 @@ -- Stability : experimental -- Portability : GHC -- --- Haskell lists provide a simple and powerful API but they are limited to --- holding only pure values. Streamly streams are a logical extension of --- Haskell lists to support monadic sequences and powerful declarative --- concurrency. They provide almost the same API as lists (See --- "Streamly.Prelude"). In fact, Haskell lists can be expressed as pure --- streams, a special case of monadic streams, with similar or better --- performance and almost drop-in replacement. +-- Streamly is a general purpose programming framework using cocnurrent data +-- flow programming paradigm. It can be considered as a generalization of +-- Haskell lists to monadic streaming with concurrent composition capability. +-- The serial stream type in streamly @SerialT m a@ is like the list type @[a]@ +-- parameterized by the monad @m@. For example, @SerialT IO a@ is a moral +-- equivalent of @[a]@ in the IO monad. Streams are constructed very much like +-- lists, except that they use 'nil' and 'cons' instead of '[]' and ':'. -- --- Streams are designed for high performance applications and do not exhibit --- issues that are usually associated with lists. For example, streams --- express strings as streams of 'Char' and bytestrings as streams of 'Word8' --- without any issues, obviating the need for special purpose libraries like --- bytestring and text, and various lazy and strict falvors of those. Streamly --- arrays (See "Streamly.Array") complement streams for storing or buffering --- data efficiently and facilitating efficient interfacing of streams with IO --- systems. +-- @ +-- > import "Streamly" +-- > import "Streamly.Prelude" (cons, consM) +-- > import qualified "Streamly.Prelude" as S +-- > +-- > S.toList $ 1 \`cons` 2 \`cons` 3 \`cons` nil +-- [1,2,3] +-- @ +-- +-- Unlike lists, streams can be constructed from monadic effects: +-- +-- @ +-- > S.'toList' $ 'getLine' \`consM` 'getLine' \`consM` S.'nil' +-- hello +-- world +-- ["hello","world"] +-- @ +-- +-- Streams are processed just like lists, with list like combinators, except +-- that they are monadic and work in a streaming fashion. Here is a simple +-- console echo program example: +-- +-- @ +-- > S.drain $ S.repeatM getLine & S.mapM putStrLn +-- @ +-- +-- @SerialT Identity a@ is a moral equivalent of pure lists. Streamly utilizes +-- fusion for high performance, therefore, we can represent and process strings +-- as streams of 'Char', encode and decode the streams to/from UTF8 and +-- serialize them to @Array Word8@ obviating the need for special purpose +-- libraries like @bytestring@ and @text@. +-- +-- For more details +-- please see the "Streamly.Tutorial" module and the examples directory in this +-- package. {-# LANGUAGE CPP #-} {-# LANGUAGE MultiParamTypeClasses #-} @@ -35,12 +62,19 @@ module Streamly ( - -- * Streams and Folds - -- $overview - - -- * Streams Overview + -- * Concepts Overview + -- ** Streams -- $streams + -- ** Folds + -- $folds + + -- ** Arrays + -- $arrays + + -- * Module Overview + -- $streamtypes + -- * Type Synonyms MonadAsync @@ -165,43 +199,52 @@ import qualified Streamly.Streams.StreamK as K -- XXX provide good succinct examples of pipelining, merging, splitting ect. -- below. -- --- $overview --- Streamly is a general purpose concurrent data flow programming engine --- expressing a program as a network of streams and folds. Stream types (e.g. --- 'SerialT') exported from this module represent streams or more specifically --- stream producers while the fold type exported from "Streamly.Fold" --- represents stream consumers. Both producers and consumers can be --- transformed in many ways e.g. using map, scan, filter etc. Transformations --- can be chained into a pipeline, multiple streams can be merged together or a --- stream can be split into multiple independent data flows. All these --- compositions can be combined together to form an arbitrary data flow network --- expressing any kind of general purpose computing in a declarative manner. +-- $streams -- --- The simplest way to compose a data processing pipeline is to generate a --- stream of data (e.g. from a file or network), chain pipelines of --- transformations on the stream and finally pass it to a fold. Folds represent --- specific ways to consume a stream (e.g. producing a sum of numbers). --- Transformations can also be applied on folds before they are attached with a --- stream. +-- A program is expressed as a network of streams and folds. A stream is a +-- source or generator of data elements and a fold is a consumer of data elements that +-- reduces multiple input elements to a single value. -- --- @ +-- In the following example, a 'Word8' stream is generated by using +-- 'Streamly.FileSystem.Handle.read' on a file handle, then the +-- 'Streamly.Prelude.splitBySuffix' transformation splits the stream on +-- newlines (ascii value 10); it uses the 'Streamly.Fold.drain' fold to reduce +-- the resulting lines to unit values (@()@), 'Streamly.Prelude.length' fold +-- then counts the unit elements in the resulting stream which gives us the +-- number of lines in the file: -- --- ---Stream m a-->stream transforms-->fold transforms-->Fold m a b --- @ +-- > S.length $ S.splitOnSuffix FL.drain 10 $ FH.read fh -- --- Stream producers can be appended, merged, zipped or nested to build an --- arbitrarily complex composed stream producer. Transformations can be applied --- at any point in the composition tree. See "Streamly.Prelude" module for more --- details on the combinators. +-- The following example folds the lines to arrays of 'Word8' using the +-- 'Streamly.Mem.Array.writeF' fold and then wraps the lines in square +-- brackets before writing them to standard output using +-- 'Streamly.FileSystem.Handle.write': -- --- @ +-- > wrapLine ln = S.fromList "[" <> A.read ln <> S.fromList "]\n" +-- > readLines = S.splitOnSuffix A.writeF 10 +-- > FH.write stdout $ S.concatMap wrapLine $ readLines fh1 -- --- -------Stream m a-->transform-->| --- | --- -------Stream m a-->transform-->|=>---transform-->Stream m a---> --- | --- -------Stream m a-->transform-->| --- @ +-- One stream can be appended after another: +-- +-- > FH.write stdout $ S.concatMap wrapLine $ readLines fh1 <> readLines fh2 +-- +-- The following example reads two files concurrently, merges the lines from +-- the two streams and writes the resulting stream to another file: +-- +-- > FH.write stdout $ S.concatMap wrapLine $ readLines fh1 `parallel` readLines fh2 +-- +-- There are many ways to generate, merge, zip, transform and fold data +-- streams. Many transformations can be chained in a stream pipeline. See +-- "Streamly.Prelude" module for combinators to manipulate streams. + +-- $folds +-- +-- The way stream types in this module like 'SerialT' represent data sources, +-- the same way the 'Fold' type from "Streamly.Fold" represents data sinks or +-- reducers of streams. Reducers can be combined to consume a stream source in +-- many ways. The simplest is to reduce a stream source using a fold e.g.: +-- +-- > S.runFold FL.length $ S.enumerateTo 100 -- -- Folds are consumers of streams and can be used to split a stream into -- multiple independent flows. Grouping transforms a stream by applying a fold @@ -222,7 +265,37 @@ import qualified Streamly.Streams.StreamK as K -- @ -- --- $streams +-- $arrays +-- +-- Streamly arrays (See "Streamly.Mem.Array") complement streams to provide an +-- efficient computing paradigm. Streams are suitable for immutable +-- transformations of /potentially infinite/ data using /sequential access/ and +-- pipelined transformations whereas arrays are suitable for in-place +-- transformations of /necessarily finite/ data using /random access/. Streams +-- are synonymous with /sequential pipelined processing/ whereas arrays are +-- synonymous with /efficient buffering and random access/. +-- +-- In general, a data processing pipeline reads data from some IO device, does +-- some processing on it and finally writes the output to another IO device. +-- Streams provide the overall framework of sequential processing pipeline in +-- which arrays are used as buffering elements in the middle. In addition to +-- buffering in the middle, arrays can also be used at the boundaries of the +-- pipeline to efficiently interface with external storage systems like memory, +-- files and network. If streams are the pipes in a water pipeline network +-- then arrays are like the storage tanks in the middle. On the input side, +-- think of arrays as buckets to fetch water to feed the pipeline and on the +-- output side buckets to remove the processed water. +-- +-- 'ByteString' data type from the 'bytestring' package and the 'Text' data +-- type from the 'text' package are special cases of arrays. 'ByteString' is +-- like @Array Word8@ and 'Text' is like @utf16@ encoded @Array Word8@. +-- Streamly arrays can be transformed as efficiently as @bytestring@ or @text@ +-- by using stream operations on them. + +-- Streams and arrays are equally important in computing. They are computing +-- duals of each other. + +-- $streamtypes -- The basic stream type is 'Serial', it represents a sequence of IO actions, -- and is a 'Monad'. The type 'SerialT' is a monad transformer that can -- represent a sequence of actions in an arbitrary monad. The type 'Serial' is @@ -236,32 +309,22 @@ import qualified Streamly.Streams.StreamK as K -- explicit stream type combinators are used, the default stream type is -- inferred as 'Serial'. -- --- Here is a simple console echo program example: +-- This module exports stream types, instances and combinators for: -- --- @ --- > drain $ S.repeatM getLine & S.mapM putStrLn --- @ --- --- For more details please see the "Streamly.Tutorial" module and the examples --- directory in this package. --- --- This module exports stream types, instances and some basic operations. --- Functionality exported by this module include: --- --- * Semigroup append ('<>') instances as well as explicit operations for merging streams --- * Monad and Applicative instances for looping over streams --- * Zip Applicatives for zipping streams --- * Stream type combinators to convert between different composition styles --- * Some basic utilities to run and fold streams --- --- See the "Streamly.Prelude" module for comprehensive APIs for construction, --- generation, elimination and transformation of streams. +-- * converting between different stream types +-- * appending and concurrently merging streams +-- * Concurrency control +-- * Concurrent function application +-- * Stream rate control -- -- This module is designed to be imported unqualified: -- -- @ -- import Streamly -- @ +-- +-- See the "Streamly.Prelude" module for APIs for construction, +-- generation, elimination and transformation of streams. ------------------------------------------------------------------------------ -- Eliminating a stream diff --git a/src/Streamly/Fold.hs b/src/Streamly/Fold.hs index df2fe937e..5010b448d 100644 --- a/src/Streamly/Fold.hs +++ b/src/Streamly/Fold.hs @@ -15,14 +15,15 @@ -- Stability : experimental -- Portability : GHC -- --- A left fold consumes a stream and reduces it to a single value. Using the --- fold combinators in "Streamly.Prelude" module only one fold operation can be --- applied to a stream. This module provides a 'Fold' type that represents a --- left fold. Multiple such folds can be combined in different ways; a stream --- can then be supplied to the combined fold. --- For example, a distributive applicative composition distributes the input to +-- A left fold consumes a stream and reduces it to a single value. The fold +-- operations in "Streamly.Prelude" cannot be combined such that multiple of +-- them can run on the same stream. This module provides a 'Fold' type that +-- represents a left fold. Multiple such folds can be combined using +-- combinators; a stream can then be supplied to the combined fold and it would +-- distribute the input to constituent folds according to the composition. +-- For example, an applicative composition distributes the same input to -- the constituent folds and then combines the fold outputs. Similarly, a --- partitioning composition can partition the input among constituent folds. +-- partitioning combinator can divide the input among constituent folds. -- All the combinators in this module are of true streaming nature, stream -- elements are not unnecessarily buffered in memory, guaranteeing a constant -- memory consumption. @@ -32,18 +33,9 @@ -- sources in interesting ways whereas this module provides combinators that -- combine stream consumers in interesting ways. In other words, -- "Streamly.Prelude" provides stream merging capabilities while --- "Streamly.Fold" provides stream splitting capabilities. Both the modules --- are organized in the same way so that you can easily find the corresponding --- operations. +-- "Streamly.Fold" provides stream splitting capabilities. -- -- > import qualified Streamly.Fold as FL --- --- A left fold is represented by the type 'Fold'. @Fold m a b@ folds an --- input stream consisting of values of type @a@ to a structure of type --- @b@. The fold can be run using the 'foldl'' combinator and an input stream. --- --- >>> FL.foldl' FL.sum (S.enumerateFromTo 1 100) --- 5050 -- Also see the "Streamly.Sink" module that provides specialized left folds -- that discard the outputs. @@ -53,6 +45,13 @@ module Streamly.Fold ( -- * Fold Type + -- | + -- A 'Fold' can be run over a stream using the 'runFold' combinator in + -- "Streamly.Prelude": + -- + -- >>> S.runFold FL.sum (S.enumerateFromTo 1 100) + -- 5050 + Fold -- (..) -- , tail @@ -118,7 +117,7 @@ module Streamly.Fold -- Transformations can be applied either on the input side or on the output -- side. The 'Functor' instance of a fold maps on the output of the fold: -- - -- >>> FL.foldl' (fmap show FL.sum) (S.enumerateFromTo 1 100) + -- >>> S.runFold (fmap show FL.sum) (S.enumerateFromTo 1 100) -- "5050" -- -- However, the input side or contravariant transformations are more @@ -164,12 +163,12 @@ module Streamly.Fold , lreverse -} + {- -- * Parsing -- ** Trimming , ltake -- , lrunFor -- time , ltakeWhile - {- , ltakeWhileM , ldrop , ldropWhile @@ -194,7 +193,7 @@ module Streamly.Fold -- stream twice: -- -- >>> let avg = (/) <$> FL.sum <*> fmap fromIntegral FL.length - -- >>> FL.foldl' avg (S.enumerateFromTo 1.0 100.0) + -- >>> S.runFold avg (S.enumerateFromTo 1.0 100.0) -- 50.5 -- -- The 'Semigroup' and 'Monoid' instances of a distributing fold distribute @@ -202,7 +201,7 @@ module Streamly.Fold -- Semigroup instances of the output types: -- -- >>> import Data.Monoid (Sum) - -- >>> FL.foldl' (FL.head <> FL.last) (fmap Sum $ S.enumerateFromTo 1.0 100.0) + -- >>> S.runFold (FL.head <> FL.last) (fmap Sum $ S.enumerateFromTo 1.0 100.0) -- Just (Sum {getSum = 101.0}) -- -- The 'Num', 'Floating', and 'Fractional' instances work in the same way. @@ -339,7 +338,7 @@ _transform (Pipe pstep1 pstep2 pinitial) (Fold fstep finitial fextract) = -- | @(lmap f fold)@ maps the function @f@ on the input of the fold. -- --- >>> FL.foldl' (FL.lmap (\x -> x * x) FL.sum) (S.enumerateFromTo 1 100) +-- >>> S.runFold (FL.lmap (\x -> x * x) FL.sum) (S.enumerateFromTo 1 100) -- 338350 -- -- @since 0.7.0 @@ -547,7 +546,7 @@ stdDev = sqrt variance -- | Fold an input stream consisting of monoidal elements using 'mappend' -- and 'mempty'. -- --- > FL.foldl FL.mconcat (S.map Sum $ S.enumerateFromTo 1 10) +-- > S.runFold FL.mconcat (S.map Sum $ S.enumerateFromTo 1 10) -- -- @since 0.7.0 {-# INLINABLE mconcat #-} @@ -560,7 +559,7 @@ mconcat = Fold (\x a -> return $ mappend x a) (return mempty) return -- Make a fold from a pure function that folds the output of the function -- using 'mappend' and 'mempty'. -- --- > FL.foldl (FL.foldMap Sum) $ S.enumerateFromTo 1 10 +-- > S.runFold (FL.foldMap Sum) $ S.enumerateFromTo 1 10 -- -- @since 0.7.0 {-# INLINABLE foldMap #-} @@ -573,7 +572,7 @@ foldMap f = lmap f mconcat -- Make a fold from a monadic function that folds the output of the function -- using 'mappend' and 'mempty'. -- --- > FL.foldM (FL.foldMapM (return . Sum)) $ S.enumerateFromTo 1 10 +-- > S.runFold (FL.foldMapM (return . Sum)) $ S.enumerateFromTo 1 10 -- -- @since 0.7.0 {-# INLINABLE foldMapM #-} @@ -781,7 +780,7 @@ or = Fold (\x a -> return $ x || a) (return False) return -- | Include only those elements that pass a predicate. -- --- >>> FL.foldl (lfilter (> 5) FL.sum) [1..10] +-- >>> S.runFold (lfilter (> 5) FL.sum) [1..10] -- 40 -- -- @since 0.7.0 @@ -805,9 +804,9 @@ _lfilterM f (Fold step begin done) = Fold step' begin done -- | Take first 'n' elements from the stream and discard the rest. -- -- @since 0.7.0 -{-# INLINABLE ltake #-} -ltake :: Monad m => Int -> Fold m a b -> Fold m a b -ltake n (Fold step initial done) = Fold step' initial' done' +{-# INLINABLE _ltake #-} +_ltake :: Monad m => Int -> Fold m a b -> Fold m a b +_ltake n (Fold step initial done) = Fold step' initial' done' where initial' = fmap (Tuple' 0) initial step' (Tuple' i r) a = do @@ -821,9 +820,9 @@ ltake n (Fold step initial done) = Fold step' initial' done' -- | Takes elements from the input as long as the predicate succeeds. -- -- @since 0.7.0 -{-# INLINABLE ltakeWhile #-} -ltakeWhile :: Monad m => (a -> Bool) -> Fold m a b -> Fold m a b -ltakeWhile predicate (Fold step initial done) = Fold step' initial' done' +{-# INLINABLE _ltakeWhile #-} +_ltakeWhile :: Monad m => (a -> Bool) -> Fold m a b -> Fold m a b +_ltakeWhile predicate (Fold step initial done) = Fold step' initial' done' where initial' = fmap Left' initial step' (Left' r) a = do @@ -845,7 +844,7 @@ ltakeWhile predicate (Fold step initial done) = Fold step' initial' done' -- ---stream m a---| |---m (b,c) -- |-------Fold m a c--------| -- @ --- >>> FL.foldl' (FL.tee FL.sum FL.length) (S.enumerateFromTo 1.0 100.0) +-- >>> S.runFold (FL.tee FL.sum FL.length) (S.enumerateFromTo 1.0 100.0) -- (5050.0,100) -- -- @since 0.7.0 @@ -887,7 +886,7 @@ foldCons (Fold stepL beginL doneL) (Fold stepR beginR doneR) = -- ... -- @ -- --- >>> FL.foldl' (FL.distribute [FL.sum, FL.length]) (S.enumerateFromTo 1 5) +-- >>> S.runFold (FL.distribute [FL.sum, FL.length]) (S.enumerateFromTo 1 5) -- [15,5] -- -- This is the consumer side dual of the producer side 'sequence' operation. @@ -916,7 +915,7 @@ distribute (x:xs) = foldCons x (distribute xs) -- -- >>> import System.Random (randomIO) -- >>> randomly a = randomIO >>= \x -> return $ if x then Left a else Right a --- >>> FL.foldl' (FL.partitionByM randomly FL.length FL.length) (S.enumerateFromTo 1 100) +-- >>> S.runFold (FL.partitionByM randomly FL.length FL.length) (S.enumerateFromTo 1 100) -- (59,41) -- -- Send input to the two folds in a proportion of 2:1: @@ -932,7 +931,7 @@ distribute (x:xs) = foldCons x (distribute xs) -- -- main = do -- f <- proportionately 2 1 --- r <- FL.foldl' (FL.partitionByM f FL.length FL.length) (S.enumerateFromTo (1 :: Int) 100) +-- r <- S.runFold (FL.partitionByM f FL.length FL.length) (S.enumerateFromTo (1 :: Int) 100) -- print r -- @ -- @ @@ -971,7 +970,7 @@ partitionByM f (Fold stepL beginL doneL) (Fold stepR beginR doneR) = -- >>> let f = FL.partitionBy (\\n -> if even n then Left n else Right n) -- (fmap (("Even " ++) . show) FL.length) -- (fmap (("Odd " ++) . show) FL.length) --- in FL.foldl' f (S.enumerateFromTo 1 100) +-- in S.runFold f (S.enumerateFromTo 1 100) -- ("Even 50","Odd 50") -- @ -- @@ -1058,7 +1057,7 @@ demuxWith f kv = Fold step initial extract -- @ -- > let table = Data.Map.fromList [(\"SUM", FL.sum), (\"PRODUCT", FL.product)] -- input = S.fromList [(\"SUM",1),(\"PRODUCT",2),(\"SUM",3),(\"PRODUCT",4)] --- in FL.foldl' (FL.demux table) input +-- in S.runFold (FL.demux table) input -- One 1 -- Two 2 -- @ @@ -1116,7 +1115,7 @@ demuxWith_ f kv = Fold step initial extract -- > let prn = FL.drainBy print -- > let table = Data.Map.fromList [(\"ONE", prn), (\"TWO", prn)] -- input = S.fromList [(\"ONE",1),(\"TWO",2)] --- in FL.foldl' (FL.demux_ table) input +-- in S.runFold (FL.demux_ table) input -- One 1 -- Two 2 -- @ @@ -1136,7 +1135,7 @@ demux_ fs = demuxWith_ fst (Map.map (lmap snd) fs) -- -- @ -- > let input = S.fromList [(\"ONE",1),(\"ONE",1.1),(\"TWO",2), (\"TWO",2.2)] --- in FL.foldl' (FL.classify FL.toListRev) input +-- in S.runFold (FL.classify FL.toListRev) input -- fromList [(\"ONE",[1.1,1.0]),(\"TWO",[2.2,2.0])] -- @ -- @@ -1166,7 +1165,7 @@ classifyWith f (Fold step initial extract) = Fold step' initial' extract' -- -- @ -- > let input = S.fromList [(\"ONE",1),(\"ONE",1.1),(\"TWO",2), (\"TWO",2.2)] --- in FL.foldl' (FL.classify FL.toListRev) input +-- in S.runFold (FL.classify FL.toListRev) input -- fromList [(\"ONE",[1.1,1.0]),(\"TWO",[2.2,2.0])] -- @ -- @@ -1238,9 +1237,9 @@ unzip = unzipWith id -- accumulator. Thus we can resume the fold later and feed it more input. -- -- >> do --- > more <- FL.foldl (FL.duplicate FL.sum) (S.enumerateFromTo 1 10) --- > evenMore <- FL.foldl (FL.duplicate more) (S.enumerateFromTo 11 20) --- > FL.foldl evenMore (S.enumerateFromTo 21 30) +-- > more <- S.runFold (FL.duplicate FL.sum) (S.enumerateFromTo 1 10) +-- > evenMore <- S.runFold (FL.duplicate more) (S.enumerateFromTo 11 20) +-- > S.runFold evenMore (S.enumerateFromTo 21 30) -- > 465 -- -- @since 0.7.0 diff --git a/src/Streamly/Fold/Types.hs b/src/Streamly/Fold/Types.hs index 82a53ffaf..ee2d078b0 100644 --- a/src/Streamly/Fold/Types.hs +++ b/src/Streamly/Fold/Types.hs @@ -25,13 +25,14 @@ import Streamly.Strict (Tuple'(..)) -- | Represents a left fold over an input stream of values of type @a@ to a -- single value of type @b@ in 'Monad' @m@. +-- +-- @since 0.7.0 -- The fold uses an intermediate type @x@ as accumulator. The fold accumulator -- is initialized by calling the @init@ function and is then driven by calling -- the step function repeatedly. When the fold is done the @extract@ function -- is used to map the intermediate type @x@ to the final type @b@. This allows -- the state of the fold to be embedded in an arbitrary type @x@. --- data Fold m a b = -- | @Fold @ @ step @ @ initial @ @ extract@ forall x. Fold (x -> a -> m x) (m x) (x -> m b) @@ -46,9 +47,8 @@ instance Applicative m => Functor (Fold m a) where {-# INLINE (<$) #-} (<$) b = \_ -> pure b --- | The input (type @a@) of the composed fold is distributed to both the --- constituent folds. The outputs of the constituent folds (type @b@) are --- applied to a function. +-- | The fold resulting from '<*>' distributes its input to both the argument +-- folds and combines their output using the supplied function. instance Applicative m => Applicative (Fold m a) where {-# INLINE pure #-} pure b = Fold (\() _ -> pure ()) (pure ()) (\() -> pure b) diff --git a/src/Streamly/Prelude.hs b/src/Streamly/Prelude.hs index a2c4beb57..4f917887a 100644 --- a/src/Streamly/Prelude.hs +++ b/src/Streamly/Prelude.hs @@ -170,16 +170,18 @@ module Streamly.Prelude -- the previous step. However, it is possible to fold parts of the stream in -- parallel and then combine the results using a monoid. + -- ** Right Folds , foldrM , foldrS , foldrT , foldr + -- ** Left Folds , foldl' , foldl1' , foldlM' - -- * Running Folds + -- ** Composable Left Folds -- $runningfolds , runFold @@ -316,6 +318,7 @@ module Streamly.Prelude -- > foldl f z xs = last $ scanl f z xs -- > foldr f z xs = head $ scanr f z xs + -- ** Left scans , scanl' , scanlM' , postscanl' @@ -325,7 +328,7 @@ module Streamly.Prelude , scanl1' , scanl1M' - -- ** Running Scans + -- ** Scan Using Fold , runScan , runPostscan @@ -339,6 +342,7 @@ module Streamly.Prelude -- , lprescanl' -- , lprescanlM' + -- ** Indexing , indexed , indexedR -- , timestamped @@ -460,7 +464,7 @@ module Streamly.Prelude -- , interposeBy -- , intercalate - -- ** Breaking + -- -- ** Breaking -- By chunks -- , splitAt -- spanN @@ -490,14 +494,14 @@ module Streamly.Prelude -- , groupScan - -- *** Chunks + -- -- *** Chunks , chunksOf , sessionsOf -- , lchunksOf -- , lsessionsOf - -- *** Using Element Separators + -- -- *** Using Element Separators -- On == Dropping the separator , splitOn , splitOnSuffix @@ -1346,19 +1350,20 @@ foldlM' step begin m = S.foldlM' step begin $ toStreamS m -- $runningfolds -- --- We can use the left folds in this module instead of the folds in --- "Streamly.Prelude". For example the following two ways of folding are --- equivalent in functionality and performance, +-- "Streamly.Fold" module defines composable left folds which can be combined +-- together in many interesting ways. Those folds can be run using 'runFold'. +-- The following two ways of folding are equivalent in functionality and +-- performance, -- --- >>> FL.foldl' FL.sum (S.enumerateFromTo 1 100) +-- >>> S.runFold FL.sum (S.enumerateFromTo 1 100) -- 5050 -- >>> S.sum (S.enumerateFromTo 1 100) -- 5050 -- --- However, left folds are push type folds. That means we push the entire input --- to a fold before we can get the output. Therefore, the performance is +-- However, left folds cannot terminate early even if it does not need to +-- consume more input to determine the result. Therefore, the performance is -- equivalent only for full folds like 'sum' and 'length'. For partial folds --- like 'head' or 'any' the folds in "Streamly.Prelude" may be much more +-- like 'head' or 'any' the the folds defined in this module may be much more -- efficient because they are implemented as right folds that terminate as soon -- as we get the result. Note that when a full fold is composed with a partial -- fold in parallel the performance is not impacted as we anyway have to @@ -1366,13 +1371,13 @@ foldlM' step begin m = S.foldlM' step begin $ toStreamS m -- -- >>> S.head (1 `S.cons` undefined) -- Just 1 --- >>> FL.foldl' FL.head (1 `S.cons` undefined) +-- >>> S.runFold FL.head (1 `S.cons` undefined) -- *** Exception: Prelude.undefined -- -- However, we can wrap the fold in a scan to convert it into a lazy stream of -- fold steps. We can then terminate the stream whenever we want. For example, -- --- >>> S.toList $ S.take 1 $ FL.scanl' FL.head (1 `S.cons` undefined) +-- >>> S.toList $ S.take 1 $ S.runScan FL.head (1 `S.cons` undefined) -- [Nothing] -- -- The following example extracts the input stream up to a point where the @@ -1382,15 +1387,15 @@ foldlM' step begin m = S.foldlM' step begin $ toStreamS m -- > S.toList -- $ S.map (fromJust . fst) -- $ S.takeWhile (\\(_,x) -> x <= 10) --- $ FL.postscanl' ((,) \<$> FL.last \<*> avg) (S.enumerateFromTo 1.0 100.0) +-- $ S.runPostscan ((,) \<$> FL.last \<*> avg) (S.enumerateFromTo 1.0 100.0) -- @ -- @ -- [1.0,2.0,3.0,4.0,5.0,6.0,7.0,8.0,9.0,10.0,11.0,12.0,13.0,14.0,15.0,16.0,17.0,18.0,19.0] -- @ --- | Fold a stream using the supplied monadic fold. +-- | Fold a stream using the supplied left fold. -- --- >>> S.foldlStream FL.sum (S.enumerateFromTo 1 100) +-- >>> S.runFold FL.sum (S.enumerateFromTo 1 100) -- 5050 -- -- @since 0.7.0 @@ -2181,15 +2186,15 @@ mapMaybeMSerial f m = fromStreamD $ D.mapMaybeM f $ toStreamD m -- XXX Use a compact region list to temporarily store the list, in both reverse -- as well as in reverse'. -- +-- /Note:/ 'reverse'' is much faster than this, use that when performance +-- matters. +-- -- | -- > reverse = S.foldlT (flip S.cons) S.nil -- -- Returns the elements of the stream in reverse order. The stream must be -- finite. Note that this necessarily buffers the entire stream in memory. -- --- /Note:/ 'reverse'' is much faster than this, use that when performance --- matters. --- -- /Since 0.7.0 (Monad m constraint)/ -- -- /Since: 0.1.1/ @@ -2567,7 +2572,7 @@ intercalate :: (IsStream t, MonadAsync m) => t m a -> t m a -> t m a -- elements of its input are consumed by fold @f1@ and the rest of the stream -- is consumed by fold @f2@. -- --- > let splitAt_ n xs = FL.foldl' (FL.splitAt n FL.toList FL.toList) $ S.fromList xs +-- > let splitAt_ n xs = S.runFold (FL.splitAt n FL.toList FL.toList) $ S.fromList xs -- -- >>> splitAt_ 6 "Hello World!" -- > ("Hello ","World!") @@ -2656,7 +2661,7 @@ groupScan split fold m = undefined -- | Group the input stream into groups of @n@ elements each and then fold each -- group using the provided fold function. -- --- >> S.toList $ FL.chunksOf 2 FL.sum (S.enumerateFromTo 1 10) +-- >> S.toList $ S.chunksOf 2 FL.sum (S.enumerateFromTo 1 10) -- > [3,7,11,15,19] -- -- This can be considered as an n-fold version of 'ltake' where we apply @@ -2736,7 +2741,7 @@ _spanBy cmp (Fold stepL initialL extractL) (Fold stepR initialR extractR) = -- input as long as the predicate @p@ is 'True'. @f2@ consumes the rest of the -- input. -- --- > let span_ p xs = FL.foldl' (FL.span p FL.toList FL.toList) $ S.fromList xs +-- > let span_ p xs = S.runFold (S.span p FL.toList FL.toList) $ S.fromList xs -- -- >>> span_ (< 1) [1,2,3] -- > ([],[1,2,3]) @@ -2781,7 +2786,7 @@ span p (Fold stepL initialL extractL) (Fold stepR initialR extractR) = -- -- This is the binary version of 'splitBy'. -- --- > let break_ p xs = FL.foldl' (FL.break p FL.toList FL.toList) $ S.fromList xs +-- > let break_ p xs = S.runFold (S.break p FL.toList FL.toList) $ S.fromList xs -- -- >>> break_ (< 1) [3,2,1] -- > ([3,2,1],[]) @@ -2838,7 +2843,7 @@ _spanRollingBy cmp (Fold stepL initialL extractL) (Fold stepR initialR extractR) -- group is folded using the fold @f@ and the result of the fold is emitted in -- the output stream. -- --- >>> S.toList $ FL.groupsBy (>) FL.toList $ S.fromList [1,3,7,0,2,5] +-- >>> S.toList $ S.groupsBy (>) FL.toList $ S.fromList [1,3,7,0,2,5] -- > [[1,3,7],[0,2,5]] -- -- @since 0.7.0 @@ -2859,7 +2864,7 @@ groupsBy cmp f m = D.fromStreamD $ D.groupsBy cmp f (D.toStreamD m) -- comparison fails a new group is started. Each group is folded using the fold -- @f@. -- --- >>> S.toList $ FL.groupsByRolling (\a b -> a + 1 == b) FL.toList $ S.fromList [1,2,3,7,8,9] +-- >>> S.toList $ S.groupsByRolling (\a b -> a + 1 == b) FL.toList $ S.fromList [1,2,3,7,8,9] -- > [[1,2,3],[7,8,9]] -- -- @since 0.7.0 @@ -2878,7 +2883,7 @@ groupsByRolling cmp f m = D.fromStreamD $ D.groupsRollingBy cmp f (D.toStreamD -- -- Groups contiguous spans of equal elements together in individual groups. -- --- >>> S.toList $ FL.groups FL.toList $ S.fromList [1,1,2,2] +-- >>> S.toList $ S.groups FL.toList $ S.fromList [1,1,2,2] -- > [[1,1],[2,2]] -- -- @since 0.7.0 @@ -2895,7 +2900,7 @@ groups = groupsBy (==) -- stream before the sequence and the second part consisting of the sequence -- and the rest of the stream. -- --- > let breakOn_ pat xs = FL.foldl' (FL.breakOn pat FL.toList FL.toList) $ S.fromList xs +-- > let breakOn_ pat xs = S.runFold (S.breakOn pat FL.toList FL.toList) $ S.fromList xs -- -- >>> breakOn_ "dear" "Hello dear world!" -- > ("Hello ","dear world!") @@ -2931,7 +2936,7 @@ breakOn pat f m = undefined -- -- Let's use the following definition for illustration: -- --- > splitOn' p xs = S.toList $ FL.splitOn p (FL.toList) (S.fromList xs) +-- > splitOn' p xs = S.toList $ S.splitOn p (FL.toList) (S.fromList xs) -- -- >>> splitOn' (== '.') "" -- [""] @@ -2975,7 +2980,7 @@ splitOn predicate f m = -- ".--." => "" "--" -- @ -- --- > splitOnSuffix' p xs = S.toList $ FL.splitSuffixBy p (FL.toList) (S.fromList xs) +-- > splitOnSuffix' p xs = S.toList $ S.splitSuffixBy p (FL.toList) (S.fromList xs) -- -- >>> splitOnSuffix' (== '.') "" -- [] @@ -3021,7 +3026,7 @@ splitOnSuffix predicate f m = -- @["a","b"]@. In other words, it treats the input like words separated by -- whitespace elements determined by the predicate. -- --- > wordsBy' p xs = S.toList $ FL.wordsBy p (FL.toList) (S.fromList xs) +-- > wordsBy' p xs = S.toList $ S.wordsBy p (FL.toList) (S.fromList xs) -- -- >>> wordsBy' (== ',') "" -- > [] @@ -3045,7 +3050,7 @@ wordsBy predicate f m = -- | Like 'splitOnSuffix' but keeps the suffix attached to the resulting -- splits. -- --- > splitBySuffix' p xs = S.toList $ FL.splitBySuffix p (FL.toList) (S.fromList xs) +-- > splitBySuffix' p xs = S.toList $ S.splitBySuffix p (FL.toList) (S.fromList xs) -- -- >>> splitBySuffix' (== '.') "" -- [] @@ -3119,7 +3124,7 @@ splitBySuffix predicate f m = -- For illustration, let's define a function that operates on pure lists: -- -- @ --- splitOnSeq' pat xs = S.toList $ FL.splitOnSeq (A.fromList pat) (FL.toList) (S.fromList xs) +-- splitOnSeq' pat xs = S.toList $ S.splitOnSeq (A.fromList pat) (FL.toList) (S.fromList xs) -- @ -- -- >>> splitOnSeq' "" "hello" @@ -3184,7 +3189,7 @@ splitOnAny subseq f m = undefined -- D.fromStreamD $ D.splitOnAny f subseq (D.to -- | Like 'splitSuffixBy' but the separator is a sequence of elements, instead -- of a predicate for a single element. -- --- > splitSuffixOn_ pat xs = S.toList $ FL.splitSuffixOn (A.fromList pat) (FL.toList) (S.fromList xs) +-- > splitSuffixOn_ pat xs = S.toList $ S.splitSuffixOn (A.fromList pat) (FL.toList) (S.fromList xs) -- -- >>> splitSuffixOn_ "." "" -- [""] @@ -3234,7 +3239,7 @@ wordsOn subseq f m = undefined -- D.fromStreamD $ D.wordsOn f subseq (D.toStream -- -- | Like 'splitOnSeq' but splits the separator as well, as an infix token. -- --- > splitOn'_ pat xs = S.toList $ FL.splitOn' (A.fromList pat) (FL.toList) (S.fromList xs) +-- > splitOn'_ pat xs = S.toList $ S.splitOn' (A.fromList pat) (FL.toList) (S.fromList xs) -- -- >>> splitOn'_ "" "hello" -- > ["h","","e","","l","","l","","o"]