Karya, built on 2018-02-23T20:23:55 (patch cf8565b7ac832266878af99a942555d139065f12)

Safe HaskellNone




A Stream is a collection of LEvent.LEvents which is hopefully sorted in time order.



data Stream a Source #

A list seems inefficient, since each call appends to the stream. A block call will then append a bunch of events which need to be copied, and recopied as part of the larger chunk for the next block call up. It's possible the head of the stream will also be copied every time something is appended to it, but I'm not sure about that. It could also be that the number of events is low enough that all the inefficiency doesnt actually matter, but I'm not sure. I need to understand profiling to tell.

TODO one possibility is a MergeList:

data MergeList a = Chunk [a] | Merge (MergeList a) (MergeList a)

This way I don't need to copy large chunks multiple times. Also, if I make sure there is no data dependency between the merge branches, I can evaluate them in parallel.

Each call generates a chunk [Event], and the chunks are then joined with (<>). This means every cons is copied once, but I think this is hard to avoid if I want to merge streams.

TODO the Functor and Traversable can destroy the order, but this isn't checked. Maybe I shouldn't have them?

Currently I don't actually track order, and just trust the callers.


Functor Stream # 


fmap :: (a -> b) -> Stream a -> Stream b #

(<$) :: a -> Stream b -> Stream a #

Show InstrumentCalls # 
Show Builtins # 
Show Library # 
Semigroup NoteDeriver # 
Monoid NoteDeriver # 
Show a => Show (Stream a) # 


showsPrec :: Int -> Stream a -> ShowS #

show :: Stream a -> String #

showList :: [Stream a] -> ShowS #

Semigroup (Stream Signal.Control) # 
Semigroup (Stream PSignal) # 
Semigroup (Stream Score.Event) # 
Monoid (Stream Signal.Control) #

Signal.Control streams don't need sorted order.

Monoid (Stream PSignal) # 
Monoid (Stream Score.Event) # 
DeepSeq.NFData a => DeepSeq.NFData (Stream a) # 


rnf :: Stream a -> () #

Pretty.Pretty a => Pretty.Pretty (Stream a) # 
Callable (Transformer Pitch) # 
Callable (Transformer Control) # 
Callable (Transformer Note) # 
ToLibrary (Transformer Pitch) # 
ToLibrary (Transformer Control) # 
ToLibrary (Transformer Note) # 


from_sorted_events :: [a] -> Stream a Source #

Promise that the stream is really sorted.


partition :: Stream a -> ([a], [Log.Msg]) Source #

events_of :: Stream a -> [a] Source #


take_while :: (a -> Bool) -> Stream a -> Stream a Source #

drop_while :: (a -> Bool) -> Stream a -> Stream a Source #

cat_maybes :: Stream (Maybe a) -> Stream a Source #

catMaybes for Stream.

merge_asc_lists :: [Stream Score.Event] -> Stream Score.Event Source #

Merge sorted lists of events. If the lists themselves are also sorted, I can produce output without scanning the entire input list, so this should be more efficient for a large input list than merge.

This assumes all the streams are sorted. I could check first, but this would destroy the laziness. Instead, let it be out of order, and Convert will complain about it.

specific transformations

first :: (a -> Bool) -> (a -> a) -> Stream a -> Stream a Source #

Apply to the first Event that matches the predicate.

first_last :: (a -> Bool) -> (a -> a) -> (a -> a) -> Stream a -> Stream a Source #

Apply to the first and last Event that matches the predicate. If there are fewer than 2 such events, do nothing.


zip :: [a] -> Stream x -> Stream (a, x) Source #

zip_on :: ([a] -> [b]) -> Stream a -> Stream (b, a) Source #

zip3 :: [a] -> [b] -> Stream x -> Stream (a, b, x) Source #

zip3_on :: ([a] -> [b]) -> ([a] -> [c]) -> Stream a -> Stream (b, c, a) Source #

zip4 :: [a] -> [b] -> [c] -> Stream x -> Stream (a, b, c, x) Source #

misc util