Karya, built on Mon Jul 24 11:39:07 PDT 2017 (patch 33511aca01257b76b88de7c7a2763b7a965c084e)

Safe HaskellNone

Derive.Deriver.Monad

Contents

Description

Implementation for the Deriver monad.

This module should contain only Deriver and the definitions needed to implement it. Unfortunately that winds up being quite a lot. Deriver carries a State, which is depended upon directly by all derivation, so it's just as complicated as derivation itself, which is to say, very.

Since State only really requires data types, the majority of this module is data declarations, with the exception of a few constructors which are intimately concerned with the type they are constructing. The library of functions to manipulate these types are split into Derive.Deriver.Lib.

This module is way too big. Unfortunately it's hard to split up because of circular imports. Anyone who directly or indirectly needs Deriver (e.g. calls) needs to import Derive. However, anything directly or indirectly used by State must be imported by Derive. Since State is the central type that must hold anything that persists beyond the evaluation of a single note, that winds up being a lot. At one point I tried to reign in the madness with hs-boot files, but I decided that hs-boot was worse.

Synopsis

Deriver

modify :: (st -> st) -> DeriveM.Deriver st err () Source #

gets :: (st -> a) -> DeriveM.Deriver st err a Source #

put :: st -> DeriveM.Deriver st err () Source #

run :: st -> DeriveM.Deriver st err a -> DeriveM.RunResult st err a Source #

error

data CallError Source #

Constructors

TypeError !ErrorPlace !EvalSource !ArgName !ValType.Type !(Maybe BaseTypes.Val) !(Maybe Error)

ErrorPlace, EvalSource, arg name, expected type, received val, derive error. If the derive error is present, the type check required running a Deriver and the Deriver crashed.

CallNotFound !Expr.Symbol

Couldn't even call the thing because the name was not found.

ArgError !Text

Calling error that doesn't fit into the above categories.

data ErrorPlace Source #

Where a type error came from. The arg number starts at 0.

data EvalSource Source #

Constructors

Literal

The value in error came from a literal expression.

Quoted !BaseTypes.Quoted

The value in error came from a BaseTypes.VQuoted bit of code.

derived types

class (Show d, Taggable d) => Callable d where Source #

Each kind of deriver looks a different scope for its calls. By making this a class method, I can figure out which scope to look in just from the type.

data Tagged Source #

This is for ctx_prev_val. Normally the previous value is available in all its untagged glory based on the type of the call, but ValCalls can occur with all the different types, so they need a tagged ctx_prev_val.

state

data State Source #

All the state available during derivation.

Constructors

State 

Fields

  • state_threaded :: !Threaded

    Threaded state means deriving one event depends on the results of the previous event. This corresponds to StateT.

  • state_dynamic :: !Dynamic

    This data is modified in a dynamically scoped way, for sub-derivations. This corresponds to ReaderT.

  • state_collect :: !Collect

    This data is mappended. It functions like an implicit return value. This corresponds to WriterT.

  • state_constant :: !Constant

    This data is constant throughout the derivation.

data Threaded Source #

State which is threaded linearly. This destroys the ability to parallelize derivation, so it's not so great. However, the only threaded state is state_prev_val, which is only needed within a track, so sibling tracks can still be parallelized.

Constructors

Threaded 

Fields

  • state_prev_val :: !(Map (BlockId, TrackId) Tagged)

    Keep track of the previous value for each track currently being evaluated. See NOTE [prev-val].

  • state_event_serial :: !Stack.Serial

    This is used for Stack.Serial to ensure a unique stack for multiple generator calls within a single track event. It's reset on the evaluation of each uninverted track event, and incremented after every Score.Event is emitted. See NOTE [event-serial] for history.

data Dynamic Source #

This is a dynamically scoped environment that applies to generated events inside its scope.

Constructors

Dynamic 

Fields

data Inversion Source #

When a note call inverts, it stashes its actual note-generating code so it can re-invoke track evaluation on the control tracks below it. It's kind of like saving a continuation.

Previously I did it by copying the text of the inverting call to the generated track. The problem was that I therefore had to keep the evaluated expression around in the call Context, and if I forgot to clear it in the right places things would be very confusing when a later inversion executed unexpected code. under_invert transforms are now also stored as code rather than data, in state_under_invert.

Constructors

NotInverted

Pre-inversion.

InversionInProgress !NoteDeriver

After inversion, but not yet at the bottom. The inverted generator is captured here.

strip_dynamic :: Dynamic -> Dynamic Source #

Strip out fields that I don't need to remember in a TrackDynamic.

If I don't do this, I get a memory leak. Presumably the cause is that state_pitch_map has an unevaluated pitch derivation, which in turn somehow retains the previous derivation, and then the previous, and so on. This makes each derivation leak more space.

initial_controls :: BaseTypes.ControlMap Source #

Initial control environment.

default_dynamic :: Signal.Y Source #

A default dynamic that's not 0 is useful because otherwise you have to add dyn to everything. Since control tracks multiply by default, 1 is the most convenient value.

scope

data Library Source #

This is the library of built-in calls. The prio_library Scope fields are imported from this.

Constructors

Library 

Fields

type Scopes = ScopesT GeneratorScope TransformerScope TrackScope (ScopePriority ValCall) Source #

This represents all calls in scope. Different types of calls are in scope depending on the track type, except ValCalls, which are in scope everywhere. This is dynamic scope, not lexical scope.

Perhaps this should be called Namespaces, but Id.Namespace is already taken and Scopes is shorter.

data ScopesT gen trans track val Source #

TODO this could probably now do with a more general name maybe CallType for this, and CallKind for Scope? This is arg type, Scope is return type, or maybe TrackType.

Calls are in scope by expression position (generator, transformer, track, val) and then by track type (note, control, pitch). Expression position also determines the the argument type (generator: nothing, transformer: deriver, track: TrackTree.EventsTree), while track type determines the return type (Deriver Note, Deriver Control, Deriver Pitch).

Val calls are special in that they always have the same type (Args -> Val), and are in scope in val call exrpession position for all track types.

names: EScope, TScope for ExpressionScope and TrackScope? ExprScope, TrackScope? I'd want to update the names in CallDoc too.

Constructors

Scopes 

Fields

Instances

Show InstrumentCalls # 
Show (CallMaps d) # 
(Monoid gen, Monoid trans, Monoid track, Monoid val) => Monoid (ScopesT gen trans track val) # 

Methods

mempty :: ScopesT gen trans track val #

mappend :: ScopesT gen trans track val -> ScopesT gen trans track val -> ScopesT gen trans track val #

mconcat :: [ScopesT gen trans track val] -> ScopesT gen trans track val #

(Pretty.Pretty gen, Pretty.Pretty trans, Pretty.Pretty track, Pretty.Pretty val) => Pretty.Pretty (ScopesT gen trans track val) # 

Methods

pretty :: ScopesT gen trans track val -> Text Source #

format :: ScopesT gen trans track val -> Doc Source #

formatList :: [ScopesT gen trans track val] -> Doc Source #

s_generator :: (:->) (ScopesT gen trans track val) gen Source #

s_transformer :: (:->) (ScopesT gen trans track val) trans Source #

s_track :: (:->) (ScopesT gen trans track val) track Source #

s_val :: (:->) (ScopesT gen trans track val) val Source #

data Scope note control pitch Source #

Constructors

Scope 

Fields

Instances

DeepSeq.NFData (Scope a b c) # 

Methods

rnf :: Scope a b c -> () #

(Pretty.Pretty note, Pretty.Pretty control, Pretty.Pretty pitch) => Pretty.Pretty (Scope note control pitch) # 

Methods

pretty :: Scope note control pitch -> Text Source #

format :: Scope note control pitch -> Doc Source #

formatList :: [Scope note control pitch] -> Doc Source #

s_note :: (:->) (Scope note control pitch) (ScopePriority note) Source #

s_control :: (:->) (Scope note control pitch) (ScopePriority control) Source #

s_pitch :: (:->) (Scope note control pitch) (ScopePriority pitch) Source #

newtype ScopePriority call Source #

An instrument or scale may put calls into scope. If that instrument or scale is replaced with another, the old calls must be replaced with the new ones.

Priority is determined by get_scopes, which returns them in the fields' declaration order.

The reason this can't be accomplished just by arranging imports in the right order is that when an instrument or scale comes into scope, it needs to replace existing instrument or scale calls. To do that, I need to keep each category separate. Also, this way I can import the ky file once at the toplevel, and it will still override library imported calls.

Constructors

ScopePriority (Map CallPriority [LookupCall call]) 

data CallPriority Source #

Constructors

PrioOverride

Override calls shadow all others. They're useful when you want to prevent instruments from overriding calls, which the lilypond deriver needs to do.

PrioInstrument

These are instrument-specific calls implicitly imported by note tracks.

PrioBlock

Block calls are local definitions, so they should override library calls, but are still below instrument calls. Otherwise, it's easy to define a block that shadows a drum stroke and get confused. TODO there could be a mechanism to set PrioOverride in case I actually do want to shadow an instrument call.

PrioScale

This is for value calls introduced by a scale. They are implicitly imported by pitch tracks.

PrioLibrary

Calls imported from the Library.

add_priority :: CallPriority -> LookupCall call -> ScopePriority call -> ScopePriority call Source #

Add this call at this level of priority. It will shadow existing calls with the same name.

replace_priority :: CallPriority -> [LookupCall call] -> ScopePriority call -> ScopePriority call Source #

Replace all calls at this level of priority.

data DocumentedCall Source #

This is like Call, but with only documentation. (name, CallDoc)

data LookupCall call Source #

Constructors

LookupMap !(Map Expr.Symbol call) 
LookupPattern !Text !DocumentedCall !(Expr.Symbol -> Deriver (Maybe call))

Text description of the Symbols accepted. The function is in Deriver because some calls want to look at the state to know if the Symbol is valid, e.g. block calls.

TrackCall

track_call :: Module.Module -> CallName -> Tags.Tags -> Doc.Doc -> TrackCallFunc d -> TrackCall d Source #

constant

data Constant Source #

Values that don't change during one derive run.

Constructors

Constant 

Fields

data Mode Source #

Derivation can run in a few distinct modes.

Constructors

Normal

Standard derivation.

ScoreDurationQuery

This indicates that I'm running the deriver just to find out its duration. There's a hack in Derive.Eval that will fill in collect_score_duration when it sees this mode. More detail in CallDuration.

RealDurationQuery 
Lilypond !Lilypond.Types.Config

Emit events intended for the lilypond backend. Calls that have corresponding staff notation (e.g. trills) emit special events with attached lilypond code in this mode.

mergers :: Map Expr.Symbol (Merger Signal.Control) Source #

The built-in set of control Mergers.

merge_scale :: Merger Signal.Control Source #

Unlike the rest, this one is not associative.

instrument

data Instrument Source #

Mostly the deriver just deals with instruments as strings, and doesn't understand anything else about them. However, it does need a few other things, which are expressed here to avoid excessive dependencies between the systems.

Constructors

Instrument 

Fields

type InstrumentCalls = ScopesT [LookupCall (Generator Note)] [LookupCall (Transformer Note)] [LookupCall (TrackCall Note)] [LookupCall ValCall] Source #

Some ornaments only apply to a particular instrument, so each instrument can bring a set of note calls and val calls into scope, via the Scope type.

This is like CallMaps, except that it has ValCalls.

control

data Merge sig Source #

How to merge a control into Dynamic.

Constructors

DefaultMerge

Apply the default merge for this control.

Merge !(Merger sig)

Merge with a specific operator.

Instances

Show (Merge sig) # 

Methods

showsPrec :: Int -> Merge sig -> ShowS #

show :: Merge sig -> String.String #

showList :: [Merge sig] -> ShowS #

DeepSeq.NFData (Merge a) # 

Methods

rnf :: Merge a -> () #

Pretty.Pretty (Merge a) # 

data Merger sig Source #

Combine two signals. The element should be an identity, like mempty. ControlMod uses it to avoid affecting signal outside of the modified range. The merge function is not obliged to be associative, so this isn't actually a monoid. TODO it's all the fault of merge_scale... do I lose something important with associativity?

Constructors

Merger !Text !(sig -> sig -> sig) !sig

name merge identity

Set

Replace the existing signal.

Instances

Show (Merger a) # 

Methods

showsPrec :: Int -> Merger a -> ShowS #

show :: Merger a -> String.String #

showList :: [Merger a] -> ShowS #

DeepSeq.NFData (Merger a) # 

Methods

rnf :: Merger a -> () #

Pretty.Pretty (Merger a) # 
ShowVal.ShowVal (Merger a) # 

Methods

show_val :: Merger a -> Text Source #

collect

data Collect Source #

These are things that collect throughout derivation, and are cached in addition to the derived values. Effectively they are extra return values, which are combined with mappend. So this is the WriterT part of State.

Constructors

Collect 

Fields

type SignalFragments = Map (BlockId, TrackId) (Map TrackTime Signal.Control) Source #

These are fragments of a signal, which will be later collected into collect_track_signals. This is part of a complicated mechanism to evaluate TrackSignals only once. When the sliced fragments of a track are evaluated, they collect signal fragments. When the track is fully evaluated, they are sorted and merged into collect_track_signals. If the track is then evaluated again, the monoid instance will discard the duplicate.

The signal fragments are indexed by the slice position. Since Signal.merge makes the earlier signals win in case of overlaps, this ensures a trimmed earlier fragment won't replace a more complete later one.

data ControlMod Source #

This is a hack so a call on a control track can modify other controls. The motivating case is pitch ornaments that also want to affect the dynamics. The modifications are a secondary return value from control and pitch calls. The track deriver will extract them and merge them into the dynamic environment. [NOTE control-modification]

type TrackDynamic = Map (BlockId, TrackId) Dynamic Source #

Snapshots of the environ at each track. This is used by the Cmd layer to figure out what the scale and instrument are for a given track.

Originally this was a map from Stacks to Environ (and only the changed parts). The idea was that I could walk up the stack to find the Environ value in scope at a given point, and given Stack.Region, could even get e.g. per event instruments. Unfortunately, while it's easy to do that on the Derive side, it seems really complicated and somewhat expensive to try to retrace a complete stack on every cmd. Since this implementation doesn't store the entire stack, a track with a different instrument at different times will wind up with the last one.

This is a much simpler solution which will hopefully work well enough in practice.

NOTE [record-track-dynamics] One complication is that when I get controls from sliced tracks, the controls are also sliced. But I need the environ from the inverted version of the track so the common case of [>i, *scale] gets the correct scale. So I record TrackDynamic for both inverted and non inverted tracks and prefer the inverted tracks, but take controls from the non-inverted versions.

data CallDuration a Source #

This is the logical duration of a call. This may be different from its actual duration (which is to say, the end time of the last event it emits). Also, while most calls adjust their duration to the duration of the event they are called from, some of them have their own intrinsic duration. For example, a block call may stretch to its calling event's duration, but it also has its own duration that is used to align the block's end, or to sequence blocks.

Since the call duration is sometimes used to place the call in the first place (e.g. to align its end), I want to evaluate the minimum amount necessary to find the duration. The implementation is that each generator call has a gfunc_score_duration field. When Derive.Eval is evaluating a generator call, if it sees that state_mode is ScoreDurationQuery, instead of calling gfunc_f, it will call gfunc_score_duration and return the result via collect_score_duration. You shouldn't stick your fingers into this machinery, but instead use Derive.get_call_duration to do the gefingerpoken for you.

I'm not very happy with this implementation, but I tried several approaches and this is the only one that worked. Historical details are in NOTE [call-duration].

Constructors

Unknown 
CallDuration !a 

calls

type CallMaps d = ScopesT [LookupCall (Generator d)] [LookupCall (Transformer d)] [LookupCall (TrackCall d)] () Source #

Previously, a single Call contained both generator and transformer. This turned out to not be flexible enough, because an instrument that wanted to override a generator meant you couldn't use a transformer that happened to have the same name. However, there are a number of calls that want both generator and transformer versions, and it's convenient to be able to deal with those together.

call_map :: [(Expr.Symbol, call)] -> [LookupCall call] Source #

Make LookupCalls whose the calls are all LookupMaps. The LookupMaps are all singletons since names are allowed to overlap when declaring calls. It is only when they are imported into a scope that the maps are combined.

call_maps :: [(Expr.Symbol, Generator d)] -> [(Expr.Symbol, Transformer d)] -> CallMaps d Source #

Bundle generators and transformers up together for convenience.

data Context val Source #

Additional data for a call. This part is invariant for all calls on an event.

The events are not used for transform calls.

TODO make separate types so the irrelevent data need not be passed?

Constructors

Context 

Fields

  • ctx_prev_val :: !(Maybe val)

    Hack so control calls have access to the previous sample, since they tend to want to interpolate from that value.

    This used to be the only way a call could get the previous value, but now if the prev val is unset, then "Derive.Args.prev_val" will evaluate ctx_prev_events. But checking ctx_prev_val is cheaper, so I'll keep it around. The evaluation fallback has to exist because track slicing may snip off the previous event.

    See NOTE [prev-val] in Derive.Args for details.

  • ctx_event :: !Event.Event
     
  • ctx_prev_events :: ![Event.Event]
     
  • ctx_next_events :: ![Event.Event]
     
  • ctx_event_end :: !ScoreTime

    The extent of the note past its duration. Since notes have decay, its important to capture control for that. Normally this is the next event's start. If there's no next event because it's the last event of the block, this is the block end, otherwise if there's no next event because it was sliced off, this is where that event would have started.

    This is the same as the first element of ctx_next_events except of course it has a value even when there is no next event.

  • ctx_track_shifted :: !TrackTime
  • ctx_sub_tracks :: !TrackTree.EventsTree

    The track tree below note tracks. Not given for control tracks. TODO should this be Either with ctx_sub_events? I don't think I ever need both set.

  • ctx_sub_events :: !(Maybe [[(ScoreTime, ScoreTime, NoteDeriver)]])

    If present, sub_events will directly return these sub events instead of slicing sub-tracks. Track evaluation will never set this, but calls can set this to reapply a note parent. It should be Events, but isn't to avoid circular imports.

  • ctx_track_type :: !(Maybe ParseTitle.Type)

    This is needed by val calls that want to evaluate events around them. Since val calls are the same on all track types, they need to know explicitly what the track type is to evaluate events on it.

Instances

Functor Context # 

Methods

fmap :: (a -> b) -> Context a -> Context b #

(<$) :: a -> Context b -> Context a #

Pretty.Pretty val => Pretty.Pretty (Context val) # 

Methods

pretty :: Context val -> Text Source #

format :: Context val -> Doc Source #

formatList :: [Context val] -> Doc Source #

ctx_track_range :: Context a -> (TrackTime, TrackTime) Source #

Range of the event in TrackTime.

dummy_context :: ScoreTime -> ScoreTime -> Text -> Context a Source #

Transformer calls don't necessarily apply to any particular event, and neither do generators for that matter.

tag_context :: Taggable a => Context a -> Context Tagged Source #

Taggable the polymorphic part of the Context so it can be given to a ValCall. Otherwise, ValCall would have to be polymorphic too, which means it would hard to write generic ones.

data Call func Source #

A Call will be called as either a generator or a transformer, depending on its position. A call at the end of a compose pipeline will be called as a generator while ones composed with it will be called as transformers, so in a | b, a is a transformer and b is a generator.

More details on this strange setup are in the Derive.Call haddock.

Constructors

Call 

Fields

newtype CallName Source #

Each call has an intrinsic name. Since call IDs may be rebound dynamically, each call has its own name so that error msgs are unambiguous. It's also used along with ArgName for argument defaulting, so if you want that to work it should be short and parseable by p_unquoted_str. The name is not necessarily unique, and in fact may be intentionally non-unique to share defaults with another.

The documentation for all calls that differ only in name are grouped together, so it's easier to read if small modifications are reflected in the name only. If you put invalid identifiers in the name, it can't be used to set default arguments.

Constructors

CallName Text 

newtype ArgName Source #

Each call argument has its own name, which is used for documentation as well as argument defaulting, as documented in Derive.Sig.

Constructors

ArgName Text 

data CallDoc Source #

Documentation for a call. The documentation is in markdown format, except that a single newline will be replaced with two, so a single n is enough to start a new paragraph. Also, single quotes are turned into links as per Util.TextUtil.haddockUrl.

data ArgParser Source #

These enumerate the different ways an argumnt can be parsed, and correspond to parsers in Derive.Sig.

data EnvironDefault Source #

This configures how an argument looks for a default in the environ.

Constructors

None

Don't default from environ at all.

Prefixed

Look for callname-argname.

Unprefixed

Look for argname. This is useful for generic parameters that should configure many calls simultaneously.

Both

First look for a prefixed key, then for an unprefixed one.

type WithArgDoc f = (f, [ArgDoc]) Source #

A value annotated with argument docs. This is returned by the functions in Derive.Sig, and accepted by the Call constructors here.

data PassedArgs val Source #

Data passed to a Call.

Constructors

PassedArgs 

Fields

generator

generator :: Module.Module -> CallName -> Tags.Tags -> Doc.Doc -> WithArgDoc (GeneratorF d) -> Generator d Source #

Create a generator that expects a list of derived values (e.g. Score.Event or Signal.Control), with no logs mixed in. The result is wrapped in LEvent.Event.

generator_events :: Module.Module -> CallName -> Tags.Tags -> Doc.Doc -> WithArgDoc (PassedArgs d -> Deriver [d]) -> Generator d Source #

Make a generator from a function which returns events in sorted order. TODO this just trusts that the events will be sorted. Is there a safer way?

generator1 :: Module.Module -> CallName -> Tags.Tags -> Doc.Doc -> WithArgDoc (PassedArgs d -> Deriver d) -> Generator d Source #

Since Signals themselves are collections, there's little reason for a signal generator to return a Stream of events. So wrap the generator result in a Stream singleton.

TODO call this signal_generator?

with_score_duration :: (PassedArgs d -> Deriver (CallDuration ScoreTime)) -> Generator d -> Generator d Source #

Set the gfunc_score_duration field to get ScoreTime CallDuration.

transformer

type TransformerF d = PassedArgs d -> Deriver (Stream.Stream d) -> Deriver (Stream.Stream d) Source #

args -> deriver -> deriver

transformer :: Module.Module -> CallName -> Tags.Tags -> Doc.Doc -> WithArgDoc (TransformerF d) -> Transformer d Source #

Just make_call with a more specific signature.

val

cache types

The cache types are nominally exported from Derive.Cache, but must be defined here to avoid circular dependencies.

newtype Cache Source #

Constructors

Cache (Map CacheKey Cached) 

newtype CacheKey Source #

Ideally, the cache would be keyed by all data that can affect derivation, which would mean all of Dynamic and Threaded. Effectively a deriver is a function that takes State as its input, and this would be memoizing that function. But in practice, there's too much junk in there, so I have to do an approximation.

The first approximation is the stack, which is a proxy for the things that are likely to affect derivation. Different calls in the stack are likely to result in a different environment, or a different Stack.Region likely means a different warp. Stack.Serial attempts to ensure that multiple generators within a single event also have unique stacks.

Constructors

CacheKey 

data Cached Source #

When cache entries are invalidated by ScoreDamage, a marker is left in their place. This is just for a nicer log msg that can tell the difference between never evaluated and damaged.

Constructors

Cached !CacheEntry 
Invalid 

data CacheEntry Source #

Since an entire track is one type but will have many different calls of different types, the deriver type division goes above the call type division.

data CallType d Source #

The type here should match the type of the stack it's associated with, but I'm not quite up to those type gymnastics yet.

Constructors

CallType !Collect !(Stream.Stream d) 

Instances

DeepSeq.NFData d => DeepSeq.NFData (CallType d) # 

Methods

rnf :: CallType d -> () #

damage

data ScoreDamage Source #

Modified ranges in the score.

Constructors

ScoreDamage 

Fields

newtype ControlDamage Source #

Control damage indicates that a section of control signal has been modified. It's dynamically scoped over the same range as the control itself, so that events that depend on it can be rederived.

util

scale

data Scale Source #

Constructors

Scale 

Fields

newtype LookupScale Source #

A scale can configure itself by looking in the environment and by looking up other scales.

type Transpose = Transposition -> BaseTypes.Environ -> Pitch.Step -> Pitch.Pitch -> Either BaseTypes.PitchError Pitch.Pitch Source #

Scales may ignore Transposition if they don't support it.

Transposition could almost always succeed, and leaving the error reporting to scale_show. But for some scales it has to parse the Pitch.Key from the environ, which can fail. Parsing the key is pretty unfortunate, since it winds up getting repeated for scale_read and scale_show, but I don't want to make the Key type concrete, since each scale has a different one.

TODO could make the key an existential type and export scale_parse_key?

type Enharmonics = BaseTypes.Environ -> Pitch.Note -> Either BaseTypes.PitchError [Pitch.Note] Source #

Get the enharmonics of the note. The given note is omitted, and the enharmonics are in ascending order until they wrap around, so if you always take the head of the list you will cycle through all of the enharmonics.

type Layout = Vector.Unboxed.Vector Pitch.Semi Source #

The number of chromatic intervals between each Pitch.PitchClass, starting from 0, as returned by scale_read. The length is the number of degrees per octave. A diatonic-only scale will have all 1s, and a scale without octaves has an empty layout.

This is analogous to Layout, but is intended to be a minimal implementation that all scales can export, without having to support the full complexity of a chromatic scale.

Combined with scale_read and scale_show, I can use this to do math on scale degrees.

merge

testing

invalidate_damaged :: ScoreDamage -> Cache -> Cache Source #

Clear the damaged portions out of the cache so they will rederive.

Block damage also clears track caches that are on that block.