yes this is somewhat similar but also distinct from the video signal flow diagram post.
When i design systems for working with continuous signals i like to define a set of primitives and a set of operations that either map primitives to themselves or to another primitive. when working with video signals this is sort of the basic things i think about
Frame Rates: these can be modified either directly or indirectly via sampling and holding (strobe) or globally by changing settings in monitors and signals. different frame rates can have significantly or subtly different effects depending on what kind of signals and what kind of displays you are working with.
Frames: arrays of pixel data held in some combination of continous or discrete form, they get drawn to a screen, sent into some kind of processor or filter, can be mixed with other frames, can have geometrical operations both linear and nonlinear performed on them, and can be stored in a buffer
Pixels: this is where it gets kind of interesting to contrast between analog and digital signals. there is some degree of both continuous and discreteness to individual “pixel” data in an analog signal. either way the important thing about a pixel is that they have an x y location in a frame and each pixel contains data about color space. you can do transformations on the color space to a pixel.
i think that rates, frames and pixels make up one class of primitives. the basic operations i think are pretty clear from the above discriptions, the only thing i would want to note specifically about operations is that ‘filters’ in video space i think of as being different from color space operations because a filter is a convolution which means it is an operation that needs to take a group of pixels and then return 1 pixel. certain kinds of effects that are similar to filters can certainly be done as indivdual pixel operations so maybe just tossing filter as being misleading in this context and just saying convolution instead?
I don’t want to go too much in depth about oscilators, partially because i don’t really have that much personal experience working with video oscillators and also because i think they belong to a different class of primitives than the rates, frames, and pixels. There is one thing i wanted to start a discussion about tho, i was speaking with @kevinkripper and a bunch of other folks about this as well and maybe @ojack, @schaferob and lone V can add some thoughts as well. Video osc have an extra dimension to think about in contrast to audio osc. Terminology for audio osc are pretty standardized, frequency and amplitude and phase all translate pretty well usually (well phase seems to get abstracted all over the place but shrug). On the other hand in a video osc you have frequencey in terms of how many periods an osc has in 1 frame and you also have a the rate of change in between frames. it seems like everyone who works with video oscs has a slightly different way to refer to rates of change in frames and rates of change in time, was curious if anyone would be interested in talking about trying to come up with a useful standard that would make crossing over from analog video osc to vsynth to hydra to whatever an easier translation to help folks communicate interesting patch ideas back and forth!