conditioned inference - archive

Production of simulacra

Simulators predict probability distributions. Simulacra are trajectories of words. An operation to yield the latter from the former.

First presented with minimal interpretation. See if an analogy comes to mind. Then I will give analogies.

(there are variations on this way of generating trajectories ... but this is a classic (in some ways ideal) setup)

Context input to model. Model outputs weights(probabilities) for each possible token. A token is sampled from the output probability distribution and appended to the context. Repeat.

Interpretation: Stochastic time evolution operator and wavefunction collapse

The context could be thought of as a state or configuration which is evolved by the simulator’s stochastic dynamical law.

After each “propagation” step, a random “measurement” is made from the output probability. A single token is selected - wavefunction collapse. The determinate value is added to the context to create the next state.

Natural language version of https://github.com/mxgmn/WaveFunctionCollapse [but without lowest entropy heuristic — always collapses next token next]