IO monad realized in 1965

 

 

Introduction

Monads turned out a convenient tool to represent and reason about computational effects -- such as IO, mutable state and exceptions -- in lambda calculi. In particular, monads let us clearly separate evaluation-order--insensitive simplifications from sequentially performed executions [Moggi-Fagorzi-2003].

The exponential explosions of `do-it-yourself' monad tutorials prompts a thought if monads, like the related continuations [Reynolds-1993], are destined to be rediscovered time and time again. One naturally wonders about the first `monad tutorial'.

An old (1994) paper on category theory monads and functional programming included a relevant historical side-note. It turns out that the essence of monads has been fully grasped back in 1965, by at least one person. That person has also discovered that imperative, control-flow--dependent computations can be embedded into a calculus by turning control flow into data flow. That person is Peter Landin. His 1965 paper [Landin-1965] anticipated not only IO but also State and Writer monads, call/cc, delayed evaluation and its connection with streams.

References
The IO monad is 45 years old
An early version of this essay, posted on the Haskell-Cafe mailing list on Wed, 29 Dec 2010 01:13:13 -0800 (PST)

 

The first monad tutorial and the `programmable semicolon'

Here is the historical aside, cited from [Hill-Clarke-1994] (Sec 3):
Monads are typically equated with single-threadedness, and are therefore used as a technique for incorporating imperative features into a purely functional language. Category theory monads have little to do with single-threadedness; it is the sequencing imposed by composition that ensures single-threadedness. In a Wadler-ised monad this is a consequence of bundling the Kleisli star and flipped compose into the bind operator. There is nothing new in this connection. Peter Landin in his Algol 60 used functional composition to model semi-colon. Semi-colon can be thought of as a state transforming operator that threads the state of the machine throughout a program. The work of Peyton-Jones and Wadler has turned full circle back to Landin's earlier work as their use of Moggi's sequencing monad enables real side-effects to be incorporated into monad operations such as print. This is similar to Landin's implementation of his sharing machine where the assignandhold function can side-effect the store of the sharing machine because of the sequencing imposed by functional composition. Landin defined that `Imperatives are treated as null-list producing functions' [In Landin's paper, () is the syntactic representation of the empty list]. The assignandhold imperative is subtly different in that it enables Algol's compound statements to be handled. The function takes a store location and a value as its argument, and performs the assignment to the store of the sharing machine, returning the value assigned as a result of the function. Because Landin assumed applicative order reduction, the K combinator was used to return (), and the imperative was evaluated as a side effect by the unused argument of the K-combinator. Statements are formed by wrapping such an imperative in a lambda expression that takes () as an argument. Two consecutive Algol-60 assignments would be encoded in the lambda calculus as:
    Algol 60        Lambda Calculus
    x:=  2;         ( (\() -> K () (assignandhold x 2)) .
    x:= -3;           (\() -> K () (assignandhold x (-3))) ) ()

By using a lambda with () as its parameter, () can be thought of as the `state of the world' that is threaded throughout a program by functional composition.

 

Landin's IO monad in modern eyes

The ()-passing trick is described in full on p.100 of Landin's paper, with remarkable clarity:
Statements. Each statement is rendered as a 0-list- transformer, i.e. a none-adic function producing the nullist for its result. It achieves by side-effects a transformation of the current state of evaluation. ... Compound statements are considered as functional products (which we indicate informally by infixed dots).

Contrast that with a more recent explanation [Peyton-Jones-2000]:

A value of type IO a is an "action" that, when performed, may do some input/output, before delivering a value of type a. This is an admirably abstract statement, and I would not be surprised if it means almost nothing to you at the moment. So here is another, more concrete way of looking at these "actions": type IO a = World -> (a, World) This type definition says that a value of type IO a is a function that, when applied to an argument of type World, delivers a new World together with a result of type a. The idea is rather program-centric: the program takes the state of the entire world as its input, and delivers a modified world as a result, modified by the effects of running the program. [p5]

Simon Peyton-Jones then goes on to explain [p14] that the functional world-passing realization of the IO monad is indeed the implementation of the monad in GHC. The IO monad operation (>>) is a functional composition then -- just like it is in Landin's representation.

There are some differences: World is abstract in GHC but is represented as the empty list () in Landin's paper. Whereas GHC threads the World through, Landin's encoding discards and re-generates the () token. GHC uses a non-strict evaluation strategy whereas Landin relied on call-by-value: although the K combinator discards the value of its second argument, the side-effects of its evaluation are performed nonetheless.

The differences are, however, superficial. Landin could have hid the concrete `world' representation and the K-combinator trick by defining assignandhold' x v = \() -> K () (assignandhold x v) and using it as a primitive. The sequencing of compound statements is ensured by functional composition, as data flow, and hence does not depend on the evaluation strategy.

Strictly speaking, we ought to treat World as an abstract data type, to prevent the compiler from optimizing the composition of two World -> World functions and reordering computations. We also need an additional constraint that the World is to be used linearly, so the compiler will not be tempted to replicate World-producing computations. Landin's calculus, induced by his sharing machine, ensures these constraints. In GHC, World is treated specially and the duplication of redices is prevented by ad hoc rules in the optimizer.

Incidentally, call-by-name and call-by-value were introduced ten years later, in [Plotkin-1975], in the context of Landin's ISWIM. Both evaluation strategies have the notion of value and both take lambda-abstraction to be a value. The body of a function is evaluated only when the function is applied to an argument. Landin understood it back in 1965: ``the use of lambda, and in particular (to avoid an irrelevant bound variable) of lambda (), to delay and possibly avoid evaluation is exploited repeatedly in our model of Algol 60. A function that requires an argument-list of length zero is called a non-adic function'' [p90]. We now call that a thunk.

 

Other Landin's discoveries

Peter Landin's 1965 paper had many other discoveries. First the reader will notice the where notation. Peter Landin even anticipated the debate on let vs where, saying ``The only consideration in choosing between let and where will be the relative convenience of writing an auxiliary definition before or after the expression it qualifies.''

Another remark

However, input/output devices can be modeled as named lists, with special, rather restricted functions associated. ... Writing is modeled by a procedure that, operates on a list, and appends a new final segment derived from other variables. (Alternatively, a purely functional approach can be contrived by including the transformed list among the results.)
anticipated the State and Writer monads as well.

In the section on Streams, Landin wrote:

This correspondence [laws of head/tail/cons] serves two related purposes. It enables us to perform operations on lists (such as generating them, mapping them, concatenating them) without using an `extensive,' item-by-item representation of the intermediately resulting lists; and it enables us to postpone the evaluation of the expressions specifying the items of a list until they are actually needed. The second of these is what interests us here.... [and footnote 6] It appears that in stream-transformers we have a functional analogue of what Conway calls `co-routines.'

 

References