Most Influential ICFP Paper Award


Presented annually to the author(s) of a paper presented at the ICFP held 10 years prior to the award year. The award includes a prize of $1,000 to be split among the authors of the winning paper. The papers are judged by their influence over the past decade. This award was initiated in 2007 (for 1997).

Selection Committee
The award given in year N is for the most influential paper presented at the conference held in year N-10. The selection committee consists of the following members:

  • the current SIGPLAN Chair, ex officio,
  • a member of the SIGPLAN EC appointed as committee Chair by the SIGPLAN Chair,
  • the General Chair and Program Chair for ICFP N-10,
  • the General Chair and Program Chair for ICFP N-1, and
  • a member of the SIGPLAN EC appointed by the committee Chair.

The SIGPLAN Chair shall also adjudicate conflicts of interest, appointing substitutes to the committee as necessary.

Recipients

2014 Ralf Lämmel and Simon Peyton Jones

(for 2004) Scrap More Boilerplate: Reflection, Zips, and Generalised Casts

Citation:

In their TLDI 2003 paper, Lämmel and Peyton Jones introduced “Scrap Your Boilerplate” (SYB), a simple but powerful design pattern for generic programming in Haskell. SYB introduced a library of strategic combinators which one could use to compose generic custom traversals of complex data structures, such as syntax trees or XML documents. It was not initially clear, however, whether SYB would scale to handle more traditional applications of generic programming, such as serialization, deserialization, and generic equality, which require a form of reflection. The authors’ subsequent ICFP 2004 paper significantly extended their original SYB library with support for reflection, zips, and generalised casts, thus demonstrating the applicability of SYB to a much broader range of generic programming tasks. This broad applicability, combined with the ease with which programmers could pick up and use the library, led to its adoption in numerous Haskell developments, and contributed greatly to the burgeoning interest in statically typed, practically usable libraries for generic programming over the past decade.

2013 Didier Le Botlan and Didier Rémy

(for 2003) ML^F: Raising ML to the Power of System F

Citation:

Le Botlan and Rémy’s paper on MLF presented a major technical breakthrough in type inference for first-class polymorphism. Remarkably, by going from ML to a system that is more expressive than System F, principal types could be recovered. The paper was the first and only paper (to date) to present a type inference and unification algorithm with all the “good” features of ordinary ML inference and unification, except on a richer universe of types and constraints that allowed impredicative instantiation. The MLF system enjoys most general unifiers, completeness properties, and robustness under program transformations, with user type annotations providing the oracles needed to reach the expressiveness of System F and beyond. The MLF paper has been highly influential on subsequent work in type inference for first-class polymorphism, and is a must-read for anyone interested in the topic.

2012 Robert Findler and Matthias Felleisen

(for 2002) Contracts for higher-order functions

Citation:

Assertion-based contracts have proven very useful for dynamically enforcing first-order program invariants in procedural languages, but until 2002 they had not been supported in languages with higher-order functions. Findler and Felleisen’s paper filled this gap, presenting “the first assertion-based contract checker for languages with higher-order functions.” The paper also spawned a great deal of follow-on work on such topics as the semantics of blame assignment and the integration of static typing and dynamic contract checking. Higher-order contracts have become a central component of the Racket (formerly PLT Scheme) system – one of the major software artifacts of the functional-programming community – which serves as both a research testbed and an effective tool for programming-language education.

2011 Claudio Russo

For 2001 Recursive Structures for Standard ML

Citation:

Various attempts at extending ML modules with recursions were made starting in the early 90’s. However, they all ran into a nasty typing issue later dubbed the “double-vision problem” by Derek Dreyer. Russo’s ICFP 2001 paper was the first to propose a correct and practical solution to this problem. The solution played very cleverly on the nonstandard formalization of ML modules developed in his thesis. Building on this novel insight, the ICFP 2001 paper develops a complete design for recursive modules in ML, which Russo implemented in Moscow ML and which is complete enough to handle many desirable examples. Russo’s design was the main source of inspiration for adding recursive modules in OCaml, taking recursive modules from an open research issue to a realised language feature that simply works.

2010 Koen Claessen and John Hughes

(for 2000) Quickcheck: a lightweight tool for random testing of Haskell programs

Citation:

This paper presented a very simple but powerful system for testing Haskell programs that has had significant impact on the practice of debugging programs in Haskell. The paper describes a clever way to use type classes and monads to automatically generate random test data. QuickCheck has since become an extremely popular Haskell library that is widely used by programmers, and has been incorporated into many undergraduate courses in Haskell. The techniques described in the paper have spawned a significant body of follow-on work in test case generation. They have also been adapted to other languages, leading to their commercialisation for Erlang and C.

2009 Malcolm Wallace and Colin Runciman

(for 1999) Haskell and XML: Generic combinators or type-based translation?,

Citation:

Malcolm Wallace and Colin Runciman’s 1999 ICFP paper “Haskell and XML: Generic combinators or type-based translation?” was one of the first papers to spell out the close connection between functional programming languages and XML. It described a typed correspondence between XML’s document type definitions (DTD) and Haskell datatypes. This correspondence leads to a natural representation of XML data in functional languages and permits native functions to operate on this representation. The paper also describes a generic encoding of XML trees together with a combinator language for querying and transforming the XML. The paper led to a widespread awareness of the close connection between XML and functional programming and initiated a flurry of research on functional XML processing. Moreover,
the accompanying implementation was widely distributed as part of a CD with various XML tools, thus making an impact in the problem domain beyond the functional programming community: a perfect example of functional programming solving real world problems.

2008 Lennart Augustsson

(for 1998) Cayenne — a language with dependent types,

Citation:

Lennart Augustsson’s 1998 ICFP Paper “Cayenne: A language with dependent types” made dependently-typed programming accessible to non-type theorists. The language design was a bold one, both in its use of dependent types and in its adoption of an undecidable type system. Although the idea seemed quite radical at the time, it allowed future work to break out of the straight-jacket of decidable type systems. It also pointed the way towards the merger of programming languages and proof systems that we are starting to see today in languages such as Agda.

2007 Conal Elliott and Paul Hudak"

(for 1997) Functional Reactive Animation,

Citation:

“Functional Reactive Animation” by Conal Elliott and Paul Hudak was the first published paper on functional reactive programming. It described a collection of data types and functions that comprised an embedded domain-specific language called Fran for composing interactive, multi-media animations. The key abstractions were first-class behaviors and events. Intuitively, a behavior is a value that varies with continuous time while an event is a discrete counterpart including time-varying predicates. The idea of regarding the entire lifetime of a time-varying quantity as a single first-class value was new and very surprising at the time, but the paper made it seem simple and obvious. The insight in the paper led to a significant number of follow-on projects including FranTk, Fruit, Pidgets, FrTime, Frob, FRP, Frappe, Frag, Fvision, Yampa, Feris, and work on embodying financial contracts in functional terms.

2006 Julia L. Lawall and Harry G. Mairson

(for 1996) Optimality and inefficiency: what isn’t a cost model of the lambda calculus?

Citation:

Julia Lawall and Harry Mairson’s 1996 ICFP paper “Optimality and inefficiency: What isn’t a cost model of the lambda calculus?” exposed a fundamental problem with proposed algorithms for optimal reduction. Starting with Jean-Jacques Lévy’s seminal work in 1978, the goal of optimal reduction was to correctly normalize lambda-calculus terms without duplicating redexes. Various strategies were subsequently devised to realize optimal reduction, notably the solution of John Lamping at POPL 1990, then simplified and improved by Georges Gonthier, Martín Abadi, and Jean-Jacques Lévy at POPL 1992. Each solution used subtle bookkeeping mechanisms to control sharing.

Lawall and Mairson showed that these bookkeeping mechanisms introduced a complexity and inefficiency of their own. They discovered terms that could be normalized in linear time, but whose bookkeeping costs required exponential time. They further showed that Frandsen and Sturtivant’s cost model for lambda-calculus reduction, presented at FPCA 1991, needed to account for the size of intermediate terms, and that optimal-evaluation interpreters were at least exponentially slower than the proposed cost model. Lawall and Mairson concluded that the notion of optimality did not necessarily coincide with that of efficiency. As a consequence, different and possibly optimal evaluation strategies were still needed, as were more realistic cost models. Subsequent work in this area has focused on such cost models, on further analysis of the inherent complexity of optimal reduction, and on relaxing the optimality condition in exchange for lower bookkeeping overhead and greater overall efficiency.