Given by ACM SIGPLAN to recognize an individual (or individuals) who has (have) made a significant and lasting contribution to the field of Programming Languages. The contribution can be a single event or a lifetime of achievement. The award includes a prize of $5,000. The award is presented at SIGPLAN’s PLDI conference the following June.
All questions about the Programming Languages Achievement Award should be directed to the SIGPLAN Awards co-Chairs.
Nominations can be submitted at any time using the Web form at https://awards.sigplan.org. Nominations submitted on or before January 15th will be considered for award that year. A nomination for the Achievement Award that is not selected will remain in consideration for a total of three years.
Award recipients are selected by a committee constituted as follows:
The current committee is:
Because this award is intended to recognize persons of major influence, it is likely that several members of the committee may have worked with, or co-authored articles with, the nominees, and may have a conflict of interest. The primary mechanism for handling such conflicts will be to declare them to the committee; once so declared, conflicts of interest shall not automatically prevent a committee member from taking part in the selection process. However, if a member of the committee, or the chair of the committee, feels that the association of a committee member with a nominee would interfere with impartial consideration of the nominees, that conflicted member shall be absented from the relevant parts of the discussion. If a committee member has conflicts of interest with more than one nominee, the Chair of the Committee may ask the constituency that appointed the committee member to select a replacement member. The SIGPLAN EC Chair will adjudicate as necessary.
Xavier Leroy has made fundamental contributions to both the theory and practice of programming languages on a range of topics, including type system and module system design, efficient compilation of functional programming languages, bytecode verification, verified compilation, and verified static analysis. He is known as the main architect of the OCaml programming language and the author of the CompCert verified compiler for C. Both of these projects have had enormous impact in the PL community and beyond, and they have significantly pushed the boundaries of what is perceived as possible.
Among Xavier Leroy’s many accomplishments, his OCaml language introduced numerous advances in language design and implementation and created a solid platform that enabled the work of thousands of others in both research and industry. OCaml has been used to build widely-used theorem provers (e.g., Coq) as well as operating system hypervisors (e.g., Xen). OCaml has also played a key role in a number of companies (Jane St. Capital) and has inspired numerous variants and follow-on designs (F#, Scala).
Another enormous feat of Xavier Leroy’s research is the development of the CompCert C compiler, which opened up a whole new area of activity and demonstrated the feasibility of formally verifying the full functional correctness of a compiler for a mainstream programming language. This accomplishment created a true change in people’s perceptions of what is possible and inspired many researchers to succeed at similar feats across a variety of domains.
Robert (Bob) Harper is widely known for his foundational contributions to our understanding of type theory and its use in the design, specification, implementation, and verification of modern programming languages. Bob demonstrated that sound type systems and operational semantics are not only appropriate vehicles for reasoning about idealized languages, but that this theory can be applied to complex, modern languages. His achievements include analysis of language features ranging from references to continuations to modules, the definition of a variety of new type systems, the idea of using types throughout the compilation process, the analysis of run-time system semantics and cost, a formal definition of Standard ML and its mechanization, the definition of logical framework LF and other logical frameworks based on homotopy type theory.
Some of Bob’s most influential work involved the design, theory and implementation of the TIL (Typed Intermediate Language) compiler system, which pioneered the idea that compilers can propagate type information to lower-level intermediate languages, where it can be used to guide optimizations and to catch compiler bugs. These ideas led directly to the development of proof-carrying code, typed assembly language, and a wide array of type-preserving compilers.
Bob Harper has also had a profound impact though his mentorship, teaching and the influence of his books “Programming in Standard ML” and “Practical Foundations for Programming Languages.”
Hans Boehm has made fundamental, pioneering contributions on a number of topics, including garbage collection, programming language memory models and thread semantics. His work stands for a style of pragmatically-driven research with an emphasis on solving problems in widely-used languages - it has made life better for many practicing programmers.
Hans is the primary author of the Boehm-Demers-Weiser garbage collection (GC) library, the first to enable automatic deallocation of memory for most C programs – a surprising result at that time. Its low overhead made automatic GC widely accessible, enabling many language implementations to get off the ground quickly, with competitive GC performance. Initially published in 1988, the library is still widely used. It likely significantly accelerated the widespread adoption of GC and also enabled many modern C/C++ memory leak detectors for non-garbage-collected programs. Hans’ papers based on this library provided major new insights into both this “conservative” style of GC, and GC in general - his sole-author 1993 “most influential” PLDI paper showed how to greatly reduce the space cost of “conservative” GC, and his PLDI’91 paper introduced “sticky bit” collection, later adopted by mainstream Java collectors.
Hans is the leading authority on programming language memory models and thread semantics. The theory of memory models is complex, with subtleties still not well understood. Defining the model for a popular language is even harder, requiring balancing conflicting constituencies – teachable to the masses, performance for advanced programmers, and compatibility with legacy and future hardware. Leading the C++11 process, Hans miraculously managed to satisfy all constituencies. He argued for the core model to be the teachable “sequential-consistency for data-race-free” model, but additionally developed an alternate path for advanced programmers and non-compatible hardware using the concept of relaxed atomics. Along the way, Hans wrote several seminal papers. The PLDI’05 paper exposed problems with the then-prevalent approach of implementing threads as a library, the PLDI’08 paper described the C++ model, and a series of papers exposed hitherto unknown problems with benign data races, fences, seqlocks, and subtleties in the notorious out-of-thin-air (OOTA) issue. Hans was also a major contributor to the Java memory model effort, a precursor to the C++ effort, and his C++ work has influenced analogous OOTA related open problems for Java. It is impossible to overstate Hans’ impact on the understanding and practice of language memory models. He has had profound influence on the boundary between PL and computer architecture communities.
Alex Aiken has made fundamental, pioneering or breakthrough contributions on a number of topics, including constraint-based program analysis, statistical bug localization, super optimization, program synthesis, and parallel programming systems, among others. Alex’s research combines foundational theoretical results with practical implementations and thorough empirical evaluations.
Perhaps Alex’s most significant work is on constraint-based analysis. Starting with early work on soft typing for functional programs, he subsequently developed the theory and practice of constraint-based analysis for several key applications. He developed what was at the time the most scalable and precise points-to analysis for C and subsequently exploited distributed, SAT-based constraint solving to perform static analysis at a level of precision and scalability that had not been seen before. Alex’s research on cooperative bug isolation introduced the idea of using lightweight dynamic analysis at scale to find the causes of bugs in software systems that have been fielded. His work on data representation synthesis showed how to automatically construct low-level data structure implementations from high-level descriptions. His STOKE project revitalizes the old idea of super-optimization by using stochastic search to find significantly better code than general-purpose compilers. Finally, Alex is the developer of Moss ("a Measure of Software Similarity"), a web service that finds duplicate code, even if it has been alpha-converted or transformed in other ways. Most notably, Moss continues to be used for plagiarism detection by many Computer Science departments and others around the world.
Not only has Alex’s research led the field in important new directions, but his publications exemplify high-quality work in programming languages. To date, he is likely the most prolific author in POPL and PLDI, two flagship programming languages venues, and his publications have received many recognitions. Alex also stands out for his outstanding mentorship. He has graduated 24 PhD students, many of whom have won awards for their work and gone on to distinguished careers in academia and industry. Last but not least, Alex has provided dedicated service and strong leadership to the SIGPLAN community, serving as the program chair for both POPL and PLDI, the general chair for POPL, as an associate editor for TOPLAS, and chair and member on various SIGPLAN award committees.
Thomas Reps has made exceptional contributions to the field of programming languages, on a diverse range of topics that include incremental computation, program slicing and dataflow analysis, shape analysis, and analysis of binary code.
Tom’s dissertation research on generating language-based environments applies incremental attribute grammar evaluation to problems such as name analysis and type checking, and is broadly applicable to a wide range of languages. This work has been commercialized successfully by Grammatech and it has deeply influenced research on integrated development environments. Tom’s work on program slicing and dataflow analysis builds on an elegant framework for solving context-free reachability problems. This work has been extremely influential in the academic community, and it has had huge impact in industry (e.g., IBM products for security analysis and Microsoft’s tools for verifying device drivers). Tom’s work on shape analysis relies on three-valued logic to solve difficult problems in shape analysis, and has had major impact on the verification community. His more recent work on analyzing x86 binary code has applied techniques from his previous work on dataflow analysis and slicing to the intensely challenging problem of analyzing machine code with significant success, and has been commercialized by GrammaTech.
A common thread in all of Tom’s research is that it provides elegant solutions to deep foundational problems. Many of these solutions have become widely adopted, resulting in an unusual level of practical and industrial impact. His publications exhibit an exemplary degree of scholarship, with precise exposition that facilitates adoption by the community. He has received numerous significant awards throughout his career, starting with the 1983 ACM Dissertation Award, and his citation impact is among the highest in the entire field of programming languages.
Simon Peyton Jones is widely known as an essential force behind the design, implementation, evolution, and remarkable success of the Haskell language and the Glasgow Haskell Compiler for more than twenty years. His role as a primary designer of the Haskell language and the GHC implementation of Haskell has produced a platform for hundreds of researchers and many more developers. Indeed, GHC is unparalleled for achieving what might otherwise be seen as an engineering impossibility: the leading implementation of an increasingly popular programming language for both research experimentation and serious industrial use. While the rise of Haskell and GHC – and its influence on other languages – is the shared result of a vibrant community of many, it is also fair to describe it as inextricable from Simon’s career.
Among Simon’s technical contributions to language design is leading work in monadic I/O, type classes, generalized abstract data types, composable transactional memory, generic programming via “scrap your boilerplate”, advances in type inference, and more. His work on functional-language implementation has been no less influential and wide-ranging, with a compiler design based on a typed intermediate language and optimizations expressed via declarative rewriting. In making Haskell a practical language, he collaborated to design and implement novel approaches to concurrency, foreign function interfaces, space profiling, and nested data parallelism.
Beyond all these technical advances and many more, Simon is widely recognized as a visionary leader in the field, an articulate writer, an extraordinarily effective speaker, an agent of change in computing education, a generous collaborator, and a peerless mentor. His remarkable contributions to the field of programming languages arise from technical sophistication, impeccable taste, boundless enthusiasm, and charismatic leadership.
Luca Cardelli has made exceptional contributions to the field of programming languages. He has a remarkable ability to re-invent himself every decade or so and yet continue to make true innovations. His contributions span many areas: software; language design, including experimental languages; programming language foundations; and the interaction of programming languages and biology.
He is a leading language designer. He was one of the designers of Modula-3, which played a major role in popularising the notion of type-safe programming, and was a key influence on Java and C#. Thus, his work helped lay the foundations for today’s near-universal industrial acceptance of type-safe programming languages. He also designed several important experimental languages, including Obliq, a distributed higher-order scripting language, and Polyphonic C#, a distributed extension of C#. He implemented the first compiler for ML.
He is a leading and hugely influential computer scientist. His most sustained research activity has been in establishing the semantic and type-theoretic foundations of programming languages. Two of his most prominent works are his 1985 paper “On understanding types, data abstraction, and polymorphism” (with Wegner) and his 1996 book “A Theory of Objects” (with Abadi). Other important works of his concern the semantics of multiple inheritance in object-oriented languages and explicit substitutions in the lambda-calculus.
More recently, he has focused on modeling global and mobile computation, via the Ambient Calculus and spatial logics. His work pioneered an influential way of describing, programming, and reasoning about global and mobile computation, in terms of formal calculi and language constructs. It also connected with an important line of work on semi-structured data, and led, indirectly, to his current interest in biology and stochastic systems.
In both his own work and his great scientific influence Cardelli has made lasting contributions to the field of programming languages.
Neil Jones is a pioneer of programming-language methods. He introduced control-flow analysis for higher-order programs, binding-time analysis to tame self-applicable partial evaluation, what is known today as “Jones optimality”, and size-change termination analysis. Neil is also noted for bridges he established between programming languages and complexity/computability theory, e.g., characterizing Turing’s Universal Machine as self interpretation, Kleene’s S-m-n theorem as partial evaluation, Kleene’s second recursion theorem as reflection, and the expressive power of typed cons-free functional languages by means of complexity classes. His pioneering work on complexity theory included the development of completeness for P and solving the spectrum problem (the basis of finite model theory). Lastly, Neil is a tireless and inspiring mentor, and he has written several influential textbooks that are testaments to his thesis that programs are data objects and that programming languages are a cornerstone of computer science.
Patrick and Radhia Cousot are the co-inventors of abstract interpretation, a unifying theory of sound abstraction and approximation of structures involved in various domains of computer science, such as formal semantics, specification, proof, and verification. In particular, abstract interpretation has had a major impact on the development of the static analysis of software. In their original work, the Cousots showed how to relate a static analysis to a language’s standard semantics by means of a second, abstract semantics that makes precise which features of the full language are being modeled and which are being discarded (or abstracted), providing for the first time both a formal definition of and clear methodology for designing and proving the correctness of static analyses. Subsequently, the Cousots contributed many of the building blocks of abstract interpretation in use today, including chaotic iteration, widening, narrowing, combinations of abstractions, and a number of widely used abstract domains. This work has developed a remarkable set of intellectual tools and has found its way into practice in the form of widely used libraries and frameworks. Finally, the Cousots and their collaborators have contributed to demonstrating the utility of static analysis to society. They led the development of the Astrée static analyzer, which is used in the medical, automotive, and aerospace industry for verifying the absence of a large class of common programming errors in low-level embedded systems code. This achievement stands as one of the most substantial successes of program verification to date.
The SIGPLAN 2012 Achievement award has been won by Matthias Felleisen, an exemplary researcher whose work covers theory, practice, and education, with each reinforcing the others. He has made fundamental contributions across the entire spectrum of the programming language field. He introduced evaluation contexts as a notation for specifying operational semantics, and progress-and-preservation proofs of type safety, both of which are used in scores of research papers each year, often without citation. His other contributions include small-step operational semantics for control and state, A-normal form, delimited continuations, mixin classes and mixin modules, a fully-abstract semantics for Sequential PCF, web programming techniques, higher-order contracts with blame, and static typing for dynamic languages.
Matthias tests his ideas by building real systems, working in the framework of PLT Scheme, later renamed Racket, which includes interactive development environments, interactive debugging and flow analysis, documentation tools, and support for language levels and multiple languages. He has coauthored a series of influential texts, including four ‘little’ books—The Little Schemer, The Seasoned Schemer, A Little Java, and The Little MLer—the widely used textbook How to Design Programs, and a monograph on Semantics Engineering. He founded a series of workshops and projects to support K–12 teaching, including TeachScheme!, Reach Java, and Program by Design. Moreover, he has nurtured a new generation of researchers, who are now respected in their own right and pursue the same goals of excellence, rigor, and real-world impact that have characterized Matthias’s work.
Tony Hoare’s long career in computing is studded with seminal engineering and scientific contributions to Programming Languages; his views on programming language design have been recognized as profound even by those who declined to follow his advice.
Two contributions stand out as fundamental: the development of what is now known as Hoare logic, and Communicating Sequential Processes. Hoare logic is a system for reasoning about imperative programs. It was introduced in the 1969 article “An Axiomatic Basic for Computer Programming”, which is perhaps the most influential 6-page paper ever published in CACM. Drawing on earlier work of Robert Floyd, an entire sub-area of computer science has developed from Hoare’s initial ideas; many modern verification systems build on Hoare logic.
Only 9 years later, CACM published Hoare’s paper on Communicating Sequential Processes (CSP). Contemporary with Milner’s CCS, but pursuing complementary goals, CSP has been enormously influential. It provided the basis for the occam programming language and its realization in the Transputer; it has been used for modeling and verifying the concurrency properties of critical software systems; and it inspired a flowering of subsequent concurrency research.
Although either of these contributions would alone justify the achievement award, Hoare is doing more with his Unifying Theories research, which aims to unify theories of programming across paradigm, abstraction level and semantic style. Beyond all of this, Tony is renowned for his unfailing courtesy, his inspiration, and his dedication to his chosen calling. He is the epitome of a scholar and a gentleman.
Professor Gordon D. Plotkin has made fundamental advances in almost every area of the theory of programming languages. His contributions have helped to establish the mathematical foundations on which the scientific study of programming languages are based. His 1975 paper “Call-by-name, Call-by-value, and the λ-calculus” exposed the relationship between the reduction semantics of the λ-calculus and its operational semantics, as defined by Landin’s SECD machine. In the process, he defined what it meant for a calculus and a semantics to correspond: this launched the study of operational semantics as it is now understood. He invented Structural Operational Semantics as a technique for specifying the semantics of a wide range of programming languages, concurrent as well as sequential; this form of semantics is now one of the basic working tools of researchers developing new programming languages and type systems.
Plotkin’s contributions to the development of the mathematical theory of domains, and its applications to the denotational semantics of programming languages, have been of fundamental importance: they include his powerdomain construction, systematic development of the general theory of the solution of recursive domain equations, and his work on PCF and the full abstraction problem.
Plotkin’s work with Glynn Winskel on event structures is the basis for reasoning about distributed systems, process algebras, and reactive systems. Event structures have been enormously influential in the development of models of concurrency. He has also investigated the logical foundations of computer security, including logics for specifying authorization policies for computer systems. Plotkin continues to make bold and deep contributions, for example, in his current work on the algebraic theory of effects, and on languages and calculi for biochemical modelling. Taken together, Gordon Plotkin’s contributions over the past four decades exhibit a range and depth unmatched in the field.
Professor Rod Burstall has made deep, seminal contributions to the design of programming languages and the field of program verification. These contributions, which many of us now take for granted, include the introduction of algebraic datatypes coupled with pattern-matching clausal function definitions as found in Hope, ML, Haskell and Coq; the generalization and use of structural induction for proving properties of programs; the fold-unfold method for deriving efficient, provably-correct programs from easy to understand prototypes; mechanisms for reasoning about pointer-based, imperative programs that directly led to the development of separation logic; proof techniques and connections to modal logic for reasoning about concurrent programs; and the use of dependent types and algebraic specifications for constructing module systems that directly influenced SML and OCaml. Through these amazing contributions and his collaborations and mentorship, he helped build one of the most important centers of programming research at Edinburgh, which was eventually institutionalized as the Laboratory for Foundations of Computer Science.”
Professor Barbara Liskov has had tremendous impact on the fields of programming languages, operating systems, distributed systems, and information security. Much of her early research focus was on data abstraction, modularity, and encapsulation as typified by the CLU programming language. At the time, CLU incorporated a number of advanced features, such as modular encapsulation of abstract data types, bounded polymorphism, exceptions, and iterator abstraction that had clear influence over successive languages including Ada, Modula-3, C++, and Java. Through CLU, the related books and articles, her work on behavioral subtyping, and her courses on programming methodology, Professor Liskov changed the way that a generation of engineers thought about and constructed large software systems. Professor Liskov’s work on the Argus project also brought to the fore the idea of integrating transactions and orthogonal persistence into a programming language with an aim towards building reliable distributed systems. More recently, her work on information flow control helped to start a research focus on end-to-end security using language-based mechanisms for enforcement.”