Write a Blog >>
POPL 2021
Sun 17 - Fri 22 January 2021 Online
Sun 17 Jan 2021 19:04 - 19:17 at LAFI - Session 2 Chair(s): Dougal Maclaurin

A wide variety of libraries written in Julia implement stochastic simulators of natural and social phenomena for the purposes of computational science. However, these simulators are not generally amenable to Bayesian inference, as they do not provide likelihoods for execution traces, support constraining of observed random variables, or allow random choices and subroutines to be selectively updated in Monte Carlo algorithms.

To address these limitations, we present Genify.jl, an approach to transforming plain Julia code into generative functions in Gen, a universal probabilistic programming system with programmable inference. We accomplish this via lightweight transformation of lowered Julia code into Gen’s dynamic modeling language, combined with a user-friendly random variable addressing scheme that enables straightforward implementation of custom inference programs.

We demonstrate the utility of this approach by transforming an existing agent-based simulator from plain Julia into Gen, and designing custom inference programs that increase accuracy and efficiency relative to generic SMC and MCMC methods. This performance improvement is achieved by proposing, constraining, or re-simulating random variables that are internal to the simulator, which is made possible by transformation into Gen.

Genify.jl is available at: https://github.com/probcomp/Genify.jl

Genify.jl: Transforming Julia into Gen to enable programmable inference (Extended Abstract) (genify-lafi21-extended-abstract.pdf)561KiB

Sun 17 Jan
Times are displayed in time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change

18:00 - 19:30: Session 2LAFI at LAFI
Chair(s): Dougal MaclaurinGoogle Research
18:00 - 18:12
Talk
Enzyme: High-Performance Automatic Differentiation of LLVM
LAFI
William S. MosesMassachusetts Institute of Technology, Valentin ChuravyMIT CSAIL
18:12 - 18:25
Talk
Parametric Inversion of Non-Invertible Programs
LAFI
Zenna TavaresMassachusetts Institute of Technology, Javier BurroniUniversity of Massachusetts Amherst, Edgar MinasyanPrinceton University, David MorejonMassachusetts Institute of Technology, Armando Solar-LezamaMassachusetts Institute of Technology
File Attached
18:25 - 18:38
Talk
Bayesian Neural Ordinary Differential Equations
LAFI
Raj DandekarMIT, Vaibhav DixitJulia Computing, Mohamed TarekUNSW Canberra, Australia, Aslan Garcia ValadezNational Autonomous University of Mexico, Chris RackauckasMIT
Pre-print Media Attached File Attached
18:38 - 18:51
Talk
On the Automatic Derivation of Importance Samplers from Pairs of Probabilistic Programs
LAFI
Alexander K. LewMassachusetts Institute of Technology, USA, Ben Sherman, Marco Cusumano-TownerMIT-CSAIL, Michael CarbinMassachusetts Institute of Technology, Vikash MansinghkaMIT
Media Attached
18:51 - 19:04
Talk
Decomposing reverse-mode automatic differentiation
LAFI
Roy FrostigGoogle Research, Matthew JohnsonGoogle Brain, Dougal MaclaurinGoogle Research, Adam PaszkeGoogle Research, Alexey RadulGoogle Research
File Attached
19:04 - 19:17
Talk
Genify.jl: Transforming Julia into Gen to enable programmable inference
LAFI
Tan Zhi-XuanMassachusetts Institute of Technology, McCoy R. BeckerCharles River Analytics, Vikash MansinghkaMIT
Media Attached File Attached
19:17 - 19:30
Talk
Probabilistic Inference Using Generators: the Statues Algorithm
LAFI
Pierre Denisindependent scholar
Link to publication DOI Pre-print Media Attached File Attached