I am a programmer and researcher who wants to make computers more fun and empowering for students, researchers, and engineers. I thought the field was stuck, so I moved from computer science to neuroscience in search of ideas for making computers smarter. Then, to my surprise, the computers went and got smarter.

## Current project

Can we make it more fun to explore the space of new AI models? Could exploring Deep Learning architectures and training regimes become more fun than playing Minecraft?

So far, two ML libraries have spun out from this project:

- Vexpr: use Lisp-like code transformation to make models readable, fast, and visualizable
- rows2prose: visualize models by rendering scalars into styled text

## Blog

Expressions are Pragmatic Model Visualizations

2024-01-10

What happens when you vectorize wide PyTorch expressions?

2023-10-19

Gaussian Processes Extrapolate, Sometimes in Goofy Ways

2023-03-28

Maybe Bayesian Optimization Should Be Harder, Not Easier

2022-11-30

Imagine A Deep Network That Performs Successive Cheap Queries On Its Input

2022-07-08

Bayesian Optimization Is More Basis-Dependent Than You Might Think

2022-06-26

Likely ≠ Typical: A Viewpoint On Why We Perturb Neural Networks

2022-01-28

Some “Causal Inference” intuition

2021-11-04

## Select talks / presentations

Intro to Grid Cells + Quickly Forming Structured Memories

2021-06-07

Testing a possible explanation for grid cell distortions

2021-05-17

Journal Club: Hinton’s GLOM + Numenta’s TBT (after a year without a haircut)

2021-03-24

Using grid cells as a prediction-enabling basis

2020-12-22

The Minimum Description Length Principle, sparsity, and quantization

2020-04-01

## Photo

*(Photo credit: Rosanne Liu, 2023)*

## Papers

*Poster at 30th Annual Computational Neuroscience Meeting (2021)*

*PLOS Computational Biology (2020)*

*Front. Neural Circuits (2019)*

*Front. Neural Circuits (2019)*