I am an engineer at Blockstack, where I'm building Free and Open Source tools that are going to be the foundation of a new and more decentralized web. I'm interested generally in distributed systems, security, and information privacy.
I have an academic background, which, in addition to meaning that I have a subscription to the New York Review of Books, means that I did some amount of research in Computer Science. In fact, I defended my PhD thesis at Princeton University pretty recently. My adviser was Mike Freedman and my research focused on exploiting regularity in the structure of modern web applications to improve different aspects of those applications.
If you are interested in reading my thesis, Exploiting the Structure of Modern Web Applications, it is available here. It looks at how we can automatically provide some measures of security through more intelligent and secure web frameworks and then provide increased performance through better caching for web applications. The insight in this work is that today's web applications are typically implemented with an implicit structure, and this structure can be exploited by frameworks for better security (see the Passe project) on the one hand, and better performance on the other (see Hyperbolic Caching).
Previously, I attended MIT and received a pair of bachelor's degrees. I also received an M.Eng in 2011 advised by Barbara Liskov while working in the Programming Methodology Group on Distributed Information Flow Control and Secure Audit Trail Analysis. My thesis was titled Analyzing Audit Trails in the Aeolus Security Platform.
This work investigates how web applications' data access patterns can inform cache eviction strategies. In particular, by incorporating varying decay rates for different items in the cache, the eviction strategy is able to perform better than prior strategies on web-like workloads. However, these varying decays introduce reorderings that are not easily implemented using classical queue structures, and so this work requires new approaches to managing item priorities. We implemented a prototype of our work in a modification of Redis. I spent the summer of 2015 at MSR New York working on this project with Sid Sen.
Passe explores alternative programming models and mechanisms for securing application data. Many modern applications rely on a centralized and logically separate shared data store. By ensuring that only "normal" queries can execute on this data store, certain confidentiality and integrity guarantees can be provided for application data even while the application itself is compromised by attackers. This model allows application programmers to use already familiar interfaces without any explicit security specifications. Developers need only execute our analysis tool on their applications. I made a poster for this work which may serve as a handy infographic for those interested.
A paper on this work appeared at the 2014 IEEE Symposium on Security and Privacy. Source code for my modified version of PyPy to support taint tracking is available on GitHub, as is the web framework itself, Passe.
As a research intern at MSR Cambridge with Miguel Castro and Manuel Costa in the summers of 2013 and 2014, I helped investigate the impact of garbage collection on performance sensitive code. In systems with large memory capacity, high memory utilization, and small objects, modern GCs can introduce application stalls on the order of tens of seconds, which is unacceptable for systems applications such as key-value stores and databases. However, GCs are relied on to provide temporal memory safety. In my internship, I helped developed a prototype variant of C# with manual memory management and temporal memory safety achieved with compiler-inserted checks.
This work contributed to a "Simple, Fast, Safe Manual Memory Management" which appeared at PLDI 2017.