Blog

Literature Review Tricks

Conducting a thorough literature review for a new idea is a critical yet difficult research skill to learn. Even now, for areas I'd like to think I have a good grasp on, its difficult not to miss something important; multiple times, I've done a literature review only to find a …

Read More

Reproducible Hyper Parameter Sweeps in Machine Learning

In the past couple weeks I've been working on writing machine learning code in python with the following goals:

  1. Make experiments easy to reproduce (or retroactively debug). Primarily, this means saving the configuration and the code as it was at the time it was run.
  2. Make it easy to run …

Read More

Debugging Machine Learning Code

Developing new machine learning code is often error prone and takes many iterations of the write-run-debug loop. In this context, I specifically refer to saving time fixing errors that crash the program--not than those that cause models to be incorrect in subtle ways (for that, see Andrej Karpathy's blog post …

Read More

Riot Games Data Science Internship Video

Summer 2017 I was a data science intern at Riot Games. It was an amazing experience for many reasons, and I'm excited to share the video that Riot Games produced featuring interviews with all the data science interns!

More about the internship, and applying at the …

Read More

Entropy and KL Divergence

Description: Intuitive introduction to what the Kl divergence is, how it behaves, and why its useful

Introduction

The KL divergence is short for the Kullback-Leibler Divergence discovered by Solomon Kullback and Richard Leibler in 1951. Semantically, divergence means the amount by which something is diverging, and diverging in turn means to lie in different directions from a different point. In the case of KL divergence we …

Read More