Alex Ratner

PhD Candidate in CS @ Stanford

Github | Google | Twitter
Email: ajratner at stanford.edu

Latest News

One of the key bottlenecks in building machine learning systems today is labeling training data. I work on enabling users to program the modern ML stack in higher-level but noisier ways. These weak supervision approaches can lead to applications built in days or weeks, rather than months or years. On the applications side, much of my focus is in the biomedical domain as a Bio-X SIG Fellow. I’m very fortunate to work with Chris Ré and others in the Hazy, Info, StatsML, DAWN, and QIAI labs.

Research Projects


Data Programming + Snorkel

Snorkel enables users to quickly and easily generate training datasets by writing labeling functions, rather than labeling data by hand. These labeling functions can express a variety of weak supervision signals like heuristics, patterns, and distant supervision, which Snorkel automatically models and combines. Check out tutorials and more at snorkel.stanford.edu!

Snorkel MeTaL is a new version of Snorkel focused on handling diverse multi-task supervision, using a scalable new matrix completion approach; check it out here!

Publications

By Year | By Topic | Highlighted


Older News