The recent boom in machine learning is an ideal situation for second-year Research Assistant Professor (RAP) Brian Bullins. Dr. Bullins, a 2019 Princeton PhD graduate, conducts research in the areas of optimization and machine learning alongside TTIC Professor Nathan Srebro.
“The way learning works is that there’s some model behind the scenes that is learned through an iterative process. That process by which these models are learned is where the optimization procedures come into the picture. I’m very interested in mathematical methods themselves, and viewing these problems as mathematical formulae and thinking about how to develop the methods to optimize them efficiently with the resources that we have,” said Dr. Bullins.
Recently, he has specifically been interested in the problem of parallel optimization. These models have become increasingly large in recent years, both in terms of the model size and in terms of the size of the data sets. It has become more difficult to keep all of the information on a single machine, even large-scale machines with extensive memory, so people have looked to the benefits of spreading this optimization objective to multiple machines.
“Going from a single machine optimization problem to having it distributed over several machines comes with its own interesting set of problems. I’ve been working closely with Professor Nati Srebro and several of his students, including Blake Woodworth and Kumar Kshitij Patel. We want to optimize this function, and we have to do it where there is access to machines that contain different sets of information. We have to balance these constraints against various other things and the methods themselves, and ask how we can better design methods to take advantage of this structure,” said Dr. Bullins.
When he first became involved with optimization, he was especially intrigued by the mathematical structure behind problems. He wanted to work on the underlying mathematics, and hopefully, be able to prove some interesting theorems. Soon after getting into optimization, Dr. Bullins found what he feels is the natural complement to this mode — problems in machine learning, where these sorts of optimization methods are most readily applicable.
He found the fact that you can both develop practical methods and prove their properties especially interesting. “Being able to write some code that will implement that method, and run it for these problems and see some improvement over other methods is really a nice end-to-end satisfying feeling that it has applicability,” said Dr. Bullins. “It’s not just math on paper, you’re seeing these models learn, and then several steps down the line, practitioners are able to use these methods and see improvements.”
Unlike most postdocs, RAP’s at TTIC are in a unique position where they have many of the same privileges as faculty members, without being required to teach. For example, they receive endowment-provided independent research funding and are allowed to be the principal investigator (PI) on grants. There are opportunities to gain teaching experience, but not having the obligation means that they have much more time for their own research interests.
“In many ways, it’s the best of both worlds. What it has meant for me is the chance to really get to explore all of the research ideas that I’m interested in. There are also so many great faculty members and students. I’ve been able to establish really great collaborations. All of those things have come together to make for such a unique and amazing experience. I couldn’t be happier,” said Dr. Bullins. “The people at TTIC in general are absolutely fantastic. I’m especially lucky that there are such strong groups in the institute that work on problems related to what I work on, which is this interesting intersection of optimization and machine learning.”
Next year, Dr. Bullins will be starting his third and final year as an RAP. He is hoping to stay in academia, which means he will be applying for faculty positions starting in the fall. In terms of research, he hopes to continue investigating all of the curious directions that open up as progress is made in problems of distributed optimization. “Every time you think you’ve solved the problem, you realize that there are two or three other interesting directions, and you wonder, what could that mean, if we were to take that a few steps further?” said Dr. Bullins.
His time at TTIC as an RAP has been akin to a professorship in training. “It’s given me a chance to get a first sense of what it would be like to advise my own students. Working with Professor Srebro’s group, I have the chance to direct parts of projects. It’s good to develop a sense of how you determine a research agenda, and find interesting questions,” he said. Though there are plenty of challenges in his field of research, Dr. Bullins feels that he will be better equipped to handle them after his experience at TTIC.