I am a researcher and engineer who works primarily on Static Analysis, but my research interests span broadly across Compilers, Program Analysis and Verification. I recently graduated from UCLA with a Ph.D. in Computer Science, and I will be joining Uber's Programming Systems Group. At UCLA, I was a part of the Compilers Group, and I worked with my advisor Jens Palsberg on adapting static analysis tools to better meet user expectations (also the title of my thesis) on false positives, analysis time and repairability. Before the Ph.D., I completed a five year Dual Degree (Integrated Bachelors & Masters) in Computer Science from the Indian Institute of Technology Madras. There I worked on the runtime design of the X10 task-parallel programming language, and on algorithms for graph matchings.
My research at UCLA was primarily centered around making static analysis tools more useful in practice. Traditionally, static analysis tools have been developed as verification tools. That is, they verify that a program is guaranteed to not have a certain kind of error (like cast-check errors or null-pointer errors). Since solving this problem accurately is undecidable, they overapproximate all possible program paths. Hence they never missed any errors, but they could report some false-positives. Scalability and a low false-positive rate were also important design considerations for these tools, but these goals were secondary to the safety property (i.e. never missing an error).
However, in practice, static analysis tools are more commonly used as bug-finding tools and not verification tools. That is, these tools are used to automatically identify bugs and security vulnerabilities that are hard to test or missed during testing. In the bug-finding setting, the programmers do not expect a guarantee on the absence of errors. Further, since these verification tools cannot deal with dynamic features (eg. reflection in Java), and most modern languages have such dynamic features, such guarantees are not even feasible for most real-world programs. Instead, users care more that these tools run in a few seconds (or at most minutes) during compilation or code-review and have very low-false-positive rates. Programmers will stop using tools that take too long or report more than the occasional false warning. The following empirical study elaborates on this point in more detail.
While the existing verification-focused static analysis tools may not be the best fit for a bug-finding setting, they continue to be widely used because redesigning these tools, which initially took two decades of work by hundreds of researchers, is simply too cumbersome. In my opinion, the most practical approach in this situation is to adapt existing tools to the new bug-finding setting. Hence, my research at UCLA was focused on pre-processing and post-processing existing static analysis tools to prioritize a low false-positivity rate, scalability, and repairability, over the safety (i.e. absence of errors) guarantee.
My research work at IIT Madras was in the fields of parallel-programming languages (with Professor V Krishna Nandivada) and graph-matching algorithms (with Professor Meghana Nasre).
One dimension of this question is about what kind of problems I like to work on, and I believe that there are two kinds of researchers for this. The first kind are very driven by a particular topic or research problem and would only be strongly motivated to work on that. The second kind of researcher loves solving hard problems in general, and would be happy to work on any challenging problem that matches his skill-set. I think I'm the second kind of researcher, and my ideal problem is ill-specified, open-ended, and requires design thinking.
The second dimenion is about the kind of solutions I like to find, and again there two kinds of researchers for this. The first kind like to develop a clean, elegant solution for their problem, even if it sometimes requires simplifying the problem or working on a particular variation of it that is amenable to develop theory for. The second kind of researchers want solutions for real-world problems with real-world constraints, and are ready to accept solutions that may seem heuristic or empirically justified in nature, as long as they work well in practice. Again, I consider myself to be the second kind of researcher. I like to do whatever it takes, whether using theory or heuristics, to solve the problem at hand.
Since I find several areas interesting, I also happen to know a little bit about a lot of topics - like a jack of all trades. I can talk about programming languages, software-engineering and computer architecture in depth, but I also know a decent amount about machine-learning, natural-language processing, graph-algorithms, operating systems, computer security, bio-statistics, and economics.
I love talking about philosophy (sometimes a bit too much), psychology and economics, and I also love reading about them. I like travelling to meet friends and family who don't live in my city.