CS graduate student Xin Lyu (advisors: Jelani Nelson and Avishay Tal) has won the Best Student Paper Award at the Computational Complexity Conference (CCC) 2022. The solo-authored paper titled “Improve Pseudorandom Generators for AC^0 Circuits” was one of two co-winners of the Best Student Paper Award at CCC, which is an annual conference on the inherent difficulty of computational problems in terms of the resources they require. Organized by the Computational Complexity Foundation, CCC is the premier specialized publication venue for research in complexity theory.
Two of EECS Prof. Ren Ng's former graduate students, Pratul Srinivasan and Benjamin Mildenhall, jointly received an honorable mention for the 2021 Association for Computing Machinery (ACM) Doctoral Dissertation Award. This award is presented annually to the "author(s) of the best doctoral dissertation(s) in computer science and engineering." Srinivasan and Mildenhall, who both currently work at Google Research, were recognized "for their co-invention of the Neural Radiance Field (NeRF) representation, associated algorithms and theory, and their successful application to the view synthesis problem." Srinivasan’s dissertation, "Scene Representations for View Synthesis with Deep Learning," and Mildenhall’s dissertation, “Neural Scene Representations for View Synthesis,” addressed a long-standing open problem in computer vision and computer graphics called the "view synthesis" problem: If you provide a computer with just a few of photographs of a scene, how can you get it to predict new images from any intermediate viewpoint? "NeRF has already inspired a remarkable volume of follow-on research, and the associated publications have received some of the fastest rates of citation in computer graphics literature—hundreds in the first year of post-publication."
A paper with lead author EECS postdoc Efrat Shimron and co-authors EECS graduate student Ke Wang, UT Austin professor Jonathan Tamir (EECS PhD ’18), and EECS Prof. Michael Lustig shows that algorithms trained using "off-label" or misapplied massive, open-source datasets are subject to integrity-compromising biases. The study, which was published in the Proceedings of the National Academy of Sciences (PNAS), highlight some of the problems that can arise when data published for one task are used to train algorithms for a different one. For example, medical imaging studies which use preprocessed images may result in skewed findings that cannot be replicated by others working with the raw data. The researchers coined the term “implicit data crimes” to describe research results that are biased because algorithms are developed using faulty methodology. “It’s an easy mistake to make because data processing pipelines are applied by the data curators before the data is stored online, and these pipelines are not always described. So, it’s not always clear which images are processed, and which are raw,” said Shimron. “That leads to a problematic mix-and-match approach when developing AI algorithms.”
CS Profs. Alistair Sinclair and Shafi Goldwasser have won inaugural Test of Time awards at the 2021 Symposium on Theory of Computing (STOC), sponsored by the ACM Special Interest Group on Algorithms and Computation Theory (SIGACT). Sinclair won the 20 Year award for his paper, “A polynomial-time approximation algorithm for the permanent of a matrix with non-negative entries," which solved a problem that had been open for decades. Goldwasser won the 30 Year award for "Completeness theorems for non-cryptographic fault-tolerant distributed computation," which showed how to compute a distributed function even if up to one-third of the participants may be failing, misbehaving, or malicious. The awards were presented at the 2021 STOC conference in June.
CS alumna and Prof. Shafi Goldwasser (Ph.D. '84, advisor: Manuel Blum) has won the 2021 Foundations of Computer Science (FOCS) Test of Time Award. This award "recognizes papers published in past Annual IEEE Symposia on Foundations of Computer Science (FOCS) for their substantial, lasting, broad, and currently relevant impact. Papers may be awarded for their impact on Theory of Computing, or on Computer Science in general, or on other disciplines of knowledge, or on practice." Goldwasser is among five co-authors who won the award in the 30 year category for their groundbreaking complexity theory paper "Approximating Clique is Almost NP-Complete," which used the classification of approximation problems to show that some problems in NP remain hard even when only an approximate solution is needed.
EECS Prof. Emeritus Lotfi Zadeh (1921 - 2017) is being honored with a Google Doodle feature today. In 1964, Zadeh conceived a new mathematical concept called fuzzy logic which offered an alternative to rigid yes-no logic in an effort to mimic how people see the world. He proposed using imprecise data to solve problems that might have ambiguous or multiple solutions by creating sets where elements have a degree of membership. Considered controversial at the time, fuzzy logic has been hugely influential in both academia and industry, contributing to, among other things, "medicine, economic modelling and consumer products such as anti-lock braking, dishwashers and elevators." Zadeh's seminal paper, "Fuzzy Sets -- Information and Control," was submitted for publication 57 years ago today.
CS Prof. Michael Jordan has co-written an article in Wired titled "The Turing Test Is Bad for Business" in which he argues that now that "computers are able to learn from data and...interact, infer, and intervene in real-world problems, side by side with humans," humans should not try to compete with them but "focus on how computers can use data and machine learning to create new kinds of markets, new services, and new ways of connecting humans to each other in economically rewarding ways." Jordan wrote the article because many AI investors are focusing on technologies with the goal of exceeding human performance on specific tasks, such as natural language translation or game-playing. “From an economic point of view, the goal of exceeding human performance raises the specter of massive unemployment,” he said. “An alternative goal for AI is to discover and support new kinds of interactions among humans that increase job possibilities.”
CS alumni Xiaoye Sherry Li (Ph.D. '96, advisor: James Demmel) and Richard Vuduc (Ph.D. '03, advisor: James Demmel) have, along with Piyush Sao of Georgia Tech, won the 2022 Society for Industrial and Applied Mathematics (SIAM) Activity Group on Supercomputing (AG/SC) Best Paper Prize. This prize recognizes "the author or authors of the most outstanding paper in the field of parallel scientific and engineering computing published in English in a peer-reviewed journal." Their paper, "A communication-avoiding 3D algorithm for sparse LU factorization on heterogeneous systems,” was published in 2018 in the IEEE International Parallel and Distributed Processing Symposium (IPDPS). Li is now a Senior Scientist at Lawrence Berkeley National Laboratory (LBNL) where she works on diverse problems in high performance scientific computations, including parallel computing, sparse matrix computations, high precision arithmetic, and combinatorial scientific computing. Vuduc, now an Associate Professor in the School of Computational Science and Engineering at Georgia Tech, is interested in high-performance computing, with an emphasis on algorithms, performance analysis, and performance engineering.
EECS alumna Hani Gomez (Ph.D. '20, advisor: Kris Pister) is the subject of a Berkeley Computing, Data Science, and Society (CDSS) profile titled "Hani Gomez, Ph.D.: Computing Pedagogy at the Nexus of Technology and Social Justice." Gomez was born in Bolivia and earned her B.S. in EE at the University of South Carolina before coming to Berkeley for her graduate studies. She has merged social justice and technology into a post-doc research position at Berkeley, split between EECS and the Human Contexts and Ethics (HCE) program in CDSS. Gomez helped develop the course CS 194-100 EECS for All: Social Justice in EECS last spring, was one of three presenters in a June HCE workshop titled "Towards Social Justice in the Data Science Classroom," and serves on the EECS Anti-Racism Committee. She says the preoccupation with perfectionism at Berkeley "doesn’t leave room [for you] to learn from your mistakes...You need to give yourself room to learn or unlearn, to grow and relearn.”
EECS alumnus Yang You (Ph.D. '20, advisor: James Demmel) was named as one of two honorable mentions for the 2020 ACM Special Interest Group in High Performance Computing (SIGHPC) Dissertation Award. You was selected for developing LARS (Layer-wise Adaptive Rate Scaling) and LAMB (Layer-wise Adaptive Moments for Batch training) to accelerate machine learning on HPC platforms. His thesis, “Fast and Accurate Machine Learning on Distributed Systems and Supercomputers,” focuses on improving the speed and accuracy of Machine Learning training to optimize the use of parallel programming on supercomputers. You made the Forbes 30 Under 30 2021 Asia list for Healthcare and Science in April and is now a Presidential Young Professor of Computer Science at the National University of Singapore.