News

Luke Strgar thinks that Blockchain can be used to track gun sales in America

Graduating CS senior Luke Strgar thinks he might have a solution for the fraught issue of guns in America: Use blockchain to track gun sales.  Strgar thinks that Blockchain offers the perfect balance between security, anonymity and scale that could please people on all sides of the gun-control debate.  He spent two days in Washington, D.C. this month pitching the idea of a centralized, ultra-secure, online gun-sale database to legislative aides and think-tank analysts.  A database like this could be monitored by everyone and could not be abused by the government.  “The goal here is to find a solution that both parties can agree on,” Strgar said. “I am not interested in developing something for one side of the discussion, that people try to force down the throat of parties coming from the other side. One of the nice things about technology is that you can develop systems that work for people.”

Nick Carlini embeds hidden commands to Alexa and Siri in recordings of music and spoken text

CS graduate student Nicholas Carlini  is featured in a New York Times article titled "Alexa and Siri Can Hear This Hidden Command. You Can’t." He and his advisor, David Wagner, have published a paper showing they can embed audio instructions, undectable by human beings, directly into recordings of music or spoken text. They can secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.  “We want to demonstrate that it’s possible,” he said, “and then hope that other people will say, ‘O.K. this is possible, now let’s try and fix it.’ ”  Carlini was among a group of researchers who showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

Jacque Garcia graduates a champion

Graduating CS senior Jacque Garcia, the president of Cal Boxing, is the focus of a Berkeley News article titled "Longtime fighter graduates as a champion."  Garcia, who grew up in Compton and is known for her “mental toughness, determination, dedication and positive attitude,” won the 2018 132-pound National Collegiate Boxing Association (NCBA) championship belt, an Outstanding Boxer Award, and a Cal Boxing women's third-place team award.  She was also both a Code2040 Fellow and CircleCI software engineering intern in 2017, and worked at the Hybrid Ecologies Lab in 2016 to help Ph.D. grad student Cesar Torres develop some features of a 2.5D Computer Aided Design (CAD) tool to reduce complexity of digital modeling by using grey-scale height maps.  Garcia credits the student organization Code the Change for her decision to eventually major in Computer Science. “Graduation is going to be very emotional,” says Garcia. “I didn’t start thinking about college until I was in the eighth grade. I didn’t know if I was going to go to college, I didn’t know how I was going to pay for it. It’s going to be a surreal moment. I can’t believe it’s happening.”

HäirIÖ: Human Hair as Interactive Material

CS Prof. Eric Paulos and his graduate students in the Hybrid Ecologies Lab, Sarah Sterman, Molly Nicholas, and Christine Dierk, have created a prototype of a wearable color- and shape-changing braid called HäirIÖ.  The hair extension is built from a custom circuit, an Arduino Nano, an Adafruit Bluetooth board, shape memory alloy, and thermochromic pigments.  The bluetooth chip allows devices such as phones and laptops to communicate with the hair, causing it to change shape and color, as well as respond when the hair is touched. Their paper "Human Hair as Interactive Material," was presented at the ACM International Conference on Tangible, Embedded and Embodied Interaction (TEI) last week. They have posted a how-to guide and instructable videos which include comprehensive hardware, software, and electronics documentation, as well as information about the design process. "Hair is a unique and little-explored material for new wearable technologies," the guide says.  "Its long history of cultural and individual expression make it a fruitful site for novel interactions."

Allan Jabri named 2018 Soros Fellow

CS graduate student Allan Jabri has been named a 2018 Paul & Daisy Soros Fellow.   Soros Fellowships are awarded to outstanding immigrants and children of immigrants from across the globe who are pursuing graduate school in the United States.  Recipients are chosen for their potential to make significant contributions to US society, culture, or their academic fields, and will receive up to $90K in funding over two years.  Jabri was born in Australia to parents from China and Lebanon and was raised in the US.   He received his B.S. at Princeton where his thesis focused on probabilistic methods for egocentric scene understanding, and worked as a research engineer at Facebook AI Research in New York before joining Berkeley AI Research (BAIR).  He  is interested in problems related to self-supervised learning, continual learning, intrinsic motivation, and embodied cognition. His long-term goal is to build learning algorithms that allow machines to autonomously acquire visual and sensorimotor common sense. During his time at Berkeley, he also hopes to mentor students, contribute to open source code projects, and develop a more interdisciplinary perspective on AI.

Stephen Tu wins Google Fellowship

EE graduate student Stephen Tu (advisor: Ben Recht) has been awarded a 2018 Google Fellowship.  Google Fellowships are presented to exemplary PhD students in computer science and related areas to acknowledge contributions to their chosen fields and provide funding for their education and research. Tu's current research interests "lie somewhere in the intersection of machine learning and optimization" although he previously worked on multicore databases and encrypted query processing.  Tu graduated with a CS B.A./ME B.S. from Berkeley in 2011 before earning an EECS S.M. from MIT in 2014.

Making computer animation more agile, acrobatic — and realistic

Graduate student Xue Bin “Jason” Peng (advisors Pieter Abbeel and Sergey Levine) has made a major advance in realistic computer animation using deep reinforcement learning to recreate natural motions, even for acrobatic feats like break dancing and martial arts. The simulated characters can also respond naturally to changes in the environment, such as recovering from tripping or being pelted by projectiles.  “We developed more capable agents that behave in a natural manner,” Peng said. “If you compare our results to motion-capture recorded from humans, we are getting to the point where it is pretty difficult to distinguish the two, to tell what is simulation and what is real. We’re moving toward a virtual stuntman.”  Peng will present his paper at the 2018 SIGGRAPH conference in August.

Ashokavardhanan, Jung, and McConnell

Ashokavardhanan, Jung, and McConnell named KPCB Engineering Fellows

Undergraduate students Ganeshkumar Ashokavardhanan (EECS  + Business M.E.T.),  Naomi Jung (CS BA), and Louie McConnell (EECS + Business M.E.T.) have been selected to participate in the 2018 KPCB Engineering Fellows Program, named one of the top 5 internship programs by Vault.  Over the course of a summer, KPCB Engineering Fellows join portfolio companies, where they develop their technical skills and are each mentored by an executive within the company. It offers students an opportunity to gain significant work experience at Silicon Valley startups, collaborating on unique and challenging technical problems.

Carlini (photo: Kore Chan/Daily Cal)

AI training may leak secrets to canny thieves

A paper released on arXiv last week by a team of researchers including Prof. Dawn Song and Ph.D. student Nicholas Carlini (B.A. CS/Math '13), reveals just how vulnerable deep learning is to information leakage.  The researchers labelled the problem “unintended memorization” and explained it happens if miscreants can access to the model’s code and apply a variety of search algorithms. That's not an unrealistic scenario considering the code for many models are available online, and it means that text messages, location histories, emails or medical data can be leaked.  The team doesn't “really know why neural networks memorize these secrets right now, ” Carlini says.  “At least in part, it is a direct response to the fact that we train neural networks by repeatedly showing them the same training inputs over and over and asking them to remember these facts."   The best way to avoid all problems is to never feed secrets as training data. But if it’s unavoidable then developers will have to apply differentially private learning mechanisms, to bolster security, Carlini concluded.

Ling-Qi Yan helps to improve computer rendering of animal fur

CS graduate student Ling-Qi Yan (advisors: Ravi Ramamoorthi/Ren Ng) and researchers at U.C. San Diego are the subject of an article in TechXplore titled "Scientists improve computer rendering of animal fur."  He is part of a team that developed a method for dramatically improving the way computers simulate fur, and more specifically, the way light bounces within an animal's pelt.  The researchers are using a neural network to apply the properties of a concept called subsurface scattering to quickly approximate how light bounces around fur fibers.  The neural network only needs to be trained with one scene before it can apply subsurface scattering to all the different scenes with which it is presented. This results in simulations running 10 times faster than current state of the art.  "We are converting the properties of subsurface scattering to fur fibers," said Yan. "There is no explicit physical or mathematical way to make this conversion. So we needed to use a neural network to connect these two different worlds."  The researchers recently presented their findings at the SIGGRAPH Asia conference in Thailand.