Categories
Faculty

In the Spotlight: Mingmin Zhao and Building a Bridge Between Machine Learning and Monitoring Health

In the past year, the Department of Computer and Information Science has welcomed an unprecedented number of academic professionals to join Penn’s faculty. One of the Assistant Professor’s who has joined both CIS and ESE this past Fall is Mingmin Zhao, an MIT graduate with a PhD focusing on building wireless sensing systems with artificial intelligence.

The collaboration between CIS and a number of departments at Penn is what encouraged Zhao to further his research and teaching career here.

“Penn provides a fertile ground for interdisciplinary research not only within the CIS department but also with other departments, including ESE, medical school, nursing school, etc.” said Zhao, “I am very excited about collaborating with people at Penn and working on highly-impactful interdisciplinary research.”

Zhao’s research interests include building wireless sensing systems that can capture a human’s functionality through physical surfaces. He explains that his research “uses machine learning to interpret and analyze wireless reflections to detect humans through walls, track their movements, and recognize their actions, enabling a form of x-ray vision.”

“Through-Wall Human Pose Estimation Using Radio Signals”
Mingmin Zhao, Tianhong Li, Mohammad Abu Alsheikh, Yonglong Tian, Hang Zhao, Antonio Torralba, Dina Katabi,
Massachusetts Institute of Technology

With these wireless sensing systems, he has also developed a way for healthcare professionals to track a person’s functions including sleep, respiration, and heart rate. “These technologies allow us to continuously and without contact monitor people’s health without wearable sensors or physical contact with the user.” In the startup he joined after graduating, Zhao stated that they are building upon his own research to “work with pharmaceutical companies to run clinical trials in people’s homes.”

“Learning Sleep Stages from Radio Signals: A Conditional Adversarial Architecture”
Mingmin Zhao, Shichao Yue, Dina Katabi, Tommi Jaakkola, Matt Bianchi,
MIT & Massachusetts General Hospital

When asked about what made him passionate about the work that he does, Zhao explained that he is passionate about developing sensing tech that focuses on better understanding humans and their wellbeing.

“New sensing technologies (e.g., contactless monitoring of physiological signals) could help doctors understand various diseases and how patients are doing after taking medications,” said Zhao. “They could enable new digital health and precision medicine solutions that improve people’s life.”

Mingmin Zhao is currently teaching CIS 7000 focusing on wireless mobile sensing and building AIoT (Artificial-intelligence Internet of Things) systems. He is looking forward to educating his students to apply what they have learned in building “hardware-software systems” to solving practical problems that can impact the world.

Categories
Faculty

In the Spotlight: Osbert Bastani and Integrating Machine Learning into Real-world Settings

Osbert Bastani, Assistant Professor in the Computer and Information Science Department in the School of Engineering, University of Pennsylvania

Many students and faculty alike may recognize the face above as Osbert Bastani. Well that’s because this Assistant Professor is not a new member of the Penn Engineering team. Osbert joined the Computer and Information Science Department as a Research Assistant Professor in 2018 specializing in programming languages and machine learning.

“Penn has a great group of faculty working on interesting research problems, and they are all incredibly supportive of junior faculty. I’ve been fortunate enough to collaborate with Penn CIS faculty in a range of disciplines, from programming languages to NLP to theory, and I hope to have the chance to collaborate with many more.” (Osbert Bastani)

Osbert actually began his research career in programming languages. This major challenge in this research is “verifying correctness properties for software systems deployed in safety-critical settings.” He explains that because machine learning is progressively being incorporated into these systems, it has become a greater challenge facing verification. In his research, he is tackling this overarching question; “How can one possibly hope to verify that a neural network guiding a self-driving car correctly detects all obstacles?” While there has been progress made in trustworthy machine learning, there is still a long road ahead to finding solid solutions.

His enthusiasm in working with the Ph.D. students on various topics and research projects is what he has looked forward to most as he entered into this new role in his teaching career at the start of this Fall semester. Since the school year began, he has been teaching Applied Machine Learning (CIS 4190/5190) with Department Chair, Zachary Ives. When asked about how the semester is going Osbert replied:

“I’ve been very fortunate to have strong students with very diverse interests, meaning I’ve had the opportunity to learn a great deal from them on a variety of topics ranging from convex duality for reinforcement learning to graph terms in linear logic. An incoming PhD student and I are now learning about diffusion models in deep learning, which are really exciting!” (Osbert Bastani)

While teaching, Osbert is also involved in several research projects that are dealing with trustworthy machine learning within real-world settings. One project that raises several questions about fairness and interpretability includes “building a machine learning pipeline to help allocate limited inventories of essential medicines to health facilities in Sierra Leone.” In addition, during a summer internship at Meta, one of Osbert’s students has been in the process of “developing deep reinforcement learning algorithms that can learn from very little data by pretraining on a huge corpus of human videos.”

Osbert Bastani wears many hats in the CIS Department. Not only is he involved in teaching and research projects with students, he is also a member of several groups within the department. Those include PRECISE, PRiML, PLClub, and the ASSET Center and he encourages all students to attend the seminars that each club holds and get the opportunity to learn about research in their areas or outside of their own.

Just as Osbert works to problem solve within the classroom and in his research, he does just about same outside of work as well! He expresses that he is an avid board game player and frequents the restaurant just down the street from Penn called “The Board and Brew”. He and his wife have played through the restaurant’s entire collection of the game “Unlock!”. The Board and Brew has great food and several hundred games to choose from. It is highly recommended by Osbert himself!

https://www.theboardandbrew.com/game/
Categories
Faculty

In the Spotlight: Eric Wong and Developing Debuggable AI-Systems

What happens when AI goes wrong? Probably not the Terminator or the Matrix – despite what Hollywood suggests – but rather, something that could still harm a human, such as a self-driving car that gets into an accident, or an algorithm that discriminates against certain people. Fortunately Penn has innovative researchers like Eric Wong, who build tools to make sure AI works correctly!

You may have already seen Eric on campus or perhaps teaching his advanced graduate class. Just like the Class of 2026 who are quickly learning their way around Levine Hall, Eric is one of the C.I.S. Department’s newest faculty members. An Assistant Professor who works in Machine Learning, Eric is a Carnegie Mellon Ph.D. graduate and a former MIT post-doctoral researcher in the Computer Science and Artificial Intelligence Lab.

As this semester is in full swing, Eric Wong is busy at work teaching course 7000-05: Debugging Data and Models. When asked what he is looking forward to most about teaching in Penn Engineering, Eric stated,

“One of the key skills that students will learn is how to tinker with AI systems in order to debug and identify their failure modes. I’m excited to see the new ways in which Penn Engineering students will break AI systems, as well as the innovations they come up with to repair them!”

The initiatives that Penn Engineering has launched in recent times are what drew Eric to the C.I.S. Department, specifically the ASSET Center. “Penn Engineering is well-situated to ensure that the tools and systems we develop as computer scientists actually satisfy the needs and requirements of those that want to use them.”, said Eric. He will be one of many faculty members working with ASSET to develop reliable and trustworthy AI-systems which coincides with his own research.

Some of Eric’s specialized interests in this field include “verifying safety properties of an AI-system, designing interpretable systems, and debugging the entire AI pipeline (i.e. the data, models, and algorithms).” His research goals are working towards debugging AI-systems so that the user is able to understand the decision process of a system and learn how to inspect its defects. Eric is also interested by the interdisciplinary work of connecting these methods to other fields outside of engineering. Collaborators in medicine, security, autonomous driving, and energy would “ensure that the fundamental methods we develop are guided by real-world issues with AI reliability.”

As AI is being developed and deployed at a rapid rate, Eric worries that, “it is only a matter of time before the ‘perfect storm’ induces a catastrophic accident for a deployed AI system.” In teaching methods of debugging AI-systems, he strives to give his students the tools and knowledge toward building safer and more trustworthy AI for the future. He hopes that with his research and teachings in the classroom, students take the time to “critically examine their own system” before sending them out into the world.

When Eric is not spending time making sure AI-systems are at the top tier of trustworthiness and reliability, he enjoys trying to recreate the recipes of meals that he orders at restaurants. Trying to “reverse engineer its creation process” is harder than it might seem. Eric mentioned that, “It does not always look the same as the original, nor does it always taste as good, but sometimes it works!”. Maybe someday that too will be something an AI can do (correctly)!

Categories
Faculty

Secure Imprecision: Professor Andreas Haeberlen speaks on the importance of Differential Privacy

Left: Andreas Haeberlen
Right: “Informal Definition of Differential Privacy,” courtesy of the National Institute of Standards and Technology
October is Cybersecurity Awareness Month. This article is one of a cybersecurity-focused series. 

Last year during the peak of the COVID-19 pandemic in the US, testing and contact tracing failed to quell the spread. Many circumstances — including a decades-old underfunding of state health departments, and slow workforce build – have contributed to this outcome.  

However, according to Department of Computer and Information Science Professor Andreas Haeberlen, one of the main reasons contact tracing wasn’t relatively more successful is simple: people don’t’ feel comfortable sharing their information. 

“It’s really scary to think of people knowing all the things that you type in your phone,” said Haeberlen. “Like what you’ve had for breakfast, or your medical information, or where you’ve been all day or who you’ve met. All of that data is super super sensitive.” 

Haeberlen, whose research centers distributed systems, networking, security, and privacy, believes that differential privacy could be the solution. 

“Differential privacy is a way to purpose private information so that you can really guarantee that somebody can’t later learn something sensitive from this information,” said Haeberlen. “[It] has a very solid mathematical foundation.” 

The National Institute of Science and Technology defines differential privacy in terms of mathematical qualification. “It is not a specific process, but a property that a process can have,” said NIST on their website. “For example, it is possible to prove that a specific algorithm ‘satisfies’ differential privacy.” 

And so we might assert that, if an analysis of a database without Joe Citizen’s individual data and an analysis of a database with Joe Citizen’s individual data yield indistinguishable results, then differential privacy is satisfied. “This implies that whoever sees the output won’t be able to tell whether or not Joe’s data was used, or what Joe’s data contained,” said NIST on their site. 

Haeberlen insists that, with widespread application of differential privacy, user trust is not only no longer a barrier, but that it is not necessarily required. Surrendering our sensitive information to large corporations such as Apple would no longer require a leap of faith. 

Building the tools

A popular industry standard of cybersecurity involves adding imprecision into results to purposefully skew them, and thus protect individual user data. Challenges to this application, according to Haeberlen, include the ongoing debate among experts about whether it satisfies differential privacy specifications, and its lack of scalability. 

Fuzzi: A Three-Level Logic for Differential Privacy,” a paper by Haeberlen and fellow researchers Edo Roth, Hengchu Zhang, Benjamin C. Pierce and Aaron Roth, is one of many of Haeberlen’s oeuvre that focuses on developing tools that can do the work for us. The paper presents a prototype called Fuzzi, whose top level of operational logic “is a novel sensitivity logic adapted from the linear-logic-inspired type system of Fuzz, a differentially private functional language,” according to the abstract. 

Essentially, a researcher would input data into the tool, define what that data means, and specify what data output they’re searching for. The tool would be able to state if that output satisfies differential privacy specifications, and, if not, what amount of imprecision would need to be added in order to meet specifications.  

“The way that we did that was by baking differential privacy into a programming language,” said Haeberlen. “As a practitioner you don’t have to understand what differential privacy is, you also don’t have to be able to prove it.” 

In the world of science, imprecision usually means error and gross miscalculation. However, in the more specific realm of differential privacy, imprecision equals security. 

“Imprecision is good because it causes the adversary to make mistakes,” said Haeberlen. In this case, the “adversary” is any person or system trying to gain access to sensitive information. 

All tools developed by Haeberlen and his team have been made available under open-source license, and companies such as Uber and Facebook are currently releasing data sets using differential privacy.  

Visit Professor Andreas Haeberlen’s page to learn more about his current projects and recent publications.  
Categories
Faculty

Professor Susan Davidson honored with VLDB Women in Database Research Award

Professor Susan Davidson in Computer and Information Science (CIS), who has spent nearly 40 years teaching with the Department, has been awarded the 2021 Very Large Database (VLDB) Women in Database Research Award.

Sponsored non-profit organisation Very Large Database Endowment Inc., the award focuses on the cumulative lifetime work of the researcher. Davidson was specifically honored “for groundbreaking work in the areas of data integration, data provenance and her efforts in cross-disciplinary research, namely bridging databases and biology.”

“Really it was more that I was one of the early people to help define what interesting topics, there were in bioinformatics,” said Davidson.

The former Department Chair of CIS wrote an award acceptance speech titled “It’s not just Cookies and Tea” that blended the focal points of her life’s work — data integration, provenance and concurrencies — with personal life. The two are often inextricable.

“I talked about my parents and how they influenced where I am today: that was provenance,” said Davidson. “I talked about how i’ve built programs to recruit, retain and promote women in engineering, computer science. You have to integrate, as well as have cookies and tea.”

Davidson’s advocacy for other women, both within the engineering field and without, has also been a defining facet of her professional career. The Founder of Advancing Women in Engineering (AWE) at Penn was hoping her speech would also serve as a point of motivation.

“I was also really trying to encourage other women, “said Davidson. “I know that it’s been extremely hard for for women with young children during the pandemic.”

The Women in Database Research Award is one of many presented at the annual VLDB Conference, this year hosted in hybrid format, August 16-20 in Copenhagen, Denmark. According to the VLDB site, "this series is perhaps the most international (in terms of participation, technical content, organization, and location) among all comparable events."