<p>Visit the <a href="https://www.georgetown.edu/operating-status">operating status page</a> for information on the university's current operating status.</p>
View of stained glass with the Georgetown University seal

Professor, Team Wins Prize for Privacy Concept Used by Google

May 26, 2017 – Kobbi Nissim, the McDevitt Chair in Computer Science at Georgetown, and his colleagues at Harvard and Penn State have won the 2017 Gödel Prize for creating a new privacy concept now used by Google and Apple.

The prize, named for the late Austrian-American mathematician Kurt Gödel, is sponsored by the European Association for Theoretical Computer Science and the Association for Computing Machinery’s Special Interest Group on Algorithms and Computation Theory. 

The Gödel awards outstanding papers of the past 14 years in the area of theoretical computer science.

Differential Privacy

“Putting privacy in computation on solid theoretical grounds is a main interest of my work,” Nissim says, “and in 2006 my colleagues and I introduced a concept that became central to this endeavor.” 

The concept was differential privacy, a complex theoretical framework focused on privacy used in statistics and online.

With personal sensitive information constantly being collected and analyzed, privacy is more important than ever, he says, but a close look at the outcome of such analyses can potentially harm people’s privacy.

“Let’s say ‘Alice’ did one computation of Georgetown salaries on Sunday and another on Monday but didn’t know that between those dates I had joined the university,” he explains. “Given the two averages, if you know I joined in-between, you can determine my salary. This is a risk to my privacy that would be mitigated by differential privacy.”

Census Bureau-Georgetown Project

 “Differential privacy is a requirement of statistical analyses,” Nissim says. “An analysis is differentially private if the likelihood of every potential outcome is almost the same whether my information is included in the analysis or not.

“This implies that if I contribute my information in a differentially private analysis, my chances of incurring harm, such as being denied insurance, would be almost the same as if I my information was not used. A similar protection is provided simultaneously to all those whose information is used.”

The U.S. Census bureau, Google’s Chrome internet platform, and Apple’s iOS 10 platform are all now using some differentially private computations to protect the personal information of citizens and customers, the professor says.

He notes that he is leading a project with the Census Bureau that focuses on both quantitative and legal standards for the protection of privacy in the collection and use of data.

Influencing Policy

“A lot of my research is in collaboration with social scientists, ethicists and lawyers,” Nissim says of his work.

The connection between the university and the Census Bureau recently expanded when Georgetown’s McCourt School of Public Policy opened the first Research Data Center (RDC) in Washington, a joint project of the Census Bureau and the school’s Massive Data Institute.

“Georgetown is near great institutions in D.C. that influence policy and the adoption of strong privacy tools that have a good theoretical basis,” Nissim says.