Affiliated Event: Beyond Crypto: A TCS Perspective (August 19)
Happy Hour, 5pm-7pm, UCSB Lagoon Plaza, right outside Corwin Pavilion
The foundational study of cryptography has had far-reaching impact in theoretical computer science and beyond. In turn, cryptography has benefitted greatly from advances and challenges raised in a host of disciplines. This deep, rich, and mutually-beneficial web of connections spans varied fields such as complexity theory, machine learning, coding theory, game theory, distributed computing, the study of privacy and fairness in statistical data analysis, quantum computation, data structures and more.
Developing such connections, and unearthing new ones, is vital to the future success of cryptography. The goal of this workshop is highlighting work and research avenues that can broaden the reach and impact of cryptography and open the door to fruitful interactions and connections between the foundations of cryptography and other fields in and beyond theoretical computer science.
Towards this end, the workshop program will include a carefully curated collection of tutorials and talks about recent developments in related fields that raise challenges that would benefit from study through the "cryptographic lens," as well as questions and results in cryptography that have the potential for broad impact in and beyond the field.
This event is part of the DIMACS/Simons Collaboration in Cryptography, supported by the National Science Foundation under grant number CNS-1523467.
- Yuval Ishai (Technion)
- Guy Rothblum (Weizmann)
- Scott Aaronson (UT Austin)
- Boaz Barak (Harvard)
- Cynthia Dwork (Harvard and Radcliffe Institute for Advanced Study)
- Aleksander Madry (MIT)
- Virginia Vassilevska Williams (MIT)
- Mary Wootters (Stanford)
Workshop Reception: Saturday night, August 18th (before the workshop), 5:00-7:00pm, Corwin Pavilion.
Talks will take place in Corwin Pavilion West.
"On Optimal Algorithms and Assumption Factories", by Boaz Barak.
Throughout most of history, cryptographic schemes have been developed through a "cat and mouse" process whereas a cryptographer would propose a scheme, a cryptanalyst would break it, a new scheme is proposed, and so on and so forth.
Starting in the 1970's, a new paradigm of "provable security" emerged, where the security of cryptographic schemes would be based on a few basic computational assumptions or "axioms". However, given our current knowledge, coming up with these axioms is still more art than science, and recent years have seen a partial return to the "cat and mouse" process, with a proliferation of both assumptions and attacks.
In this talk I will describe an alternative approach for coming up with axioms, which is arguably more "disciplined", using conjectures on the optimality of classes of algorithmic techniques. We will illustrate the approach in the context of recent proposed constructions of indistinguishability obfuscators.
Part of the talk will be based on joint works with Zvika Brakerski, Ilan Komargodski and Pravesh Kothari, as well as with Sam Hopkins, Pravesh Kothari, Aayush Jain, and Amit Sahai.
"Machine Learning and Security: The Good, the Bad, and the Hopeful", by Aleksander Madry.
Machine learning has made a tremendous progress over the last decade. In fact, many believe now that ML techniques are a “silver bullet”, capable of making progress on any real-world problem they are applied to.
But is that really so?
In this talk, I will discuss a major difficulty in the real-world deployment of ML: making our ML solutions robust and secure. After briefly surveying some of the key challenges in this context, I will focus on one of the most pressing issues: the widespread vulnerability of state-of-the-art deep learning models to adversarial misclassification (aka adversarial examples). I will describe a framework that enables us to reason about this vulnerability in a principled manner as well as develop methods for alleviating the problem it poses.
"Theory for Society: Crypto on Steroids", by Cynthia Dwork.
Modern cryptography has gifted us with an approach to creating digital manifestations of social phenomena in the physical world. As algorithms increasingly supplant human decision making and shape our view of the world, and as computers are increasingly leveraged to attack data analytic processes, the need for cryptographic thinking grows apace. In short, we need crypto on steroids.
This talk begins with differential privacy, a definition of privacy tailored to privacy-preserving analysis of large datasets together with a collection of techniques for achieving this goal and complementary lower bounds. We will see that the concept also protects against "p-hacking" (adaptive data analysis), allowing the construction of a reusable holdout set. We explore the cryptographic approach to algorithmic fairness, and see two roles for differential privacy in this endeavor. Finally, and time permitting, we touch on a tight connection to a problem in quantum computing.
"A Fine-Grained Approach to Algorithms and Complexity", by Virginia Vassilevska Williams.
A central goal of algorithmic research is to determine how fast computational problems can be solved in the worst case. Unfortunately, for many central problems, the best known running times are essentially those of their classical algorithms from the 1950s and 1960s. For years, the main tool for explaining computational difficulty have been NP-hardness reductions, basing hardness on P ≠ NP. However, if one cares about exact running time (as opposed to merely polynomial vs non-polynomial), NP-hardness is not applicable, especially if the problem is already solvable in polynomial time. In recent years, a new theory has been developed, based on "fine-grained reductions" that focus on exact running times. In this talk I will give an overview of current progress in this area, and will highlight some new developments and their relationship to cryptography.
"Cryptography, Local Decoding, and Distributed Storage", by Mary Wootters.
In this talk, I’ll give an overview of locality in error correcting codes, highlighting connections in cryptography. Error correcting codes encode data to protect it from noise, and broadly we say that an error correcting code exhibits “locality” if it is possible to recover a piece of information without looking at very much of the encoded data. Connections between locality and cryptography go back a long way; a famous example is the tight relationship between locally decodable codes and private information retrieval. This talk will survey these connections, but with a focus on newer notions of locality. These newer notions—inspired primarily by distributed storage—have already found a few connections to cryptography, but have potential for many more.
"Certified Randomness from Quantum Supremacy", by Scott Aaronson.
I’ll describe a novel application for near-term quantum computers with 50-70 qubits: namely, generating cryptographic random bits, whose randomness can be certified even if the quantum computer is untrusted (e.g., has been backdoored by an adversary). Unlike schemes based on Bell inequality violation, ours requires only a single device able to solve classically hard sampling problems. Our protocol harvests the outputs of the sampling process and feeds them into a randomness extractor, while occasionally verifying the outputs using exponential classical time. I’ll also compare to the beautiful independent work of Brakerski et al., who proposed a scheme for the same problem that has much more efficient verification, but that probably can’t be implemented on near-term devices. Paper still in preparation. As time permits, I’ll also say something about a new connection between gentle measurement of quantum states and differential privacy (joint work with Guy Rothblum, paper still in preparation).