# Mathematics Department

## Colloquium

#### Upcoming Seminar

March 21, 4:40 PM in Eliot 314
Reasoning About Cryptosystems in a Computational Logic
Josh Gancher, Cornell University

In modern cryptography, any flaw in a security proof is a potential source of an attack on the cryptosystem. As cryptographic schemes become more complex over time, we will see proofs get larger and more subtle, making them difficult to verify manually. A promising answer to this problem comes from computer-aided cryptography, which aims to apply tools from formal methods to certify cryptographic security proofs. In this talk, I will present a tool from this space, AutoLWE, which delivers partially automated mechanized proofs of security for a variety of group and lattice-based cryptosystems. No prior knowledge about cryptography will be assumed.

Most Thursday afternoons during the academic year, the Reed College Department of Mathematics hosts a math talk. The talks are directed to our mathematics majors but are usually accessible on a variety of levels.

### 2018-19 Schedule

#### Fall

4:40-5:30pm in Eliot 314 (unless marked otherwise). Directions to Reed.

#### Spring

4:40-5:30pm in Eliot 314 (unless marked otherwise). Directions to Reed.

Jan 31 Please note change in time.Please note change in location.The Shape of the Quantum RealmSpyridon Michalakis, California Institute of TechnologyTime: 4:30 PMLocation: Vollum Lecture Hall At the turn of the century, a list of thirteen significant open problems at the intersection of math and physics was posted online by the president of the International Association of Mathematical Physics. But with problems such as Navier-Stokes on the list, quick progress seemed unreasonable. Indeed, a decade later, with only one problem partially solved and despite the progress yielding two Fields Medals, the list was all but forgotten. Then, in 2008, as a young mathematician at Los Alamos National Lab, I was tasked with solving the second problem on the list, which asked for a rigorous explanation of the Quantum Hall effect, an important phenomenon in physics with applications to quantum computing and beyond. The solution, which would involve a deep connection between topology and quantum physics, would come a year later. I want to share that journey with you, focusing on insights gained along the way about the relationship of mathematics to physics. How to obtain parabolic theorems from their elliptic counterpartsBlair Davey, City College of New York Experts have long realized the parallels between elliptic and parabolic theory of partial differential equations. It is well-known that elliptic theory may be considered a static, or steady-state, version of parabolic theory. And in particular, if a parabolic estimate holds, then by eliminating the time parameter, one immediately arrives at the underlying elliptic statement. Producing a parabolic statement from an elliptic statement is not as straightforward. In this talk, we demonstrate a method for producing parabolic theorems from their elliptic analogues. Specifically, we show that an $L^2$ Carleman estimate for the heat operator may be obtained by taking a high-dimensional limit of $L^2$ Carleman estimates for the Laplacian. Other applications of this technique will be discussed. Quantum Learning from Symmetric Oracles (or: How to multiply two matrices when you only know one of them)Jamie Pommersheim, Reed CollegeThe study of quantum computation has been motivated, in part, by the possibility that quantum computers can preform certain tasks dramatically faster than classical computers. Many of the known quantum-over-classical speedups, such as Shor's algorithm for factoring integers and Grover's search algorithm, can be framed as oracle problems or concept learning problems.  In one model of concept learning, a student wishes to learn a concept from a teacher by asking questions, called queries, to which the teacher responds. In the interest of efficiency, the student wishes to learn the concept by making as few queries as possible. For any such concept learning problem, there is a corresponding quantum concept learning problem. In the quantum version, the student is allowed to ask a superposition of queries – mathematically, a linear combination of queries – and the teacher answers with the corresponding superposition of the responses.  After making this idea precise, we will examine several concept learning problems and their quantum analogues.  We will discuss recent joint work with former Reed student Daniel Copeland, in which we show how tools from representation theory can be used to precisely analyze any quantum learning problem with sufficient symmetry. What is a perfectoid space and what is it good for?Matthew Morrow, Institut de Mathématiques de Jussieu-Paris Rive GaucheThe subject of p-adic arithmetic geometry is concerned with number theory which is close to’’ a fixed prime number p. There has been incredible progress in recent years, largely thanks to perfectoid spaces: their creation and the development of the subsequent theory was one of the reasons that Peter Scholze was awarded a Fields Medal last year. In the talk I will attempt to explain some of the main ideas around perfectoids and how we use them. Machine Translation: 350 years of progress and new challenges in the connectionist ageJonathan May, University of Southern CaliforniaMethods and mechanisms for algorithmically converting between human languages have been pursued since at least the 17th century, and translation was one of the first applications devised for computers in the post-war era. The availability of large corpora of translation examples and methods for extracting and wrangling corpus statistics in the 1990s led to a proliferation of higher quality output that could be used by anyone. Today's neural network-driven MT systems are far more fluent and flexible than their predecessors but also far more data-hungry, and have opened the door to a series of new challenges and opportunities: Building good-quality translation systems when our training examples are unreliable, when we only have a small number of examples, or when we have no examples at all Building translation systems that behave more like human translators Understanding what our new systems are learning, and characterizing the limits of their learning potential In this talk I'll provide a selective history of machine translation, introduce core methods and ideas of the field, and discuss work that I and others have done to address these new challenge areas. New Points on CurvesDino Lorenzini, University of Georgia Let X/Q be the plane curve defined by a polynomial f(x,y) with coefficients in the field Q of rational numbers. Thus a point (a,b) in the complex plane is on this plane curve iff f(a,b)=0. A rational point on this curve is a point (a,b) where both a and b are rational numbers. It is easy to construct points on this curve which are not rational. For instance, choose any rational number b, and then solve the equation f(x,b)=0 to obtain a point (a,b) on the curve where most often the coordinate a is not a rational number. To any point (a,b), we associate the smallest field extension of Q generated by a and b, and when this field extension Q(a,b) is a finite dimensional Q-vector space, we call its dimension d (the degree of this field over Q). With this definition, a rational point has degree d=1.  In the case of the Fermat curve x^n+y^n=1 with n>2, setting b=2 produces a point (a,b) on that curve with a= n-th root of (1-2^n). The degree of L:=Q(a,b) is then n. So the Fermat curve has points of degree 1 and of degree n. Can one find other points on this curve with degree 1