Events 2009-2010

Logic Colloquium

August 28, 2009, 4:10 PM (60 Evans Hall)

Paolo Mancosu
Professor of Philosophy, University of California, Berkeley

Measuring the size of infinite collections of natural numbers: Was Cantor’s theory of infinite number inevitable?

Cantor’s theory of cardinal numbers offers a way to generalize arithmetic from finite sets to infinite sets using the notion of one-to-one correspondence between two sets. As is well known, all countably infinite sets have the same ‘size’ in this account, namely that of the cardinality of the natural numbers. However, throughout the history of reflections on infinity another powerful intuition has played a major role: if a collection A is properly included in a collection B than the ‘size’ of A should be less than the ‘size’ of B (part-whole principle). This second intuition was not developed mathematically in a satisfactory way until quite recently. In this talk I begin by reviewing the contributions of some thinkers who argued in favor of the assignment of different sizes to infinite collections of natural numbers. Then, I present some recent mathematical developments that generalize the part-whole principle to infinite sets in a coherent fashion. Finally, I show how these new developments are important for a proper evaluation of a number of positions in philosophy of mathematics which argue either for the inevitability of the Cantorian notion of infinite number (Gödel) or for the rational nature of the Cantorian generalization as opposed to that, offered by Bolzano, based on the part-whole principle (Kitcher).

Logic Colloquium

September 11, 2009, 4:10 PM (60 Evans Hall)

Theodore A. Slaman
Professor and Chair of Mathematics, University of California, Berkeley

Degree Invariant Functions

We will discuss the issues surrounding Martin’s conjectured characterization of the functions on reals which are invariant with respect to Turing degree. We will give a recent application, obtained jointly with Montalban and Reimann, to the question of whether Turing equivalence is universal among countable Borel equivalence relations.

Logic Colloquium

September 25, 2009, 4:10 PM (60 Evans Hall)

Dana S. Scott
University Professor, Emeritus, Carnegie Mellon University, and Visiting Scholar, University of California, Berkeley

Mixing Modality and Probability

Orlov first [1928] and Gödel later [1933] pointed out the connection between the Lewis System S4 and Intuitionistic Logic. McKinsey and Tarski gave an algebraic formulation and proved completeness theorems for propositional systems using as models topological spaces with the interior operator corresponding to the necessitation modality. Earlier, Tarski and Stone had each shown that the lattice of open subsets of a topological space models intuitionistic propositional logic. Expanding on a suggestion of Mostowski about interpreting quantifiers, Rasiowa and Sikorski used the topological models to model many first-order logics. After the advent of Solovay’s recasting of Cohen’s independence proofs as using Boolean-valued models, topological models for modal higher-order logic were studied by Gallin and others. For Boolean-valued logic, the complete Boolean algebra M = Meas([0,1])/Null of measurable subsets of the unit interval modulo sets of measure zero gives every proposition a probability. Perhaps not as well known is the observation that the measure algebra also carries a nontrivial S4 modality defined with the aid of the sublattice Open([0,1])/Null of open sets modulo null sets. This sublattice is closed under arbitrary joins and finite meets in the measure algebra, but it is not the whole of the measure algebra. By working by analogy to the construction of Boolean-valued models for ZF, we can construct over M a model for a modal ZF (MZF) where membership and equality predicates have interesting and natural modal properties. In such a universe the real numbers correspond to random variables, and – following a suggestion of Alex Simpson (Edinburgh) – there is also a well-motivated modeling of random reals. A modal set theory, however, requires a reexamination of comprehension principles, and much work remains to be done to organize methods of proof to take account of the new distinctions encountered. It is also possible to recast well-known theorems of Ergodic Theory as principles about this modal universe, and the question to be considered is whether the new perspective can lead to some new results.

Logic Colloquium

October 09, 2009, 4:10 PM (60 Evans Hall)

Nate Ackerman
Visiting Scholar in Mathematics, University of California, Berkeley

Trees, Sheaves, and Definition by Recursion

We will show there is a topological space for which presheaves are the same thing as trees. We will further show that there is a sheaf on this topological space which has an important relationship with Baire space.

We will then use these connections to show how a definition by transfinite recursion can be thought of as an operation on sheaves, and how the well-definedness of such a definition can be thought of as a property of the sheaf we are working on.

This will then allow us to define a second-order tree as a sheaf on a tree and to expand our notion of definition by transfinite recursion to all well-founded second-order trees (even those which are ill-founded as normal trees).

We will then mention how these techniques can be used to prove a variant of the Suslin-Kleene Separation theorem.

Logic Colloquium

October 23, 2009, 4:10 PM (60 Evans Hall)

Lara Buchak
Assistant Professor of Philosophy, University of California, Berkeley

Different Decision or Different Decision Theory?: Modeling Risk Aversion

Decision theory is supposed to accommodate any preferences that a rational decision maker might have. I show that standard decision theory – expected utility theory – cannot accommodate common preferences that stem from the way apparently rational decision makers account for risk. I propose a generalization of expected utility theory that can accommodate these preferences, and show that it has all the theoretical power of the standard theory: in particular, it has a representation theorem that allows us to derive an agent’s beliefs, desires, and attitudes towards risk from his preferences. I then explore a classic strategy for responding to counterexamples of the type I present: re-characterizing the choice problem facing an agent, by individuating outcomes more finely. This introduces an important question in the methodology of decision theory: under what circumstances should we interpret the agent as facing a different choice problem rather than adopt a different theory to describe his behavior? I approach this question formally, and show that we cannot individuate the outcomes to make his behavior compatible with expected utility theory without also making his behavior compatible with many other opposing theories: thus, I suggest, we cannot save the standard theory.

Logic Colloquium

November 06, 2009, 4:10 PM (60 Evans Hall)

Martin Davis
Professor Emeritus of Mathematics and of Computer Science, Courant Institute, New York University, and Visiting Scholar in Mathematics, University of California, Berkeley

Reflections on Hilbert’s 10th Problem with a New Conjecture

Hilbert’s 10th Problem asked for an algorithm to determine whether a given polynomial equation with integer coefficients has an integer solution. In this talk I will discuss the negative solution to the problem as well as various extensions.

Logic Colloquium

November 20, 2009, 4:10 PM (60 Evans Hall)

Denis R. Hirschfeldt
Professor of Mathematics, University of Chicago

Lowness Properties and Cost Functions

One of the byproduts of recent developments in the theory of algorithmic randomness is an increased interest in notions of computability-theoretic lowness. We will focus on two such notions: K-triviality, which is a natural notion of “randomness-theoretic weakness” with a number of interesting characterizations, and strong jump traceability, which has a more combinatorial definition but is also strongly connected to algorithmic randomness. In particular, we will see how these notions can be characterized in terms of two other powerful ideas arising from the theory of algorithmic randomness: cost functions and diamond classes.

We will use these themes to explain some of the issues of current interest in the rapidly developing area of algorithmic randomness, including a discussion of some major open questions.

Logic Colloquium

December 04, 2009, 4:10 PM (60 Evans Hall)

Sherrilyn Roush
Associate Professor of Philosophy, University of California, Berkeley

Rational Self-Doubt

If one is highly confident that Rosco is the murderer from having seen the crime, and then learns of the substantial experimental psychology evidence that human beings are very unreliable at eyewitness testimony, is one thereby obligated to reduce one’s confidence about Rosco? Current Bayesian rationality theory forbids any such revision because of idealizations that I argue are unnecessary. I develop a more general probabilistic framework that shows how we can learn about the world from information about our own unreliability, through higher-order updating that re-calibrates us. It shows why taking doubt about one’s own judgment seriously does not end up in incoherence or runaway skepticism, and what the added value of this kind of evidence is.

Logic Colloquium

January 29, 2010, 4:10 PM (60 Evans Hall)

John MacFarlane
Associate Professor of Philosophy, University of California, Berkeley

Fuzzy Epistemicism

It is taken for granted in much of the literature on vagueness that semantic and epistemic approaches to vagueness are fundamentally at odds. If we can analyze borderline cases and the sorites paradox in terms of degrees of truth, then we don’t need an epistemic explanation. Conversely, if an epistemic explanation suffices, then there is no reason to depart from the familiar simplicity of classical bivalent semantics. I question this assumption, showing that there is an intelligible motivation for adopting a many-valued semantics even if one accepts a form of epistemicism. The resulting hybrid view has advantages over both classical epistemicism and traditional many-valued approaches.

Logic Colloquium

February 12, 2010, 4:10 PM (60 Evans Hall)

Jennifer Chubb
Assistant Professor of Mathematics, University of San Francisco

Degree Spectra of Relations

We consider algorithmic properties of additional relations on computable structures. For example, for a computable linear ordering we may consider the successor relation, which does not have to be computable. The degree spectrum of a relation on a computable structure is the set of degrees (Turing or other) of images of that relation under isomorphism to another computable copy of the structure. We’ll see a variety of examples of degree spectra of definable relations, and will use algorithmic information theory to analyze the strong degree spectra of the omega-type initial segment of computable linear orderings of type omega + omega*. I will discuss some general results, and present some examples from recent collaborative projects.  

Logic Colloquium

February 26, 2010, 4:10 PM (60 Evans Hall)

Lotfi A. Zadeh
Professor in the Graduate School and Director of the Berkeley Initiative in Soft Computing, University of California, Berkeley

Toward a Logic of Everyday Reasoning

In one form or another, attempts to construct a logic of everyday reasoning go back to antiquity. The classical, Aristotelean, bivalent logic may be viewed as a product of such attempts. However, Aristotelean logic does not qualify as a logic of everyday reasoning because it does not come to grips with the core issue — the intrinsic imprecision of everyday reasoning. In modern times, logical systems — driven by a quest for the ultimate in precision, rigor and depth — have become more and more estranged from everyday reasoning. But as we move further into the age of machine intelligence and automated decision-making, the need for a logic of everyday reasoning and natural language understanding becomes increasingly apparent.

Today, a logic of everyday reasoning is not in existence. In my lecture, I sketch two attempts to construct one. The first attempt relates to what may be called precisiable everyday reasoning. A very simple example is: Icy roads are slippery; slippery roads are dangerous; therefore icy roads are dangerous. In this case, it can be shown that the seemingly reasonable conclusion is incorrect, and that a correct conclusion can be inferred. A simple example of unprecisiable reasoning is the following. I hail a cab and ask the driver to take me to address A the fastest way. The driver takes me to A but it is not possible to prove or disprove that the route which he chose is the fastest way. Construction of a logic of unprecisiable everyday reasoning requires a radical departure from traditional approaches to logical reasoning.

Logic Colloquium

April 02, 2010, 4:10 PM (60 Evans Hall)

Seth Yalcin
Assistant Professor of Philosophy, University of California, Berkeley

Modus Tollens Denied

I present and defend a number of counterexamples to Modus Tollens, construing Modus Tollens as a pattern of inference expressible in ordinary language. I ask what these counterexamples teach us about the meaning of conditional sentences and about the concept of logical consequence appropriate to understanding natural language inference. Since under classical assumptions, Modus Tollens can be proved from Modus Ponens, I ask in particular whether we should abandon Modus Ponens (as Vann McGee has independently argued we should), or give up aspects of the classical reasoning that allows us to prove Modus Tollens from Modus Ponens. I develop the latter approach.

Logic Colloquium

April 12, 2010, 4:10 PM (60 Evans Hall)

Grigor Sargsyan
National Science Foundation Postdoctoral Fellow, and Assistant Adjunct Professor, University of California, Los Angeles

Descriptive Inner Model Theory

It is by now a well-known theorem of Martin, Steel, and Woodin that if there is a measurable cardinal above omega-many Woodin cardinals then AD holds in L(R). Since this theorem was proved, many theorems of a similar nature have appeared in set theory. For instance, Steel recently showed that PFA, the proper forcing axiom, implies that AD holds in L(R). Many results that followed the Martin-Steel-Woodin theorem, including Steel’s aforementioned result, used a substantial amount of inner model theory to prove that all sets in a certain “canonical” collection of sets of reals are determined. In this talk we will take canonical stand for Universally Baire. Letting then Gamma be the collection of Universally Baire sets, many of the theorems that have been proven after the Martin-Steel-Woodin result are about the size of Gamma, and in particular, the earlier results just show that under certain hypotheses all sets of reals in L(R) are in Gamma. Descriptive inner model theory then can be defined as the the study of the properties of Gamma in various mathematically rich extensions of ZFC, using methods from inner model theory. Inner model theory enters into the study of the properties of Gamma through a conjecture known as the Mouse Set Conjecture, which allows one to translate questions about sets of reals into questions about iteration strategies for a certain kind of mice known as hod mice. In this talk, we will explain all this terminology and will outline some of the more recent theorems, including the following.

Theorem (S.) If there is a Woodin limit of Woodins then there is Gamma^* which is a subset of Gamma and such that and L(Gamma^*, R) satisfies AD_R + Theta is regular.

We will also explain some applications in calibrating lower bounds of consistency strength and in the inner model problem.

Logic Colloquium

April 16, 2010, 4:10 PM (60 Evans Hall)

Umesh V. Vazirani
Strauch Professor of Electrical Engineering and Computer Sciences, University of California, Berkeley, and Director of the Berkeley Quantum Computation Center

Is There a Quantum Analog of the PCP (Probabilistically Checkable Proofs) Theorem?

Logic Colloquium

April 23, 2010, 4:10 PM (60 Evans Hall)

Richard A. Shore
Professor of Mathematics, Cornell University

The Limits of Determinacy in Second-Order Arithmetic