Recent seminars

Room P5.18, Mathematics Building

Gláucia Murta

Gláucia Murta, Heinrich-Heine-Universität Düsseldorf
Secure Anonymous Conferencing in Quantum Networks

Users of quantum networks can securely communicate via so-called (quantum) conference key agreement, making their identities publicly known. In certain circumstances, however, communicating users demand anonymity. In this talk, I will introduce a security framework for anonymous conference key agreement with different levels of anonymity and present efficient and noise-tolerant protocols exploiting multipartite Greenberger–Horne–Zeilinger (GHZ) states.

We compare the performance of our protocols in noisy and lossy quantum networks with protocols that only use bipartite entanglement to achieve the same functionalities. Our simulations show that GHZ-based protocols can outperform protocols based on bipartite entanglement and that the advantage increases for protocols with stronger anonymity requirements. Our results strongly advocate the use of multipartite entanglement for cryptographic tasks involving several users.


Room P5.18, Mathematics Building

Chrysoula Vlachou

Chrysoula Vlachou, Instituto de Telecomunicações
Quantum weak coin flipping with arbitrary levels of security: A geometric and an algebraic analytical solution

Weak coin flipping (WCF) is a fundamental cryptographic primitive for two-party secure computation, where two distrustful parties need to remotely establish a shared random bit whilst having opposite preferred outcomes. It is the strongest known primitive with arbitrarily close to perfect security in the quantum case, while classically its security is completely compromised (unless one makes further assumptions, such as computational hardness).

The existence of almost perfectly secure quantum WCF protocols was established in the seminal work of C. Mochon in 2007, however, his proof was partially non-constructive, thus, setting back the proposal of explicit protocols. More recently, Arora, Roland and Weis designed an algorithm that numerically constructs unitary matrices corresponding to WCF protocols with arbitrarily small bias [STOC’19, p.205-216].

In this talk, we present new techniques which yield a fully analytical construction of WCF protocols with different levels of security including nearly perfect security, thus achieving a solution that has been missing for more than a decade. In particular, we study the quantum WCF problem from both an algebraic and a geometric perspective and propose two different solutions, respectively. Furthermore, our new techniques lead to a simplified proof of existence of WCF protocols by circumventing the non-constructive part of Mochon’s proof.

Room P5.18, Mathematics Building

M. Hamed Mohammady

M. Hamed Mohammady, Université Libre de Bruxelles
Quantum measurements constrained by the third law of thermodynamics

In the quantum regime, the third law of thermodynamics implies the unattainability of pure states. In this talk, I will first introduce an operational formulation of the third law as a constraint on the class of realisable channels, such that the availability of a channel not so constrained is both necessary and sufficient for the attainability of pure states. Subsequently, I will show how the third law thus construed limits measurements of general observables, or POVMs. In particular, it is shown that while the third law categorically precludes ideal or repeatable measurements, non-disturbing measurements can be achieved if the measured observable is sufficiently "unsharp".

Room P3.10, Mathematics Building

Jean-Yves Béziau

Jean-Yves Béziau, UFRJ, Brazil
Classical Propositional Logic without Atoms

Classical propositional logic (and other propositional logics) are generally presented with atomic formulas, due to a philosophical idea of Wittgenstein (1921), baptized by Bertrand Russell “Logical Atomism” (1924). This philosophical view is controversial and from a mathematical point of view it is possible to construct propositional logic without atoms. This is what we will show here in the case of classical propositional logic. Surprisingly enough this has not yet been studied in details. On the one hand Gödel very succinctly talked about that in a informal way (1929/1930). On the other hand Suszko developed what he called “abstract logic”, a general theory of propositional logics without the atomic assumption, but did not study in details particular cases.

We will here present a precise mathematical definition of classical propositional logic without atoms, present a semantics for it, a sequent calculus and prove the completeness theorem using a very general abstract version of this theorem (Beziau 2001).


  • J.-Y. Béziau, “Sequents and bivaluations”, Logique et Analyse, 44 (2001), pp.373-394.
  • K. Gödel, “Eine Eigenschaft der Realisierungen des Aussagenkalküls”, Ergebnisse eines mathematischen Kolloquiums, 2 (1929/30), pp.20-21.
  • B. Russell, “Logical Atomism”, in J. H. Muirhead (ed.), Contemporary British Philosophers, London: Allen and Unwin, 1924, pp.356–383.
  • R. Suszko (with S. L. Bloom and D. J. Brown) “A note on abstract logics”, Bulletin de l’Académie Polonaise des Sciences, 18 (1970), pp. 109-110.
  • L. Wittgenstein, “Logisch-Philosophische Abhandlung”, Annalen der Naturphilosophie, 14 (1921).

Room P5.18, Mathematics Building

Charles Bedárd

Charles Bedárd, Università della Svizzera Italiana, Switzerland
Emergence, Sophistication and Epistemology

A quantitative and objective notion of emergence is proposed, using algorithmic information theory as a basis for an objective framework in which a bit string encodes observational data. A plurality of drops in the Kolmogorov structure function of such a string is seen as the hallmark of emergence. The definition offers some theoretical results and extends the notions of coarse-graining and boundary conditions.

Sophistication and logical depth are two quantitative measures of the non-trivial organization of an object. Although apparently different, these measures have been proven equivalent (by Pr. André Souto and collaborators), when logical depth is renormalized by the busy beaver function. However, if the measures are relativized to auxiliary information y, it turns out that the ability of y to solve the halting problem introduces a distortion between Soph(x|y) and Depth(x|y).

Finally, I share my latest thoughts about epistemology (the philosophy of knowledge), more precisely, about the problem of making it rigorous. I explain the gap between Bayesian and Popperian epistemology and speculate on some possible resolutions through algorithmic information theory. Note that this problem is not 'merely' philosophical – it is the key problem that needs to be solved to have proper thinking machines.