2000 seminars


Room P4.35, Mathematics Building

João Rasga, SQIG-IT / IST-UTL

Quantifier elimination revisited

It is well known that quantifier elimination can play a relevant role in proving decidability of theories. We present new compositional techniques for establishing quantifier elimination in a semantic way, capitalising on the fact that a 1-model-complete theory with algebraically prime models has quantifier elimination. We propose two ways of showing that a theory has algebraically models namely iteration and adjunction and we provide a guided way to prove that a theory is 1-model-complete. The applicability of these new conditions is illustrated on the theories of the natural numbers with successor, term algebras and algebraically closed fields. This talk is based on ongoing joint work with Cristina Sernadas.


Room P3.10, Mathematics Building

Manuel Martins, U Aveiro

Algebraic Kripke frames

Reconfigurable systems behave differently in different modes of operation and commute between them along their lifetime. Such different behaviours can be modelled by imposing some sort of structure upon states in a transition system expressing the overall system's dynamics. We take this approach by endowing states in standard Kripke frames with algebras, each of them modelling a local configuration. An equational hybrid logic, admitting infinitary formulae, is proposed to express a broad range of properties of those structures, including liveness requirements. The paper also develops a number of preliminary results on its semantics, including a discussion of a suitable notion of bisimulation by generalizing standard invariance results to this broad setting.

The talk reports on ongoing joint work with Luís S. Barbosa and A. Madeira.

Funded by the Brazilian Research Council and the GeTFun project (Marie Curie – European Commission).


Room P3.10, Mathematics Building

J.-Y. Béziau, UFRJ - Brazil

Non Truth-Functional Semantics

After giving a general view of truth-functionality and different kinds of semantics, I will present various non truth functional semantics, bivalent as well as many-valued, explaining the advantages and disadvantages of such techniques.

Funded by the Brazilian Research Council and the GeTFun project (Marie Curie – European Commission).


Room P3.10, Mathematics Building

Manuel Biscaia, SQIG - Instituto de Telecomunicações

Satisfiability of exogenous temporal logics and its applications on planning

Dealing with uncertainty in the context of planning is a natural research subject in Artificial Intelligence. Many of the techniques used capitalize upon the existence of very efficient SAT-solvers for propositional logic. We propose a temporal extension of a probabilistic logic, which on one hand allows us to deal with quantities/probabilities, and on the other hand allows us to simplify the temporal aspect of planning. We present an algorithm for deciding the satisfiability problem for this logic, presenting also a complete Hilbert calculus for it. The algorithm is intended to profit from the fast advances occurring in LTL SAT-solvers. We will then check some of the limitations of our logic regarding specification of properties, discussing some important open problems in Markov chains.


Room P3.10, Mathematics Building

André Souto, SQIG - Instituto de Telecomunicações

Kolmogorov one-way functions

We prove several results relating injective one-way functions, timebounded conditional Kolmogorov complexity, and time-bounded conditional entropy. The idea is to use an expected value approach of the time-bounded Kolmogorov complexity.

We prove a separation result: based on the concept of time-bounded Kolmogorov complexity, we find an interval in which every function f is a necessarily weak but not a strong one-way function.

We propose an individual approach to injective one-way functions based on Kolmogorov complexity, defining Kolmogorov one-way functions and prove some relationships between the new proposal and the classical definition of one-way functions. We explore how Kolmogorov one-way functions are related with the conjecture of polynomial time symmetry of information.

Finally, we relate our definitions and results with two forms of time-bounded entropy.

Work based on:

L. Antunes, A. Matos, A. Pinto, A. Souto and A. Teixeira, One-Way Functions Using Algorithmic and Classical Information Theories, Theory of Computing Systems, 52(1):162-178 Springer Verlag, 2013.


Room P3.10, Mathematics Building

Paulo Mateus, SQIG-IT / IST-UTL

Improving classical authentication over a quantum channel

We introduce a quantum protocol to authenticate classical messages that can be used to replace Wegman-Carter's classical authentication scheme in quantum key distribution (QKD) protocols. We show that the proposed scheme achieves greater conditional entropy of the seed for the intruder given her (quantum) observation than the classical case. Our protocol is an improvement over a classical scheme by Brassard, taking advantage of quantum channel properties. It is motivated by information-theoretical results. However by adopting it, QKD becomes a fully quantum protocol. We prove that quantum resources can improve both the secrecy of the key generated by the PRG and the secrecy of the tag obtained with a hidden hash function. We conclude that the proposed quantum encoding offers more security than the classical scheme and, by applying a classical result, we show that it can be used under noisy quantum channels. Joint work with F. Assis, A. Stojanovic and Y. Omar.


Room P3.10, Mathematics Building

João Rasga, SQIG-IT / IST-UTL

Completeness and interpolation of almost-everywhere quantification over finitely additive measures

We give an axiomatization of first-order logic enriched with the almost-everywhere quantifier over finitely additive measures. Using an adapted version of the consistency property adequate for dealing with this generalized quantifier, we show that such a logic is both strongly complete and enjoys Craig interpolation, relying on a (countable) model existence theorem. We also discuss possible extensions of these results to the almost-everywhere quantifier over countably additive measures. The talk reports on joint work with Wafik Lotfallah and Cristina Sernadas.


Room P3.10, Mathematics Building

Diogo Poças, Instituto Superior Técnico

Complexity with costing and stochastic oracles

We study the idea of coupling a Turing machine with an analog device as oracle, thus defining a new class of machines called the analog-digital Turing machines. An example of analog device is the Scatter Machine (SME, proposed by Beggs and Tucker in 2007, then coupled with a Turing machine in a model proposed by Beggs, Costa, Loff and Tucker later on in 2008). Following studies of different physical experiments of measurement led to the conclusion that the analog devices can be cathegorized in one of three different types: two-sided, threshold and vanishing, corresponding to three different oracles to Turing machines. (The SME is an example of two-sided type.) We analyze the computational power of such hybrid systems subject to polynomial resources. We shall present techniques to prove lower and upper bounds, considering different degrees of precision in the measurement procedure. Joint work with José Félix Costa


Room P3.10, Mathematics Building

Carolina Blasio & João Marcos, IFCH - UNICAMP - Brazil & LoLITA - DIMAp - UFRN - Brazil

Do not be afraid of the Unknown

Semantically, if an agent asserts or refutes some sentence it is commonplace to identify these attitudes with truth-values. We propose instead a useful model for representing and reasoning about information that accommodates distinct cognitive attitudes concerning acceptance or rejection by an agent. From that model, we show a generous four-place notion of entailment, henceforth called B-entailment, that generalizes the well-known approaches of multiple-conclusion entailment. One of the advantages of such model is that it provides a natural interpretation for non-classical logical values characterizing ‘the unknown’: some sentence unbeknownst to a given agent might be a sentence which the agent has reasons to accept and also reasons to reject; alternatively, the sentence might be unbeknown to the agent if she simultaneously has reasons not to accept it and reasons not to reject it. As we will show, yet another advantage of our generalized notion of entailment and of its underlying many-dimensional structure of truth-values is that it provides a simple and appealing framework for the uniform representation of many known logics. The talk reports on joint work with João Marcos.


Room P3.10, Mathematics Building

João Marcos, LoLITA - DIMAp - UFRN - Brazil

Negative modalities, consistency and determinedness

We study a modal language for negative operators — an intuitionistic-like negation and its paraconsistent dual — added to (bounded) distributive lattices. For each non-classical negation an extra operator is introduced in order to allow for standard logical inferences to be restored, whenever appropriate. We characterize the minimal normal logic and a few other basic logics with such negative modalities and their companions. The talk reports on joint work with Adriano Dodó.


Amphitheatre Qa02.1, South Tower

Alexandra M. Carvalho, PIAG-IT / IST-UTL

Complete MDL for Bayesian networks

A Bayesian network (BN) is a probabilistic graphical model that efficiently represents a joint probability distribution. It encodes specific conditional independence properties pertaining to the joint distribution via a directed acyclic graph. Learning unrestricted BNs given a sample consists in finding the BN that minimizes the relative entropy, also known as Kullback-Leibler (KL) divergence, between the distribution represented by the BN and the one induced by the observed frequency estimates (OFE). This problem is known to be NP-hard, therefore, optimal and efficient learning of BNs is restricted to tree-like structures. Using KL divergence for learning BNs tends to favor complete network structures since adding an edge never increases the KL divergence. This phenomenon leads to overfitting which is usually avoided by adding a complexity penalty. Common penalties include the minimum description length (MDL) and the stochastic complexity, which are computable upper bounds of the Kolmogorov complexity of a BN. The MDL version based on the integer universal coding has redundancies that can be explored to further compress it, obtaining what is called complete MDL. Although MDL for BNs has been derived, that is not the case for complete MDL. In this work we obtain the complete MDL expression for tree BNs, discuss its extension to other structures and compare it with other approaches in the context of learning BNs. Joint ongoing work with M. A. Figueiredo.

New room: QA02.1.


Room P3.10, Mathematics Building

Cristina Sernadas, SQIG-IT / IST-UTL

Reasoning about unreliable circuits

A complete extension of classical propositional logic is proposed for reasoning about circuits with unreliable gates. The pitfalls of extrapolating classical reasoning to such unreliable circuits are extensively illustrated. Several metatheorems are shown to hold with additional provisos. Applications are provided in verification of logic circuits and improving their reliability. The talk reports on joint work with Amílcar Sernadas, João Rasga and Paulo Mateus.


Room P3.10, Mathematics Building

Filipe Casal, SQIG - Instituto de Telecomunicações

On Nelson-Oppen techniques

The Nelson-Oppen method allows the modular combination of quantifier-free satisfiability procedures of first-order theories into a quantifier-free satisfiability procedure for the union of the theories provided that these theories verify some restrictions. In this talk, we will present the Nelson-Oppen techniques for stably infinite, shiny and strongly polite theories, and analyse the relationship between shiny and strongly polite theories in the one-sorted case. We show that a shiny theory with a decidable quantifier-free satisfiability problem is strongly polite and we provide two different sufficient conditions for a strongly polite theory to be shiny. Based on these results, we derive a combination method for the union of a polite theory with an arbitrary theory. Joint work with João Rasga.


Room P3.10, Mathematics Building

Amaury Pouly, École Polytechnique

Computational Complexity of the GPAC

In 1941, Claude Shannon introduced a model for the Differential Analyzer, called the General Purpose Analog Computer (GPAC). Originally it was presented as a model based on circuits, where several units performing basic operations (e.g. sums, integration) are interconnected. However, Shannon itself realized that functions computed by a GPAC are nothing more than solutions of a special class of differential equations of the form \(y'=p(y)\) where \(p\) is a (vector of) polynomial. Analog computers have since been replaced by digital counterparts. Nevertheless, one can wonder whether the GPAC could in theory have super-Turing computable power.

A few years ago, it was shown that Turing-based paradigm and the GPAC have the same computational power. So, switching to analog computers would not enable us to solve the Halting problem, or any uncomputable exotic thing, but we can nonetheless compute everything a Turing machine does (given enough resources), and a return to analog technology would at least not mean a loss of computational power. However, this result did not shed any light on what happens at a computational complexity level: can an analog computer (GPAC) solve a problem faster (modulo polynomial reductions) than a digital computer (Turing machine)?

I will provide some elements which show that some reasonable restrictions of the GPAC are actually equivalent to P (polynomial time) and NP (nondeterministic polytime), and that there are strong links with Computable Analysis which study the complexity of real functions, at the complexity level.


Room P3.10, Mathematics Building

Paulo de Sousa Mendes, U Lisboa

Um modelo semântico de representação da causação e a necessidade de critérios lógico-jurídicos na atribuição da causalidade em direito penal

A causalidade, enquanto elemento da infracção criminal, não se confunde com os problemas probatórios. Não tratamos das leis empíricas da causalidade que interessam à prova, mas só da causalidade como categoria do entendimento e lei geral do mundo inteligível. Essa lei geral da causalidade vale da mesma maneira para todos os tipos de crime cuja consumação se verifica com um resultado (homicídio, ofensas corporais, burla, etc.). De acordo com a teoria jurídica da condição, qualquer facto sem o qual o resultado típico não se teria verificado é condição, aliás equivalente às demais, e vale singularmente como causa, tal como todas as outras. Basta, pois, que um agente tenha contribuído com uma condição para o resultado para ser considerado como causador do mesmo. Segundo essa teoria, a determinação do nexo de causalidade faz-se através da fórmula da conditio, que consiste essencialmente num raciocínio hipotético contrafactual. A fórmula da conditio não consegue, porém, resolver satisfatoriamente os casos de sobredeterminação causal do resultado por força da acção de vários agentes, quando tiverem actuado independentemente uns dos outros. Um modelo formal semântico da evolução do mundo construído com base nas lógicas temporais ramificadas pode ajudar à compreensão das conexões causais entre as acções individuais e o resultado relevante. No final, o modelo permitir-nos-á perceber que, mesmo em situações em que não existe nenhuma incerteza factual, podem subsistir, ainda assim, dúvidas sobre a atribuição da causalidade a determinados agentes. Concluímos que a atribuição da causalidade é um problema lógico-jurídico, que, por isso mesmo, tem de ser resolvido com apelo para critérios igualmente lógico-jurídicos. Apesar de tudo, a causalidade deve ser claramente distinguida da imputação do resultado típico ao agente. Esta palestra reporta em trabalho conjunto com José Carmo.