**Mélodie Andrieu: Complexities of hypercubic billiard words**

Abstract: Sturmian words form a class of infinite words over the binary alphabet which sheds light, through their equivalent definitions, on remarkable interactions between combinatorics, dynamical systems, and number theory. These definitions give rise to various classes of words over the d-letter alphabet, and a large program consists in determining whether some of these classes of words coincide, and which properties of Sturmian words they still satisfy.

In this talk, we will focus on one dynamical representation of Sturmian words: as words generated by a billiard on a square table, which generalizes to a billiard in the cube and in the cube of dimension d; and on two combinatorial quantities which characterize Sturmian words: the *complexity* and the *abelian complexity*.

**Émilie Charlier: Alternate base numeration systems**

Abstract: Alternate base numeration systems generalize real base numeration systems as defined by Renyi. In this case, real numbers are represented using a finite number of bases periodically. Such systems naturally appear when considering linear numeration systems without a dominant root. As it happens, many classical results generalize to these numeration systems with multiple bases but some don’t. In this talk, I will present a quick survey of the work done so far concerning combinatorial, dynamical and algebraic aspects, while keeping a focus on a language-theoretical point of view. This study has been led in collaboration with several co-authors : Célia Cisternino, Karma Dajani, Savinien Kreczman, Zuzana Masakova and Edita Pelantova.

**Ismael Jecker: Transducers and the Power of Delay**

Abstract: Transducers are theoretical machines computing functions: they read input words and answer with output words.

Several variations of the standard transducer model have been developed to compute different classes of functions.

This talk surveys three fundamental classes:

Sequential functions, the basic class computed by the simplest model;

Regular functions, a versatile class recognized by a variety of models;

Polyregular functions, the most expressive but most complex class.

The central theme of the talk is the notion of delay, a powerful tool that reduces problems about transducers to problems about automata.

**Jussi Karlgren: When the Map is More Exact than the Terrain**

Abstract: Computational models of the world are often designed to conform to both our intuitions about important qualities of the world and to computational convenience. These two design principles can be at cross purposed. Hypotheses about e.g. human information processing, based on observations of effective and efficient human behaviour can be quite useful as an inspiration to how a computational model should be put together. Those intuitions and guiding principles and the metaphors they provide can later turn out to limit innovation and development, especially if they are catchy and easily communicated to the outside world. This talk will give examples related to neural processing models, and specifically vector space models of the semantics of human language.

**Shuo Li: On the number of squares in a finite word**

Abstract: A square is a word of the form uu, where u is a finite word. The problem of determining the number of distinct squares in a finite word was initially explored by Fraenkel and Simpson in 1998. They proved that the number of distinct squares, denoted as Sq(w), in a finite word w of length n is upper bounded by 2n and conjectured that Sq(w) is no larger than n. In this note, we review some old and new findings concerning the square-counting problem and prove that Sq (w) ≤ n − Θ(log2(n)).

**Will Merril: Formal Languages and the NLP “Black Box**“

Abstract: The empirical success of deep learning in NLP and related fields motivates understanding neural nets on an abstract computational level: what kind of string patterns can they be sensitive to, and what kinds of representations can they build? Luckily, formal language theory provides a useful framework for analyzing neural networks, which I will overview in this talk. I will cover the classical Turing completeness result for infinite-precision RNNs, separations of different RNN architectures with limited precision, and recent work characterizing transformers as language recognizers using circuits and logic. I may also cover applications of this work, such as the extraction of discrete models from neural networks. Hopefully, the tutorial will synthesize different analysis frameworks and findings about neural networks into a coherent narrative, and provide a call to action for open questions that formal language theorists can engage with.

**Richard Mörbitz: The M-monoid Parsing Problem and its Application to Constituency Parsing**

Abstract: In this talk, I will present a general framework for weighted parsing, called M-monoid parsing, which is built on top of grammar-based language models and employs flexible weight algebras. It generalizes previous work in that

area (semiring parsing, weighted deductive parsing) and also covers applications outside the classical scope of parsing, e.g., algebraic dynamic programming. I will present an algorithm which terminates and is correct for a large class of weighted grammar-based language models. Finally, I will discuss current research on a particular constituency parsing problem where the set of possible constituent trees is given by a new automaton model, called constituent tree automaton. This problem, too, turns out to be an instance of the M-monoid parsing problem.

**Markus Schmid: On Structural Tractability Parameters for Hard String Problems**

Abstract: For intractable graph problems, many structural “tractability parameters” exist (like, e.g., treewidth or other width-parameters), which, if bounded by a constant, yield polynomial-time solvable cases or even fixed-parameter tractability. Indeed, consulting the well-known “Information System on Graph Classes and their Inclusions” (https://www.graphclasses.org/), the knowledge on graph parameters and their corresponding graph classes (both algorithmically as well as combinatorially) seems overwhelming, especially to a non-expert in algorithmic graph theory.

For (intractable) problems on strings, the situation is quite different. Most parameters that naturally come to mind have a non-structural flavour like alphabet size, number of input strings, size of language descriptors, etc. In any case, structural tractability parameters for strings seem to be highly dependent on the actual problem at hand: some structural restriction of strings may substantially decrease the complexity of one specific problem, while it has virtually no impact for any other string problem.

In this talk, we will survey some results that stem from the task of discovering structural parameters for strings that can make hard problem instances tractable. We will focus on the approach to translate string problems into graph problems in order to make use of algorithmic meta-theorems in algorithmic graph theory.