Structural questions in neural structures

July 1-3. 2002


Introduction

The original purpose of the event was to give an overview of the techniques - borrowed from algebra, combinatorics, logics, and linguistics,- used for studying neural networks. We did not intend to talk about techniques traditionally used for continous and discrete dynamical systems. While organizing the conference, we eventually focused only on inviting lecturers, from whom we (even if their research interest is loosely connected to the above) could learn a lot, and enjoy it at the same time.

Organizers:
Tóth János és Szalisznyó Krisztina

Location

We are not expecting any non-hungarian speaking participants, so please kindly find the location description in hungarian.

Program

July 1., Monday
chair: Krisztina Szalisznyó

09:15 Tóth János Opening Remarks
09:30 Andrea Székely The organization of the CNS: from brain to synapse 60 min
10:30 János Tóth Mathematica and Computer Science 60 min
11:30 lunch
12:30 Elemér Lábos About the Sloane encyclopedia 20 min
13:00 Fülöp Bazsó Boolean function calculus with some application 60 min
14:00 Attila Pethő Experimental number theory 90 min
15:30 discussion

July 2., Tuesday
chair: Tóth János

9:30 László Négyessy Reverberation, perseveration, distractibility: questions about the function of neural networks of the prefrontal cortex 60 min
10:30 Béla Paláncz Symbolic Neural Networks Computations 30 min
11:30 Zoltán Somogyvári Dynamics of random Boolean networks 45 min
12:30 lunch
13:30 Lajos Rónyai Indexing, texts, Internet, algebra, algorithms 90 min
15:00 Gábor Proszéky Decision situaitons in language processing 60 min
16:00 discussion

July 3., Wednesday
chairs: László Zalányi, Fülöp Bazsó

09:30 Lajos Lóczi Stephen Wolfram: A New Kind of Science 20 min
10:30 Erzsébet Csuhaj Varjú Networks of language processors: bio-inspired models of computing 60 min
11:30 lunch
13:30 Péter Érdi River drift of a fashion, or about the small world structure of the human language 60 min
14:30 Kriszta Szalisznyó Symbolic dynamics and formal languages 20 min
15:30 László Kálmán Linguistic Capacity: Associative or Combinatory? 60 min

Abstracts


Boolean function calculus with some application
Fülöp Bazsó
MTA KFKI RMKI
Department of Biophysics
H-1121 Budapest, Konkoly Thege u. 29-33.
http://cneuro.rmki.kfki.hu
bazso@sunserv.kfki.hu

Dynamic phenomenological modells can be created to model neural and other networks by iterating Boolean functions. We may comprehend the basic dynamical properties of such networks. It can be shown, that in the dynamical systems defined this way the notions and means used in continuous dynamics find their paralell.

We illustrate some introduced notions through several examples.


Networks of language processors: bio-inspired models of computing
Erzsébet Csuhaj-Varjú
Computer and Automation Research Institute
Hungarian Academy of Sciences
H-1111 Budapest, Kende u.13-17.
csuhaj@sztaki.hu

Networks of language processors (NLP systems) are unconventional models of computing, introduced for describing the behaviour of communities of dynamically changing and communicating agents in terms of formal grammars and languages.

A network of language processors is a virtual graph with a language identifying mechanism (a grammar, an automaton, etc.) and a set or a multiset of words located at each node. These language processors work in a synchronized manner by performing rewriting steps and communication steps. At a rewriting step, each language processor rewrites the strings (or some of them) that can be found at that moment at the node, and then a communication step follows. At this step, the language processors, according to the communication protocol of the network, communicate the previously rewritten strings to the other nodes. A sequence of alternating rewriting steps and communication steps determine a computation in the network. The result of the computation can be defined as the set or the multiset of words which can be found at a dedicated node under computation or at some previously fixed step of the computation.

Networks of language processors are not only tools for computation (determining languages) but also can be used for studying questions as the description of the dynamics of string multisets at the nodes. If the words are considered as descriptions of organisms or they identify DNA sequences, then NLP systems can be useful in describing the behaviour of interacting populations with dynamically changing members.

In this talk we discuss two variants of NLP systems, networks of Lindenmayer systems and networks of Watson-Crick Lindenmayer systems, both demonstrating the power of this framework. In the case of networks of Lindenmayer systems the component language processors are grammars with totally parallel rewriting, modelling developmental systems. In the case of Watson-Crick Lindenmayer systems the totally parallel rewriting (the development) is triggered by a fundamental property of DNA computing, the Watson-Crick complementarity. We show that both models, even with very simple language processors and communication protocols, are as powerful as the Turing machines, moreover, for the massive parallelism some well-known NP-complete problems can be solved by these constructs in linear time. Finally, we demonstrate a method for describing the dynamics of the string multisets occurring under computation in these networks.

References:

  • E. Csuhaj-Varjú: Networks of Language Processors. EATCS Bulletin 63 (1997), 120-134. Appears also in Gh. Pãun, G.Rozenberg, A.Salomaa (eds.) Current Trends in Theoretical Computer Science, World Scientific, Singapore, 2001, 771-790.
  • E. Csuhaj-Varjú and A. Salomaa, Networks of parallel language processors. In: New Trends in Computer Science, Cooperation, Control, Combinatorics. (Gh. Pãun and A. Salomaa, eds.), LNCS 1218, Springer-Verlag, Berlin-Heidelberg-New York, 1997, 299-318.
  • E. Csuhaj-Varjú and A. Salomaa, Networks of Watson-Crick D0L systems. TUCS Report 419, Turku Centre for Computer Science, Turku, 2001. To appear in Proc. Third International Conference on Words, Languages, and Combinatorics, Kyoto, 2000. (M. Ito, ed.), World Scientific, Singapore, 2002.
  • G. Pãun, G. Rozenberg and A. Salomaa, DNA Computing. New Computing Paradigms. Springer-Verlag, Berlin, Heidelberg, New York (1998).
  • Handbook of Formal Languages, Vol. I-III. (G. Rozenberg and A. Salomaa, eds.) Springer Verlag, Berlin, Heidelberg, New York, 1997.

  • Linguistic Capacity: Associative or Combinatory?
    László Kálmán
    Mindmaker Ltd., Budapest
    kalman@mindmaker.hu

    Chomskyan models of natural-language grammars, based on formal language theory, crucially rely on the assumption that linguistic knowledge contains two different components, namely, one combinatory system (syntax) and one that contains knowledge about the use of elementary building blocks (lexicon). Post-Chomskyan theories of grammar (in particular, Construction Grammar) posits that words and syntactic structures (and whatever is in between) must be treated in a homogeneous manner, since they are all alike in that they consist in associations of formal properties with semantic ones. In accordance with this, constructionism says we do not use linearly ordered sequences consisting of lexical building blocks in linguistic communication, but complex and possibly overlapping patterns (constructions) arising from the simultaneous satisfaction of various constraints.

    Mindmaker Ltd. are trying to implement such a post-Chomskyan grammar within the paradigm of `integrated NLP'. In this implementation, constructions constitute a network, and they activate each other through associations. Yet, each carries essentially symbolic information, and the task is essentially combinatory. It is unclear for the moment how familiar sub-symbolic representation and learning systems (e.g., connectionism) can account for this double character of the NLP task.


    Stephen Wolfram: A New Kind of Science
    Book review by
    Lajos Lóczi
    Budapest University of Technology and Economics, Faculty of Sciences
    Department of Differential Equations
    1521 Budapest
    lloczi@math.bme.hu

    The long-awaited book of Stephen Wolfram has been completed. This massive, 1197-page tome is a great intellectual achievement: one of his friends suggested it should be called PrincipiaComputatus. Using the early results of his investigations into the behaviours of cellular automata, and the technical computing system Mathematica he created, the book presents a huge number of examples of various scientific disciplines where simple rules generate immensely complex results. The book culminates in the Principle of Computational Equivalence implying that at some level of complexity everything is exactly as complex as anything else.

    A New Kind of Science is a readable book sharing his ideas with scientists and also with nonscientists. Throughout the book, emphasis is placed on the algorithm rather than the equation. Wolfram predicts it will have unprecedented implications for science and scientific thinking.


    Reverberation, perseveration, distractibility: questions about the function
    of neural networks of the prefrontal cortex
    László Négyessy
    Semmelweis University
    Department of Anatomy
    Neurobiology Research Group
    negyessy@ana.sote.hu

    Understanding the function of the prefrontal cortex (PFC) is crucial to reveal the biological mechanisms of thought– and mood disorders and it is also a central question in cognitive neuroscience. It is widely accepted that PFC plays essential role in working memory, especially in executive functions, a main component of working memory. From neurobiological perspective the function of working memory is maintaining neural representations of different kind of sensory or learned information temporally in activated form for on-line processing. Manipulation of this information held in working memory by executive functions is a necessary step to goal directed behavior. A characteristic feature of neuronal activity in the PFC is the vigorous response in tasks requiring short term retention of information, like delay response tasks. Disturbance of this activity results in faulty behavioral responses indicating its central role in normal functions as well as clinical symptoms. It is reasonable to assume that optimal regulation of delay-related activity is essential for flexible and at the same time accurate behavior. Accordingly, irresistible activity or instability of it could result in perseveration or distractibility, respectively, which are characteristic symptoms of prefrontal dysfunction. The original idea about the neuronal mechanisms underlying delay-related activity was that reverberation of activity between interconnected neurons. However, this mechanistic proposal has a limited explanatory power considering the diversity of circuits integrated in the PFC. The purpose of this talk is to give an introduction about the state of our knowledge regarding on the organizational principles of these cortical and subcortical networks whose activities are needed to be integrated in a comprehensive model of the PFC.


    Symbolic Neural Networks Computations
    Béla Paláncz
    Department of Photogrammetry and Geoinformatics, Civil Engineering Faculty
    Budapest University of Technology and Economics
    H-1521 Budapest, Hungary
    Palancz@epito.bme.hu

    An overview is given about the Neural Networks, the recently developed, but not yet released application of Mathematica. The most significant feature of this package, that the symbolic form of a trained network can be produced, consequently it is an easy job to implement it into other applications.

    Examples from different fields of sciences will demonstrate the usage of the different type of networks, which are available in the Neural Networks application.


    Experimental Number Theory
    Attila Pethő
    University of Debrecen
    Department of Computer Science
    pethoe@math.klte.hu

    Experiments have in number theory a long tradition, although they were called rather as examination of tables or numerical test of conjectures. I cite here only two classical examples; the first is Gauss' conjecture on the distribution of primes, which was proved nearly hundred years later by de la Vallée Poussin and Hadamard, the second is Riemanns conjecture, which is still unproved. Both Gauss and Riemann did thorough numerical investigations with "pencil on paper" before stating the conjectures.

    In the 20th century because of the idea of computers and the development of the algorithmic point of view more and more researcher were interested for the representation of mathematical objects and algorithmic aspects of operations. The investigations of Derek and Emma Lehmer, Zassenhaus and Cassels means a transition from the precomputer to the computer age. They belong to those scientists, who realized that computers may become experimental facilities for mathematics. One of the most important results of the beginning, i.e. of the 1950's, was the conjecture of Birch and Swinnerton-Dyer conjecture, by which the analytic and geometric rank of elliptic curves is equal. It was stated again after long and careful numerical tests, but this time the tests were done by computers.

    The solution of diophantine equations is an interesting branch of number theory since ancient ages. A systematic theory exists only since the beginning of the 20th century, by our opinion since the talk of Hilbert at the 2nd Conference of Mathematicians in Paris, 1900. In Debrecen investigations started in this directions in the early 80th. We developed algorithms among others for the solution of Thue-, index form- and elliptic equations, implemented and applied them for large sets of input data. Our most important result was that such equations have usually only small solutions. Our numerical results influenced the theoretical investigations; the explicit determination and the size of constants are playing nowadays an important role and the proof of several theorems were simplified. 20 years ago one could publish a paper in a good journal about the solution of a simple Thue or elliptic equation, today this is a routine problem for a well-chosen computer number theory software. By the examination of the results one should be careful, because the usual answer is that the problem has only trivial solutions. The situation is similar to the primality tests. If a number survives fast probability tests, then it is quite certainly a prime, but a certificate of its primality can be found hardly.


    Indexing, texts, Internet, algebra, algorithms
    Rónyai Lajos
    Computer and automation
    Research Institute of the Hungarian Academy of Science
    H-1111 Hungary, Lágymányosi u. 11.
    ronyai@sztaki.hu

    We live in the Information Age. One of our major concerns in this regard is to find the needle (piece of data) in the huge haystack of information we have around. We shall discuss two related results. One is Latent Semantic Indexing for document retrieval, the other is Kleinberg's algorithm for ranking the hits of a search.

    A common feature of the two approaches is that they both employ models of linear algebraic nature. In quite different ways both of them are connected to singular value decomposition (SVD) of matrices.

    Meanwhile we shall have opportunity to speak about Cornelius Lánczos, the genius of linear algebraic computations, and the fortuitous kind of circularity which is present in certain patterns of reasoning.


    Dynamics of random Boolean networks
    Somogyvári Zoltán
    MTA KFKI RMKI
    Department of Biophysics
    H-1121 Budapest, Konkoly Thege u. 29-33.
    http://cneuro.rmki.kfki.hu
    soma@sunserv.kfki.hu

    Random Boolean networks are typical examples of simple systems exhibiting complex behaviour and were first introduced and applied to biological networks by Kauffman. They become model systems of many complex biological networks. This type of networks were applied as an abstract model of numerous nonlinear complex systems: systems of interacting catalysts at the origins of life, self-regulatory genomic systems, coupled co-evolutionary ecosystems and neural networks. Behaviour of cellular networks is also closely related to classical models of statistical physics such as the Ising model and percolation theory. Their behaviour show a typical phase transition depending on the number of connections. Over the critical treshold, they show "chaotic" behaviour or at least a discrete analog of a chaotic system. Under threshold connectivity they show stable periodic behaviour. The aim of our work was to provide an analytical approximation for the statisical parameters of lengths of attractors and transients, and to give a schema to understand the principles underlying the behaviour of this (structurally) simple but (dynamically) complex system.


    Symbolic dynamics and formal languages
    Kriszta Szalisznyó
    MTA KFKI RMKI
    Department of Biophysics
    H-1121 Budapest, Konkoly Thege u. 29-33.
    http://cneuro.rmki.kfki.hu
    szali@sunserv.kfki.hu


    The organization of the CNS: from brain to synapse
    Andrea D. Székely
    Semmelweis University,
    Department of Anatomy, Histology and Embryology
    Tűzoltó u.58.
    H-1094 Budapest, Hungary
    adszekely@ana.sote.hu

    A major role for the central nervous system is the perception, processing and storage of information deriving from the external environment, while generating an adequate answer. The ultimate equipment for this task is an intricate neuronal complex, which, during phylogeny has gained both a midline and cranial position. The CNS of mammals can be subdivided to cerebrum, cerebellum, brain stem and spinal cord.

    The fine structure of the mammalian brain is rather conserved and the emergence of the cerebral cortex counts as a novelty when compared to other vertebrate classes. The cortex, according to the traditional histological description, consists of 6 layers, however, certain cortical fields may differ due to their developmental origin or even functional profile. The functional moduls of the cortex are represented by a columnar system, such as the orientation columns or the ocular dominance columns in the visual cortex.

    The morphological units of the nerve tissue are embodied by neurones (cells generally with processes) they may be classified as excitatory (projection), or inhibitory (local interneurons), however, some further neuronal types exist. The neurons establish contacts via synapses which may also be classified as excitatory-inhibitory or asymmetrical-symmetrical on the basis of their morphology.

    While the basic elements are identical in all vertebrate classes, it is however one of the most exciting questions in comparative neuroanatomy to find out how two completely differently organised neuronal organs may successfully fulfill the same role, such as mammalian and avian vocalisation or spatial orientation.


    Mathematica and Computer Science
    János Tóth
    Department of Mathematical Analysis, Faculty of Sciences
    Budapest University of Technology and Economics
    H-1521 Budapest, Hungary
    jtoth@math.bme.hu

    Our aim is to show that the mathematical program package Mathematica is not only useful in solving problems involving symbolic and numerical calculations, graphics, animation, sound generation and publication on paper and on the web, but also to teach elements of computer science such as list processing, pattern matching, functional programming, recursion, l-calculus etc.

    References:

    • Gray, J. W.: Mastering Mathematica. Programming Methods and Applications. AP Professional, Boston etc., 1994.
    • Maeder, R. E.: Computer Science with Mathematica. Theory and Practice for Science, Mathematics, and Engineering, Cambridge University Press, Cambridge, 2000.
    • Szili, L., Tóth, J.: Mathematics and Mathematica, ELTE Eötvös Kiadó, Budapest, 1996. (In Hungarian).
Groups: