The non-obvious relationship between syntax and semantics

Ferdinand de Saussure described the relationship between syntax and semiotics/semantics (he doesn't draw the distinction) as two coded forms of the same message, but written on opposite sides of the same piece of paper, if you will. The three diagrams in Figure F.1 below depict the successive development of this concept. They are taken directly from de Saussure's textbook [1]. They clearly indicate his linguistic ideas. But they are wrong, as indicated by the watermarks in Figure F.1 below. 


Figure D.1 (a) top  (b) middle (c) bottom


To see how wrong Saussure is in diagrams D.1 (a) and D.1 (b), I include my own (also wrong) mental image of the interplay of syntax and semantics in Figure D.1 (c). In the bottom diagram, the words pass from the 'message (syntax)' condition to the right of my reading point, to the 'meaning (semantics)' condition to the left of my reading point. The syntactic string on the right 'shrinks' as the semantic 'bubble' on the left grows, until the final symbol ('delicatessen') has been subsumed within it, and the reader understands the meaning contained within the message. 

My original idea was that syntax consisted of a string of separate language units, the constituents of the message, where the message is the (non-information bearing) vehicle which 'contains' (particle/object analog) or 'carries'(wave/signal analog) the meaning or semantics, which is the (information-bearing) cargo or payload. Whereas the message/vehicle/container is a string of components or constituents, taken perhaps from a generic 'parts bin' of plug-and-play Lego*-like units, the meaning is a 'gestalt', an indivisible whole, or semantic 'bubble'. 

If language is considered primarily as a tool of thought, and secondarily as a communication mechanism, a different picture emerges. What seems like a string of symbols (ie syntax, a permutational code) is actually the list of nodes visited during a descending traversal of a hierarchical memory tree.  There is never any kind of positional/permutational code. At any given level of the i-linguistic (ie Marr-Chomsky) memory hierarchy, the semantics is given by the combination of symbols present- THIS IS BECAUSE MEANING IS A COMBINATORIC CODE. The recursive traversal of a semantic hierarchy yields a semantic representation (data structure) with a pseudo-linear structure. 

Consider  Miller's magic number, which is usually given as between 5 and 9, with average value of 7. This is the maximum number of digits (eg of a telephone number) that an average person can remember easily, ie without using excessive rehearsal or other specific mnemonic techniques. An important caveat when applying this trick is to remember that it operates recursively, which is why we automatically 'chunk' phone numbers into groups of trigrams or sometimes tetragrams, and also why large decimal numbers have a comma separating out the third powers of 10, eg 5,010 and 191,000,445. The point I am making, and the fact that links this to the arguments above, is that the human cerebral cortex has six layers. If we adopt the Marr-Chomsky model, then this neural structure most likely consists of three 'descending' recursive layers (Chomsky's productions) and three 'ascending' recursive layers (Marr's representations).

Chomsky's recent models rely on the importance of two 'merge' operations, and merging of two child nodes on the same recursive level (External Merge) and a merging of a parent node with one of its children (Internal Merge). As you can see from the neural models used by GOLEM theory (see sections 5 and 6), these linguistic operations are easily implemented in memory, that is, they are integral to the neural hierarchy.

The point is, by acknowledging that these descending and ascending traversals of the memory hierarchy are syntactic forms, we are in a better position to establish their TRUE relationship to semantics, which we have defined as the memory hierarchies being traversed. As a common sense check, think about programming semantics, which we can (simplistically) conceptualise as the construction of complex data types from simple ones. 

By these arguments, the true picture emerges- syntax describes the process of 'cutting out' a sub-tree (one corresponding the the subject's current situation) from the memory super tree of all possible experiences. Syntax is how we change our own (or someone else's) semantic state. These operations are similar, whether they are local to self and used to plan and execute thought and behaviour or extended to non-self, and used to plan and produce spoken/written communications.


(a) top                                                                                      (b) bottom

Figure D.2 


So where should we start. How about this statement: We inherit the ability to create and understand semantics, but we learn the ability to create and understand syntax. When we say human infants are born with the 'capacity for language', what we really mean is we are born with instinctual semantic capabilities. In other words, we already know what stuff is, and what it means. 

We need to delve a little deeper into the system that produces language, ie the human mind. We can show that semantics is a combinatorial code using the following reasoning: the meaning/semantics of a situational description depends on the nature of the situation (ie the elements that are present -mathematically speaking, their combination), not the nature of the description (ie the presentation of the elements - mathematically speaking, their permutation).

English syntax employs the passive voice, which allows writers to vary word/phrase order (indicial rank, or thematic importance) in the string while while preserving meaning [4].  To see how this feature is consistent with the GOLEM language model, consider the following narrative. John and Mary must take shelter from heavy rain in an abandoned house. Mary volunteers to walk to the nearest shop, while John gets a fire going. John finds an umbrella, and gives it to Mary so she won't get wet. 

Round brackets indicates a combination, set or semantic state s.t. (A, B) = (B, A). - commutativity
Square brackets indicate a sequence, array or string s.t. [A, B] != [B, A} - non-commutativity

Semantic state before: ((John, Umbrella), Mary)
Semantic state after: (John, (Umbrella, Mary))

The number of permutations of three items taken three at a time P(3,3) = 6

[J U M] - John gave the Umbrella to Mary - Active Voice
[J M U] - John gave Mary the Umbrella. - A.V.
[U M J] - The Umbrella was given to Mary by John. -Passive Voice
[U J M] - The Umbrella owned by John was given to Mary. -P.V.
[M J U] - Mary was given John's Umbrella. -P.V.
[M U J] - Mary was given the Umbrella by John. -P.V.

1. taken from a Course in General Linguistics by Ferdinand de Saussure. Edited by Charles Bally and Albert Sechehaye. Translated from the French, with an introduction and notes by Wade Baskin. McGraw-Hill 

2. c/- ResearchGate

3. c/- Princeton courses

4. Pinker, S. (2020) Zombie nouns and the passive voice. Lecture delivered to the Royal Society- https://youtu.be/OV5J6BfToSw


GOLEM Conscious Computers
All rights reserved 2020
Powered by Webnode
Create your website for free! This website was made with Webnode. Create your own for free today! Get started