Brains of Sand       L          Linguistic schismatics  

Figure L.1

Figure L.2

There is a flawed, dangerously wrong assumption undermining contemporary linguistics. Yet no less a person than Noam Chomsky was sucked in by it, and taken for a ride. This assumption starts as follows,  with the uncontroversial premise that structure determines function. No mistake there, any of us can think of at least a half dozen examples where the truth of this principle is beyond the reach of any reasonable doubt. Now the next bit. The structure of language is syntax, and its function is semantics, so therefore, semantics is based on syntax. Find the rules for the construction of syntactical expressions, and you will work out the rules for encoding semantics, or meaning. What could possibly be wrong with this view of linguistics? 

Lets start with a rather obvious point - Noam Chomsky, surely the most talented linguist of the modern era, and undisputed genius of the first order, developed this view of linguistics into something called the 'minimalist' program. This was as inconclusive as it was disappointing. Was he wrong, or was he laboring under a false set of premises, a bad idea? The truth can be obtained by applying Pierce's Hook (retroduction). In other words, we ask the question- what is the most likely cause of Chomsky's failure? (A) That he did bad work using a good model, or (B) that he did good work using a bad model. The latter scenario (B) the more likely of the two, knowing Chomsky's track record of successful discoveries in other related sub-disciplines.

TDE theory uses a view of linguistics (schizolinguistics. or linguistic schismatics) which separates semantics from syntax. This idea is quite different from the traditional view- recall it was Ferdinand Saussure who likened Semantics and Syntax to codes printed on either side of the same piece of paper- different but inseparable. This view has been around for over a century, and it has been singularly unproductive. Not that it hasn't motivated the authorship of thousands of academic papers over the years, rather that it hasn't bore fruit in non-research domains. Take machine translation- arguably we are as far away from an efficient 'true' algorithm as we ever were. The only change over the years is that computers have become so much smaller and more powerful, even relatively weak algorithms* used in combination with one another can be cobbled together into a usable device. 

The idea that syntax encodes meaning suggests (wrongly) that cognitive structure can be inferred from grammatical structure. Consider the (on the face of it, quite reasonable) separation of words into closed and open classes. Closed class words are those used to 'prime and frame the canvas' include articles, determiners, tense suffixes like -ed, number markers like -s, and pronouns denoting the three or so noun cases^.  Open class words include those nouns and verbs which 'paint the picture'. This is the same basic idea as separating programming syntax into a central core of reserved words, around which the user can define and name their own customised variables/objects. 

LCCM theory is one of the more significant recent attempts to infer cognitive structure from grammatical structure. LCCM Theory^^ takes its name from the two central theoretical constructs adopted in the theory: the Lexical Concept and the Cognitive Model (see figure L.2). Figure L.1 gives the GOLEM/COGESE knowledge map, which is almost** identical to Tulving's memory division into(1) Declarative and Procedural levels, and (2) the Declarative level is further divided into Episodic and Generic knowledge sub-types. 

*arguably, people are less 'literate' than they once were, that is, there is a wider tolerance of grammatical errors where they don't change the intended meaning. Grammatical correctness only to demonstrate one's erudition (evidence of a private college education, perhaps) rather than merely convey information is a thing of the past.

^subjective (he, she, it), genitive/possessive (his, hers, theirs), dative/objective (him, her, them) and genitive/generic to - or -ing

^^The Theory of Lexical Concepts and Cognitive Models

**Tulving's main idea was Episodic memory, which is recognised as one of the major memory systems.  In Tulving's words -"It (ie episodic memory) shares many features with semantic memory, out of which it grew (Tulving 1984), but it also possesses features that semantic memory does not (Tulving & Markowitsch 1998)". That is, episodic memory is a kind of semantic memory (note the layers- semantics above, syntax below) in the memory map in figure L.1. This represents an independent point of agreement between Tulving's work and the TDE/GOLEM/COGESE research. The division between semantics and syntax in the COGESE memory map arises from a totally separate logical thread, one based on the view of the  TDE as a (Turing Equivalent) Finite State Automaton (FSA) , where the states are semantic, and the transitions are syntactic. Another independent point of agreement with Tulving is the (orthogonal to the semantics/syntax) division of the COGESE memory map (c.f. the human brain) into INTRAsubjective/serial computations in the left cortex and INTERsubjective/parallel computations in the right corte.

© 2019 mirodyer@icloud.com
Powered by Webnode
Create your website for free! This website was made with Webnode. Create your own for free today! Get started