Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2013, TC World
…
3 pages
1 file
The principles of minimalism, as developed by John Carroll in The Nurnberg Funnel and subsequent works, are based on solid principles of learning theory derived from cognitive science. Too often, we forget that our “users” are, in reality, learners.
On Nature and Language
I. The roots of the Minimalist Program. AB & LR: To start from a personal note, let us take the Pisa Lectures as a point of departure [1]. You have often characterized the approach that emerged from your Pisa seminars, 20 years ago, as a major change of direction in the history of our field. How would you characterize that shift today? NC: Well, I don' t think it was clear at once, but in retrospect there was a period, of maybe 20 years preceding that, in which there had been an attempt to come to terms with a kind of a paradox that emerged as soon as the first efforts were made to study the structure of language very seriously, with more or less rigorous rules, an effort to give a precise account for the infinite range of structures of language. The paradox was that in order to give an accurate descriptive account it seemed necessary to have a huge proliferation of rule systems of a great variety, different rules for different grammatical constructions. For instance, relative clauses look different from interrogative clauses and the VP in Hungarian is different from the NP and they are all different from English; so the system exploded in complexity. On the other hand, at the same time, for the first time really, an effort was made to deal with what has come to be called later the logical problem of language acquisition. Plainly, children acquiring this knowledge do not have that much data. In fact you can estimate the amount of data they have quite closely, and it's very limited; still, somehow children are reaching these states of knowledge which have apparently great complexity, and differentiation and diversity ….and that can't be. Each child is capable of acquiring any such state; children are not specially designed for one or the other, so it must be that the basic structure of language is essentially uniform and is coming from inside, not from outside. But in that case it appears to be inconsistent with the observed diversity and proliferation, so there is kind of a contradiction, or at least a tension, a strong tension between the effort to give a descriptively adequate account and to account for the acquisition of the system, what has been called explanatory adequacy. Already in the nineteen fifties it was clear that there was a problem and there were many efforts to deal with it; the obvious way was to try to show that the diversity of rules is superficial, that you can find very general principles that all rules adhere to, and if you abstract those principles from the rules and attribute them to the genetic endowment of the child then the systems that remain look much simpler. That's the research strategy. That was begun around the nineteen sixties when various conditions on rules were discovered; the idea is that if you can factor the rules into the universal conditions and the residue, then the residue is simpler and the child only has to acquire the residue. That went on for a long time with efforts to reduce the variety and complexity of phrase structure grammars, of transformational grammars and so on in this manner [2]. So for example Xbar theory was an attempt to show that phrase structure systems don't have the variety and complexity they appear to have because there is some general framework that they all fit into and that you only have to change some features of that general system to get the particular ones. What happened at Pisa is that somehow all this work came together for the first time in the seminars, and a method arose for sort of cutting the Gordian knot completely: namely eliminate rules and eliminate constructions altogether. So you don't have complex rules for complex constructions because there aren't any rules and there aren't any constructions. There is no such thing as the VP in Japanese or the relative clause in Hungarian. Rather there are just extremely general principles like "move anything anywhere" under fixed conditions that were proposed, and then there are options that have to be fixed, parametric choices: so the head of the construction first or last , null subject or not a null subject, and so on. Within this framework of fixed principles and options to be selected, the rules and the constructions disappear, they become artifacts. There had been indications that there was something wrong with the whole notion of rule systems and constructions. For example there was a long debate in the early years about constructions like, say, "John is expected to be intelligent": is it a passive construction like "John was seen", or is it a raising construction like "John seems to be intelligent"? And it had to be one or the other because everything was a construction, but in fact they seemed to be the same thing. It was the kind of controversy where you know you are talking about the wrong thing because it doesn't seem to matter what you decide. Well, the right answer is that there aren't any constructions anyway, no passive, no raising: there is just the option of dislocating something somewhere else under certain conditions, and in certain cases it gives you what is traditionally called the passive and in other cases it gives you a question and so on, but the grammatical constructions are left as artifacts. In a sense they are real; it is not that there are no relative clauses, but they are a kind of taxonomic artifact. They are like "terrestrial mammal" or something like that. "Terrestrial mammal" is a category, but is not a biological category. It's the interaction of several things and that seems to be what the traditional constructions are like, VP's, relative clauses, and so on. The whole history of the subject, for thousands of years, had been a history of rules and constructions, and transformational grammar in the early days, generative grammar, just took that over. So the early generative grammar had a very traditional flair. There is a section on the Passive in German, and another section on the VP in Japanese, and so on: it essentially took over the traditional framework, tried to make it precise, asked new questions and so on. What happened in the Pisa discussions was that the whole framework was turned upside down. So, from that point of view, there is nothing left of the whole traditional approach to the structure of language, other than taxonomic artifacts, and that's a radical change, and it was a very liberating one. The principles that were suggested were of course wrong, parametric choices were unclear, and so on, but the way of looking at things was totally different from anything that had come before, and it opened the way to an enormous explosion of research in all sorts of areas, typologically very varied. It initiated a period of great excitement in the field. In fact I think it is fair to say that more has been learned about language in the last 20 years than in the preceding 2000 years. AB & LR: At some point, some intuitions emerged from much work within the Principles and Parameters approach that economy considerations could have a larger role than previously assumed, and this ultimately gave rise to the Minimalist Program [3]. What stimulated the emergence of minimalist intuitions? Was this related to the systematic success, within the Principles and Parameters approach and also before, of the research strategy consisting in eliminating redundancies, making the principles progressively more abstract and general, searching for symmetries (for instance in the theoretically driven typology of null elements), etc.? NC: Actually all of these factors were relevant in the emergence of a principles and parameters approach. Note that it is not really a theory, it's an approach, a framework that accelerated the search for redundancies that should be eliminated and provided a sort of a new platform from which to proceed, with much greater success, in fact. There had already been efforts, of course, to reduce the complexity, eliminate redundancies and so on. This goes back very far, it's a methodological commitment which anyone tries to do and it accelerated with the principles and parameters (P&P) framework. However, there was also something different, shortly after this system began to crystallize by the early 80s. Even before the real explosion of descriptive and explanatory work it began to become clear that it might be possible to ask new questions that hadn't been asked before. But not just the straightforward methodological question: can we make our theories better, can we eliminate redundancies, can we show that the principles are more general than we thought, develop more explanatory theories? But also: is it possible that the system of language itself has a kind of an optimal design, so, is language perfect? Back in the early 80s that was the way I started every
Cosmological Theories of Value
Scientific Minimalism 4.1 Background and Overview The provocative speculation of some of the precedent previously explored in Chap. 2 can be balanced by what might be called "scientific minimalism." As shown in Fig. 1.1, the science explored here can help inform and lead into cosmological worldviews that will leverage much contemporary science. Scientific minimalism can be characterized for our purposes by a few basic interrelated methodological ideas and interpretations such as: epistemological prudence, functionalism, and increasing verisimilitude. We will briefly touch on some high-level value theory implications of scientific minimalism and then follow with a fairly in-depth exploration of some key contemporary science such as quantum theory and its potential implications for the more extreme applications to cosmic origins and cosmic evolution, followed by a brief foray into the psychology of science. 4.2 Epistemological Prudence Epistemological prudence can be thought of as having two parts. The first is often referred to as "Occam's razor" which suggests that all things being equal, the fewer assumptions a theory holds, the better the theory is likely to be when compared to others. So in general, for competing theories, the theory that makes the least assumptions is preferable. This is a kind of parsimony argument, perhaps even an aesthetic argument, that values simplicity over complexity (and relates to Daniel Dennett description of functionalism in the next section). This view reflects a desire to keep theories as simple as possible, but no simpler.
Linguistics and Philosophy, 1997
In the past several years Chomsky has published a series of papers in which he proposes what he terms a minimalist view of syntax. 2 The model of grammar that emerges from this and related work represents a significant departure from the Government Binding (GB) theory of syntax, which dominated the Principles and Parameters (P&P) approach throughout the 1980's. 3 On the GB view, the syntactic well-formedness of a sentence depends upon the satisfaction of constraints that apply at one or more levels of representation in a derivation. The standard GB account recognizes a sequence of three syntactic levels of 1 Much of the research for this paper was done during the summer of 1995, when the alphabetically second author visited the first as a guest academic researcher in the
Adaptive Behavior, 2019
This special issue highlights the work of some recent participants of a series of Minimal Cognition workshops held at the University of Wollongong in 2018. The goal of these workshops has been to showcase the interdisciplinary work being done in what might be called 'minimal cognition research.' Our aim was, and continues to be, to bring biologists, cognitive scientists, philosophers, and computer scientists into dialogue on how we might understand the minimal criteria for cognition and cognitive behavior. This special issue, titled "Approaching Minimal Cognition," is intended to open up further exchange between the different fields converging on this richly interesting area of research, and to spur debate and discussion about the roots of cognition, the indicators of cognitive behavior, collective cognition, and the possibility of life-mind continuity.
Context-sensitivity and semantic …, 2007
ACM SIGDOC Asterisk Journal of Computer Documentation, 1999
Page 1. SEMINARI DE RECERQUES LINGÜISTIQUES Universitat de Girona, Girona 19 de juny de 2009 1 Minimalism: A look from Below Ángel J. Gallego (CLT – Universitat Autònoma de Barcelona) <[email protected]> 1. Minimalism: the role of the 3 rd factor (1) Two tensions within GG a. In the GB approach: descriptive vs. explanatory adequacy b. In the MP approach: - SMT [= language is an optimal solution to interface demands] vs.
Trends in Cognitive Sciences, 2003
Much of perception, learning and high-level cognition involves finding patterns in data. But there are always infinitely many patterns compatible with any finite amount of data. How does the cognitive system choose 'sensible' patterns? A long tradition in epistemology, philosophy of science, and mathematical and computational theories of learning argues that patterns 'should' be chosen according to how simply they explain the data. This article reviews research exploring the idea that simplicity does, indeed, drive a wide range of cognitive processes. We outline mathematical theory, computational results, and empirical data underpinning this viewpoint.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Frontiers in Psychology
Unpublished, 2020
IEEE Transactions on Professional Communication, 1990
Adaptive Behavior
Context-Sensitivity and Semantic Minimalism (eds. Preyer G, Peter G), 2007
Discourse Processes, 1994
The Western Ontario Series in Philosophy of Science, 2011
Modelling and Using Context 2017, 2017
Esse arts + opinions, 2007
TOWARD A MINIMALIST EDUCATION: NEW PROPOSAL FOR THE EDUCATIONAL SYSTEM, 2024
Granthaalayah Publications and Printers, 2023
The Quarterly Journal of Experimental Psychology A, 1999