It is more than fifty years since Christopher Alexander enunciated his ideal of an architectural project firmly grounded on a mathematical in nature, rigorous formalization, capable of simultaneous control of an enormous amount of variables, largely beyond the power of the most gifted, talented and skilled designers having ever worked within an architectural tradition. Peter Eisenman defended his mythical doctoral thesis nearly at the same time: the logic of architectural form was that of a coherent linguistic system combining both exact functioning and unlimited capacities of variation and complexity; soon afterwards, he described the autonomous nature of the system as a major break with all previous architecture, a break which opened and characterized a new epistemological era; and he designed and eventually built structures where always more and more concurrent logical procedures were put at work in a most mechanical way. Alexander wanted to solve all imaginable requirements and Eisenman demanded the strictest internal congruence: the criteria of excellence could not be more different; the pattern language that Reyner Banham could describe as the timeless element common to all past and future architectures and the self-referential sign specific of contemporary architecture were obviously contradictory and incompatible. But both ideas were eagerly desirous of controlling an extraordinary multiplicity; and both were developed theoretically and also methodologically... within the limits of the T-rule and the adjustable square: parametric architecture had a theory –or some draft theories- much time before it became a real possibility.
The computer did finally arrive in the eighties; in those days, nobody cared to refine, improve or reconsider those theories. Whatever the relevant firm taken into consideration, all witnesses afford the same universal evidence: the story is always one of a distrustful and suspicious principal making continuous statements against any kind of naive fascination in front of that auxiliary tool while asking his prothetic and just-arrived expert in its use to provide some not-really-important secondary material; the tool growing to take over running the whole process on practical grounds.
The effort of collecting chronicles of the sort is being made just now, and the idea that the history of the process must be recorded is part of a larger theoretical endeavor at last under way, including the reappearance of the question of a new architecture for a new time –both specifically based on the computer-, the idea of parametricism as a style –be it the only possible avantgarde or the latest of neogothics-, the organization of congresses and conferences intended to completely redefine the status of architecture, or the repeated statement that the existence of a theory is a necessary requisite to find one’s way among so much valueless production.
The course will critically study the theoretical materials produced during the different stages of this process.
- Progress in understanding architectural theory in the 21st century and its roots.
- Ability to read a text slowly in order to interpret it properly and critically; identification of relevant terminology.
- Ability to communicate information in oral and written format; management of time allocated to presentation.
- Ability to work collectively on theoretical and practical matters.
Students will be required to prepare two papers on two texts, to discuss their papers with the teachers and to present them to the class; the use of complementary bibliography will be encouraged.
Analysis of two texts and presentation to the class (40% + 25%)
Participation in discussion (25%)
Regular attendance to class (10%)