import/exportChristian Marc Schmidtanything(at)christianmarcschmidt.com



Adaptive Landscapes
PDF

Design can be adaptive, demonstrating a potential for change, as the conditions and context within...

This bibliography includes the reading material that was influential for the development of my...

My thesis begins with several opposing concepts: the ideal and the empirical; order and disorder;...

Modular Systems
PDF

In the early 1930s, American type designer William Addison Dwiggins began working on a ‘roman’...

Phonetic alphabets came from the necessity for clear communication over radio lines, and are...

Self-Organizing Networks
PDF

Warchalking exposes wireless Internet networks through hand-inscribed chalkmarks in urban...

A series of postcards is lasercut with haikus, written by the participants of an online forum, in...

In Rimbaud’s ‘ville’ of ‘Les Illuminations’, flows are structural, and steady-states are ephemeral....

Swarms, consisting of independent agents, exhibit spontaneous expansive and contractive macro...

Programme & Mutable Form
PDF

I am interested in exploring program and mutable form within the urban environment, both as an...

Markets are becoming global, while transitory population segments are growing. Cities are becoming...

A process book summarizes my early thesis direction involving computation. Formed around the text...

Computation & Complexity
PDF

This piece appears as a 16-page section in Multi-Purpose, a book produced by the graphic design MFA...

The similarity between language and landscapes is expressed through ‘linguistic’ mark-making within...

The Club of Rome’s famous 1975 article predicted, based on a computer simulation, that the world...

Globalization may encourage the homogenization of cultures, by accelerated processes of selection...

The statement “You Can’t Fall in Love with a Growth Curve” originated as a piece of graffiti, the...

Entropia is a typeface that responds to sound. Entropia can be set to various states of...

Urban Artifacts
PDF

“New York Places & Pleasures” by Kate Simon, published in 1959, is considered among the best...

Ten of the most familiar utopian city grids—from humanistic plans, based on proportions of the...

My proposal for a new railing along the pedestrian and bicycle walkway of the Williamsburg Bridge...

According to Aldo Rossi, urban artifacts are stable moments in the constantly shifting composition...

The site of the MTA Hudson Yards is located at the Hudson River terminus of the 34th Street...

1/ The Grid in City Planning

2/ The Roots of Modernism

3/ Power, Politics and Architecture

4/ Postmodernism & the Death of the Author

5/ The Empirical Model

6/ The Image of the City & The Urban Artifact

7/ Evolutionary Theory & Self-Organizing Networks

8/ Emergence & Sited Information

9/ Computation & Custom Software

10/ Programme & Complexity

11/ Prefabrication & Templating

12/ Standardization

13/ Ornament & Cryptography

14/ Towards an Adaptive Design Methodology

Notes My thesis begins with several opposing concepts: the ideal and the empirical; order and disorder; top-down and bottom-up; hierarchical and flat; static and dynamic. Extremes within a multidimensional coordinate system, these ideas represent the poles between which any form may be situated. Instead of categorizing, they direct. This approach has helped me arrive at a personal method, of which adaptivity forms the core.

The Grid in City Planning The grid in American city planning was initially a political instrument. When Philadelphia was founded in 1682, William Penn, who was working on a document prefiguring the Constitution of the United States, gave the instructions to lay out the town whereby its streets would be uniformly arranged in a checkerboard pattern. Thus, the rectilinear grid, which was later adopted as the primary form of city planning in North America, is rooted in a democratic sense of equality and commercial pragmatism.
page top

The Roots of Modernism The Modernist movement in the 20th century was preceded by the Italian Futurists, lead by Marinetti, who, in his founding manifesto, claimed that “Art, in fact, can be nothing but violence, cruelty, and injustice.” The Futurists, who detested history and knowledge, glorified nationalism, war, and the industrial machine. Futurism was one of the first Avante Garde movements, and most of its members went on to join the National Socialist parties in Italy and Germany, in the period before the Second World War. Though Modernism was less radical, it continued many of Futurism’s central paroles: its ahistoricism, glorification of the industrial process, and the belief in profound change. Although the Futurists would have used military might to assimilate other nations, the conquest of Modernism took place on a much subtler level—through its proliferation, yet ongoing, within the disciplines of architecture, art and design. And while Futurism is endless—an all-engulfing, accelerating whirlwind of change—Modernism tends towards a steady-state, an end of history.
page top

Power, Politics and Architecture When Le Corbusier wrote ‘The City of To-Morrow,’ he described two types of path: the ‘pack-donkey’s way’, which is meandering and following the line of least resistance, and ‘man’s way’, which is straight and determined. His city plans for Algiers and Chandigarh, India (the latter of which was realized), were two among many which utilized the rectilinear grid as their basic organizing principle. His approach to city planning originated from the differentiation between the physician—the planner who attempts to revitalize city neighborhoods by changing one variable at a time—and the surgeon, who revitalizes by demolishing the entire neighborhood and rebuilding from zero. Needless to say, Le Corbusier favored the latter, which is apparent in his famous plan for modernizing Paris. Without exception, and oblivious of scale, his architecture consisted of broad strokes of geometric simplicity—the work of a man who considered himself a universal artist, never prioritizing any discipline, be it painting, graphic design or architecture. Despite its bold vision and beauty, Chandigarh failed as a city plan. With is large blocks and wide avenues, it neglected the human scale in favor of the automobile. Today, Chandirgarh is ridden with crime and disrepair.

In 1937, Robert Moses was assigned parks commissioner of New York City. Driven by an idealism carried on from his days as a student at Yale, his ambition was to elevate New York City to the standards of a modern metropolis by building highways, recreation facilities and parks for the working classes, as well as adequate housing for the poor. Through a variety of political moves, he was able to attain a position of unrivaled political power, able to determine the outcome of every building project undertaken by the city. During his highly productive administration, Moses built bridges, highways, parks, housing projects, beaches and cultural centers such as Lincoln Plaza, adhering to modernist principles prioritizing the automobile. His housing projects lining the Lower East Side of Manhattan as well as innumerous sections of Harlem, Brooklyn and the Bronx, bear a strong resemblance to the high-rises in Le Corbusier’s Radiant City. The results of Moses’ city planning were catastrophic for the city, fragmenting communities and heightening crime rates. It is attributable to him that New York City lost one of its greatest architectural achievements to date, the original Penn Station, which was demolished in 1963.

In the case of both Le Corbusier and Robert Moses, authoritarian means were employed to enforce one individual’s philosophy—a collision between idealism and reality, in which idealism gained the upper hand. Little consideration was given to outside variables pertaining to particularities of the site. Rather, the site was formed to the will of a single person, often with tragic results.
page top

Postmodernism & the Death of the Author The beginnings of Postmodernism are found in literary theory. Barthes, Foucault and Derrida, as the movement’s key figures, introduced the idea that no piece of authored writing was, in fact, original. Rather, it was situated along a vector of writing in all its entirety. The authored text was based on systems, codes and traditions, “woven from threads of the ‘already written,’ the ‘already read’” (Allen, Graham). Out of response to a growing disillusionment with the idealistic goals of modernism and a reality consisting of unbridled commercialism, transforming America’s urban landscapes into strip malls, billboards and generic building façades, Robert Venturi began developing a language of appropriation in the 1960s, using the American vernacular type as a kit of parts to recombine and re-purpose. While Postmodernism advocated the appropriation of a culture which grew independently out of societal and commercial predispositions, as opposed to supposed elitist ideals of order and pure geometry, Modernism, which originated from Futurism and its contempt for history and knowledge, attempted to start from a clean slate, turning instead to technology and the industrial machine. The key to understanding their disparity, therefore, is empiricism.
page top

The Empirical Model Since economic theory became an academic discipline, there has been an ongoing dispute between followers of the empirical model, represented by Neo-Ricardian theory, and the equilibrium model, represented by Neo-Classical theory. The former, derived from the writings of David Ricardo, advocated the formulation of market theories based solely on the existence of prior statistical evidence. Neo-Classicism, on the other hand, postulated the existence of a universal equilibrium state, based upon which markets could be explained. It assumes perfect knowledge on behalf of all the market participants, which, in practice, makes it difficult to employ without making adjustments to the far-from-perfect conditions of the real-world. The empirical model, on the other hand, which has all but disappeared in contemporary economics, is resurfacing in other disciplines.
page top

The Image of the City & The Urban Artifact In the 1960s, architect and urban planner Kevin Lynch conducted experiments in creating a collective image of the modern city. He asked survey participants in three American cities to draw a map of the city from memory, following a common graphic language. The results strongly indicated that the mental image of a city held by its inhabitants was vastly different from the one held by architects, urban planners, and those with the political authority to shape it. These findings coincided with the theories put out by Jane Jacobs, a scholar of urban planning, whose model was based largely on statistical evidence. Jacobs became a public advocate for diversity and mixed-uses within city neighborhoods, states achievable only through an empirical process of development within the city. Opposing the practice of isolating and grouping functions initiated by the modernists, she became a key figure challenging the policies of Robert Moses during the dispute over the demolition of Penn Station. In the 1980s, Aldo Rossi published his ‘The Architecture of the City,’ in which he put forth the theory that cities are, in fact, historical constructs, growing forth from their Urban Artifacts, the constants within a changing urban fabric. His theoretical discourse criticizes naive functionalism for its disregard of the empirical layers within a city context, failing to integrate new architecture into the city as a whole.
page top

Evolutionary Theory & Self-Organizing Networks In 1997, Manuel De Landa described processes creating either meshworks or hierarchies—both historical constructs—as the key to understanding cultural, technological and biological evolution. Stable states, according to De Landa, are temporary manifestations of matter or information, based not on a process of optimization, but one of adaptation. He opposes the equilibrium theory held by followers of Darwin’s ‘survival of the fittest’ maxim, claiming that search spaces—the spaces in which the ‘probe-head’ of evolution operates—are characterized by selective adaptation to existing environmental circumstances, not the continued thrust towards maximum fitness. Underlying De Landa’s evolutionary theory, which he bases on Deleuzian philosophy, is an abstract machine of sorting and stratification, which, similar to a river bed, first collects matter or information, then sorts it hierarchically, in a vertically stratified meshwork. De Landa is interested in moments of historical bifurcation, those instances in which the limits of meshworks (relatively stable states) are reached and a shift from one strata to another occurs, always with historically defining consequences.

The context for De Landa’s hypothesis is ‘Universal Darwinism’—the term coined by Richard Dawkins, author of ‘The Selfish Gene.’ Universal Darwinism refers to the application of Darwin’s evolutionary theory, originally formulated for organisms, to non-organic structures in disciplines such as anthropology, psychology and computer science, as long as there exists an alternative replicator. (To test whether Darwin’s theory could actually be generalized, Richard Lewontin suggested replacing the word ‘organism’ in Darwin’s ‘Origin of Species’ with ‘individual’, and to examine whether the book still made sense. It did.) Universal Darwinism led Dawkins to his theory of memetics, postulating the existence of memes—self-replicating electrical impulses in our brains, which are, in fact, the basis of our thoughts and entire cultures. Similar to genes, memes are subject to selection and adaptation.
page top

Emergence & Sited Information In his book ‘A New Kind of Science’, Stephen Wolfram, author of the popular software ‘Mathematica’, explains cellular automata, a phenomenon discovered as early as the 1960s, in which simple rulesets govern the behavior of bits, represented by pixels and their alternate on or off states. Early experiments uncovered that certain rulesets could lead to endlessly repeating—fractal—and endlessly non-repeating—chaotic—patterns. Wolfram’s research extensively claims that cellular automata are, in fact, the simulating nature, and that everything is reducible to binary operations. As in chaos theory, Wolfram observes a tipping point, in which an otherwise stable system tips over into a chaotic one, an observation directly linked to De Landa’s historical bifurcations. The ‘Game of Life’, invented by James Conway decades earlier, is a two-dimensional cellular automata which simulates a living organism. Its elements seem to reproduce, move around the screen and ‘die’ after a certain passage of time, all again based on a relatively simple, unmodified set of rules.

Cellular automata exhibit what journalist Stephen Johnson calls ‘emergence.’ Johnson observes self-organization leading to emergent behavior in various other phenomena, including cells, ant colonies, cities and in the Internet. Just as Kevin Kelly before him analyzed the structure of a beehive in search of its ‘spirit’, Johnson looks at the ant colony as an example of a decentralized system. Its reoccurring behavior patterns on a macro-scale are undoubtedly more complex than its individual participants could comprehend.

In ‘Smart Mobs’, Howard Rheingold recognizes that emergent behavior patterns exist in groups empowered by mobile technology, in particular the cellular phone with its Internet capability. Through technology enabling location-sensitive, decentralized networks, he argues, groups of cell-phone users harness their collective potential—for political purposes, among others—on unprecedented scale. Rheingold, who foresaw the popularity of the Internet years before its event, recognizes location-based information services as the next big, society-altering technology. This is mirrored, interestingly, by the research into augmented reality systems currently conducted by Stephen Feiner, a professor of computer science at Columbia, who is investigating non-immersive interfaces for experiencing multiple layers of information in real (as opposed to virtual) environments. Annoted space, furthermore, is currently a popular idea, under development in several incarnations. The fundamental idea is that written annotations are linked to geographic locations, which can be retrieved by means of a handheld computer or cellular phone when arriving in their proximity. The aspect of sited information is what is important here—location-specific, as opposed to location-independent, as used to be the case with the Internet.

The Internet itself was born from concerns over security concerns. Initially called the Arpanet, it was founded by the US military as a vessel for the exchange of intelligence. Its lack of a centralized data server protected the network from hackers or other threats. In all of these cases, standard views of the ways in which our world and society operate are challenged by contemporary thinking around decentralized systems.
page top

Computation & Custom Software Conceptual artist Sol LeWitt was among the first to use programmatic algorithms in his work. In his words, “when an artist uses a conceptual form of art, it means that all the planning and decisions are made before hand and the execution is a perfunctory affair. The idea becomes a machine that makes the art.”

Architect Greg Lynn uses computation to create animate buildings. Through computational means, he converts site-specific information flows into three-dimensional form. Lynn’s architecture is a frieze-frame of a particular moment in time, situated empirically and programatically. Around the same time in the early 1990s, German architect Christian Moeller began experimenting with genetic algorithms for generating architectural form. His process entailed programming a computer to generate hundreds of possible shapes, from which the most likely forms are chosen. The chosen iterations are fed back into the program, and the process is repeated until the ideal forms emerge. Numerous other architects, designers and artists have since employed this process of optimization. This practice is employed by a growing number of architects and designers.

Software can be either proprietary or individually customized. The latter, while optimal, is typically also more expensive and has traditionally been limited to those with the necessary resources. Other generally employ cheaper, but less optimal proprietary solutions. The limitations of proprietary software has made it difficult for artists and designers to realize projects which fall outside of the boundaries of what is offered. John Maeda of the late mit Aesthetics and Computation Group at the MIT Media Lab made a first attempt at creating a programming language geared towards visual artists. ‘Design by Numbers’ uses a simplified syntax for programming visually oriented pieces. This initial effort was more recently followed by Processing, an interpreter based on a simplified version of Java, an object-oriented programming language.
page top

Programme & Complexity As John Maeda writes: “Complexity results when you attempt to reduce an already simple situation of form. Each action or motion leads incrementally to a set of less simple possibilities.” Indeed, the use of computation as a tool enables complexity through reduction unobtainable manually. This is a prerequisite for animate or mutable form. Animate form is the term invented by Greg Lynn, in which structures are singular moments situated along a hypothetical timeline. This idea originates with the Futurists, interested in capturing the speed and change of modern times, and is apparent in the work of Umberto Boccioni, or in the famous painting ‘Nude Descending a Staircase’ by Marcel Duchamp. Barthes, in his formidable essay on plastics, relates the latter to the inherent idea of mutability: “[...] more than a substance, plastic is the very idea of its infinite transformation; as its everyday name indicates, it is ubiquity made visible.”

Though mutability has been explored exhaustively in conjunction with the notion of entropy by the land artists in the 1960s and 70s, I am interested primarily in programmatic form. Programmatic does not have to mean computational. As early as the 1960s, Karl Gerstner published ‘Designing Programmes’, in which he analyzed programmatic systems in the arts and design. Analogous to Sol LeWitt, I use the term ‘programme’ to signify two components—the code, or designation of rules, on the one hand, and the compiler, or executable component on the other. Cartography is a case in point: Maps, as Denis Wood, author of ‘The Power of Maps’ points out, are highly complex ‘supersigns’, a synthesis or system of many lesser signs of a specific function. Similar to code, maps also have a legible and an executable state. Cartography, therefore, can also be considered a type of programming.

The programmatic process, primarily conceptual, has the potential to lead to unpredictable results—perhaps the strongest dynamic inherent in the use of this method. Unpredictability can be obtained through randomness, algorithm, and dynamic data. Randomness was explored in the early twentieth century by the Dadaists, in their wanderings through Paris and the French countryside. Although randomness can lead to unpredictable results, it does not lead to emergence: it stands in opposition to a probability theory of evolution. Randomness means equiprobability, which by definition cannot lead towards an optimization process. In order for there to be a process of optimization, combinatorical constraints are required. Certain elements must therefore have a higher probability of combining than others. This relationship reinforces itself over time.

The algorithm, mentioned earlier, involves a predefined process or particular set of rules. Algorithms can become emergent when rules are formulated in a certain way, as in the case of cellular automata. Genetic algorithms, furthermore, can lead to evolutionary systems through recursivity—generated results are fed back into the programme to produce new and increasingly optimized results. The International Situationists, an influential group in the 1960s, utilized the algorithm in their psychogeographic experiments, in which a repeating set of directions directed one’s walking course within the city. More recently, based on Situationist theory, generative psychogeography makes use of the genetic algorithms to optimize walking paths in urban environments. Finally, parameters can lead to unpredicable—and emergent—outcomes when translated into graphic or sculptural form. New York City zoning regulations are just one example, in which laws have directly influenced the shapes of architecture in the city for nearly a century, producing the complex landscape we are familiar with today.
page top

Prefabrication & Templating As designers, we are accustomed to using templates, in the sense that the various media we work in have clearly defined parameters and boundaries that we either accept or attempt to overcome. Standardization is central to our profession—when we are not using predetermined standards, we are establishing our own.

As a case in point, programming is initially non-proprietary. It is inherently conceptual, inherently abstract. Any program, however, requires an appropriate expression, and through its visualization, it too becomes subject to standardization. In a sense, while programming may appear to free us from the constraints of the proprietary, by constructing programs, or systems, we are, in fact, creating new templates.

As literary theory has pointed out, true originality does not exist—whether unconsciously or consciously, as creators we are always continuing ideas put forth by others. The more we absorb, therefore, the less we may ‘invent’—the drive towards innovation and differentiation, while truly futile, is paradoxically unavoidable.

Conclusively, the template is of vital significance to the notion of adaptability. Any designed object is in a sense a template or a container when it constitutes a system which refrains from being specific to any particular content, only to a particular type of content. Here, design acts as a flexible framework with the ability to accommodate a variety of inputs.
page top

Standardization Before the onset of industrialization, ornament was a highly valued practice saturated with the accumulation of cultural knowledge, knowledge which was evocative, rather than intellectual. Over the decades since Gutenberg, through increasing automation and the decrease in manual labor, ornamentation gradually began to disappear, as did the value associated with craftsmanship. In bookmaking, ornament reached its peak in the illuminated manuscripts of the middle ages; in architecture, it may have peaked with the rococo movement. In both bookmaking and architecture, automation in form of the printing press for moveable type and steel fabrication in the building industry led to a decline of ornament which has continued until this day.

It appears that what happened was essentially a paradigm shift from one form of standardization—the inherited knowledge and skill of the craftsman—to another—the standards imposed in the name of increased efficiency through automation. The fundamental difference, however, is that the former was locally situated, while the latter tended towards ever broader standards, standards which have become global in our age. As we have seen, standards on a local scale are better suited to represent the varieties of cultures existing in the world today. These are local customs and rituals, beliefs and linguistics. Standards on a global scale are more efficient and practicable to the point of necessity in the interest of business—the inherent conflict of globalization.

At the onset of the 20th century and the proliferation of mass production, efforts were underway to impose international standards in every sector of industry. A coordinated production was essential for the high volume in the nation-wide exchange of goods. Furthermore, standardization was encouraged not only in production, but also conception. As Le Corbusier attempts to demonstrate in his ‘Modulor’, compositional standards lead to innovation within set limitations. As an example, he cites musical notation, which limited composition to a fixed number of tones. This tempered scale was supposedly instrumental in bringing forth the great composers of our age, by focusing their creative output within a fixed, harmonious set of combinational boundaries. In another sense, standards evolved over time, as an accumulation of evidence gained through trial and error, in combination with the selective activity of market forces. Through study, we attempt to incessantly understand, re-contextualize and improve upon past works. These standards include certain types, such as the practice of a certain craft, building types, or the textbook, types which are constantly under scrutiny and gradually evolve alongside changing cultural values. As designers searching for tension and conflict in the work we produce, we are aware that standardization is a force to either embrace or counter.
page top

Ornament & Cryptography The three fundamental requirements of ornament are (1) that it is an evolutionary product, (2) that, consequently, it embodies a form of inherited, cultural knowledge, and (3) that it requires an object of ornamentation. The second requirement is essentially a byproduct of the first; in other words, the two are intertwined to a degree that makes them inseparable. The central concept is that ornament is a container for information. This offers a new way for understanding its implications. The information contained within an ornament is recoverable only to those with a fundamental understanding of the context within which it originated. By studying the history of a particular type of ornament throughout the time of its usage, we can begin to understand its meaning. Its meaning evolves—through a process of selection and sedimentation—to a degree that it becomes either iconic, in other words representational, or evocative and abstract.

Recently, information visualization has emerged as a practice situated in the liminal area between the disciplines of computer science and the graphic arts. As a visual representation of data, it too can be either abstract or representational in the way it conveys information. The fundamental goal is to offer insight into data through visual means, making it possible to recognize trends, tendencies, and other characteristics. Information visualization is primarily conceptual, and requires interpretation through application to an object, whether based in static or dynamic media.

Cryptography is closely related to information visualization. Information is rendered indecipherable through codifying, retrievable only through certain knowledge. Codification can occur visually, as in the case of barcodes: Here, information is rendered illegible by the conversion to a graphic system, decipherable only through the use of the decoding software. In a sense, information visualization functions along the same lines as cryptography, embedding data within a graphic form, however with the intent to communicate, not to obscure. Here, I make the connection to ornament. If these methods are applied to an object, the three requirements listed above may have been fulfilled: When the data is an evolutionary product, and thereby embodies a form of cultural knowledge, then through its visual application to an object, it becomes ornamental.

My tenet of ornament is therefore its renewal through computational means, a method of bringing it back into the cultural vocabulary as a visual container for information. Ornament, understood as a carrier of information, is subject to the evolutionary processes of selection, replication and mutation, the contained data changing in analogy to the visual form.
page top

Towards an Adaptive Design Methodology I am proposing a design methodology primarily interested in a bottom-up as opposed to top-down approach. This implies adjusting parameters, while observing the results after every modulation in a recursive process of trial and error, until the aesthetic and semantic factors of the design object stand in an appropriately balanced relationship. There are two types of possible scenarios for a bottom-up methodology. The first—a directed formation of dynamic flows—is akin to wind tunnel optimization, achieving a predetermined outcome at the end of the process. Although optimization is one possible scenario (analogous to the Darwinian model of evolution), the other possibility is one in which there exists no desired outcome. Instead, an outcome is attained through the process itself—it is a historical construct which stands merely at the end of a process, rather than at its peak (this model is described by Manuel De Landa, as well as the cultural selectionists). This second process—an undirected formation of dynamic flows—is decentralized, rather than hierarchical, and carries potential for creating surprising or even emergent outcomes.

A bottom-up methodology assumes preexisting datasets or behavior patterns within the environment, and aims to transform them directly into objects of design. The data can be either site-independent (isolated), or it can originate with the various informational subsets of a particular spatial environment (situated). Whether isolated or situated, an object designed based on preexisting data is, at least in theory, mutable, changing as the forces affecting its parameters change. It is in a state of constant becoming, in itself a reflection of process.

Similar to the collage city proposed by Fred Koetter and Colin Rowe, a theory of adaptive design presupposes the combination of both a bottom-up and a top-down methodology. Adaptability finds its practical application in an open-ended system, a framework which permits change to occur. It is, therefore, a predisposition to the traditional design process, primarily a choice of method and tools. Within this system, yet not contained by it, expression and authorship can find their places. And while the notion of design authorship changes with an adaptive process, it may remain intrinsically present, though perhaps not as clearly stated. Here, authorship may become more of a collaborative effort; an invitation to modify or add to. Adaptibility may also inform the way in which we define our practice as design professionals. Instead of following the traditional single-signature model, practices can become collaborative and interdisciplinary. In another sense, adaptibility empowers anyone to be a designer. As a passage from Mieke Gerritzen’s ‘Everyone Is A Designer’ prescribes: “Digital systems demand an aesthetic of unfinish. Design moves from the realm of visual problem solving to the fluid fields of experience creation, often unbounded in both time and form.” Adaptivity can become a position as well as a methodology, informing work which is deliberately and self-consciously sited in a world in which change is the only certainty.
page top

© Copyright 2007 Christian Marc Schmidt. All rights reserved | W3C XHTML | W3C CSS