corneretageres.com

<Exploring DeLanda's Perspective on Physics and Science's Dynamics>

Written on

In examining the realm of physics, it is essential to recognize that contradiction and disunity are intrinsic to all fields, including science. Throughout history, science has evolved into a distinct discipline, characterized by its essence, which poses challenges to a flat and experimental ontology. Essentialism and typology, inherent to scientific inquiry, hinder a precise evaluation of scientific progress as being experimental, disunified, and varied.

A productive entry point into this discussion is the concept of fundamental laws. Within classical mechanics, Newton's laws are often regarded as foundational truths from which further deductions are made. This reliance on essentialism categorizes singularities while disregarding the productive processes and causal relationships that lead to their emergence. Traditional notions of causation have been shaped by linearity, suggesting that effects follow mechanically from their causes according to exceptionless fundamental laws. This linguistic representation perpetuates an essentialist perspective on classical physics. Linear causality simplifies system responses, while non-linear and statistical models frequently leave phenomena inadequately explained, revealing the autonomy inherent in self-organization. Fixed and uniform representations of matter, drawing from Aristotle's philosophy of essence, exemplify this linearity. Deleuze, in works such as Difference and Repetition and A Thousand Plateaus, advocates for a materialist energeticism that embraces the movement of matter, introducing "variable intensive affects" to the formal essence, thus subverting essentialism rather than merely opposing it.

DeLanda asserts:

“Additivity and externality presuppose, as I said, a matter obedient to laws and constituting an inert receptacle for forms imposed from the outside. Matter under non-linear and non-equilibrium conditions is, on the other hand, intensive and problematic, capable of spontaneously giving rise to form drawing on its inherent tendencies (defined by singularities) as well as its complex capacities to affect and be affected” (ISVP 170).

The prevailing approach often overlooks the dynamic and productive causes in favor of a focus on constant regularities. The constant-conjunction theory of causation, which suggests that A causes B only in the sense that A is followed by B, was initially proposed by David Hume and echoed in Post-Newtonian physics. As a result, scientists began to prioritize the search for regularities in nature over investigating causal relationships and exploring experimental problems. This deductive method aimed to derive universal truths from general regularities, as illustrated by Ronald N. Giere in Explaining Science: A Cognitive Approach. He discusses the simplification of a two-dimensional pendulum into a one-dimensional model through logical deduction from Newton's principles.

The reality is that the intricacies of physics practices are far too complex to be effectively approached through a deductive-nomological framework, which seeks explanations through propositions. This method begins with a law expressed in language, which is then tested for its truth or falsity in a laboratory setting, ultimately categorizing phenomena under a general principle. This axiomatic structure leads to deductions that convey truth or falsehood, where axioms themselves are treated as essences, suggesting that the truth must be inherent within them. In Difference and Repetition, Gilles Deleuze utilizes the Kantian notion of 'shortest distance' as a schema that dictates spatial understanding in line with concepts, paralleling the "minimum principles" discussed by Manuel DeLanda and Morris Kline, which dictate how phenomena are related and ordered.

Scientific inquiry should be populated with models that do not imply completion, unity, or finality, but instead emerge from historical experimentation, accumulation, and interaction—forces influenced by the contingencies of these processes. This necessitates that fields remain open rather than closed, forming a non-axiomatic collection of models. Fundamental laws achieve generality at the cost of accuracy, as their generality relies on the assumption that all other conditions remain constant. This population approach enhances the descriptive accuracy of phenomena and complex causal interactions in classical mechanics while sacrificing generality. Physicists have been known to employ statistical models for data organization rather than engaging directly with raw data through experimentation, where axioms confront reality. This confrontation represents an active causal intervention, contrasting with passive observation that organizes the population of models. It is crucial to avoid teleological, goal-oriented, finite, and efficient methodologies in scientific study.

Manuel DeLanda's approach in Intensive Science and Virtual Philosophy advocates for recognizing the heterogeneous and variable population of scientific models and the productive relationships between phenomena. The prioritization of linguistic truth must be reevaluated, bridging the gap between mathematical models and linguistic laws. This approach emphasizes well-defined problems rather than an axiomatic framework.

“For problems-ideas are by nature unconscious: they are extra propositional and sub representative, and do not represent the affirmations to which they give rise” (DR 267).

This distribution unfolds within a non-essential multiplicity. Newton's laws should not be perceived as revealing an objective and universal truth; rather, they should be understood as well-posed problems.

“These questions are those of the accident, the event, the multiplicity” (DR 188).

However, the problem should not be subordinated to its solution or its solvability. A genuinely problematic question remains linguistic while elucidating the underlying reasons for phenomena rather than merely describing regularities. This distinction allows for more accurate assessments of relevant and irrelevant factors concerning phenomena. Two individuals may pose the same question about a phenomenon, yet their inquiries can yield entirely different meanings based on their respective contexts. This variance is encapsulated in what Alan Garfinkle terms "contrast spaces," analogous to "state spaces" in physics. The contrast/state space is a realm of possibilities, whether geometric in physics or decisive in everyday inquiries, thus becoming "metalinguistic."

“In a typical nonlinear state space, subdivided by multiple attractors and their basins of attraction, the structure of the space of possibilities depends not on some extrinsically defined relation but on the distribution of singularities itself. The trajectories in state space, defining possible sequences of states, are spontaneously broken into equivalence classes by the basins of attraction: if the starting point or initial condition of two different trajectories falls within a given basin, both trajectories are bound to end up in the same state, and are equivalent in that respect.” (ISVP 160).

The central thesis posits that an approach aimed at accurately distributing singularities challenges the concept of objectivity while presenting true and false problems. The identification of poorly-posed questions lies within the contrast space, characterized by its vagueness or rigidity. Overdetermination often encompasses excessive alternatives within a question, leading to the inclusion of irrelevant factors. Establishing relevance and irrelevance in contrast spaces is critical. Contrast spaces and underlying assumptions are key to unlocking differentiation and objective validity in question formulation. Delanda illustrates this with the example of a convection cell and its cyclic behavior, where irrelevant "micro-causal" descriptions (e.g., collisions of individual molecules) may be raised. The appropriate explanation for this phenomenon hinges on macro-causal concepts such as temperature and density gradients along with gravitational interactions.

“Causal problems should be framed at the correct level given that each emergent level has its own causal capacities, these capacities being what differentiates these individuals from each other” (ISVP 162).

In classical physics, laws are articulated through differential equations, presupposing physical quantities that shape the dimensions of state spaces, while the distribution of singularities within the basins of attraction constitutes the contrast space.

This discussion is deeply intertwined with Deleuze's ontology and epistemology. When considering populations, some phenomena establish causal relations with actual events, while others forge quasi-causal links with virtual singularities. In the interplay between explanatory problems and individual solutions (with the former being a virtual counterpart and the latter actual), questions and problems can be explicated to yield a multiplicity of solutions: they explicate the virtual in reality. This relationship must exhibit isomorphism, shifting the perception of physics from a field that produces true linguistic propositions reflecting reality to one where these two dimensions are interconnected yet distinct. Is individuality maintainable while ensuring such isomorphism?

“The philosopher must become isomorphic with the quasi-causal operator, extracting problems from law-expressing propositions and meshing the problems together to endow them with that minimum of autonomy which ensures their irreducibility to their solutions” (ISVT 165).

Another fundamental aspect of Delanda's model is the transition from studying properties to exploring capacities. Individuality must be cultivated through a process of individuation that dismisses static notions, allowing us to recognize individuals in actuality by observing their effects in heterogeneous assemblies alongside other entities. We must isolate the desirable properties (partial objects) for discovery to innovate in a more experimental and accurate manner. Scientific knowledge should emerge from a dynamic multiplicity of relationships. Science progressively differentiates between relevant and irrelevant factors, placing its accuracy in a state of flux and maintaining an open-ended perspective.

Notes:

[1] It is also worth noting that this approach has connections to Réné Descartes, who anticipated that scientific laws would derive from a unified truth of the universe. [2] Verificationist theories of meaning not only seek the truth of statements but also entirely disregard statements that cannot be tested by their criteria, representing a limitation.

Works Cited:

DR: Gilles Deleuze, Difference and Repetition, Columbia University Press, 1995 ISVP: Manuel DeLanda, Intensive Science and Virtual Philosophy, Bloomsbury Academic; Revised ed. Edition, 2013 ATP: Gilles Deleuze and Félix Guattari, A Thousand Plateaus: Capitalism and Schizophrenia Volume 2, University of Minnesota Press, 1987