Scientific Models

P=NP?

Question:

Does P=NP?

Response:

http://en.wikipedia.org/wiki/P_%3D_NP_problem
“Is P equal to NP?
In a 2002 poll of 100 researchers, 61 believed the answer is no, 9 believed the answer is yes, 22 were unsure, and 8 believed the question may be independent of the currently accepted axioms, and so impossible to prove or disprove.[5]”

However, if the question is examined from a broader perspective, the nature of the problem becomes evident (computational complexity theory formally addresses the inherent difficulty of algorithmic problems).

http://fr.wikipedia.org/wiki/Th%C3%A9orie_de_la_complexit%C3%A9_des_algorithmes
“[translation] Clearly, P = NP because a deterministic algorithm is a specific non-deterministic algorithm, which, in simple terms, means that if a solution can be calculated in polynomial time, then it can be verified in polynomial time. On the other hand, the inverse: NP = P, which is the real difficulty of the P = NP equality, is a central open question in computational theory. In 1970, this question was raised independently by Stephen Cook and Levin[5]. Most experts make the assumption that NP-complete problems are not soluble in polynomial time. On this basis, several approaches have been attempted.”

If the computers of the future offer greater possibilities in terms of computation and algebraic representation than the technologies of today, we may be able to resolve this problem conclusively.

On the basis of the approaches I have seen and that have been brought to my attention, the answer is no.

I think that a future answer to this question will depend on a contribution from both theoretical and technological advancements to this type of problem, its demonstration and solution.

The significant value of this effort must be measured against the efforts required in other areas that will be essential to the society of the future.

- Paul Labbé

 


 

The Law-of-Nature Model

Question:

This is a question about the law-of-nature model used in science. Etienne Wegner argued in an article written in the sixties and available on line that it was only relatively simple (not easy) things that are the subject of natural laws because laws can be articulated for only those things for which the relevant conditions are itemizable and thus controllable. Complexity thus presents a challenge to a laws of nature model. (I’m not thinking of stochastics, obviously.) A law of nature is a statement of a pattern of nature that leaves out any reference to the circumstances (ICs) of a place, because it can. But it can because there aren’t any relevant ICs. There are plenty of phenomena, therefore, for which a law of nature model is not appropriate. It seems that scientists (physicists and non-systems biologists) aren’t giving much thought to these deep paradigmatic implications of complexity. My question is, is the reason for this lacuna because they’re locked into a reductionist paradigm or is it because of the spin that they have to do for simple non-scientists who pour out research funds in R&D projects? A related question is about “fundamental particles.” Is it really possible to explain human society as the effect of little bits of matter bumping up against each other collectively, or does the collectiveness not add many other dimensions of causality?

Answer:

This is a great question, and of course that means it’s not an easy one to answer. But here is my view in two parts.

First, I think we must make a distinction between a ‘law of nature’ and predictability. For example, let’s look at the weather, which is a fully deterministic system – by which I mean that all the variables involved (e.g. temperature, humidity, density, condensates and particles, masses -land/water, etc.) all transpire and interact according to well known and understood ‘laws of nature’ (e.g. thermodynamics, classical physics, etc.). Now despite it being fully deterministic we cannot predict the weather because of what is known (and perhaps can also be considered a law of nature) as sensitivity to initial conditions. This has been popularly known as the ‘butterfly effect’ – the flapping of the wings of a butterfly in China, could ultimately result in a hurricane in Florida. Thus a fully deterministic series of causes and effects (like weather, ecosystem, market) based on ‘laws of nature’ yet remaining completely unpredictable – and therefore appearing to be an ‘unlawful’ system. It is important to note that improvements in weather forecasting are based not so much on ‘prediction’ as they are on better methods of observing the current state through such things as satellite imagery.

This means that we essentially don’t know how small a difference makes a difference and the contrary how big a difference makes no difference.

Another view is the fact that observation itself can change or influence the phenomena under study – the most famous example being that the way in which we observe light results in observing light as either a particle or a wave. In a significant way it is the observer that organizes reality – that determines the boundary/outline of a system. That once we accept and understand the ultimate interdependence of all with all, then reductionism becomes limited in its utility – but still useful for many things.

However, this only addresses part of your question. The other part is the paradigm of reductionist approaches to understanding nature. Here again we must distinguish what science has found as ‘laws of nature’ versus models/theories of how we come to understand the application and consequences of these laws. Reductionism to put it simply means that if I understand the parts of a system I will eventually understand the whole of the system. Essentially this is a view of nature as a type of machine.
What chaos and complexity (recently articulated by some scientist/philosophers as chaoplexity) is that the whole is greater than the sum of its parts – that relationships between parts must be included in our understanding. This ‘greater than the sum of’ phenomena – what you refer to as ‘collectiveness’ is called emergence.

Emergent phenomena, remain dependent on the constituting parts and yet are ontologically real, that is cannot be understood or analyzed through the parts. For example, I can know all there is to know about hydrogen and oxygen, but cannot predict from that knowledge what water is (H2O). Even more, one water molecule or even 1,000 water molecules (a drop of water has in the order of magnitude about 1022 molecules) does not give me any way to anticipate and understand – wetness, viscosity, turbulence, even surface tension. Wetness, although completely dependent on hydrogen-oxygen combined in vast numbers, emergences (can’t be found in the parts) at a certain scale of interacting parts. Thus to study wetness we must study at the appropriate level. Another example, one can’t understand the rules of grammar through the study of synapses, while each human as language processing functions in their brains, particular languages are determined through social interaction. Thus although each act of languaging will necessarily be dependent on the synaptic events occurring in the brain, one could not determine which language is being processed by studying the synaptic events alone.

To try to conclude – laws of nature continue to be discovered, explored and exploited by science, but that doesn’t necessarily translate into a direct capacity to predict, or control. However, humans are excellent at figuring out how to exploit what is uncontrollable to certain ends.

And finally, that complexity (chaoplexity) represents the path of life – ever increasing levels of emergence (wholes which become greater than the sum of their parts which in turn become parts of larger wholes). Each new ‘emergent’ represents new capabilities, new domains of action and so must be studied as new fields with their own level of ‘natural laws’ (just as the laws of chemical interaction in synaptic events will not lead you to the rules of grammar in a particular language or the laws regulating of molecular configuration – H2O lead you to the laws regulating turbulence).

I know I’ve answered your question with examples of traditional scientific disciplines often referred to as the ‘hard sciences’. As a scientist I consider the social sciences the truly ‘complex’ sciences – the domains where complexity and emergence must form the fundamental research paradigm as the human domain is greater than the sum of psychological-social-cultural-economic -political-environmental parts.

Your question, and I hope my answer, underscores a fundamental strength of the scientific project, which is its ability to undertake radical transformation of its own theories, frameworks and paradigms – while continuing to provide a type of unifying body of knowledge and method.

- John Verdon

Date modified: