There is the smell of lungwort flowers
Among forget-me-not
In the fact that I,
My distracted rigorous mind,
Is the root of the Non-unit,
Hiding the split point
Towards what was
And what will be. The pillar.
V.
Khlebnikov.
|
INTRODUCTION
We try to analyze some logical and philosophical
aspects of the quantum theory and Gödel's theorem on
incompleteness in this article. It is shown, that logic must
develop concepts of relativity (up to the introduction of
logical "reference systems", similar to the physical reference
systems) and nonpredicative notions must become foundation of
the logical constructions. Model theory must more flexibly
individualize elements of sets and their features. Relation
between modern mathematics and philosophical concepts of Gödel
and Berkley is shown. There are some shortcomings in
the article. For example, we didn't discuss some generally
known aspects of Gödel's theorem, quantum phenomena.
Philosophical themes are given very briefly. The theory of
category is briefly touched upon too. It is desirable (but not
necessary) that, despite our century of narrow specialisation,
the reader get to know all these subjects. It is described in
details only that, in what we managed (or it seemed to manage)
to tell something new in the above mentioned
questions. 1. RELATION BETWEEN the SUBJECT and PREDICATE
in QUANTUM THEORY Superposition principle
and common look of the wave function dependence Ψ on the time is important for
our article. If the physical system can be in the states Ψ1 and Ψ2, it can be in the
state αΨ1
+βΨ2, as well, where αΨ1+βΨ2,
where α and β are complex constants.
Dependence Ψ(t) on the time
is described by the summarized equation of
Schrödinger:
which is real in any of the quantum theories, which differ
only in the construction specific of the H Hamilton operator. Many deep
investigations was devoted Ψ-function of the semantic analysis. We should like to
pay attention to one detail, which could be missed in these
investigations. The fact is, Schrödinger equation
is the fields equation, that is, a differential
equation for some potential (in this case, it is a complex-
symbolazed vector potential in Hilbert space). In all
prequantum physical theories such equations described what
philosophers call the matter, the substance, the medium.
Physicists call it a field, some essence which transfers
energy and has some force effect on the matter. Strangeness of
the quantum theory consists in the effect that the field
equation describes only informational Ψ-function in it. Half-bilinear
form (Ψ*ĜΨ),
where Ĝ, Hermitian
operator, will be the probability density for the
observed meanings of some physical value (for example, for the
particle coordinate). Such nonlinear dependence of probability
on linear dependent conditions (superposition principle) has
some deep logical consequences. For example, if there
several alternative object states, which can't be detected by
our device (that is simply by a macro-situation) then the
states loose their alternativeness. The object behaves itself
as if it can be in one of this states or in several states at
the same time. In other words, electron is blurred as a medium
all over the whole space till the device measures its
coordinate, which still will be "dotted" during measurement,
and will differ from other possible alternatives of the
coordinate. Indiscernibleness of unobservable
alternatives goes out of the principle of superposition for
the conditions and the quadratic expressions for the
probabilities. Properly speaking, this proves the known two-
slot mental experiment on the interference of the probability
waves. There are an obstacle before the electron source whit
has two slots. If to make a macro-situation such that the
device fix both the slot, through which the electron has
passed, and the place of the electron hit on the receiver
screen, then the probability distribution will be described by
the quadratic of the amplitude sum. Otherwise, when the device
doesn't fix the electrons way, the probability will be the
quadratic of the sum amplitudes, that is, the electron
distribution will give interference picture. So the
informational, logical structure (Ψ-function) is manifested
physically as a field. They usually state, that we have right
to discuss, which of the two alternative ways the electron has
chosen, if they can't be distinguished by observation. But it
isn't so, we don't speak about the results of observation, but
about this ways, calculations their amplitudes summarizing.
So, electron, being an empirical point, is as well "a logic
field" the statements not verified empirically which status of
the validity is wave process in a certain physical
environment. Ψ is a
vector field, scalar quadratic function of which plays
exclusively logical part. The truth measure of the statement c
corresponds semantically to its "presence" degree in the given
sphere of individuals and is a field intensity in the quantum
theory. In usual physical fields |Ψ|
2 would be energy; in the ase of
quantum, it is a probability measure of the
statement. It should be noted, that empiric
undeterminability of the sentence does not mean loosing the
sense of this sentence at all: it reveals itself indirectly,
through the observed facts. We can and discuss the non-
observably electrone trajectories. It is the amplitude
superposition of these ways that creates interference picture
of the screen. Here lies the principal difference of the
quantum correlation of uncertainties from any classical
relation of the measurement mistakes. If a classical molecule
in volume V puts pressure
on the volume walls, consisting of local hits, the electron,
localized in the some volume, will not have any definite
coordinate, but distribute itself through the whole volume
and, therefore,exert an even pressure on the walls (which will
take some force interval, as for the impuls we have: ΔpΔx≥h/4π). This "blurring"
effect we observe physically. And just it lies in the basis of
the tunnel transition, the principle of the least action, as
well as processes in the presence of virtual
particles. So, in the quantum theory, the density of
the distribution of probability for a point particle
principally is not distinguishable from the medium density,
being present all over the space. It means logically some
nonpredicativity: measure of the truth of the sentence about
the object of some other admissible theory statement
itself. And in a classical case, the experiment also
does not distinguish the point, having a coordinate,
determined with the error Δx, from the medium, distributed in the interval Δx. But the classical theory
takes unconstructive, inductively established idealization of
the material point. The latter supposes unevidently, that
density of the probability distribution of the point object is
not identical with the medium distribution density. And
what is more, both classical and constructive logic takes
analogical idealization when proclaims a consistency tautology
in the kind: A&(~A)=0. When
following the concept of the truth of the quantum theory, it
is wrong in the case, when the sentence А is unestablishable by some
effective means. Perhaps, we must consider some "reference
systems" in the quantum logic, which a formed by a definite
choice of the disjunctive system of the events. For
better understanding of the difference between the quantum and
classical situation, we shall give one obvious example.
Suppose, on a flat surface a heap of sand is poured out so
that mass density over the surface is distributed; for
instance, normally. And suppose, there is one black grain of
sand in this heap. Then the relative (i.e., fixed) weight of
the sand over a unit of the area will present the density of
some medium, and the density of probability to find the black
grain of sand over the given unit of the area. And
nevertheless, the medium presents one thing in the classical
case, and the information is quite different (probability to
find the black grain of sand). But in case of the quantum the
situation is quite different. If the black grain of sand is
somewhere in the depth of the heap, then it cannot be observed
directly and is does not have to be a grain: it will behave
itself as a medium, distributed according to the probability.
And the whole heap will take blackish tint, thickening to the
top. But if we begin to mix the sand, all this blackness will
be concentrated in one point, when the grain of sand will come
out and will become observable. Resuming we an make
following conclusions: 1) Wave-particle
duality. Wave-particle duality is not a
problem by itself. It is only the consequence of a deeper
dualism between the medium and the predicate, the material and
ideal. There is no paradox in the unity of a wave and
corpuscular descriptions at all. That is why physics so
persistently searched for electromagnetic ether in the nature
at the beginning of the century. Any medium, discrete or
continuous, having the simplest elastic force, is a carrier of
wave processes with we have in acoustics, for example. And
vise versa, with the help of wave packets we can model any
classical corpuscles. 2) Dualism of the subject and
predicate. Real paradox is another dualizm: what
was density of probability of some event under one condition,
becomes a physical substance under the others. And these
conditions lie in the way the outer macrosituation
distinguishes the range of disjunctive microconditions of the
object. The same Ψ-vector
describes both field properties of the object, and
informational, logical properties of sentences about the
object. In the end, this state of the things is caused by the
fact that empirically the object is distinguished by us only
by the presence or absence of some properties in it, these
properties being manifested in the nature neither worse nor
better than the objects possessing these properties. "To be an
object" is also one of the properties. It is just logically
that set elements from predicates modelled on this set. As we
can see, the nature is built somehow differently. And that is
why strict division of the world into elements of individual
sphere and the predicates outwardly applied to the sequences
of these elements is a shortcoming of the modern formal logic.
A conclusion can be made that future development of logic must
go by means of more delicate study of model processes and more
flexible solution of the individuation problem. Namely, the
relation (х ∈ М) must be
determined effectively by some individuation means. The same
is in the relation of the truth of А
(х1,...хn)=1 the sentence
А to the sequence (хi)|
хi∈M,i=1,...,n; of the set elements
М. As for set theory
itself, plurality of molecules in a glass of water,
undoubtfully makes one physical object and has a physical
reality. Which cannot be so simply said, for example, about
boolean set of this set. The nature, apparently, tends "to
turn on and of" notorious "function of collecting ∈ ". In any case, in quantum
theory, only those sets of microconditions are real, which are
alternatively distinguished by a macrosituation. Now,
let us pass on the effectiveness problems. The first attempt
to substantiate mathematics by empiric realities is
constructivism. All constructivism appeals to intuitive
clearness come only to feasibility of empiric processes of
calculation. And only empiric practice makes these processes
clear intuitively. The quantum theory shows, that empiric
means do not come only to the calculation processes. There
many processes their effectiveness being groundlessly supposed
self-evident by the constructivism. We have already pointed
out violation of tautology A&(~A)
=0, which is true both in boolean and heiting
logic. Analogical is the case with the self-obvious
effectiveness of the reflexive property of the equation: Х=Х in the constructive and
classical theories. Ist inner contradictoriness was notices
from ancient times. To experimentally determine Х=Х, it is necessary to take the
same twice for the comparison with itself, which is
impossible. The modern method of inscriptions consideration
(of unlimited reserve of copies of the given object or symbol)
simply diverts from the problem considering it being solved a
priori. However, nontriviality of individuation of the quantum
objects does not require effective determination of
reflexivity. For the same reasons it is impossible to consider
admissible idealisation: "a point in a continuum", or in
general: "the set element" if this concept is defined
independently both from set, and from internal structure of
the element. These of idealisation cease to work already at
least because of basic integrity of the quantum description:
Ψ- function sets a
condition of all Universe, instead of set of the allocated
objects. Complementarity principle points to the fact, that
effective means are relative, comprise some equitable
"reference systems", much more difficult, than system of bases
in linear space. This relativity of the logical theory is
still to be studied. It is thought that Brauer tried in
intuitionizm to come to deeper ideas, rather than constructive
"inspection" of constructions of the
Cantor...
3) Quanta and
probability. Despite unavoidable error of
measurement, classical theory ascribes to the physical
quantity point meaning in the interval of this error. This
idealization is the main difference of classical physics from
the quantum one. The idealization is based on induction we can
always construct a more precise device and repeat the
measurement. The nonrepeatability of the measurement act in
the quantum theory prohibits this induction. But the medium,
distributed in the error interval of one measurement, is
indistinguishable from the point, which could be fixed in this
interval by some other, ideal measurement. It is because we
have only one measurement. And the thing, which in some other
measurement would be a point, an element of the set of values
of physical quantity, is an interval now, a subset. To be more
precise not even a subset, but measure of the truth of some
predicate. Nature does not know any strict selfidentical
elements in a new way, depending on distinguishing of their
features in some basic set ("reference system" of the device).
Let some physical quantity be determined through the set of
values of another quantity now. Then point localization of one
of them will lead to the loosing of localization of another
one. And "the value" of this another quantity will become some
set of ist common values, that is, predicate in the
sphere, which it runs trough. Such is the case with
coordinates and impulses, as you know, the speed values,
representing derivatives, are determined through the open
intervals of coordinate values (let us remember the Fourier
transform, used for the derivation of relationship of
uncertainties). Hence principles uncertainty of the quantum
description come out. Here one should pay attention to
the fact, that physics could face the violation of the
idealization of the point values of the quantities not
obligatory in microcosm simply because of the necessity of
very precise measurements. The examined idealization itself
has not physical, but logic and statistical
meaning. That is why it is necessary to specify logical
foundations of mathematical statistics to understand quantum
phenomena. Theory of probability is applied in statistics in
some nonpredicative manner, striving to ground its axioms
empirically, end the same time postulating their empirical
truth beforehand. So that "quantum" phenomena may arise
in statistics not only because of the high measurements
precision, but also by some other reasons, for example, when
increasing the selection volume (however, the last follows
from the most quantum theory: the rubber ball thrown in a
concrete wall, will necessarily make through it tunnel
transition if to make astronomically a great number of throws.
However not all here so is simple. Both classical, and the
quantum physics describe the same world, and the quantum
theory does not deny any classical laws, and opposite, only
follows them. Actually all their distinctions purely
statistical: to the events unequivocally occurring or
impossible in the classic, distributions of probabilities
answer. And the quantum description always needs classical
verification. Differently, it is necessary to Newton and
Einstein's physics, without rejecting the laws, and at all
without paying attention to the microcosm phenomena, all the
same somehow to explain abnormal behaviour of a ball). Not
classical physics, but modern theory of probability is the
approximate description of nature. It is necessary to regard
only concrete meaning of the Plank constant to the microcosm
phenomena specificity. When probability density is
undistinguishable from the medium density, then even simple
notion of the object, will be described by the wave process,
and it is naturally to expect, that we'll come to some field
equation of Schrödinger type. As for the empiric
shortcomings of modern statistics, consecutively carried out
principle of the probability approach should consider
metaprobabilities, that is probability measures. All
situations with the beforehand unknown laws of distribution,
lead to such an approach. And that is why we empirically
begin to speak about the probabilities, that we do not know
them. And frequency of some event in the consecutive theory
must give information about the distribution laws not the
first, but higher metalevels. But it is the subject for
another work. Let us examine one more famous mental
experiment in conclusion. It is the experiment of Einstein-
Podolsky-Rosen. That teleportation of properties by which have
now learnt to make experimentally. Two photons flying
in the opposite sides, are created as a result of annihilation
of an electron and a positron. Far as they fly away to the
moment of their being observed, determination of polarisation
of one of the photons will cause exactly this polarization of
the other, which is required for preservation of the moment of
momentum of the whole system. It turns out, that some
physically important information can spread at any speed,
exceeding that the speed of light. There is no contradiction
here, a luminous spot on the screen of a long enough kinescope
can also move at any high speed. The mental experiment simply
affirms that not all physical phenomena are caused by
spreading of the signal carrying energy. That is, purely phase
characteristics of Ψ-
function describe the observed physical effects. But Ψ-function is such a "truth
measure", phase changes of which describe qualitative
peculiarity of the sentences, their interrelation and
dependance on the time. The world, weaved by the movable light
spot, has quite different informational characteristics, which
classical physics supposes. In a classical world, the
information exists only by Shannon, as a measure of nonuniform
material substance: some medium or energy-carrying signal. But
in the world, weaved by the light spot, we have information as
a self-sufficient ideal essence, from which purely
theologically, as by the drawing or plan, the matter is
constructed. The Einstein-Podosky-Rosen experiment, the
explanation of the least action principle through the wave
packets, the tunnel transition, combination of
undistinguishable alternatives, virtual effects, all this says
that the real world has exactly such, teleological features,
but not always and not in everything. This has some
deep philosophical and logical consequences. We have tried to
prove, that understanding of the quantum phenomena should be
looked for in logic. Namely, it is required to study the
relations between the individual sphere and the sphere of
predicates more flexible and in more details, nontrivially
decide the individuation problem. Individual of elements must
be effectively defined by the available means, forming
something like reference systems. It requires some
reconstruction and the probability theory as the
base. 2. A LITTLE of PHILOSOPHY Foundations of the
mathematical logic, especially in the semantic sphere and in
the model theory are inseparable from some global
philosophical questions, which we'll examine
briefly. 1) Material and Ideal. This
question is directly connected with our main subject: logical
relation between objects and their
properties. Classical philosophy paid much attention to
the material and ideal. However these notions themselves have
not reached a sufficient strictness level. Under ideal
philosophers understand intuitive manner of psychical
processes, which begin in the sensation and finish somewhere
on the level of thinking and ideas. Philosophy represents
ideal generally, turning off the pure human qualities and
perceiving similar phenomena in the whole nature. As for the
material, here the initial image was the aspect sensation,
which goes out of the limits of psychical and belongs to the
outer world. Nevertheless, classical philosophy left us two
clear-cut aspects of the material and the
ideal. Firstly. The ideal has no other essence except
reflection, modelling, homomorphism. The material is the
reflected itself, thus appearing as something external to the
ideal. These results are formulated most fully by Marx and
express some functional properties of the material and the
ideal. We deliberately digress from the marxism specificity
and do not state after Marx, that "material exists outside and
independently of the ideal", and that "material is primary,
ideal is secondary". Secondly. The ideal is in the role
of a predicate, a property and has ist foundation in itself in
this sense. Material is a subject of a predicate, a carrier of
properties, the main of which (axioms) are externally applied
to the matter. And in this sense the material has a foundation
out of itself in something else. This aspect is most exactly
formed by Hegel. And again, we turn off the rest of the
specific properties, which Hegel ascribes to the idea and the
matter, and we stop at these logical properties. Now
let us pay our attention to the fact, that physics and
mathematics brought about much new to this question and more
strictly studied both the first, and the second aspects. The
point is, that in modern science both these aspects
characterize the ideal only as an information, and the
material as the thing, which models, reflects the given
information. Attention should be paid to the fact, that the
information here is understood in a wide sense, as some
measure on properties, expressing their qualitative
differences and quantitative characteristics of their truth.
But the logic has not developed yet such an information
concept. Shannon's definition is only a particular case
purely quantitative, comparison of the information of
qualitatively uniform sentences, having real values of the
truth in the [0,1]
interval. So, the philosophical notion of the ideal
corresponds to the sentences, their probabilities, their
measures of truth in different logics, as well as to the
values depending on them. All the above mentioned arguments of
the quantum theory consider the relationship between the
material and the ideal (the medium and the probability). As we
see, the nature is build so that these points are the sides of
the same medal. The material and the ideal are unstable
notions relativated in some reference systems, determining and
completing each other. In the quantum theory, the only concept
describes them the Ψ-
function. So, the logical problem of the subject and the
predicate has a direct philosophical accent. 2) What
is Dialectics? Now this word seems to be somehow
incomprehensible, which is interpreted almost like
"sophistry", i.e. toing with paradoxes, as ungrounded attempts
to draw strict conclusions in nonformal and even
nonformalizable situations. In the past philosophers
often argued, whether the laws of logic are objective or not.
But now it is almost evident, that the laws of logic are just
the laws of logic are just the laws of our thinking and
nothing more. And these laws are universal for the whole world
just because they are the laws of making language constructs.
And they are objective so far as they describe some part of
work of our brain. That why many different logics can
be constructed which will describe the same theoretical
content. The logic collides with reality in verification and
modelling processes only, which are empirical in the long
run. But the reality itself has not to obey any logical
laws at all. Moreover, everything shows that the reality is
not logical altogether. For example, it is impossibility
simultaneous completeness and noncontradictoriness of the
formal arithmetic and empirical completeness and feasibility
of all real count processes. And our theoretical cognition
itself, turning to the experiment, is forced to give old
formal theories up and create new, more precise ones. And,
judging by everything, this process is infinite. That means
that no formal theory is capable of being empirically
complete. I.e., the world have no formal model. In other
words, it is contradictory logically! But it is contradictory
logically only. These formal contradictions are not only
solved in their content, but, possibly, do not even
arise. We approach the formal contradictoriness of any
real content for purely formal reasons as well. Let us again
turn to Gödel's theorem of noncompleteness. Any formal theory,
rich enough in empirical content, includes the arithmetics.
Hence, being noncontradictory, this theory cannot be complete.
Whereas in reality this theory can have a complete model
(arithmetic, for instance). Hence follows contradictoriness of
the reality, as well as the necessity of constant turning of
the formal theory to its empirical model for solving
statements which are unestablishable formally. Thus,
the real world is contradictory formally, but this is not the
problem of the world, it is our problem. Because the formal
logic is the law of our thinking. All this is a direct
consequence of the statements being unsolvable formally,
always solvable by their content but not vice verso. And here
follows, that the substantial object is simply
nonformalizable, it cannot exist formally. But in the logic
(and not only in logic), incapability to exist is a synonym of
contradictoriness. But the substantial objects exist really.
There is difference between actual and logical existence. So
that in reality everything is contradictory logically in the
nature. That is why we have to model noncontradictory the
contradictory, reconciling ourselves to idealizations,
relativity of our truths and groundlessness of our induction
methods. And the cognition is just constant pulling on of a
straight-jacket of the logic on the evident absurdness of the
reality. Therefore, the dialectics is a logical
contradictoriness which is realized logically through a
noncontradictory method. An excellent illustration to it is an
example from the quantum theory identification of
nonobservable alternatives. In fact, all the science consists
of such examples. And a point in the continuum is just an
attempt to connect full totality, lack of individuality of the
medium with "elementness", strict individualization of
discrete objects. Thus, the logically contradictory world is
projected noncontradictorily in our logic. It is just a
dialectical contradictoriness. It is a cognition. 3)
Relativity. The problem of relativity was put
forward by Berkley. It is very important, that relativity of
the truth consisted for him just in the presence of effective
means of determining it. And this relativity was total: every
object was created only through the correlations of
sensations. "To exist is to be apprehended". Only
physics was included in further development of the relativity.
So reference systems appeared. So excessive anthropomorphism
was excluded and even by "interaction". Thus, "exist" became
the synonym for "being perceived and perceive". The
relativity itself has a purely dialectic nature. So far as the
reality is contradictory, А as well must be true besides the
truth of some sentence (~A). One of the methods of logically noncontradictory
modelling of this situation is the truth А in the reference system and the
truth in (~A) in another,
the order reference system finitizing the world, "carving" a
finite part out of it and choosing effectively not any, but a
particular negation, i.e. sentence В such, that A&В=0. As for the concrete
character of the relativity, the modern science gives here two
models: the Einsteinian and the quantum ones. These models
differ in essence, though in both cases it is a question of
transformations of basis of linear space. The reference
systems of the theory of relativity are built on the passive
act of observation, which leads to co-existence of different
reference systems, exchanging the information, and to the
possibility to change our system rather at will. But in the
quantum theory, Ψ-function
always describes the situation as a whole, it is always the
only and unique at the given moment of time. The observation
act is replaced by the interaction act and there is no
changing the choice of the complete system of the really
measurable values. There is no observer from the other
reference system who would measure impulses when I am
measuring the coordinate. The quantum observer is always one
as well as an instrument which he has chose for measuring
characteristics of a particular particle in a given
condition. Thus, Einstein's relativity says of
different sides of the real. The quantum relativity treat of
different variations of the possible with complete
absoluteness of choice from these possibilities. The choice is
only one and once being made it cannot be changed. As we see,
the differences have a modal character. But in both cases, the
relativity is modelled mathematically by transforming the
bases of some space. In the Einsteinian case, it is a metric
space. In the quantum case, we have different presentations of
the state vector in Gilbert's space. And the
identification of nonobservable alternatives gets a very
interesting formulation: "The class of possibilities not
realized by a given reality, presents itself as the only and
indivisible reality. And vice verso, any reality is the class
of possibilities which are not realized by it." Just this
happens with Ψ-function of
the electron when it "gropes" ist real way with superlight
phase speed to fulfil the principle of the least
action. And again, we see, that the cause of the
quantum phenomena lies not in physics, but in the logic and
the theory of probabilities. But the modern logic lags behind
these ideas, it does not model the relativity of sentences. It
has happened, because the effectiveness problem has been
developed by the mathematics without any relation to the
relativity problem. Thus, different models of the given
predicate only conjuctively add to its properties a new list
chosen from a strictly fixed set of properties coinciding with
the initial ones. The concept of relative solvability
developed by the constructivizm, is notable for analogical
scholasticism. The relativity of modern physics requires that
the predicate model does not regulate trivially strict
relation of this predicate with the elements of the individual
sphere. And the value of the truth itself must also be
relativated. 4) Hegel. Now let us
remember Hegel's dialectics. In the basis of the Universe he
puts the idea, the notion which is identical to the God in
absolute. The world of phenomena is presented as not quite
adequate embodiment of the idea, almost as a parody on it.
Thus the matter is determined negatively as a negation of the
notion and, at the same time, the matter has only the notion
as the basis. So the matter - this is what has cause of
something else. The notion also has its foundation only "in
itself and for itself" - no more characteristic properties of
this dialectical pair has. And the direct self-truth of the
axiom cannot, in fact, be self-validity of the notion, for is
empty and conceals in reality the basis of the axiom in the
belief or in practical needs. In the same way the notion
creates itself and deduce through the self-denial in its other
existence in the matter. Two properties of Hegel's
dialectics are striking the eye. The first one is the
relativity. All this expressions of the kind "in itself", "for
itself", "for the other" are simply the indicators of
different reference systems. But with Hegel, the interrelation
of these systems is rather peculiar. They are unstable and
flow over one into another constantly. Just in this flowing
consists both the process of meditation over the nature of the
notion and the evolution of the notion itself. "In this
aspect, it looks so that requires quite different aspect where
it looks quite different..." is a typical turn of Hegel's
thought. It should be noted that the nature behaves so as
well, performing constantly according to Schrödinger's
equation unitary transformation of the Ψ-function. And the
second, the main characteristic of the notion is
nonpredicativity self-applicability. And Hegel, as it makes
Russel in the 20-th century, arrives at the self-negation of
such a notion. To avoid misunderstanding, later on we shall
use inverted commas to distinguish Hegel's notion from the
notion in general. 3. NON-PREDICATIVE CONCEPTS in the
CONTEMPORARY LOGIC. We are having an
interesting situation. The formal logic can model Hegel's
ideas nonpredicatively only. At the same time, Russel has
driven out of the logic all nonpredicative statements. But it
is not quite so. As we have mentioned, the logic presents the
contradictory noncontradictorily. Russel's paradox (if: Х≡{α|~(α ∈ α)}, then: (Υ ∈ X) ↔ ~(Υ ∈ Х), or nominal
variant: "In the village lives only one barber; he shaves all
the men who do not shave themselves. Who, then shave it?"
Where we can see, it's not about the specifics of the
theory of sets, and in our thinking in general) is "solved" by
simple refusal to contemplate such sets. For instance, by
introducing one's own classes which do not represent sets.
Further, if the above mentioned classes are considered within
the limits of the theory of modes, the nonpredicative concept
is still present in the theory not manifestly, being
"diffused" all over the hierarchy of modes. But even without
turning so theory of modes, we come to the concept of a
universal set (the set closed relative to all set forming
operations) together with the axiom of belonging of any set to
a universal one. In this case, this universal set will be one
of the models of the theory of sets as a whole. That is, if
the logic refuses to consider the nonpredicative object "set
of all set" directly, it all the same introduces its more
strict model, a universal set. That is why one should not
insist that such philosophical generalizations as "the World
as a whole" are logically groundless. The universal set is
just one of the formal models of the Universe. How much this
model is adequate to the prototype or can the Universe
be considered either as an open or closed system are
different questions, but the fact is, that the theory of sets
has not refused the concept: "set of all set", but just proved
that this concept should be modelled not trivially, i.e. as a
universal set. The same with the other absolute
concepts: the logic does not forbid their formal models at
all, but models them not directly, not trivially. On the
contrary, these general concepts can be interpreted
scientifically only in a formal and logical way. Remember, for
instance Hegel's definition: the infinity is something self-
identical due to its self-difference. As a matter of fact, it
coincides with the "concept" as being "self-affirmed through
its self-denial" (we are again free to quote Hegel). When
formalizing such an infinity trivially, we come down to the
absurd: (Х=Х)↔(Х≠Х). But
replacing here equality by the equivalence, Cantor arrives at
the following: "the set Х
is infinite if there exists such a one-to-one immersion f: X→X, that f(X)≠X." And it is clear now,
that Hegel correctly defined the infinity for all that. And if
we take a closer look at the situation, it will be clear that
the empirically set equality is always a coincidence of a
limited class of properties, i.e. some effective equivalence.
In this case, one should not simple substitute the classical
definition: (Х=Х)&[(X=Y)→(A(X)→A(Y))]
(where А is
any statement) for (Х=Y)≡[A(X)&A(Y)]
(where А is
an element of a effective limited class of statements). As has
been mentioned above, an uneffectiveness of a reflexive
character should be somehow shown. As we see, the problem of
infinity is inseparable from the problem of
individulization. But with all this coincidence of the
concepts of the infinity according to Hegel and Cantor, it
should not be forgotten that Cantor's definition is only a
formal image of the substantial concept of Hegel, and the
image far from perfection at that, for it is created by the
classical mathematics with a trivially clear understanding of
an element and a function as a plurality of put in order pairs
of elements with a formula of having a single meaning of
values. But for Hegel's approach all those are accidental, not
justified by anything factors. Meanwhile, the modern
mathematics have only Cantor's definition of the infinity. At
the same time, Russel's paradox and troubles with the axiom of
choice illustrate an incompleteness of this
definition. The next, and perhaps the only step in
formalizing the infinity of the "concept" was made by Robinson
in this theory of the nonstandard mathematics. The basis here
is in the fact that the formal theory having a countable model
has a model of any infinite power. What if the first model is
a subset of the second one? What if the theorems of the first
model result from the theorems of the second one due to the
restriction of the sphere of action of quantifiers and the
sphere of changing variables? By using the theorem of
compactness, ultrafilters and directed relations, such a pair
of models can be built for any formal theory. And then the
following situation arises. The theorems true in the standard
sphere (individual sphere of the first model) can get broken
in the expanded set which includes nonstandard elements as
well. But the same theorems are still true in the expanded
set, only their standard presentation is broken. Here we have
a substantial nonpredicativity which is expressed quite
predicatively and noncontradictorily formally. The theory is
modelled so, that is theorems have the "truth which is
realized through the moment of falsity". Of course, formally
it looks quite different. But here the content is the same as
in the "concept". As to the reality, the infinity of the field
mass of the particle (where the mass and in fact appears to be
measurable, but an infinite number of) is more natural to
explain by nonstandard methods than by renormalizing ones.
Analogous is the matter which zero probabilities of point
events forming are continuum. But here as well we have
but an incomplete model of the "concept". Making the infinity
according to Cantor nonstandard, we still cannot eliminate the
aforementioned shortcomings. For instance, the standardness
predicate remains here something external for the theory,
having relation to its model only. But the "concept" requires
the modelling process (being embodiment of an idea into the
matter according to Hegel) to be something self-depended but
be included organically into the construction of the logical
language and theory. Another instance of real action of
nonpredicative structures in the mathematics is the recursive
function apparatus. Here we also avoid any direct application
of the nonpredicativity through some variation of the term in
the right and left parts of the equation: f(n+1)=g[f(n)], where the natural
numder n is changed to
(n+1). As a matter of fact,
it should be mentioned that the nonpredicativity in modern
mathematics always causes much trouble even if it does not
lead to a contradiction. The reasons will become clear, if we
consider, for example, the functional equation: f(х)=g[f(х)], where х is real or complex. The
situation is typical for the mathematical analysis. The
equation presents itself as a determination of a function
f(х). But to acquire the
full information of it, one should solve the equation, that
is, find the expression of f(х) through other functions and exclude nonpredicativity.
Such striving for determinizm and irrelativity of the
cognition says that the mathematics is still too classical,
still lags behind the latest parts of the physics. We try to
show, that the mathematical logic to come should have
principles of additionality, relativity, etc. of its
own. Gödel has given the best result of applying
nonpredicative methods in constructing the proof of his
theorems. But it is the theme of a special
subject. 4. Gödel INCOMPLETENESS THEOREM. Gödel had proved
nonpredicativity of the arithmetic concealed in the
metatheory. The essence of the incompleteness theory consists
just in possibility to code numerically the logical symbols in
the sentences of numerals and, using the recursive
functions, the logical relations between these symbols.
But this time, the content relates to the metalogic. The fact,
that this coding gives the possibility to formulate the liar
paradox (one of the modifications of Russell's paradox) is
just the consequence of provide by Gödel
nonpredicativity. Thus, the arithmetic is
nonpredicative in the sense that the sentences of numerals are
expressible by these numerals and metasentences by the
sentences. As we see, it is more fine nonpredicativity than
that considered by Russell. Russell's nonpredicativity is its
consequence. But the character of "derivation" this
consequence goes beyond the boundaries of the formalism, in
which we construct the arithmetic, and makes us to study the
whole class of other formalisms. What sentences are
unsolvable in the arithmetic? The only ones for which there
exists such a system of Gödel's enumeration in which they turn
out to be the liar paradox (becoming nonpredicative according
to Russell). This enumeration class is virtually the
whole class of formal languages which are isomorphic to that
on which the arithmetic is built. The proof of this
isomorphism is the main point of Gödel. In that case, if in
one of this languages the sentence interpreted as the liar
paradox, it will be unsolvable. An impression is created that
in the boundaries of the classical mathematics Gödel uses
methods radically different from formal and constructivistic
ones. The statement is regarded here not as something rigidly
formulated in the only formal language, but as a pattern of
statements, as a function of the class of isomorphic
languages. As a result, it is proved that this function has
some properties which are invariant to the class as a whole.
The consequance of this is unsolvability of the statements
with nonpredicative interpretation in one of the languages.
Moreover, this class of languages contains obligatorily the
elements nonpredicative in the sense that they are built just
of the "letters" of the individual sphere which models the
theory. The fact, that this property is fulfiled for any
theory in the boundaries of which the arithmetic can be
modelled (and this is just the property of any theory we are
interested in) points out clearly to the following general
conclusions. First, Gödel's theorems describe the
correlations of the "concept" and its material embodiment
according to Hegel's philosophy. It is precisely the ideal
structure (the formal theory) that is reflected in this
material structure which it describes (numerals, individual
sphere) and, owing to this,reflects itself. Here we have a
self-denying autoreflexive nature of the "concept". Thus, the
infinite set together with the induction and recursion
principles presents a formal model of the "concept". As Gödel
has proved, it is not complete and enlargeable. Its
noncontradictoriness is not to be determined practically. As
to the further refinement of the model, the global character
of the "concept" requires all the power limitations to be
removed; besides, as we have mentioned above, the nonstandard
variation of its formalism looks more preferably. That is why
the proper class of all ordinals of the nonstandard set theory
with the limitation axiom is to be considered as the most
complete formalization of the "concept". The limitation axiom
is equivalents to the fact that any set is obtained from 'Ø' by taking a Boolean value and
uniting it is transfinite number of times. It corresponds to
the "concept" in Hegel's philosophy. We have not discussed yet
the truth of this philosophy, an adequate formalization of the
"concept" has been considered. And this formalization, as
mentioned above, is still far from being
perfect. Second, in the modern logic, Gödel's theorem
pops up suddenly and accidentally, like a trick. For in the
arithmetic itself the absence of the incompleteness is
evident. This incompleteness puts forWard much more problems
than solves them. Therefore, Gödel's theorem shows that the
modern logic somehow exhausted itself and is required to be
revised. And it is the way of proofing the theorem that shows
the direction of the further development. The process of
constructing a formal language (as well as the logic and
theories in this language) cannot be separated from the
process of its modelling. A structure uniting to faces: the
predicate and the individual in itself should be built. The
formalism as well as its model should loose their stiff
unambiguity. The main apparatus must enclose some class of
isomorphic languages and models. Having taken any of aspects
of this structure, we should have obtained a formalism model
or a model formalization. It is difficult to say more
concrete now about this future logic. But the fact mentioned
earlier can serve as a confirmation of just this tendency of
the development: the mathematics discerns and registers the
individuals through the feasibility of some predicates. At the
same time, the predicates themselves are defined in the long
run as functions from the individual sphere into the Boolean
(or Heyting) algebra as well, i.e. only through their own
models. Therefore, the predicate cannot be considered
separately from the individual. The problem of their
correlation cannot be considered as solved a priori, as it
makes the modern logic. All the problems arise just with this
correlation. Boolean (as well as intuitionistic) algebra of
"pure" statements is complete, solvable and noncontradictory.
The problems arise just in the calculus of predicates in
modelling the first-order theory. And the nature itself, as
was shown for the quantum theory, does not correlate the
individuals and predicates in a trivially external way.
Therefore, nonpredicative methods acquire a special meaning.
And despite the impossibility to use them directly, just they
will serve the basis for the development of the logic and the
key to understanding Hegel's
dialectics. 5. FORMALISM and CONSTRUCTIVISM. Thus, the indeterminate
and complementarity principles of the quantum theory arise,
possible, from the same source that Gödel's incompleteness in
the mathematics. In the quantum theory these principles arise
almost like an establishment of empirical facts, and in the
mathematics as a sudden and not quite pleasant
surprise. Thus, to find their deeper grounds is the matter of
future. But may be here a constructive trend in the
mathematics could help? But despite all its theoretical
and practical significance, the constructivism does not solve
the global logic problems. The modern formal systems are not
able to substantiate themselves. There can be formulated
unsolvable questions and paradoxes arise in them: the concepts
contradictory by themselves turn contradictory in this
formalism. The paradoxes are eliminated by accumulating new
and new, rather artificial axioms. The trouble is that these
systems do not cover the deep reasons of origin of a paradox.
The algorithmical method is not free from these problems as
well. But the constructive solvability problem is not
solvable. That means that there will always exist problems
the solvability of which cannot be cleared by the algorithm
theory. It is not simply the impossibility to state the truth,
it is just meta-undecidability. And if such a problem is just
not solvable owing to some concealed reasons, the
algorithmical method will never find it out. But the
constructivism could not really get rid of such problems.
These two trends in mathematics have begun with violent
discussions and now they are trying to come to mutual
understanding. And they do come to it. Any formal language has
a strict algorithm of its construction of initial symbols.
Introductions topological methods makes it possible to
contract pseudoboolean algebra in Boolean algebra, and vice
verso. The constructivism becomes a logic of solving problems,
and the formalism by the logic of mathematical
deduction. Both approaches have become, as a result,
equivalent and treat the same subject in two different
aspects. Probably Brauer himself tried to find something quite
different. Let us examine these problems more closely. The
constructivism is considered to drive the infinity out of the
mathematics. But it is not quite so. The potential infinity is
a nonobvious applying of the actual one. As we have seen, the
analogical situation originates in the set theory. Proper
classes are introduced to avoid Russel's paradox. But having
refused to consider nonpredicative concepts, we force them
into the sphere of the metatheory, and then we are not immune
against surprises either accepted (of the universal set kind),
or not quite accepted (incompleteness theorem
kind). The question of rightfulness of applying the
infinity concept turns only in the following: how much is the
induction principle rightful in the empiric practice? Just by
spreading the generalization of the observed in many
experiments on all other experiments, one comes to the ideas
of infinity, running the risk of erring and coming to a
contradiction. And he do err by making the first inductive
conclusions more accurate with the help of subsequent ones
which are also inductive. But there is no cognition without
the induction at all. Without induction we could not have
designated the classes of objects by words, we could not have
united a number of sensations of a certain magnitude,
hardness, form, and weight with a word "stone". And this
induction erroneous by definition (a conclusion based on a
finite number of cases is groundless for all the infinite
class of cases) is the main tool of the theoretical cognition.
The induction itself represents this cognition without being
able to substantiate itself. It is possible, that just in the
practice of induction lies the cause of the self-negation
properties of Hegel's "concept". But the main thing for us now
is that the induction is simply unavoidable for thinking. The
infinity is unavoidable for thinking as well. By denying the
actual infinity, the constructivists act unconstructively, for
they do not have arguments in favour of its rejection, as well
as its maintaining. The only argument remains, that the
infinity is not perceivable empirically. But many other things
are not perceived empirically as well, such simple notions as
"stone" (or "number"), for instance. Just simply being a
notion, it has purely inductive nature, being linked with the
experience but indirectly. If very strict refine the concept
of account, the account of the empirical process will only
unconscious sorting items. The concept of being impossible
without experience, a purely inductive, out-experienced
entity. At the time, it is well demonstrated Berkeley and
Kant. We have turned now to the sphere of empirical
facts not accidentally. As we have mentioned, the empirical
counting process presents the foundation of the
constructivism. In essence, it is an experimental construction
of the same classical mathematics, and the formalism follows
the way fully equivalent to the constructive one, because the
formalism builds its constructions of discretely discernible
symbols using certain combinatorial rules. But the world
perceived is more complex than the counting processes or the
collection of discretely discernible elements. Observing
smooth movement of pointers of his instruments, the physicist
draws an inductive conclusion of a dense everywhere degree of
order of values of the measured quantity. Nevertheless, he has
to graduate these values by the discrete marks of the natural
series to draw maximum of information out of the position of
the pointers. All this presents idealization of our
intellect and our reference systems. But if the formal logic
is occupied with noncontradictory interpretation of the
contradictions, the intuitionism one is engaged in
constructive interpretation of the nonconstructive
world. 6. The CONTENT and the FORM. In the paragraph 5 a
subject of infinity was regarded. As to, the real World, we
should admit its infinity at least due to its logical
contradictoriness and principal nonpredicativity. But our
knowledge is always finite, for it is always expressed through
a finite number of symbols of the formal language. And it just
turns out, that we cognize the infinite by using the finite
means, somehow reducing, limiting the infinity in our
imagination. We think, the logical "reference system" within
the limits of which we imagine the future logic, will just
reflect and limit differently the infinity of the World with
its finite means. Speaking of the infinity of the World
and of the finiteness of the knowledge, as a matter of fact,
we speak of the substantial and the formal. Because under the
content, the formal theory semantics mean always not obviously
its real empirical model. The term "content" has simply no
other meaning. The formal arithmetic studies the properties of
numbers expressible by the formal and logical method. The
substantial arithmetic is occupied with real count processes.
Being preoccupied with the form exclusively, the mathematics
applies in the end for the content to the empirical practice
only. But just in the end, because there is a relativity of
terminological and modal character. On the whole, the group
theory can be called the form, and such properties as
commutativity, cyclicity, etc., attributed to the content. But
this will be the content which can be formalized, i.e. the
formal one. It is done so in the mathematics by modelling a
theory in the individual sphere of another one. For all that,
the "true", absolute content rests, nevertheless on the
empirical reality. The algorithmical language is of value only
in that one can write an algorithm down here and now, on this
sheet of paper and then apply it to the really written
symbols. Thus, the "true" content is a complete model
of the formal theory. This model can be said much about, but
it exists only in reality as a concrete object, virtually
being without formal existance. But the abstract, formal
object exists as a language structure. Gödel's theorem just
proves inexhaustibility of the content by the form. Thus, the
content is an infinite form gradation coming true really but
not formally. The form presents an infinite plurality of
contents embraced by the formal theory, infinite as much as to
be realized formally. All the terms of any language are
formal in the sense that they designate an abstract "object in
general" of the kind "stone", "house", "variable", etc. But
the real house is rich in content, having an infinite
plurality of properties, and any its axiomatic description
will be incomplete. On any concrete projects of architects and
builders an infinite number of houses, of copies of their
formal plan can be constructed. Hence, for any formal means
some rich in content objects are indiscernible and even become
apparent as coinciding (according to the quantum theory),
although they are alternative. Consequently, the
problem of the object individualization is far from trivial
the empirical point of view, as supposed by the set theory
relative to the elements or the algorithm theory relative to
the totality of the inscriptions, representing similar
objects. Besides, when studying modelling and semantic
problems, the mathematics inaviteble turns to the empirical
practice which requires revision of the initial mathematical
postulates. 7. METAMATHEMATICS. Despite the fact, that
the modern mathematics understand the truth of statements
rather absolutely, some relativity moment is still inevitable.
Therefore, any theory or logic imply metatheory and metalogic
nonmanifestly. A formal language cannot be built on nothing. A
metalanguage is necessary for this. And the number of these
prefixes: "meta-", in fact, should be built up infinitely. So
that operating on the finite objects only, the constructivism
simply ignores the infinite number of the metalevels of its
determinations. This sequence is usually cut off artificially,
trying to build the theory so that the metastatements were as
trivial as possible; then all the necessary metainformation is
reduced to the thesis that the letters of the language are
discernible and there is an unlimited stock of inscriptions
for any letter. But Gödel's theorem has destroyed this
seeming simplicity as well. Figuratively speaking, a "short
circuit" between the individual sphere and the sphere of
statements of them is created. This nonpredicativity pierces
all the infinite sequence formed by the prefix "meta-". All
the more, it is necessary to construct the formalism so, that
all metalevels and logical reference systems were taken into
consideration in the foundations of the mathematics
itself. Though, even without taking into account
Gödel's theorem, the formal logic has to touch upon the
metalevels, because any logical operation can be interpreted
only as a metastatement semantically. And this is just the
deep basis for constructive criticism of the classical
implication and negation. Namely: (А→B)
is a metastatement of the relation of the truth
values of the А and В statements. That is why the
constructivists insist that without a real derivation of В from А, the implication is simply a
logically groundless formal trick of Boolean algebra.
Analogically, the quantum theory calls the classical
operations (А&В) and (АvВ) is questions as
well. Apropos, in in physics not only the quantum
theory contains nonpredicativity of objects and their
properties. The same is in the general relativity theory.
Being a relation between material objects, space-time itself
becomes a material objects possessing even nonzero stress-
energy tensor (a pseudotensor, more accurately, which is
immaterial in this case). The space-time can possess a
curvature of its own, not induced by the matter but
influencing manifestly the matter behavior. So that her as
well obtain nonpredicativeness. But the real
nonpredicativeness of the nature means that everything
becoming apparent in our formalism as metalogic, must have an
analogue in the physical reality as well. All property
sequence of any metalevels should be realized somehow
physically. It is just formally, that we can come to the
agreement not to consider this sequence. But in reality it
must be real. In this connection, it is necessary to
return to the problem of correlation between the material and
ideal. Such properties as "to be material" and "to be ideal"
are deeply relative. The matter is that which exists outside
and independently of perceiving it reference system (let us
venture to exclude anthropomorphism out of Marx's definition).
I.e., this is what is reflected in the given reference system
but has its cause in the other system. Therefore, the matter
always exists "outside itself". Here we come close to Hegel's
ideas. According to Marx, the ideal is defined as a reflection
of the material in the material. But Marx does not take into
account the reference system. Where is it, this ideal? It is
somewhere, where it is present as a reflection, i.e. in the
third object perceiving both the reflecting and the reflected.
But then, the 4-th object, etc. is necessary for the existence
of the ideal. For the metareflection itself must be ideal. But
all this unlimited "meta"-sequence can become something whole
(from a whole reference system) in the self-reflection only.
So it turns out, that the ideal exists in itself and for
itself, its nature is in autoreflection; it is something that
perceives itself and is perceived by itself. And it is only in
itself, that the ideal is ideal. The information, the
reflected image, is not in a mirror or in a reflected object
but in the reflection itself taken from the side of its self-
definition. If some physical and mathematical abstractions are
excluded, one should admit the nonlinearity of all existing in
the nature reflections, interactions. But the nonlinearity is
a synonym of self-action. Material objects exist through
interaction only. Therefore, the matter always creates the
information, the ideal. And vice versa. The ideal existing
only in itself, simply does not exist for anything another.
But the ideal possessing the reality for something else
(differing from it), just materializes. The matter that exists
(perceives and is perceived) outwardly and in the outward
(outside itself). In this sense, the matter is limited,
finite. And the ideal is infinite just in the sense of
autoreflection (let us remember Cantor). And now let us
turn to the mathematical aspect of the subject. We say that
the set Y contains an
information of the set Х,
if there is a structural homomorphism f:
X→Y. In reality, the information is all three
(X,Y,f), and in Y it is contained but relatively.
Actually, the information is contained in the Galois
correspondence G(f)[X,Y],
describing this structure, that forces us to consider
the mappings the booleans: 2X and 2Y, etc. But looses
all its sense without its metatheory, etc. We have an infinite
sequence Booleans. Objective realisation of reflexion demands
objective realisation the Booleans of all orders. In nature,
this metastructure present in complete form. The only way to
define the location of the information exactly is the
requirement of Gödel's incompleteness
(X,Y,f). The statement of this structure should
be encoded in itself, and it must be an integral part of the
structure and not an amusing incident arising post factum of
its construction. We suppose, the logic must develop just in
this direction. 8. THEORY of CATEGORIES. Let us again formulate
briefly the requirements to the future development of the
mathematics and logic which we have tried to ground here.
Virtually, it is one requirement: total relativity brought to
the reference system level. In this case, the following
relativity moments are the key ones: 1) self-relativity
meaning nonpredicativity of the formalism in essence, 2)
relativity of the truth, 3) relativity of effective means, 4)
relativity of the individualization of cognition
objects. But we should like to present this future
logic more concretely. From our point of view, the theory of
categories is a forerunner of the new mathematics, the first
attempt to build it. Why? The theory of categories is
an abstraction of purely functional relations between objects.
It considers not sets and reflections, but their purely
algebraical properties. Hence nontrivial view of the
individualization of elements. In fact, abstraction of
categories embraces not only sets and reflections, but also a
class of any mathematical structures and their homomorphisms.
The central structure of the theory become not the objects
with their elements, but the homomorphisms of the objects
arrows. This leads to strengthening of the relativity
moment and to loss of stiff individualization of the object
typical to the set theory and algorithm theory. Any arrow is a
representative of a class of the arrows isomorphic to it,
formed by its multiplication by the isoarrows. The category
isomorphic arrows are indiscernible by their main properties.
But all this is not as simple as that. The main definitions of
the category theories are constructed so that selection of an
arrow of the class isomorphic to it depends on the analogical
selection from other classes. For instance, the diagram limit
is unique to within isomorphism (but there is the only arrow
letting to pass through it a given cone over the diagram).
Thus, the individualization of certain arrows determines
strictly the individualization of others. It happens because
in the category theory, the algebraical approach (associative
composition law) is combined with the combinatory geometrical
one (partiality of composition law; presence of the left and
right units for each arrow). It is interesting as well, that
the category approach allows to formulate most completely
nonstandard methods, in the sheaf theory. But the concept of
the limit, topos, sheaf comes to different particular cases of
one concept of the category conjugacy. The conjugation
is the most accurate model of Hegel's "concept". Substantion
of this statement would have required a too detailed semantic
analysis of Hegel's works and exceeded the bounds of this
article. Therefore, we ask the reader, if he is interested in
it, to solve the problem of truth of our statement on his own.
We stress only, that the apparatus of arrows and functors
allows to express formally the semantics of the philosophical
term "reflection" which has a purely informational nature. For
instance, such formations as category К↓а all arrows of the category
К in some its object а, allow to say of peculiar
"reference systems" in the logic. If К is a topos, then the К↓а is topos as well and just as
the К, has the subobject
classifier and the logic of its own. Thus, using
language of categories, one can create working models of our
main requirements to the logic: nonpredicativity, relativity
flexibility of the individualization. But it is impossible to
maintain that these are good models. And the category theory
itself is equivalent in many respects to the set theory. But
deeper principles which can be developed in future, stand
probably behind the construction of the category theory.
|