[Date Prev][Date Next] [Chronological] [Thread] [Top]

[mizar] Weak types



Dear Freek,

I have changed the subject because it seems that the subject 'local DB' is overused. Still, I do not know if the subject is right because it seems nothing to do with 'weak types' of Fairouz .


Freek Wiedijk wrote:

I would like to be more precise what we talk about. It looks to me that the whole problem is that you (and Josef and maybe some others) what to be allowed to write

     for h being Isomorphism of G,H holds ...

instead of
for h being Homomorphism of G,H st h is_an_isomorphism_of G,H holds

Or you really want weak types?

It is very natural to have weak types.  Types often will be
"weak" when defined in the most natural way.  It is clumsy not
to allow that.

The controversy is well known in logic. E.g. Andrzej Grzegorczyk in his 'Logika matematyczna' argues (I have not his book here, so take it cum grano salis) that logic should deal with formulas valid in non empty domain. So, if we treat types as a mean to restrict the domain they should be non empty.

And _I_ do not understand why one would _not_ allow weak
types.  It just would make CHECKER a bit more complicated,
right?  So why object to that?  Or is there something else
that I don't see that would be a difficulty?

If introducing weak types resulted in the faster growth of MLL, I would be for introducing of them. Sorry, if I am brute, but for me Mizar is not to be appreciated but a tool for developing the repository. I tried the introspection and I think that the main reason that I do not want weak types is that the texts when we are using them become so obscure.

For me the real difficulty is not with CHECKER but rather the REASONER. If I have no problem with
    for x being element of A holds ...
(I use 'element' for weak type) if A is empty, the I am rather reluctant to write
    let x be element of A;
in such a case. I am not certain if I understand the impact of weak types on the natural deduction. Anyway, I do not like the idea of local constants with an empty type. And I definitely do not like the idea of ground terms that do not exist. The rule dictum de omni is not valid anymore, to use
               for x being Theta holds P[x]
             ------------------------------------
                       P[a]
I have to prove 'ex x st x is Theta' or something like this. So what the use we have for universal sentences if we cannot use them :-)
Am I right?  The logic of weak types is so strange for me.

Of course people write better formalizations if you force them
to prove their existence correctness conditions, because that
way you force them to think hard about their defining
properties.  But that could be part of MML policy, right?  One
could decide not to accept articles in which modes are kept
"weak" for which it is obvious (to a human :-)) that they
actually never will be empty.

Also it should a policy to remove from the articles theorems
    for x being Theta holds ...
if Theta is empty (if we know that Theta is empty? or if we can prove it?). I am not very happy with e.g. the formalization of Amokrane Saibi of the category theory. I do not want, what I wrote, to be misconstrued. I appreciate his work (I was in his PhD committee).

Still when we use 'Morphism of o1,o2' we should know that the corresponding hom set is non empty. When I look the formalization of the category theory in Coq I feel like in a realm of ghosts: what is real and when we are taking about nothing?
But of course it is my feeling, or a prejudice.

Ah.  I understand, I think.  It would mean that "Element of"
really should be defined in the following way:

 definition let X be set;
  assume
 Z: X is non empty;
  mode Element of X -> set means
 :Def1: it in X;
  existence by Z,XBOOLE_0:def 1;
 end;

(The "X is non empty" here is a synonym of
"meaningfulness_of_Element_for X" :-))  That would follow the
scheme that you describe, right?  So would this definition be
as practical to be used everywhere throughout the MML as the
one that's currently in MML?

It is the original definition. Afterwards we learned that breaking the permissiveness makes life much easier.

I think that the predicate 'meaningfulness_of_Element_for X' is rather
         ex IT being set st IT in X

But even this is crazy!  "Element of {}" is a perfectly
natural thing to write, there's no good reason to forbid it.
And it really should be the empty type: anything else is
sillyness.

It is natural for whom? I believe that a regular mathematician would say 'what you are talking about?'. It is of course a criticism of Mizar, too. The IMPS logic is closer to mathematical practice, but I believe they pay too much for it. Mathematician quite often treat a meaningless sentences as false (it may cause some problems with Mizar). So maybe
some mathematicians would claim that
        for x being element of {} holds x = x
is false, against the logic.

So, I think a natural reaction to the sentence
       for h being Isomorphism of G,H holds ....
is rather:  how do you know that G,H are_isomorphic?
However, the mathematical language is so sloppy. Maybe it is assumed by default that G,H are isomorphic.

By the way, a question: I understand from the finite fields
example that you can "redefine" a "permissive" definition in a
way that it doesn't need an assumption anymore.  Do you do
this by repeating the "means" part, even if it actually stays
exactly the same?  (I experimented a bit, and that's how it
seems to work.)
You are right. The assumption is actually a part of the definiens and then definiens must be repeated. I am sorry for my mistake.

So in the case of "Element of" this redefinition would be:

 definition let X be non empty set;
  redefine mode Element of X means
 :Def2: it in X;
  compatibility by Def1;
 end;

Right again.

Greetings,
Andrzej