[om] DefMP elements
Andreas Strotmann
Strotmann at rrz.uni-koeln.de
Mon Dec 8 19:43:19 CET 2003
Jacques Carette wrote:
>I find that Andreas Strotmann's email eloquently voices many concerns that
>are similar to mine. Thanks!
>
>
You're welcome :-)
> On the specific recommendations:
>
>I definitely agree with those first 3.
>
>As to the fourth:
>d) use signatures in definitions: this might prevent a definition from
>being misapplied in an application that doesn't know any better. This
>may well be done in the simple way of using universal quantifiers and
>set memberships inside the defining FMPs, so that this may simply be
>added as a recommendation to CD writers.
>============
>Could you be more precise? I believe I agree, but perhaps an example would
>help a lot.
>
>My understanding is you mean for the definition to explicitly mention the
>concepts it needs defined for it to be meaningful. So a complex definition
>of exp would need the notions of complex numbers, series, convergence over
>the complex, to exist. This would be a nasty looking signature if expanded,
>but if it is expressed all in terms of CD references, it would probably be
>quite compact.
>
>
>On the other hand, aren't you saying that perhaps OMDoc should be that
>'main' method of communication, and OpenMath serving OMDoc in the same way
>that XML serves OpenMath?
>
>
This is all one big question, so let me try to answer it as such.
I think I understand what you're saying: to be really precise, one would
need to tag a definition by the concepts that it requires to hold true,
but as you say, that might very well require a system like OMDoc to
realize -- OpenMath as it stands is not really designed to handle that.
But then, a lot of applications of OpenMath would not be able to handle
that kind of meta-reasoning about their own capabilities anyway, which
is probably exactly the problem that James is trying to find a way to
solve (and I'm very sorry that I can't agree with the solution as-is).
In that sense, an exact specification of the concepts required feels
like yet another Bourbaki fallacy to me (to use your own terminology,
Jacques) -- like trying to find the most general set of circumstances
under which a concept is still recognizably the same.
I was thinking of something much simpler, really, and thus something a
bit more in James' line, I hope. For example:
for all x in R, sin(x) := series(...)
for all z in C, sin(z) := <James's definition>
for all angle in Angles, sin(angle) := quotient of sides of appropriate
right triangle
for all (A,B) in Segments x Segments, sin((A,B)) := ...
for all x in R, csc(x) := <quotient definition>
for all x in ]a,b[, csc(x) := series(...) <with appropriate real
constants a and b>
for all z in C, csc(z) := <quotient definition>
for all F in Fields and for all y in F, csc(y) := <quotient definition>
for all R in Rings and for all y in R, csc(y) := series(...)
for all z in C, exp(z) := series(...)
As you can see, none of these claim to completely define the concepts of
sine or co-secant, respectively. All use simple type definitions
(reals...) to define a domain where they are guaranteed to work. It
would be reasonable to expect that fairly simple applications could
discover useful definitions by using simple pattern-matching based on a
rough characterization of their own capabilities (e.g. "use only
definitions that involve functions over the reals"). Some applications
may even be able to use the definitions that involve general rings and
fields, but I doubt that there would be very many of those around. All
definitions, however, are supposed to be guaranteed to be correct
(though not necessarily useful) in their stated domains -- there is no
danger (well: less danger;-) of inappropriate over-generalization
(unless perhaps performed by some AI system like Steve Linton et al's
mathematical discoverer that explicitly tries to discover valid such
generalizations).
Definitions like this one should also be allowed:
sin:function := the[f:C ^ 2] (f''=-f and f(0)=0 and f'(0)=1) <!--C^2
as in twice differentiable functions-->
but the idea is that it would usually be a good idea for simple
applications to shy away from such a definition because it doesn't
specify the kind of signature that it can reason with. James' idea of
classifying definitions might help with putting up warning signs on
definitions like this one.
What I'm trying to say, I suppose, is that there may be a reasonably
simple way to do what James wants to do (namely, to provide definitions
that applications might find useful) and still be honest about the
potential limitations of the useful information we provide. Beyond
that, yes, I suppose OMDoc or something like it would be more
appropriate to carry the complex higher-order concepts necessary for
formulating definitions that are as general as we can figure out to make
them, but I doubt that there will be many software systems that could
make use of those -- and those that could, would profit from
implementing full-blown OMDoc, anyway.
-- Andreas
--
om at openmath.org - general discussion on OpenMath
Post public announcements to om-announce at openmath.org
Automatic list maintenance software at majordomo at openmath.org
Mail om-owner at openmath.org for assistance with any problems
More information about the Om
mailing list