[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

warm fuzzies

As I was saying about doing our homework... The discussion of
fuzziness seems to have involved about three and a half different
notions, which might be related but need not be -- and have mainly just
been assumed to be in the discussion.  (Although McCawley does not do
as good a job as he might at making the relevant distinctions in the
obvious section, he does so elsewhere.)

First, there is fuzzy set theory, which starts from ordinary set theory by
replacing the usual characteristic functions of sets, functions from
objects into {0,1}, with functions into [0,1], the closed interval of the
reals. Thus, rather than an object just being a member (1) of a set or not
(0), it might be said to be a member to any degree within the range of the
function.  This is a device to try to deal with vague concepts, where
things might be more or less like clear-cut (typical?) members of a set
(holders of a property).  (I will pass over -- mainly because the data is
so inconclusive -- whether this is either a reasonable or an accurate way
to deal with these concepts, indeed, whether there is a single -- or
finitely partitioned -- notion of vagueness to which this approach might
apply.) From this system three systems follow: one that simply states the
function values: "x is F to degree 0.n", and two that make use of modifier
notions, "very", "somewhat" and the like (these for adjectival F in
English, nominal Fs require different vocabulary that can -- it is said -
- be mapped onto this).  In one of these system, "very F" refers to a sort
of subset of F, one with a characteristic function that assigns non-0
values only to things that the characteristic of F does, but generally
assigns lower values to object and, in particular, assigns 1s only to some
of the things to which the F function assigns 1.  "Somewhat F" is rather
the reverse, giving non-0 values perhaps to some things to which F gives
0, but within F giving typically higher values than F does.  The exact
mathematical relations between these various modified F functions and the
base one were once a favorite game.  The second modifier version takes off
from the first format, "x is very F" means "x is F to a degree adequately
in the 'very' range."  That is, the [0,1] interval is divided into a
number of subintervals, labeled, say, "scarcely," "slightly," "somewhat,"
"moderately,"  "clearly," "very," and "absolutely" (not seriously
proposed).  These subintervals are themselves typically taken as fuzzy
sets so that one can with almost equal justification call something either
of adjoining classes.  Mathematicians tend to feel most comfortable with
the straight functional form, linguists developed the first modifier form
most fully, engineers have worked mainly with the third, realized by
analog devices or analogization (fuzzily, of course) of binary data.

Then there is fuzzy logic.  This is basically a logic which assigns truth
values from [0,1] rather than {0,1} (or {T,F}), with some restrictions on
how to compute values to compound wffs (in particular, restrictions that
distinguish this system from probability valued logics, which start with
the same sort of assignments -- and go through parallel developments.
McCawley is rather good on the problems of building such computations
in a way that gives a reasonable logic.)   Two relations with fuzzy set
theory suggest themselves immediately: we could take the truth value of
each atomic sentence, Fa, as just the value of the characteristic function
of F at a or we could take the truth value assignments as the values of
the characteristic function of the fuzzy predicate on sentences, "is true."
These are two independent ideas and fuzzy logic can use either, both or
neither (the most complex form I know of off-hand is the one that has an
independent fuzzy truth value assignment, so that the claim that a is F to
degree 0.n has some value between 0 and 1 inclusive -- hopefully with
some consistency requirement that successive suggested degrees of a as
an F get progressively higher truth values up to a maximum and then
fade smoothly away).  The "very true," etc. truth values seem to come
from the fuzzy partition of the range of the characteristic function
version of the second approach, probably combined with the first. But I
did see at least one comment that suggested the related sets notion of
various levels of truth.

Finally into this mix is added scalarity.  The old philosophical histories
of science talked about the development of theoretical concepts from the
qualitative to the comparative to the scalar to the quantitative: this is
hot, this is hotter than that, this is of the third degree of hotness,
this is 250 degrees centigrade -- the classic example (one of the few that
actually shows all stages in extensive treatment).  The move from
comparative to scalar is actually rather complex.  It involves first a
number of comparisons, to get a variety of phenomena lined out in a single
dimension.  Then this ordering, assuming that it is about a subjective
factor like hotness, must be checked for intersubjective agreement: do
other order the phenomena the same way?  Then some of the phenomena in the
ordering are taken as exemplars of certain regions of the ordering --
again with an intersubjective check.  Finally (though not essentially in
itself but only as a step toward quantitative concepts) the regions or
their exemplars are numbered in order.  The step to quantitative concepts
requires that there be an objective correlate of this subjective ordering
that provides (typically) continuous values.  That objective version then
gets assigned real numbers, typically with important exemplars in the old
scalar form getting important numbers (I forget why Fahrenheit screwed up
so badly).

Again, there are some natural and some apparent relations between
scalar properties and fuzziness.  It is reasonable, for example, to take the
degrees on the scale as fuzzy sets, with the exemplars getting 1's and
other things getting less as they moved away from that central value
(this is before the scale gets reinterpreted in the quantitative form, as the
old 1-10 chili scale got reworked as a partition of the range of Scovill
units or Beaufort in terms of wind velocity).  What will not work,
however, is to take the scale as an integer-ation of the fuzzy values or
characteristic function.  Typically, something has to get pretty close to 1
on the characteristic function of a property to even get into the running
on the scale at level 1 and most of the items on the scale are uniformly 1
on the characteristic function.  Nothing that is not hot tout simple is
going to be called hot degree 2 or greater, say.  Even "very hot" or
"absolutely hot" is going to have its 1 cut in before, say, molten steel
come into the picture.  That is, the two patterns do not fit well together
and, indeed, belong to different conceptual types.

I think that there is at least occasional use for each of these various
systems in any advanced language.  I also think that Lojban is pretty well
equipped to make those uses when called upon and to do so in distinctive
ways.  It is also in pretty good shape for doing all the corresponding
things for probabilities and, should the need arise, for other such
systems as may be presented.  I welcome various expositions about how
Lojban might do these things with the tools at hand and tend to be a bit
put out with those who say we cannot do some of these things or that we
should do more of one, especially before the proclaimer gives it a good