On Some Properties of Entropy of Fuzzy Numbers
Автор: Mamoni Dhar
Журнал: International Journal of Intelligent Systems and Applications(IJISA) @ijisa
Статья в выпуске: 3 vol.5, 2013 года.
Бесплатный доступ
Here at first we are going to give a brief history of the development of fuzzy entropy. Finally, new measures for entropy of fuzzy sets in continuous cases are introduced. In this article, our main purpose is to show that the entropy of fuzzy number is very much dependent on the selection of intervals. Another important thing which can be observed from the cases discussed is that the entropy of triangular fuzzy numbers is the same for the same choice of interval length and for non triangular fuzzy number this property does not hold.
Normal Fuzzy Number, Fuzzy Distance, Left Reference Function, Right Reference Function, Glivenko- Cantelli’s Theorem
Короткий адрес: https://sciup.org/15010390
IDR: 15010390
Текст научной статьи On Some Properties of Entropy of Fuzzy Numbers
Published Online February 2013 in MECS
Entropy is the concept first used in the second law of Thermodynamics. It measures the spontaneous dispersal of energy-how much energy is spead out in a process. It was introduced into communication theory by Shannon [1] following the rapid development of communication. It is used to measure the efficiency of information transferred through a noisy communication channel.
Theory of fuzzy sets, first proposed by Zadeh in [2] has gained quite considerable importance in various fields in recent times. Uncertainty exists in real system. Fuzziness a feature of uncertainty results on being or not being a member of the set. Probability has been traditionally used in modelling uncertainty. A measure of fuzziness used and cited in the literature is the fuzzy entropy, also first mentioned by Zadeh [3]. The name entropy was chosen due to intrinsic similarity with Shannon’s entropy. However, the two functions measures fundamentally different types of uncertainty. Basically, Shannon’s entropy measures the average uncertainty in bits associated with prediction of outcomes in a random experiment. Fuzzy entropy is the measurement of fuzziness of fuzzy set, and thus has an important position in fuzzy systems such as fuzzy decision making systems, fuzzy control systems, fuzzy neural networks systems, and fuzzy management information systems. In other words, measuring fuzziness of a fuzzy set is an important step in fuzzy systems. Further, entropy is a well known concept within physics, information theory and fuzzy set theory. Depending on its context, entropy is used for quantifying the amount of disorder of information, or of fuzziness usually defined within either a stastistical or fuzzy frameworks. Entropy plays a very important role in fuzzy literature.
Until now fuzzy entropy calculus has been studied from different angles, which has led to several methods for calculating the entropy.
Zadeh suggested a definition about entropy of a fuzzy set which takes both distribution and membership into consideration. It is defined as follows:
n
H(A) = Z ^A (X1) p (X1) log^A (x) p (X1) 1=1 (1)
Where A is the fuzzy set, ^A (x1) is the membership x function of the element i which has a probability
p ( x ) distribution of i
.
De Luca and Termini [4] suggested as a measure of fuzziness of a fuzzy set, which they defined as follows:
The entropy as a measure of fuzzy set is defined as
D ( A ) = H (A ) + H (A )
n
H (A) = -Z ^A(xi) In^A(X1)
j = 1 (3)
where n is the number of elements in the support of A and K is a positive constant.
In the first stage entropy was defined on distance, Kaufmann in [5] , proposed the way of measuring fuzzification of a fuzzy set using fuzzy distance between the non-fuzzy set and the fuzzy set as
H ( A ) = -
1 In ( n )
n
Z^A(x) HA(X)
1 = 1
where
1n
Фл(x) ^;E Sn (Ma (x))nIn 2 /-1
and n ( ( ^ A ( x )) is the Shannon’s function as follows
Sn (Ma (X )) = -Мл (X ) In Ma (X ) + (1 - Ma (X ))In (1 - Ma (X))
Another way of measuring the entropy was proposed by Yager [6] and this is to measure the distance between the fuzzy set and its complement.
Yager defined a measure of fuzziness of A in the following manner:
fP (A) = 1 —
D p ( A , A c ) sup p ( A )
Where
D p (A, A) =
n
E Ma (X)
i = 1
p
- M ac ( X )
where Ac stands for the complement of the set A
Another method is to use the entropy function to calculate the entropy. But in all these proposals we can see that there was the use of the concept that fuzzy sets do not obey excluded middle laws. This was either as the result of the initial definition of complementation of fuzzy sets as
Mac (x) = 1 - Ma (x)
where A denotes the complement of fuzzy set A or due to defining the possibility over the same space where probability is defined.
But here, we are not in a position to consider this definition of complementation of fuzzy sets as there are some shortcomings in the said definition of complementation as found by Baruah[7 ,8, 9] and these have been discussed many times in our previous works[10,11,12,13,14,15]. Further, they were not aware of the fact that a normal fuzzy number can be defined with the help of two distribution functions from which entropy can be calculated. Also it can be said that the possible links between probability and possibility which were established earlier are also not free from defects.
Various methods have been developed time to time but we would like to present here a different method for calculating entropy of fuzzy sets and thereby we shall try to specify some of its properties associated with the new result.
Here in this article, we are interested in finding entropy of fuzzy numbers with the help of the Randomness-fuzziness consistency principle as introduced by Baruah [16], which was the result of application of superimposition of on uniformly fuzzy intervals and also the Glivenko- Cantelli’s theorem on order statistics. This principle plays a key role in finding the entropy of fuzzy numbers and hence before proceeding further we would like to mention this principle in short in the next section.
The paper is organized as follows: Section II considers the Randomness-Fuzziness Consistency principles which play an important role in this article. Section III provides the definition of entropy as proposed by Shannon. Section IV deals with some numerical examples of the process proposed here. Finally, Section V presents our conclusions.
-
II. Randomness-Fuzziness Consistency Principles:
Baruah [16] introduced a framework for reasoning with the link between probability – possibility. The developoment of this principle focused mainly on the existence of two laws of randomness which are required to define a law of fuzziness. In other words , not one but two laws of fuzziness is required to define a law of randomness on two disjoint spaces which in turn can construct a fuzzy membership function. Fundamental to this approach is the idea that possibility distribution can be viewed as a combination of two distributions of which one is a probability distribution and the other is a complementary probability distribution.
The consistency principle introduced in the manner can be explained mathematically in the following form:
For a normal fuzzy number of the type N = [ α, β, γ ] with membership function
μ N ( x ) = Ψ 1( x ), if α ≤ x ≤ β ,
= Ψ2(x), if β ≤ x ≤ γ, and = 0, otherwise, with Ψ1 (α) = Ψ2 (γ) = 0,
Ψ 1 ( β ) = Ψ2 ( β ) = 1,
The partial presence of a value x of the variable X in the interval [ α, γ ] is expressible as
μ N ( x ) = θ Prob [ α ≤ X ≤ x ] +
-
(1 – θ ) {1 – Prob [ β ≤ X ≤ x ]}, (9)
where θ=1 if α ≤ x ≤ β and θ=0 if β ≤ x ≤ γ
Thus from the above principle, it can be seen that a possibility space can be seen as a combination of two probability spaces, one to the left of maximum membership function and the other to the right of the membership function.
This principle provides us with a meaningful criterion for selecting an appropriate measure of entropy of fuzzy numbers. But it is important to mention here that for finding the entropy of fuzzy numbers, besides this principle of consistency between probability and possibility, Shannon’s entropy index also plays a vital role. It is for this purpose, we are going to discuss about Shannon’s entropy briefly in the next section.
-
III. Shannon’s Entropy
This section is contributed towards introducing the mathematical definition of entropy by Shannon.
In probability theory, a well known concept is entropy. If p is a probability distribution on
X = {X1, x2,......, x„ }
px where i is the probability of i then the entropy p is defined as
H (p)=-E PiInPi
There exist other formalization of entropy also but the above expression called the Shannon’s entropy is most widely used. It defines the average amount of information obtained by observing a single source of output. As its magnitude increases more uncertainty and thus more information is associated with the source. It is well known that entropy measures the uncertainty associated with the probability distribution p. It is important to note here that Shannon’s entropy is interval dependent and since we would like to get the entropy of fuzzy numbers with the help of Shannon’s entropy, then obviously we shall get entropies of fuzzy numbers which are also interval dependent.
Accordingly, the left reference function of a normal fuzzy number which is nothing but a distribution E function, would lead to entropy 1 (say).
In a similar manner, the right reference function of the normal fuzzy number, which is nothing but a complementary distribution function, would lead to
E entropy 2 (say).
The pair 1, 2 found can rightly be called fuzzy entropy in the classical sense of defining Shannon’s entropy for a discrete law of randomness. Discretizing a law of randomness for a continuous variable should not be of much problem, which in turn can be used to [E,E] E E define fuzzy entropy 12 where 1 and 2 are Shannon’s entropies for the left reference function and right reference function respectively. This was discussed in more details in Dhar [17, 18, 19]
To illustrate the introduced concept, let us consider a very simple numerical example in the following section to make the matter clear and complete.
-
IV. Numerical Examples
In this section, we would like to cite a numerical example to show how the entropy of fuzzy numbers can be calculated. For this let us consider the following triangular fuzzy number
Suppose A=[2,3,5] be a fuzzy numbers
x - 2,2 < x < 3
M a ( x ) = ' |
5^ x ,3 < x < 5 2 |
0, otherwise |
Then we have the following results;
The entropy for the left reference function of the fuzzy number A is the following
Table 1: Shannon Entropy for Left Referenc function
x |
F(x) |
p |
Inp |
pInp |
2 |
0 |
.2 |
-1.609437912 |
-0.321887582 |
2.2 |
.2 |
.2 |
-1.609437912 |
-0.321887582 |
2.4 |
.4 |
.2 |
-1.609437912 |
-0.321887582 |
2.6 |
.6 |
.2 |
-1.609437912 |
-0.321887582 |
2.8 |
.8 |
.2 |
-1.609437912 |
-0.321887582 |
3.0 |
1 |
|||
-1.609437912 |
Then the entropy for the right Reference function is
Table 2: Shannon Entropy for Right r eference function
x |
G(x) |
1-G(x) |
p |
Inp |
pInp |
3 |
1 |
0 |
.2 |
-1.609437912 |
-0.321887582 |
3.4 |
.8 |
.2 |
.2 |
-1.609437912 |
-0.321887582 |
3.8 |
.6 |
.4 |
.2 |
-1.609437912 |
-0.321887582 |
4.2 |
.4 |
.6 |
.2 |
-1.609437912 |
-0.321887582 |
4.6 |
.2 |
.8 |
.2 |
-1.609437912 |
-0.321887582 |
5.0 |
0 |
1 |
|||
-1.609437912 |
Thus with five equal intervals, the discretize point of x for the left reference function and the right reference function are
A={(2,0), (2.2,0.2), (2.4,0.4),(2.6, 0.6), (2.8, 0.8),(3,1), (3.4,0.8), (3.8,0.6),(4.2,0.4),(4.6,0.2), (5,0)}
Hence the Shannon’s entropy in this case is (1.609437912, 1.609437912)
With the citation of the above example, we would like to draw attention to the fact that entropy of fuzzy numbers are interval dependent, which can be had from the fact that entropy of fuzzy number depends on how we choose the interval. That is to say that the entropy of fuzzy number changes with the change of the length of the interval. In the following example, for the same fuzzy number, the interval chosen is different and we shall show how the entropy would also be different.
In all of the above mentioned cases we get different entropies for the same fuzzy number if the interval chosen is different.
This has led us to conclude that entropy of fuzzy number is interval dependent. Another noticeable property of entropy of fuzzy number is that in case of triangular fuzzy number, we would get the same entropy provided the same length of interval is chosen. In the following table, we have considered the same fuzzy number as mentioned above but here eight equal intervals are taken for illustration purposes to establish our claim that the entropy of triangular fuzzy numbers are different if different length of intervals are chosen.
Table 3: Shannon Entropy for Left Referenc function
x |
F(x) |
p |
Inp |
pInp |
2 |
0 |
.125 |
-2.079441542 |
-0.259930192 |
2.125 |
.125 |
.125 |
-2.079441542 |
-0.259930192 |
2.250 |
.250 |
.125 |
-2.079441542 |
-0.259930192 |
2.375 |
.375 |
.125 |
-2.079441542 |
-0.259930192 |
2.500 |
.500 |
.125 |
-2.079441542 |
-0.259930192 |
2.625 |
.625 |
.125 |
-2.079441542 |
-0.259930192 |
2.750 |
.750 |
.125 |
-2.079441542 |
-0.259930192 |
2.875 |
.875 |
.125 |
-2.079441542 |
-0.259930192 |
3.00 |
1 |
|||
-2.079441542 |
Now we shall calculate the entropy for the right reference function
Table 4: Shannon Entropy for Right Referenc function
x |
G(x) |
1-G(x) |
p |
Inp |
pInp |
3 |
1 |
0 |
.125 |
-2.079441542 |
-0.259930192 |
3.25 |
.875 |
.125 |
.125 |
-2.079441542 |
-0.259930192 |
3.50 |
.750 |
.250 |
.125 |
-2.079441542 |
-0.259930192 |
3.75 |
.625 |
.375 |
.125 |
-2.079441542 |
-0.259930192 |
4.0 |
.500 |
.500 |
.125 |
-2.079441542 |
-0.259930192 |
4.25 |
.375 |
.625 |
.125 |
-2.079441542 |
-0.259930192 |
4.50 |
.250 |
.750 |
.125 |
-2.079441542 |
-0.259930192 |
4.75 |
.125 |
.825 |
.125 |
-2.079441542 |
-0.259930192 |
5.0 |
0 |
1 |
|||
- 2.079441542 |
Hence the required entropy in this case is (2.079441542, 2.079441542)
Thus from the above example, it is clear that for different choice of interval, the entropy will be different. Further it can be concluded that the entropy of fuzzy numbers always increase with the increase of the number of intervals.
Again, we would like to claim that the entropy of all triangular fuzzy number will be same for the same choice of the interval.
Suppose B= [-5, -2, 1] be a triangular fuzzy number defined by the membership function
x + 5 |
|
, 5 < X < 2 |
|
3 |
|
—x + 1 |
|
M b ( x ) = ' |
------, — 2 < X < 1 3 |
0, otherwise |
Table 5: Shannon Entropy for Left Referenc function
x |
F(x) |
p |
Inp |
pInp |
-5 |
0 |
.2 |
-1.609437912 |
-0.321887582 |
-4.4 |
.2 |
.2 |
-1.609437912 |
-0.321887582 |
-3.8 |
.4 |
.2 |
-1.609437912 |
-0.321887582 |
-3.2 |
.6 |
.2 |
-1.609437912 |
-0.321887582 |
-2.6 |
.8 |
.2 |
-1.609437912 |
-0.321887582 |
-2 |
1 |
|||
-1.60943791 |
Table 6: Shannon Entropy for Right reference function
x |
G(x) |
1-G(x) |
p |
Inp |
pInp |
-2 |
1 |
0 |
.2 |
-1.609437912 |
-0.321887582 |
-1.4 |
.8 |
.2 |
.2 |
-1.609437912 |
-0.321887582 |
-.8 |
.6 |
.4 |
.2 |
-1.609437912 |
-0.321887582 |
-.2 |
.4 |
.6 |
.2 |
-1.609437912 |
-0.321887582 |
-.4 |
.2 |
.8 |
.2 |
-1.609437912 |
-0.321887582 |
1 |
0 |
1 |
|||
-1.60943791 |
From the above examples and from others if we consider, we can say that the entropy of the triangular fuzzy numbers is same for the same choice of the length of the interval. But for non- triangular fuzzy numbers, entropies are not the same for the same length of intervals where as if the length of interval chosen is different then we shall get different entropy for the same fuzzy number.
We are going to cite the following example for illustration purposes
Let X= [4, 16, 25] be a fuzzy number with membership function defined as:
M x ( x ) =•
X — — 2
,4 < x < 16
5 — xTx ,16 < x < 25
0, otherwise
Here we shall find Shannon’s entropy for the left reference function and the right reference function respectively for the above mentioned fuzzy number with given membership function, with the help of which our proposed definition of entropy would be more clear.
Table7: Shannon Entropy for Left Referenc function
x |
F(x) |
p |
Inp |
pInp |
4 |
0 |
.2649111 |
-1.3283610 |
-0.3518957 |
6.4 |
.2649111 |
.2183286 |
-1.5217540 |
-0.3322424 |
8.8 |
.4832397 |
.1900804 |
-1.6603081 |
-0.3155920 |
11.2 |
.6733201 |
.1705888 |
-1.7684993 |
-0.3016862 |
13.6 |
.8439089 |
.1560911 |
-1.8573155 |
-0.2899104 |
16 |
1 |
|||
-1.5913267 |
Table8: Shannon Entropy for Right Referenc function
x |
G(x) |
1-G(x) |
p |
Inp |
pInp |
16 |
1 |
0 |
.2190046 |
-1.5186625 |
-0.3325940 |
17.8 |
.7809954 |
.2190046 |
.2081841 |
-1.5693325 |
-0.3267100 |
19.6 |
.5728113 |
.4271887 |
.1988247 |
-1.6153317 |
-0.3211678 |
21.4 |
.3739866 |
.6260134 |
.1906244 |
-1.6574503 |
-0.3159504 |
23.2 |
.1833622 |
.8166378 |
.1833622 |
-1.69629184 |
-0.3110358 |
25 |
1 |
||||
Thus with five equal intervals, the discretize points of x for the left and right reference function are: A={(4,0),(6.4,.26),(8.8,.48),(11.2,.67),(13.6,.84),(16,1), (17.8,.78),(19.6,.57),(21.4,.37),(23.2,.18), (25.0, 0)}.
The pair of Shannon’s entropy here found to be (1.591327, 1.607459)
Let us consider another fuzzy number Y= [16, 25, 36] which is defined with a membership function
y/x - 4,16 < x < 25
M y ( x ) =
6 - V x ,25 < x < 36
0, otherwise and according to the process discussed above, we can see that the enropy of the above fuzzy number will take the following form;
Table 9: Shannon Entropy for Left Reference function
x |
F(x) |
p |
Inp |
pInp |
16 |
0 |
0.219004621 |
-1.518662449 |
-0.332594094 |
17.8 |
0.219004621 |
0.208184103 |
-1.56933248 |
-0.326710074 |
19.6 |
0.427188724 |
0.198824678 |
-1.615331858 |
-0.321167836 |
21.4 |
0.626013402 |
0.190624429 |
-1.657450127 |
-0.315950484 |
23.2 |
0.816637831 |
0.183362169 |
-1.696292016 |
-0.311035783 |
25 |
1 |
|||
-1. 607458271 |
Table 10 : Shannon Entropy for Left reference function
x |
G(x) |
1-G(x) |
p |
Inp |
pInp |
25 |
1 |
0 |
0.215361925 |
-1.535435294 |
-0.33 06743 |
27.2 |
0.784638075 |
0.215361925 |
0.20681476 |
-1.575931766 |
-0.32 5925933 |
29.4 |
0.577823315 |
0.422176685 |
0.199211044 |
-1.613390494 |
-0.321 405204 |
31.6 |
0.378612271 |
0.621387729 |
0.192389013 |
-1.648235847 |
-0.3 17102467 |
33.8 |
0.186223258 |
0.813776742 |
0.186223258 |
-1.680809013 |
-0.31 300573 |
36 |
0 |
-1.60 8113634 |
Here the required Shannon’s entropy is (1.607458271, -1.608113634)
Hence is our claim that entropy of triangular fuzzy numbers are same for the same choice of the length of the interval and the entropy of all fuzzy numbers which are not triangular are not the same for the same choice of interval lengths. This is one difference found in case of triangular and non triangular fuzzy number while getting the amount of entropy.
-
V. Conclusions
In this article, our main contribution is towards providing a new definition of entropy of fuzzy sets and also to derive some of its properties. It is important to mention here the fact that the new measure of entropy of fuzzy sets based on the Randomness- Fuzziness consistency principle is introduced for continuos cases. Finally, it was observed that entropy of fuzzy numbers is interval dependent. That is to say that if we increase the number of intervals then the amount of entropy will increase and vice-versa. It was again noticed that the amount of entropy of all triangular fuzzy number remains unchanged for the same choice of intervals where as it differs in case of non triangular fuzzy numbers.
Список литературы On Some Properties of Entropy of Fuzzy Numbers
- Shannon C.E, A Mathematical theory of communication, Bell System Tech. journal, 21(1948), 379-420, 623-656.
- Zadeh L A, Inform and Control, 1965,8: 338-353
- Zadeh L.A: Fuzzy sets as a basis for a theory of possibility. Fuzzy Sets and System, 1, (1978) 3-28, 1978.
- De Luca A, Termini S, A definition of non probabilistic entropy in the settings of fuzzy set theory, Information and Control, 1972, 20:301-312.
- Kaufmann A, Introduction to the theory of Fuzzy Subsets, Academic Press, New York(1975)
- Yager R.R, A procedure for ordering fuzzy subsets of the unit interval, Information Science, 24(1981), 143-161.
- Baruah H K, Fuzzy Membership with respect to a Reference Function, Journal of the Assam Science Society, 1999, 40(.3):65-73.
- Baruah H K, Towards Forming a Field of Fuzzy Sets, International Journal of Energy Information and Communications, 2011, 2(1): 16 – 20.
- Baruah H K, Theory of Fuzzy sets Beliefs and Realities, International Journal of Energy, Information and Communications, 2011, 2(2): 1-22.
- Dhar M, On Hwang and Yang’s definition of Entropy of Fuzzy sets, International Journal of Latest Trend Computing, 2011, 2(4): 496-497.
- Dhar M, On Separation Index of Fuzzy Sets, International Journal of Mathematical Archives, 2012, .3(3): 932-934.
- Dhar M, On Geometrical Representation of Fuzzy Numbers, International Journal of Energy Information and Communications, 2012, 3(2): 29-34.
- Dhar M, On Fuzzy Measures of Symmetry Breaking of Conditions, Similarity and Comparisons: Non Statistical Information for the Single Patient., Accepted for publication in International Journal of Mathematical Archives, 2012.
- Dhar M, A Note on Subsethood measure of fuzzy sets, accepted for publication in International Journal of Energy, Information and Communications, 2012.
- Dhar M, A Note on existing Definition of Fuzzy Entropy, International Journal of Energy Information and Communications, 2012, 3( 1): 17-21.
- H. K. Baruah, the Randomness-Fuzzuness Consistency Principles, IJEIC, Vol. 1, Issue 1, 37-48, 2010
- Dhar M, Probability- Possibility Transformation: A Brief revisit, Accepted for publication in AFMI journal, Korea, 2012.
- Dhar M, A Note on Variable transformations, accepted for publication in AFMI journal, Korea, 2012.
- Dhar M, Probability- Possibility Transformations: An Overview, Accepted for publication in IJMA journal, India, 2012.