AN UNORTHODOX PARAMETRIC MEASURE OF INFORMATION AND CORRESPONDING MEASURE OF INTUTIONISTIC FUZZY INFORMATION

A new parametric function va(p) = log(1 + a) − ∑ log ( 1 + api pi ) n i=1 − 1, a > 0 (2.1.1) is proposed for the probability distribution p1, p2, p3, ... ... ... pn and its properties are studied. In this paper the given functions is twice differentiable and is used to obtain the related measure of directed divergence, measure of intutionistic fuzzy entropy, measure of intutionistic fuzzy directed divergence. We also investigate the monotonic character of the proposed function.


Introduction
In the present paper we draw our inspiration for obtaining a new parametric measure of entropy which is the joint effect of measures of information due to Kapur [6] and Burg [2]. In 1948, C.E. Shannon [8] gave the measure ( ) = − ∑ =1 (1.1) to measure its uncertainty or entropy. It can also be regarded as a measure of equality of 1 , 2 , 3 , … … … among themselves.
Later in 1972, J.P. Burg [2] and J.N. Kapur [6] gave the measure Shannon's and Burg's measures do not have any parameter, while Kapur's measure has one pararameter. When maximized by Lagrange's method, subject to linear constraints on probabilities, the measures due to Shannon, Burg and Kapur always give non-negative probabilities. Shannon's measure has been the most successful and most widely used measure. Burg's measure has also been successful, but it is always negative and as such it is hard to interpret it as a measure of uncertainty. However, it can be used for entropy maximization purposes and it has been so used. Moreover, its maximum value decreases with n and this is not a desirable property for a measure of entropy.
In the present discussion, we modify the Burg's and Kapur's measures to obtain a new parametric measure of entropy. We shall also study the properties of the measure and also investigate the related directed divergence motivated by Kullback-Liebler [7]. In this paper we introduce the corresponding measure of intutionistic fuzzy entropy and measure of intutionistic fuzzy directed divergence.
Here we define intutionistic fuzzy set as given by Atanasov (1983) [1] and then we discuss about the conditions of measure of intutionistic fuzzy entropy. 5. It should be increasing function of ( ) in the range0 ≤ ( ) ≤ 0.5 and decreasing function of ( ) in the range 0 ≤ ( ) ≤ 0.5 6. It should be concave function ( ).

Some Properties of the New Measures of Information
The measure is defined by It has the following properties: 1. It is a continuous function of 1 , 2 , 3 , … … … so that it changes by a small amount when ; = + > 0 (2.1.8) We get ′ ( ) < 0 (2.1.10) So that ( ) is monotonically increasing. Thus, ( ) satisfies all the important properties satisfied by Shannon's measure of entropy except additivity and recursivity. However, these properties are unimportant for entropy maximization purpose and hence, ( ) is an entropy.

Corresponding Measure of Intutionistic Fuzzy Information
It has the following properties: So It is a concave function of ( ) Since all the conditions for measures of intutionistic fuzzy entropy are satisfying very well and therefore (2.2.1) is a valid measure of intutionistic fuzzy entropy.