Download presentation

Presentation is loading. Please wait.

Published byJessie Day Modified over 2 years ago

1
BITTOR A New Foundation for Logical & Numeric Uncertainty Technical Overview Markov Monoids are used as a Mathematical Foundation for a New Theory of Logical & Numerical Uncertainty Replacing Current Computer Logic & Computation Infrastructures January 4, 2006 Joseph E. Johnson, Ph.D., Professor of Physics University of South Carolina Columbia SC, 29208 jjohnson@sc.edu 803.777.6431johnson@sc.edu

2
Discussion I: Background: It is well known that computers are too exacting and do not perform ‘estimates’ and ‘approximations’ unless explicitly programmed. A computer could not order food for a party not knowing how many people would come or how much they would eat – yet a person can easily operate with such extremely limited information. It is also realized by many scientists that all observations are estimates and furthermore most intelligent decision-making is based upon making optimal decisions using very limited knowledge. A good example is the field of pharmacy and medicine where drug dosage, risk analysis, and procedures have large errors. Likewise one thinks of the complex error analysis for engineering and architectural efforts with unknown strengths of materials, temperatures, loads and stresses. And there is certainly the business investment and financial domains of uncertain currencies, interest rates, supply, and demand. It is also known that while mathematics has developed a rigorous system of integer, rational, real, and complex numbers, that the current system of truncating real numbers is not a rigorous method of managing uncertainly. Thus the entire field of statistics is overlaid on traditional mathematics to manage these uncertainties. Finally, one notes that all values in the sciences and engineering (length, time, mass…) are assumed to be ‘real’ numbers in spite of the fact that it would take infinite effort to measure such a number, with a resulting infinite information level, and in certain violation of quantum theory – thus impossible. Real numbers cannot truly represent the results of observations.

3
Discussion II Our line of thought: A natural (and almost mandatory) line of thought would be to generalize the fundamental piece of information – the bit, 1 & 0, to continuous probability values – but how? My work with continuous Markov transformations led me to realize, that the values (representation space) (x 1, x 2 ) upon which they acted, could provide the natural generalizations of 1 & 0 with 1=(1,0) and 0 = (0,1) and thus allowing all intermediate values when x 1 & x 2 assumed the value range 0 to 1 (i.e. x 1 + x 0 = 1). This occurred to me because the (linear) Markov transformations maintain both the sum of the values (i.e. x 1 +x 2 = 1) and their non-negativity thus allowing them to be interpreted as probabilities.

4
Discussion III - Postulates Postulate 1: is the fundamental entity I thus make the postulate that the fundamental entity of information is the bit vector (x 1, x 2 ) (bittor i.e. the pair of numbers) which is formally the representation space of the Markov Lie Monoid (part of the general linear group of continuous real transformations). (x 1 +x 0 =1 & x i non- negative). This allows continuous values between 1 and 0 that are to be interpreted as the probability (x 1 ) that ‘1’ is the correct value and that the probability that the value is zero is x 0. Thus ‘1’ = (1, 0) and ‘0’ = (0, 1) and (1/2, 1/2) represents equal probability for a 1 or a 0 and thus represents zero information. These bittors now become the fundamental objects in a new mathematics that we propose below.

5
Discussion III - Postulates Postulate 2: Definition of Logic The next most foundational concept is that the fundamental entities in a system must have rules of combination and these rules must provide a generalization of Boolean logic (combinational product rules for AND, OR, NOR, NOT etc). Realizing that since x = represent probabilities, and since independent probabilities multiply, then the rule must be z = x y where if x and y are bittors then z will also be a bittor (components are nonnegative and sum to unity). We postulate z i = c ijk x j y k generalizing the normal logic (computer) operations of AND, OR, NOR, NAND, NOT, etc. where = 1, 2, … 16 and represent the 16 different ways of partitioning the four products x j y k in two components: (e.g. for ‘AND’ we have z 1 = x 1 y 1, z 0 = x 1 y 0 + x 0 y 1 + x 0 y 0 ). The unary NOT operation reverses (x 1, x 0 ) into (x 0, x 1 ) value wise with an obvious symmetric off-diagonal matrix form.

6
Discussion III - Postulates Postulate 2a – Linear combination A second operation similar to a sum is defined as the weighted linear combination x = a 1 x 1 + a 2 x 2 +… a n x n where x i are different bittors and where the a i are an n dimensional Markov Lie monoid representation (ie sum to unity and are non-negative – thus a larger dimensional bittor). This operation gives a weighted linear combination of bittors, by bittors. This new mathematics thus has 16 independent products and one method of linear ‘addition’ (combination). Note that these bittor objects close under these two operations.

7
Discussion III - Postulates Postulate 3: Bittor Numbers Just as the binary reals are defined as a sequence of binary values with a decimal (1101.101), we now define a bittor number to be the outer product of several Markov monoid two dimensional representations (x j,y k ) (x j,y k )(). ()() Thus a number is an (outer product) of Markov monoid representations and thus is itself such a representation in the product space of Markov transformations. One need only write the upper of the two values. Also, as one needs only a limited accuracy for the ‘error’ one can use a binary value such as 01101. This makes a Bittor number take the abbreviated form or more succinctly 110.1(011101) where (1) means an x 1 value that rounds to 100000.

8
Discussion III - Postulates Postulate 4: Bittor Arithmetic Arithmetic is defined with Bittors in exactly the same schema as with binary numbers. For example in adding two bits: 1+0 = 0+1 = 1 and 1+1 = 0+0 = 0 (except carry 1 if 1+1). This is defined as XOR (exclusive OR) of the bittors: z 1 = x 1 y 0 + x 0 y 1 showing that the probability that one gets a one is the product of probabilities for one value to be 1 and the other to be 0. Likewise, z 0 = x 1 y 1 + x 0 y 0 gives the probability to get a 0 upon addition of the two bittors. The other part of addition is the carry which is computed as z 1 = x 1 y 1 ie the AND operation as both values must be 1 in order to have a carry digit.

9
Discussion III - Postulates Postulate 5: Information Defined Shannon’s definition, I = log 2 (P) gives us an information value of 1 when P is 0 or 1 ie a well defined bit of information with a binary choice. It can be shown that the smooth generalization of Shannon entropy, via Renyi’s form of I = log 2 (a(x 1 b +x 0 b )) determines the constants a and b uniquely to be a=2 and b=2 which is already in the extended bittor logic as the operation ‘EQV’ of the bittor with itself. Thus I = log 2 (2(x 1 2 +x 0 2 )) (which is the log base 2 of self equivalence) is defined to be the information content of a bittor. The information value in an entire bittor number is thus the sum of the information in each component bittor.

10
Discussion IV Observation: Bittor operations, numbers, and information smoothly reduce to the standard numbers, logic, and binary system in traditional use when the bittors are exact (1,0) or (0,1). Consequently the proposed system is a smooth generalization of the current logic and arithmetic. Furthermore, the bittor structures include the full Boolean logic, binary values, existing number system (integer, rational, real, and complex numbers) with existing mathematical operations – all as a special case.

11
Technical Summary – I Generalized Logic 1. Basic Entity: I suggest that probability is not a scalar but a component of an ordered ‘vector’ (n-tuple - representation space) that behaves as a ‘bit vector’ or ‘bittor’ under the Lie Markov Monoid of transformations (e.g. {P, (1-P)} = {x 1, x 0 } in two dimensions). These objects generalize the fundamental concepts of ‘1’ and ‘0’ of binary logic and arithmetic allowing for continuous intermediary states of logic. 2. A New Mathematics: Boolean Logic is generalized to be a set of 16 product operations z i = c ijk x j y k generalizing the normal logic (computer) operations of AND, OR, NOR, NAND, NOT, etc. where = 1, 2, … 16 of the 16 possible partitions of the four products x j y k into two parts. A second operation is defined as the weighted linear combination x = a 1 x 1 + a 2 x 2 +… a n x n where x i are different bittors and where the a i are an n dimensional Markov Lie monoid representation (ie sum to unity and are non-negative – thus a larger dimensional bittor).

12
Technical Summary – II Generalized Arithmetic 3. New Bittor Generalized Numbers: Binary numbers are now defined as outer products of these bittor objects where ‘1’ = {1,0} and ‘0’ = {0,1}. Total uncertainly is thus {0.5, 0.5}. It is only necessary to explicitly show the upper component to express a value e.g.(1101.101 ) 4. Arithmetic: Arithmetic operations (+-*/^) are now defined by the natural Boolean generalizations between the Bittors that comprise a ‘number’.

13
Technical Summary – III Generalized Information 5. Shannon information is normally defined as ‘1 bit’ for a binary value of 1 or 0 (log 2 (P)). We extend this for a Bittor as I=log 2 (P 1 2 +P 0 2 ) via second order Renyi entropy. 6. The probability values in the bittor need not have the accuracy of a real number but only a binary number of desired length (perhaps 5, or 6) eg x = (00110, 01010).

14
Discussion Summary: The fundamental objects of observational information are proposed to be Markov Lie monoid two-dimensional representations (called bittors as short for ‘bit vectors’) with the component interpretation of the bittor to be the probabilities to be 1 and 0 respectively. The generalized logic defined by z i = c ijk x j y k provides an entirely new kind of mathematics among the bittor objects that consists of 16 different products and bittor weighted linear combinations. Bittors and bittor type numbers are capable of automated management of uncertainty and constitute a new kind of mathematical structure that generalizes the existing number systems and contain them.

15
Proposal We propose that this new type of mathematics be used to partially manage logical and numerical uncertainty by building this mathematics into computers in a fundamental way. We propose that this be done in three stages: (1) simulation in CC++ or JAVA code for demonstrations & testing, (2) built in an integral way for automatic use in CC++ and JAVA programming, and (3) incorporated as hardware as an chip accelerator to speed the processing. We propose that experts from different areas consider the impact of such an extension to our current mathematics and identify problems, and new ways of utilizing these structures.

16
End Technical Overview

Similar presentations

OK

Fast Algorithms For Hierarchical Range Histogram Constructions

Fast Algorithms For Hierarchical Range Histogram Constructions

© 2017 SlidePlayer.com Inc.

All rights reserved.

Ads by Google

Ppt on seasons in arabic Ppt on networking topics Ppt on natural resources for class 11 Ppt on mind reading machine Ppt on new zealand culture and lifestyle Head mounted display ppt on tv Ppt on business etiquettes in france Ppt on forests in india Ppt on non biodegradable waste definition Ppt on limitation act alberta