#### Imprecise Reasoning

The human senses interpret perceived external information, which is incomplete and imprecise, and try to form reasoning vital for survival. Fuzzy set theory provides a system to deal with such information linguistically and performs numerical computation using linguistic labels stipulated by *membership functions*. Selection of fuzzy *if-then* rules forms the key component of a *fuzzy inference systems* (FIS) that can appropriately model human expertise in a specific application. FIS has a structured knowledge representation in the form of if-then rules. FIS lacks the adaptability to deal with changing external environments, thus when FIS is incorporated with Neural Network or Evolutionary Computation (such as Genetic Algorithms), the resulting hybrid system is adaptive.

#### Soft Computing

According to Lotfi A. Zadeh, the founder of Fuzzy Set and Fuzzy Logic: "Soft Computing is an emerging approach to computing which parallels the remarkable ability of the human mind to reason and learn in an environment of uncertainty and imprecision."

Soft Computing constructs intelligent systems (software) that are supposed to possess humanlike knowledge and expertise within a specific domain, adapt themselves and learn to do better in changing environments, and explain how they make decisions or take actions. Conventional *artificial intelligence* (AI) computes by performing symbolic manipulation, while Soft Computing relies mainly on numerical computation. A hybrid Neural Network and Fuzzy Inference System (NeuroFuzzy) relies heavily on high-speed number-crunching computation to find rules or regularity in data sets. Soft Computing complements symbolic AI so that the boundary between the two is becoming indistinct. The long-term goal of Soft Computing and AI is the creation and understanding of *machine intelligence*.

The major components of Soft Computing are:

*Fuzzy Set and Fuzzy Logic*-- the subject of this discussion.*Artificial Neural Network*-- This is the modeling of the brain as a continuous-time non-linear dynamic system in connectionist architectures that are expected to mimic brain mechanisms to simulate intelligent behavior.*Evolutionary Computation*-- Simulating complex biological evolutionary processes lead to understanding of how living systems acquired higher-level intelligence.*Genetic Algorithm*(GA) is based on the evolutionary principle of natural selection.*Immune Modeling*is based on the assumption that chemical and physical laws may be able to explain living intelligence.*Artificial Life*is a similar discipline to Immune Modeling but also attempts to realize lifelike behavior by imitating the processes that occur in the development of life.*Bayesian Learning and Statistical Reasoning*-- Bayesian reasoning is an approach that provides a probabilistic nature to inference. A Bayes model is based on the assumption that the quantities of interest are governed by probability distributions and that optimal decisions can be made by reasoning about these probabilities together with observed data. Bayesian reasoning provides the basis for learning algorithms that directly manipulate probabilities, as well as framework for analyzing the operation of other algorithms that do not explicitly manipulate probabilities.

The field of Soft Computing is changing and evolving rapidly with new techniques and applications constantly proposed. Although software can be developed in either one of the individual components of soft computing, there is a tendency to combine two or more components so that one will complement the shortfall of the other. Hybrid systems such as *Neuro-Fuzzy* (Neural Net and Fuzzy Systems), *Genetic Neural Network* ( Neural Net and Genetic Algorithm ), *Fuzzy-Bayesian Network* (Fuzzy-Logic and Bayesian Belief Network) are common these days.

#### Where Is Java Now in Computational Intelligence and Soft Computing?

Java has become very popular as a language for writing software in computational and machine intelligence these days. Java is moving fast to be on a plane with traditional artificial intelligence languages such as Lisp and Prolog to be the first choice for writing AI-based software. There is currently an important draft at the Java Community Process (JCP), Java Specification Request 73, an API for data-mining. The proposed name for this package is *javax.datamining*, but it has yet not been finalized. The specification lead for this expert group is from Oracle, and it is excellent to see leaders in statistical software such as SPSS and the SAS Institute get involved in drafting this specification.

What is data-mining? The main goal of data-mining is to automate the extraction of hidden predictive information and patterns from (large) databases. Data-mining applies the algorithm of *machine learning* (computational intelligence) and soft-computing such as artificial neural network, decision trees and belief networks, fuzzy-logic if-then rules, and rule induction. There has been confusion about the meaning of data-mining among the IT community. It is not data warehousing, SQL-queries, and report or data visualization.

Data-mining is a major component of today's enterprise software, such as ERP (Enterprise Resource Planning) and CRM (Customer Relational Management). The expert comment from publications in *Intelligent Enterprise* (a Web site for enterprise business intelligence) predicts that business intelligence enterprise software such as CRM that does not have analytical functionality will not compete well in the market. A CRM with analytical capability that only has statistical analysis is not as good as one with both statistical plus soft-computing and computational intelligence.

The underlying algorithm of data-mining involves number crunching numeric computation, and it's a good move by Sun to develop such an API to make software development easier for mid-level or even entry-level Java developers who need to be involved in a data-mining project. Java developers just need to understand, by reading the API docs, what parameters need to be passed to a specific method, which removes the need to understand the complex mathematics implemented in a data-mining algorithm.

Data-mining projects have always involved people who have a background and a deep knowledge of mathematics, statistics, and artificial intelligence at the Ph.D. or M.Sc. level. The upcoming *javax.datamining* API package will pull in Java developers from all levels, expert down to entry level for any data-mining project. Thus, one mathematician is enough to lead a group, which eliminates the need to assemble a team of Ph.D. developers. There are already a number of freeware Java software and APIs in soft computing and machine learning available online as GPL open source, with a new one available almost daily. This shows the explosive popularity of Java in the field of machine intelligence and soft computing.

#### Evolution of Logic

The following are the different types of logic and their implications or potential applications for technology:

*Bivalent Logic:*This is the conventional "either-or", "true-false" formulated by Aristotle; it is logic of our modern day computers. A logic gate output can be either 1 or 0 and there is no number in the middle such as 0.7. There is no such thing as uncertainty or imprecision in Bivalent Logic.*Multi-valued Logic (Fuzzy Set and Fuzzy Logic):*Although modern computers and software operate using bivalent logic, it is inefficient in modeling human concepts, which tend to be imprecise and uncertain. Fuzzy Logic allows logic values to have any value between 0 and 1. ("X is a beautiful person. Y is more beautiful than X. Z is very, very beautiful.")*Quantum Logic:*It is quite different from bivalent and fuzzy logic in that the truth-values interfere with each other, leading to co-existence at the same time of different values. A quantum logic gate can exist in both states at the same time or even more states concurrently. Quantum Computation explores massive parallel computing. What has used to be science fiction decades ago now becomes science fact in today's technology. Peter Shor of AT&T invented the "Shor" quantum algorithm in 1994 and showed that factoring a large integer (400-digit number or more) into prime numbers can be done very fast using a quantum computer (around 1 year time span) in comparison with billions of years using the fastest super-computer of today. Since the emergence of Shor algorithm, financial institutions and government agencies, such as the N.S.A, are aware of the potential threat of this technology. It is no surprise that the U.S. government is at the forefront of research into quantum cryptography and encryption. Even a working group at Microsoft has been established to research this alternative model of computing. When the age of Quantum Computing matures, branches of software engineering such as*Data Warehousing*will become obsolete, because quantum computers will do searches of millions or even billions of database records and produce reports in a matter of seconds.

Speed is the limitation of the application of machine intelligence and soft computing in today's computers. One day, quantum computing will solve this limitation. In the field of *Computer Vision* (software that is trained to recognize the difference between a bicycle and a tree from an image, for example), today's computer are not yet fast enough to recognize figures from an image. The pattern matching of current vision technology is reasonable if the number of images to be matched is reasonably low. When the search is to be done on a massive image database, the retrieval process is going to be slow. Java is fast establishing itself in all areas of technical computation from scientific and engineering to business. With the release of Java Advanced Imaging plus Java3D, I have seen Java GPL projects that use soft computing and Computer Vision for scientific and medical imaging.