The human senses interpret perceived external information, which is incomplete and imprecise, and try to form reasoning vital for survival. Fuzzy set theory provides a system to deal with such information linguistically and performs numerical computation using linguistic labels stipulated by membership functions. Selection of fuzzy if-then rules forms the key component of a fuzzy inference systems (FIS) that can appropriately model human expertise in a specific application. FIS has a structured knowledge representation in the form of if-then rules. FIS lacks the adaptability to deal with changing external environments, thus when FIS is incorporated with Neural Network or Evolutionary Computation (such as Genetic Algorithms), the resulting hybrid system is adaptive.
According to Lotfi A. Zadeh, the founder of Fuzzy Set and Fuzzy Logic: "Soft Computing is an emerging approach to computing which parallels the remarkable ability of the human mind to reason and learn in an environment of uncertainty and imprecision."
Soft Computing constructs intelligent systems (software) that are supposed to possess humanlike knowledge and expertise within a specific domain, adapt themselves and learn to do better in changing environments, and explain how they make decisions or take actions. Conventional artificial intelligence (AI) computes by performing symbolic manipulation, while Soft Computing relies mainly on numerical computation. A hybrid Neural Network and Fuzzy Inference System (NeuroFuzzy) relies heavily on high-speed number-crunching computation to find rules or regularity in data sets. Soft Computing complements symbolic AI so that the boundary between the two is becoming indistinct. The long-term goal of Soft Computing and AI is the creation and understanding of machine intelligence.
The major components of Soft Computing are:
- Fuzzy Set and Fuzzy Logic -- the subject of this discussion.
- Artificial Neural Network -- This is the modeling of the brain as a continuous-time non-linear dynamic system in connectionist architectures that are expected to mimic brain mechanisms to simulate intelligent behavior.
- Evolutionary Computation -- Simulating complex biological evolutionary processes lead to understanding of how living systems acquired higher-level intelligence. Genetic Algorithm (GA) is based on the evolutionary principle of natural selection. Immune Modeling is based on the assumption that chemical and physical laws may be able to explain living intelligence. Artificial Life is a similar discipline to Immune Modeling but also attempts to realize lifelike behavior by imitating the processes that occur in the development of life.
- Bayesian Learning and Statistical Reasoning -- Bayesian reasoning is an approach that provides a probabilistic nature to inference. A Bayes model is based on the assumption that the quantities of interest are governed by probability distributions and that optimal decisions can be made by reasoning about these probabilities together with observed data. Bayesian reasoning provides the basis for learning algorithms that directly manipulate probabilities, as well as framework for analyzing the operation of other algorithms that do not explicitly manipulate probabilities.
The field of Soft Computing is changing and evolving rapidly with new techniques and applications constantly proposed. Although software can be developed in either one of the individual components of soft computing, there is a tendency to combine two or more components so that one will complement the shortfall of the other. Hybrid systems such as Neuro-Fuzzy (Neural Net and Fuzzy Systems), Genetic Neural Network ( Neural Net and Genetic Algorithm ), Fuzzy-Bayesian Network (Fuzzy-Logic and Bayesian Belief Network) are common these days.
Where Is Java Now in Computational Intelligence and Soft Computing?
Java has become very popular as a language for writing software in computational and machine intelligence these days. Java is moving fast to be on a plane with traditional artificial intelligence languages such as Lisp and Prolog to be the first choice for writing AI-based software. There is currently an important draft at the Java Community Process (JCP), Java Specification Request 73, an API for data-mining. The proposed name for this package is javax.datamining, but it has yet not been finalized. The specification lead for this expert group is from Oracle, and it is excellent to see leaders in statistical software such as SPSS and the SAS Institute get involved in drafting this specification.
What is data-mining? The main goal of data-mining is to automate the extraction of hidden predictive information and patterns from (large) databases. Data-mining applies the algorithm of machine learning (computational intelligence) and soft-computing such as artificial neural network, decision trees and belief networks, fuzzy-logic if-then rules, and rule induction. There has been confusion about the meaning of data-mining among the IT community. It is not data warehousing, SQL-queries, and report or data visualization.
Data-mining is a major component of today's enterprise software, such as ERP (Enterprise Resource Planning) and CRM (Customer Relational Management). The expert comment from publications in Intelligent Enterprise (a Web site for enterprise business intelligence) predicts that business intelligence enterprise software such as CRM that does not have analytical functionality will not compete well in the market. A CRM with analytical capability that only has statistical analysis is not as good as one with both statistical plus soft-computing and computational intelligence.
The underlying algorithm of data-mining involves number crunching numeric computation, and it's a good move by Sun to develop such an API to make software development easier for mid-level or even entry-level Java developers who need to be involved in a data-mining project. Java developers just need to understand, by reading the API docs, what parameters need to be passed to a specific method, which removes the need to understand the complex mathematics implemented in a data-mining algorithm.
Data-mining projects have always involved people who have a background and a deep knowledge of mathematics, statistics, and artificial intelligence at the Ph.D. or M.Sc. level. The upcoming javax.datamining API package will pull in Java developers from all levels, expert down to entry level for any data-mining project. Thus, one mathematician is enough to lead a group, which eliminates the need to assemble a team of Ph.D. developers. There are already a number of freeware Java software and APIs in soft computing and machine learning available online as GPL open source, with a new one available almost daily. This shows the explosive popularity of Java in the field of machine intelligence and soft computing.
Evolution of Logic
The following are the different types of logic and their implications or potential applications for technology:
- Bivalent Logic: This is the conventional "either-or", "true-false" formulated by Aristotle; it is logic of our modern day computers. A logic gate output can be either 1 or 0 and there is no number in the middle such as 0.7. There is no such thing as uncertainty or imprecision in Bivalent Logic.
- Multi-valued Logic (Fuzzy Set and Fuzzy Logic): Although modern computers and software operate using bivalent logic, it is inefficient in modeling human concepts, which tend to be imprecise and uncertain. Fuzzy Logic allows logic values to have any value between 0 and 1. ("X is a beautiful person. Y is more beautiful than X. Z is very, very beautiful.")
- Quantum Logic: It is quite different from bivalent and fuzzy logic in that the truth-values interfere with each other, leading to co-existence at the same time of different values. A quantum logic gate can exist in both states at the same time or even more states concurrently. Quantum Computation explores massive parallel computing. What has used to be science fiction decades ago now becomes science fact in today's technology. Peter Shor of AT&T invented the "Shor" quantum algorithm in 1994 and showed that factoring a large integer (400-digit number or more) into prime numbers can be done very fast using a quantum computer (around 1 year time span) in comparison with billions of years using the fastest super-computer of today. Since the emergence of Shor algorithm, financial institutions and government agencies, such as the N.S.A, are aware of the potential threat of this technology. It is no surprise that the U.S. government is at the forefront of research into quantum cryptography and encryption. Even a working group at Microsoft has been established to research this alternative model of computing. When the age of Quantum Computing matures, branches of software engineering such as Data Warehousing will become obsolete, because quantum computers will do searches of millions or even billions of database records and produce reports in a matter of seconds.
Speed is the limitation of the application of machine intelligence and soft computing in today's computers. One day, quantum computing will solve this limitation. In the field of Computer Vision (software that is trained to recognize the difference between a bicycle and a tree from an image, for example), today's computer are not yet fast enough to recognize figures from an image. The pattern matching of current vision technology is reasonable if the number of images to be matched is reasonably low. When the search is to be done on a massive image database, the retrieval process is going to be slow. Java is fast establishing itself in all areas of technical computation from scientific and engineering to business. With the release of Java Advanced Imaging plus Java3D, I have seen Java GPL projects that use soft computing and Computer Vision for scientific and medical imaging.
Fuzzy Logic and Expert Systems
Expert Systems is a computer program that uses expert knowledge to attain high levels of performance in a narrow problem area. It is an imitation of a human expert in the sense that it knows almost everything about almost nothing.
Expert systems have facilities for representing existing expert knowledge, accommodate existing databases, learn and accumulate knowledge during an operation, learn new pieces of knowledge from existing databases, make logical inferences, make decisions and give recommendations, and communicate with users in a friendly way (often in a restricted natural language), explaining their decisions and behavior. The explanation facility often helps users to understand and trust the decisions made by an expert system. Learning for expert systems can be achieved by using machine learning and soft computing. Expert systems have been used successfully in almost every human activity, such as education and training, manufacturing, medicine, science, engineering, agriculture, business, and finance. In using existing information technologies, expert systems for performing difficult and important tasks can be deployed quickly, maintained cheaply, improved easily, and refined during operation to accommodate new situations and facts.
One of the earliest expert systems was Mycin (in medicine), developed at Stanford in the Seventies. Its task was to diagnose and recommend treatment for certain blood infections. To do the diagnosis properly involved growing cultures of the infectious organism. Unfortunately, this takes around 48 hours, and if doctors waited until this was complete, their patient might be dead. So doctors had to come up with quick guesses about likely problems from the available data, as well as use these guesses to provide a "covering'' treatment, where drugs are given to deal with any possible problem.
Mycin was developed partly in order to explore how human experts make these rough (but important) guesses based on partial information. There are lots of junior or non-specialized doctors who sometimes have to make such a rough diagnosis, and if there were an expert tool available to help them, then this might allow more effective treatment. In fact, Mycin was never actually used in practice. This wasn't because of any weakness in its performance -- in tests it outperformed members of the Stanford medical school. It was more because of ethical and legal issues related to the use of computers in medicine (if it gives the wrong diagnosis, whom do you sue?).
Fuzzy systems have been accommodated into building expert systems, so that they can perform approximate reasoning and represent incomplete, ambiguous, corrupted, or contradictory data and knowledge. Fuzzy systems are rule-based expert systems based on fuzzy rules and fuzzy inference. Fuzzy rules represent, in a straightforward way, commonsense knowledge and skills, or knowledge that is subjective, ambiguous, vague or contradictory. Common-sense knowledge is usually acquired from long-term experience over many years. Fuzzy expert systems have found their way into database queries (the fuzzy-query). A SQL database query is only efficient when the information to store and retrieve is exact. When the information for one attribute is fuzzy, vague, or missing, the whole data item (record) is usually not stored, and the information is lost.
There have been a number of Java-based commercial tools available for building knowledge-based (rule-based) and expert systems. IBM has released Common Rules available for free-download, including source code. Jrules from ILOG is a commercial tool for building a business-rule engine in Java. J.E.S.S. (the Java Expert Systems Shell) from Sandia National Laboratory is the most popular Java tool for building rule-engine and expert systems, commercially. J.E.S.S. is the Java version of CLIPS (C Language Integrated Production System), which is written in C and originated from NASA in 1985.
Sun Microsystems' jumped into rule-engines with a draft specification request (JSR-94) for an API, which is still in the Java Community Process. The author of J.E.S.S. is part of this expert group, and he has announced at the jess-user group that J.E.S.S. will support the upcoming Java rule-engine API. The proposed name for this is javax.rules -- although it is not finalized yet.
Road Ahead for Java
As the debate heats up between J2EE and .NET for writing enterprisewide applications, in my opinion, these two computing platforms will probably do the job for most business requirements. However, I can see that J2EE will have an edge, because of its multiple vendor support, portability across platforms, security, and scalability, etc. One of the next battles between these competing platforms will be centered around which one has a rich set of APIs for writing applications that use machine intelligence. In this regard, Java and J2EE should be hundreds of miles ahead of .NET.
I believe that the buzzword that captures the imaginations of IT managers and developers now is "Web services". They can link, locate, and speak to other entities outside your own system by exploring every linked Internet application. A Web service with no machine intelligence will be inferior to one that accommodates computational intelligence. It is quite easy to train Java developers in a fairly short period of time to understand the .NET platform, but it will take longer to train them in the computational intelligence paradigm.
As Sun is drafting APIs that use machine learning and computational intelligence, I predict that when the next version of J2EE fully supports Web services, all level of Java developers will be able to develop intelligent Web services applications without having to be formally trained in machine intelligence. Java's upcoming sets of APIs that use computational intelligence will have documentation on how to call these class methods without needing to know how the underlying algorithm is implemented.
Software developers are very good in understanding of how to pass parameters to a method of a class and get a return object, but it is always hard to comprehend how it was implemented, especially if the class is from a library written by someone else. These upcoming Java APIs will remove that worry. JSR-87 will be available for public review on May 19, 2002. This is an API for distributed autonomous Java Multi-Agent Systems.
What is an autonomous agent? It is a software agent that can be viewed as perceiving its environment through sensors and acting upon that environment through effectors. The sensors for Java agents are in the Agent Communication Language. If the software agent's knowledge is based solely on built-in knowledge, such that it need pay no attention to its percepts, then the agent lacks autonomy. This is the equivalent of a .NET application that seems intelligent, but the software does what the designer has programmed it to do. It cannot act according to its own experience.
Agent behavior can be based on both its own experience and the built-in knowledge used in constructing the agent for the particular environment in which it operates. If the agent's behavior is determined by its own experience, then the agent is said to be autonomous. This Java agent API is designed to be FIPA (Foundations of Intelligent Physical Agent) compliant. FIPA is a non-profit organization formed by different companies drafting an agreed protocol for multi-agent software systems. This Java agent framework will have an impact in developing intelligent Web services, mobile Java devices (mobile agents), and e-commerce.
There are currently some GPL-licensed FIPA-compliant Java agent, particularly J.A.D.E. (Java Agent Development Framework), L.E.A.P. (Lightweight Extensible Agent Platform), and proprietary ones such as O.A.A. (Open Agent Architecture) from SRI International and AGLET from IBM. It is very important for IT managers to make their decisions on future projects based what advantage the computing platform brings in the long run. Java will unleash some APIs that use machine learning and symbolic AI, which will enhance and elevate the promise of Web services many fold from its current level.
An example is an intelligent software agent that bids on an online auction. You can create an agent and give instructions to it as to the type of item to look for and bid on, the highest price it can bid, etc. You then leave the agent to the task of bidding, while you go to town to see your bank manager or do personal stuff. You do not need to sit at your machine all the time during the online auction. The agent will mimic a real human bidder and act according to its built-in knowledge of the auction. A human bidder from the other side of the world would not know that he/she is bidding against a machine.
IBM demonstrated this at the end of 2001, where the agent system was not an online auction but a trading system. There were six humans and six software agents trading against each other. The humans were not told there were machines involved in trading. At the end of the session, the machine agents had all out-traded the humans.
Overall, the road ahead for Java looks brighter than ever, especially in soft computing. As it stands now, Java is a full and complete general-purpose scientific language.
FuzzyJ is the API for fuzzy logic that I use in my articles. It is a commercial tool from the National Research Council of Canada. The library binary code is downloadable.
FuzzyJ supports J.E.S.S., which can be downloaded.
- Welcome to BISC
- Online Workshop on Soft Computing
- Soft Computing Home Page
- World Federation on Soft Computing
- Internet's Resources for
Neuro-Fuzzy and Soft Computing
- Centre for Quantum Computation
- Quantum Computing with Molecules
- Quantum Computation/Cryptography at Los Alamos
- The Quantum Computer
Fuzzy Expert Systems:
- Fuzzy Sets and Systems
- General sources of fuzzy information and research groups
- Fuzzy Logic Research and Life
- Expert Systems for Lithic Analysis
- What Is Fuzzy Logic?
- Introduction to Fuzzy Logic Course
- FANGroup Fuzzy Logic Resources
- Expert Systems Principles and Programming (Third Edition), by J. Giarratano , PWS Publishing Co.
- Intelligent Java Applications for the Internet and Intranets (Java-based book) by Mark Watson, Morgan Kaufman.
- Intelligent Systems and Soft Computing: Prospects, Tools and Applications (Java-based book) by Behnam Azvine, Nader Azarmi, and Detlef D. Nauck, Springer Lecture Notes in Artificial Intelligence.
- Intelligent Control Systems Using Soft-Computing Methodologies by Ali Zilouchian and Mo Jamshidi, CRC Press.
- Uncertain Rule-Based Fuzzy Logic Systems: Introduction and New Directions by Jerry M. Mendel , Prentice Hall.
- Computer Vision and Fuzzy-Neural Systems by Arun D. Kulkarni , Prentice Hall.
- Fuzzy Logic: Intelligence, Control, and Information by John Yen & Reza Langari, Prentice Hall.
- Fuzzy and Neural Approaches in Engineering by Leften H. Tsoukalas, and Robert E. Uhrig, John Wiley & Sons.
- Soft Computing for Knowledge Discovery: Introducing Cartesian Granule Features by James G. Shanahan, Kluwer International Series in Engineering and Computer Science (Volume 570).
- Soft Computing and Its Application by Rafik Aliev and Rashad Aliev, USA World Scientific Publishing Co.
About the Author
Sione Palu has developed software for Publishing Systems, Imaging, and Web Applications. Currently, Palu develops (in Swing) his own software application for Symbolic and Visualization Mathematics for high-school students. Palu graduated from the University of Auckland, New Zealand, with a science degree in mathematics and computing. He has a personal interest in applying Java and mathematics in the fields of mathematical modeling and simulations, expert systems, symbolic AI and soft computing, wavelets, digital signal processing, and control systems.