上传者: tangsoup1994
|
上传时间: 2022-01-06 19:07:27
|
文件大小: 15KB
|
文件类型: -
It is known that there is no sufficient Matlab program about neuro-fuzzy classifiers. Generally, ANFIS is used as classifier. ANFIS is a function approximator program. But, the usage of ANFIS for classifications is unfavorable. For example, there are three classes, and labeled as 1, 2 and 3. The ANFIS outputs are not integer. For that reason the ANFIS outputs are rounded, and determined the class labels. But, sometimes, ANFIS can give 0 or 4 class labels. These situations are not accepted. As a result ANFIS is not suitable for classification problems. In this study, I prepared different adaptive neuro-fuzzy classifiers.
In the all programs, which are given below, I used the k-means algorithm to initialize the fuzzy rules. For that reason, the user should give the number of cluster for each class. Also, Gaussian membership function is only used for fuzzy set descriptions, because of its simple derivative expressions
The first of them is scg_nfclass.m. This classifier based on Jang’s neuro-fuzzy classifier [1]. The differences are about the rule weights and parameter optimization. The rule weights are adapted by the number of rule samples. The scaled conjugate gradient (SCG) algorithm is used to determine the optimum values of nonlinear parameters. The SCG is faster than the steepest descent and some second order derivative based methods. Also, it is suitable for large scale problems [2].
The second program is scg_nfclass_speedup.m. This classifier is similar the scg_nfclass. The difference is about parameter optimization. Although it is based on SCG algorithm, it is faster than the traditional SCG. Because, it used least squares estimation method for gradient estimation without using all training samples. The speeding up is seemed for medium and large scale problems [2].
The third program is scg_power_nfclass.m. Linguistic hedges are applied to the fuzzy sets of rules, and are adapted by SCG algorithm. By this way, some distinctive features are emphasized by power values, and some irrelevant features are damped with power values. The power effects in any feature are generally different for different classes. The using of linguistic hedges increase the recognition rates [3].
The last program is scg_power_nfclass_feature.m. In this program, the powers of fuzzy sets are used for feature selection [4]. If linguistic hedge values of classes in any feature are bigger than 0.5 and close to 1, this feature is relevant, otherwise it is irrelevant. The program creates a feature selection and a rejection criterion by using power values of features.
References:
[1] Sun CT, Jang JSR (1993). A neuro-fuzzy classifier and its applications. Proc. of IEEE
Int. Conf. on Fuzzy Systems, San Francisco 1:94–98.Int. Conf. on Fuzzy Systems, San Francisco 1:94–98
[2] B. Cetişli, A. Barkana (2010). Speeding up the scaled conjugate gradient algorithm and its application in neuro-fuzzy classifier training. Soft Computing 14(4):365–378.
[3] B. Cetişli (2010). Development of an adaptive neuro-fuzzy classifier using linguistic hedges: Part 1. Expert Systems with Applications, 37(8), pp. 6093-6101.
[4] B. Cetişli (2010). The effect of linguistic hedges on feature selection: Part 2. Expert Systems with Applications, 37(8), pp 6102-6108.
e-mail:bcetisli@mmf.sdu.edu.tr
bcetisli@gmail.com