DBC rule
2021-07-01 13:01:37 303KB DBC_rule
使用drools规则引擎实现的小案例,适合小白学习与演示
2021-06-30 11:01:08 102KB drools springboot
1
This book describes how to write new business rules and extend existing business rules. Business rules are code segments that customize, extend, or integrate shared object modules that use the Brooks Automation Distributed Transaction Processing Framework (DTPF).
2021-06-19 14:09:50 2.43MB Factoryworks FW
1
基于贝叶斯模型的垃圾邮件过滤程序,使用java语言,简洁、高效,容易理解。
2021-06-14 21:25:51 290KB spam filter bayes rule
1
ATP-EMTP RULE BOOK 中文版哦~~~
2021-05-18 13:12:40 25.66MB ATP-EMTP
1
该项目的目标是创建一个转换实用程序,以将自定义Snort规则转换为可在Cisco IDS / IPS设备上使用的格式。
2021-05-06 12:04:10 473KB 开源软件
1
高速信号layout等长要求参照表
2021-04-22 21:03:23 183KB layout
1
用matlab实现的FP-Growth算法。数据放在mydata中,main执行算法。
2021-04-16 14:07:50 5KB matlab 关联规则
1
关联规则挖掘的自顶向下Fp-Growth算法-Top Down FP-Growth for Association Rule Mining.pdf
2021-04-04 16:10:00 247KB 关联规则挖掘
1
Decision trees are particularly promising in symbolic representation and reasoning due to their comprehensible nature, which resembles the hierarchical process of human decision making. However, their drawbacks, caused by the single-tree structure, cannot be ignored. A rigid decision path may cause the majority class to overwhelm other class when dealing with imbalanced data sets, and pruning removes not only superfluous nodes, but also subtrees. The proposed learning algorithm, flexible hybrid decision forest (FHDF), mines information implicated in each instance to form logical rules on the basis of a chain rule of local mutual information, then forms different decision tree structures and decision forests later. The most credible decision path from the decision forest can be selected to make a prediction. Furthermore, functional dependencies (FDs), which are extracted from the whole data set based on association rule analysis, perform embeddedattribute selection to remove nodes rather than subtrees, thus helping to achieve different levels of knowledge representation and improve model comprehension in the framework of semi-supervised learning. Naive Bayes replaces the leaf nodes at the bottom of the tree hierarchy, where the conditional independence assumption may hold. This techniquereduces the potential for overfitting and overtraining and improves the prediction quality and generalization. Experimental results on UCI data sets demonstrate the efficacy of the proposed approach.
2021-03-28 17:07:16 269KB decision forest; naive Bayes;
1