scikit-learn-Cookbook-Second-Edition:Packt发行的scikit-learn Cookbook第二版

上传者: 42120541 | 上传时间: 2024-02-17 17:47:23 | 文件大小: 33.77MB | 文件类型: ZIP
scikit-学习食谱-第二版 这是出版的的代码存储库。 它包含从头到尾完成本书所必需的所有支持项目文件。 关于这本书 由于Python的简单性和灵活性,PythonSwift成为分析人员和数据科学家的首选语言,并且在Python数据空间中,scikit-learn是机器学习的明确选择。 本书包括机器学习中常见问题和不常见问题的演练和解决方案,以及如何利用scikit-learn有效执行各种机器学习任务。 第二版首先介绍了评估数据统计属性的方法,并生成了用于机器学习建模的综合数据。 在阅读本章的过程中,您会遇到一些菜谱,这些菜谱将教您实现一些技术,例如数据预处理,线性回归,逻辑回归,K-NN,朴素贝叶斯,分类,决策树,合奏等等。 此外,您将学习通过多类分类,交叉验证,模型评估来优化模型,并深入研究以scikit-learn实施深度学习。 除了涵盖模型部分,API和分类器,回归器和估计器等

文件下载

资源详情

[{"title":"( 182 个子文件 33.77MB ) scikit-learn-Cookbook-Second-Edition:Packt发行的scikit-learn Cookbook第二版","children":[{"title":"dtree.clf <span style='color:#111;'> 2.54KB </span>","children":null,"spread":false},{"title":"Writing+A+Stacking+Agreggator+with+Scikit-Learn+Extra(2).html <span style='color:#111;'> 284.74KB </span>","children":null,"spread":false},{"title":"Tuning a Decision Tree.ipynb <span style='color:#111;'> 892.89KB </span>","children":null,"spread":false},{"title":"Multiclass SVC Classifier.ipynb <span style='color:#111;'> 663.88KB </span>","children":null,"spread":false},{"title":"Quantizing an image with KMeans clustering-checkpoint.ipynb <span style='color:#111;'> 277.20KB </span>","children":null,"spread":false},{"title":"Quantizing an image with KMeans clustering.ipynb <span style='color:#111;'> 277.20KB </span>","children":null,"spread":false},{"title":"Optimizing a Support Vector Machine-checkpoint.ipynb <span style='color:#111;'> 204.05KB </span>","children":null,"spread":false},{"title":"Optimizing a Support Vector Machine.ipynb <span style='color:#111;'> 204.05KB </span>","children":null,"spread":false},{"title":"Classifying data with a Linear Support Vector Machines-checkpoint.ipynb <span style='color:#111;'> 191.09KB </span>","children":null,"spread":false},{"title":"Classifying data with a Linear Support Vector Machines.ipynb <span style='color:#111;'> 191.09KB </span>","children":null,"spread":false},{"title":"Reducing dimensionality with PCA-checkpoint.ipynb <span style='color:#111;'> 180.64KB </span>","children":null,"spread":false},{"title":"Reducing dimensionality with PCA.ipynb <span style='color:#111;'> 180.64KB </span>","children":null,"spread":false},{"title":"Dimensionality Reduction with Manifolds tSNE-checkpoint.ipynb <span style='color:#111;'> 148.48KB </span>","children":null,"spread":false},{"title":"Dimensionality Reduction with Manifolds tSNE.ipynb <span style='color:#111;'> 148.48KB </span>","children":null,"spread":false},{"title":"Using Gaussian processes for regression-checkpoint.ipynb <span style='color:#111;'> 128.50KB </span>","children":null,"spread":false},{"title":"Using Gaussian processes for regression.ipynb <span style='color:#111;'> 128.50KB </span>","children":null,"spread":false},{"title":"Visualize a Decision Tree with pydot-checkpoint.ipynb <span style='color:#111;'> 124.64KB </span>","children":null,"spread":false},{"title":"Visualize a Decision Tree with pydot.ipynb <span style='color:#111;'> 124.64KB </span>","children":null,"spread":false},{"title":"Using k-NN for regression-checkpoint.ipynb <span style='color:#111;'> 113.36KB </span>","children":null,"spread":false},{"title":"Using k-NN for regression.ipynb <span style='color:#111;'> 113.36KB </span>","children":null,"spread":false},{"title":"Plotting with Numpy and Matplotlib-checkpoint.ipynb <span style='color:#111;'> 108.00KB </span>","children":null,"spread":false},{"title":"Plotting with Numpy and Matplotlib.ipynb <span style='color:#111;'> 108.00KB </span>","children":null,"spread":false},{"title":"Using KMeans to cluster data-checkpoint.ipynb <span style='color:#111;'> 104.16KB </span>","children":null,"spread":false},{"title":"Using KMeans to cluster data.ipynb <span style='color:#111;'> 104.16KB </span>","children":null,"spread":false},{"title":"Pipelines Testing Methods to Reduce Dimensionality-checkpoint.ipynb <span style='color:#111;'> 95.27KB </span>","children":null,"spread":false},{"title":"Pipelines Testing Methods to Reduce Dimensionality.ipynb <span style='color:#111;'> 95.27KB </span>","children":null,"spread":false},{"title":"Assessing cluster correctness-checkpoint.ipynb <span style='color:#111;'> 93.17KB </span>","children":null,"spread":false},{"title":"Assessing cluster correctness.ipynb <span style='color:#111;'> 93.17KB </span>","children":null,"spread":false},{"title":"Decomposition to classify with DictionaryLearning-checkpoint.ipynb <span style='color:#111;'> 92.37KB </span>","children":null,"spread":false},{"title":"Decomposition to classify with DictionaryLearning.ipynb <span style='color:#111;'> 92.37KB </span>","children":null,"spread":false},{"title":"Viewing the Pima Indians Diabetes Dataset with Pandas-checkpoint.ipynb <span style='color:#111;'> 76.10KB </span>","children":null,"spread":false},{"title":"Viewing the Pima Indians Diabetes Dataset with Pandas.ipynb <span style='color:#111;'> 76.10KB </span>","children":null,"spread":false},{"title":"Taking a more fundamental approach to regularization with LARS-checkpoint.ipynb <span style='color:#111;'> 75.81KB </span>","children":null,"spread":false},{"title":"Taking a more fundamental approach to regularization with LARS.ipynb <span style='color:#111;'> 75.81KB </span>","children":null,"spread":false},{"title":"Probabilistic clustering with Gaussian Mixture Models.ipynb <span style='color:#111;'> 69.51KB </span>","children":null,"spread":false},{"title":"Doing basic classifications with Decision Trees.ipynb <span style='color:#111;'> 61.55KB </span>","children":null,"spread":false},{"title":"Using KMeans for outlier detection-checkpoint.ipynb <span style='color:#111;'> 59.90KB </span>","children":null,"spread":false},{"title":"Using KMeans for outlier detection.ipynb <span style='color:#111;'> 59.90KB </span>","children":null,"spread":false},{"title":"Using LDA for classification.ipynb <span style='color:#111;'> 58.32KB </span>","children":null,"spread":false},{"title":"A linear model in the presence of outliers.ipynb <span style='color:#111;'> 55.17KB </span>","children":null,"spread":false},{"title":"A linear model in the presence of outliers-checkpoint.ipynb <span style='color:#111;'> 55.17KB </span>","children":null,"spread":false},{"title":"Varying the Classification Threshold in Logistic Regression.ipynb <span style='color:#111;'> 45.89KB </span>","children":null,"spread":false},{"title":"Optimizing the ridge regression parameter.ipynb <span style='color:#111;'> 43.43KB </span>","children":null,"spread":false},{"title":"Using ridge regression to overcome linear regression's shortfalls-checkpoint.ipynb <span style='color:#111;'> 41.17KB </span>","children":null,"spread":false},{"title":"Using ridge regression to overcome linear regression's shortfalls.ipynb <span style='color:#111;'> 41.17KB </span>","children":null,"spread":false},{"title":"Tuning a Random Forest-checkpoint.ipynb <span style='color:#111;'> 40.76KB </span>","children":null,"spread":false},{"title":"Tuning a Random Forest.ipynb <span style='color:#111;'> 40.76KB </span>","children":null,"spread":false},{"title":"Evaluating the linear regression model.ipynb <span style='color:#111;'> 35.75KB </span>","children":null,"spread":false},{"title":"Tuning Gradient Boosting Trees.ipynb <span style='color:#111;'> 34.94KB </span>","children":null,"spread":false},{"title":"Decision Trees for Regression-checkpoint.ipynb <span style='color:#111;'> 34.55KB </span>","children":null,"spread":false},{"title":"Decision Trees for Regression.ipynb <span style='color:#111;'> 34.55KB </span>","children":null,"spread":false},{"title":"Cross-validation with ShuffleSplit.ipynb <span style='color:#111;'> 31.90KB </span>","children":null,"spread":false},{"title":"Tuning a Decision Tree-checkpoint.ipynb <span style='color:#111;'> 31.59KB </span>","children":null,"spread":false},{"title":"Writing A Stacking Agreggator with Scikit-Learn-checkpoint.ipynb <span style='color:#111;'> 27.15KB </span>","children":null,"spread":false},{"title":"Writing A Stacking Agreggator with Scikit-Learn.ipynb <span style='color:#111;'> 27.15KB </span>","children":null,"spread":false},{"title":"Finding the closest object in the feature space-checkpoint.ipynb <span style='color:#111;'> 25.28KB </span>","children":null,"spread":false},{"title":"Finding the closest object in the feature space.ipynb <span style='color:#111;'> 25.28KB </span>","children":null,"spread":false},{"title":"Scaling data to the standard normal.ipynb <span style='color:#111;'> 24.74KB </span>","children":null,"spread":false},{"title":"Random Forest Regression-checkpoint.ipynb <span style='color:#111;'> 23.25KB </span>","children":null,"spread":false},{"title":"Random Forest Regression.ipynb <span style='color:#111;'> 23.25KB </span>","children":null,"spread":false},{"title":"ROC Analysis-checkpoint.ipynb <span style='color:#111;'> 22.49KB </span>","children":null,"spread":false},{"title":"Creating sample data for toy analysis.ipynb <span style='color:#111;'> 21.78KB </span>","children":null,"spread":false},{"title":"Viewing the Iris Dataset with Pandas-checkpoint.ipynb <span style='color:#111;'> 21.59KB </span>","children":null,"spread":false},{"title":"Viewing the Iris Dataset with Pandas.ipynb <span style='color:#111;'> 20.57KB </span>","children":null,"spread":false},{"title":"Writing A Stacking Agreggator with Scikit-Learn Extra.ipynb <span style='color:#111;'> 20.36KB </span>","children":null,"spread":false},{"title":"Stacking with a Neural Network-checkpoint.ipynb <span style='color:#111;'> 19.85KB </span>","children":null,"spread":false},{"title":"Stacking with a Neural Network.ipynb <span style='color:#111;'> 19.85KB </span>","children":null,"spread":false},{"title":"Optimizing the number of centroids-checkpoint.ipynb <span style='color:#111;'> 19.77KB </span>","children":null,"spread":false},{"title":"Optimizing the number of centroids.ipynb <span style='color:#111;'> 19.77KB </span>","children":null,"spread":false},{"title":"ROC Analysis.ipynb <span style='color:#111;'> 16.22KB </span>","children":null,"spread":false},{"title":"Putting it All Together UCI Breast Cancer Data Set-checkpoint.ipynb <span style='color:#111;'> 15.29KB </span>","children":null,"spread":false},{"title":"Putting it All Together UCI Breast Cancer Data Set.ipynb <span style='color:#111;'> 15.29KB </span>","children":null,"spread":false},{"title":"Fitting a line through data-checkpoint.ipynb <span style='color:#111;'> 13.49KB </span>","children":null,"spread":false},{"title":"Fitting a line through data.ipynb <span style='color:#111;'> 13.49KB </span>","children":null,"spread":false},{"title":"Using truncated SVD to reduce dimensionality.ipynb <span style='color:#111;'> 11.77KB </span>","children":null,"spread":false},{"title":"Using stochastic gradient descent for regression.ipynb <span style='color:#111;'> 11.45KB </span>","children":null,"spread":false},{"title":"Create a Simple Estimator-checkpoint.ipynb <span style='color:#111;'> 11.44KB </span>","children":null,"spread":false},{"title":"Create a Simple Estimator.ipynb <span style='color:#111;'> 11.44KB </span>","children":null,"spread":false},{"title":"Numpy Basics.ipynb <span style='color:#111;'> 11.41KB </span>","children":null,"spread":false},{"title":"Viewing the Iris Dataset-checkpoint.ipynb <span style='color:#111;'> 11.02KB </span>","children":null,"spread":false},{"title":"Viewing the Iris Dataset.ipynb <span style='color:#111;'> 11.02KB </span>","children":null,"spread":false},{"title":"Feature selection.ipynb <span style='color:#111;'> 10.86KB </span>","children":null,"spread":false},{"title":"Fitting a line through data with Machine Learning-checkpoint.ipynb <span style='color:#111;'> 10.03KB </span>","children":null,"spread":false},{"title":"Fitting a line through data with Machine Learning.ipynb <span style='color:#111;'> 10.03KB </span>","children":null,"spread":false},{"title":"Classification Metrics-checkpoint.ipynb <span style='color:#111;'> 8.96KB </span>","children":null,"spread":false},{"title":"Classification Metrics.ipynb <span style='color:#111;'> 8.96KB </span>","children":null,"spread":false},{"title":"Perceptron Classifier-checkpoint.ipynb <span style='color:#111;'> 8.85KB </span>","children":null,"spread":false},{"title":"Perceptron Classifier.ipynb <span style='color:#111;'> 8.85KB </span>","children":null,"spread":false},{"title":"Plotting an ROC Curve without Context-checkpoint.ipynb <span style='color:#111;'> 8.23KB </span>","children":null,"spread":false},{"title":"Plotting an ROC Curve without Context.ipynb <span style='color:#111;'> 8.23KB </span>","children":null,"spread":false},{"title":"Classifying documents with Naive Bayes.ipynb <span style='color:#111;'> 7.97KB </span>","children":null,"spread":false},{"title":"Randomized Search with scikit-learn-checkpoint.ipynb <span style='color:#111;'> 7.73KB </span>","children":null,"spread":false},{"title":"Randomized Search with scikit-learn.ipynb <span style='color:#111;'> 7.73KB </span>","children":null,"spread":false},{"title":"Working with categorical variables-checkpoint.ipynb <span style='color:#111;'> 7.01KB </span>","children":null,"spread":false},{"title":"Working with categorical variables.ipynb <span style='color:#111;'> 7.01KB </span>","children":null,"spread":false},{"title":"Imputing missing values through various strategies-checkpoint.ipynb <span style='color:#111;'> 6.87KB </span>","children":null,"spread":false},{"title":"Imputing missing values through various strategies.ipynb <span style='color:#111;'> 6.87KB </span>","children":null,"spread":false},{"title":"Machine Learning with Logistic Regression.ipynb <span style='color:#111;'> 6.85KB </span>","children":null,"spread":false},{"title":"Regression Metrics-checkpoint.ipynb <span style='color:#111;'> 6.80KB </span>","children":null,"spread":false},{"title":"Regression Metrics.ipynb <span style='color:#111;'> 6.80KB </span>","children":null,"spread":false},{"title":"......","children":null,"spread":false},{"title":"<span style='color:steelblue;'>文件过多,未全部展示</span>","children":null,"spread":false}],"spread":true}]

评论信息

免责申明

【只为小站】的资源来自网友分享,仅供学习研究,请务必在下载后24小时内给予删除,不得用于其他任何用途,否则后果自负。基于互联网的特殊性,【只为小站】 无法对用户传输的作品、信息、内容的权属或合法性、合规性、真实性、科学性、完整权、有效性等进行实质审查;无论 【只为小站】 经营者是否已进行审查,用户均应自行承担因其传输的作品、信息、内容而可能或已经产生的侵权或权属纠纷等法律责任。
本站所有资源不代表本站的观点或立场,基于网友分享,根据中国法律《信息网络传播权保护条例》第二十二条之规定,若资源存在侵权或相关问题请联系本站客服人员,zhiweidada#qq.com,请把#换成@,本站将给予最大的支持与配合,做到及时反馈和处理。关于更多版权及免责申明参见 版权及免责申明