Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Dedication Title Page Copyright Page PREFACE Acknowledgements Chapter 1 - BACKGROUND
1.1 CLASSIFIERS AS PARTITIONS 1.2 USE OF DATA IN CONSTRUCTING CLASSIFIERS 1.3 THE PURPOSES OF CLASSIFICATION ANALYSIS 1.4 ESTIMATING ACCURACY 1.5 THE BAYES RULE AND CURRENT CLASSIFICATION PROCEDURES
Chapter 2 - INTRODUCTION TO TREE CLASSIFICATION
2.1 THE SHIP CLASSIFICATION PROBLEM 2.2 TREE STRUCTURED CLASSIFIERS 2.3 CONSTRUCTION OF THE TREE CLASSIFIER 2.4 INITIAL TREE GROWING METHODOLOGY 2.5 METHODOLOGICAL DEVELOPMENT 2.6 TWO RUNNING EXAMPLES 2.7 THE ADVANTAGES OF THE TREE STRUCTURED APPROACH
Chapter 3 - RIGHT SIZED TREES AND HONEST ESTIMATES
3.1 INTRODUCTION 3.2 GETTING READY TO PRUNE 3.3 MINIMAL COST-COMPLEXITY PRUNING 3.4 THE BEST PRUNED SUBTREE: AN ESTIMATION PROBLEM 3.5 SOME EXAMPLES APPENDIX
Chapter 4 - SPLITTING RULES
4.1 REDUCING MISCLASSIFICATION COST 4.2 THE TWO-CLASS PROBLEM 4.3 THE MULTICLASS PROBLEM: UNIT COSTS 4.4 PRIORS AND VARIABLE MISCLASSIFICATION COSTS 4.5 TWO EXAMPLES 4.6 CLASS PROBABILITY TREES VIA GINI APPENDIX
Chapter 5 - STRENGTHENING AND INTERPRETING
5.1 INTRODUCTION 5.2 VARIABLE COMBINATIONS 5.3 SURROGATE SPLITS AND THEIR USES 5.4 ESTIMATING WITHIN-NODE COST 5.5 INTERPRETATION AND EXPLORATION 5.6 COMPUTATIONAL EFFICIENCY 5.7 COMPARISON OF ACCURACY WITH OTHER METHODS APPENDIX
Chapter 6 - MEDICAL DIAGNOSIS AND PROGNOSIS
6.1 PROGNOSIS AFTER HEART ATTACK 6.2 DIAGNOSING HEART ATTACKS 6.3 IMMUNOSUPPRESSION AND THE DIAGNOSIS OF CANCER 6.4 GAIT ANALYSIS AND THE DETECTION OF OUTLIERS 6.5 RELATED WORK ON COMPUTER-AIDED DIAGNOSIS
Chapter 7 - MASS SPECTRA CLASSIFICATION
7.1 INTRODUCTION 7.2 GENERALIZED TREE CONSTRUCTION 7.3 THE BROMINE TREE: A NONSTANDARD EXAMPLE
Chapter 8 - REGRESSION TREES
8.1 INTRODUCTION 8.2 AN EXAMPLE 8.3 LEAST SQUARES REGRESSION 8.4 TREE STRUCTURED REGRESSION 8.5 PRUNING AND ESTIMATING 8.6 A SIMULATED EXAMPLE 8.7 TWO CROSS-VALIDATION ISSUES 8.8 STANDARD STRUCTURE TREES 8.9 USING SURROGATE SPLITS 8.10 INTERPRETATION 8.11 LEAST ABSOLUTE DEVIATION REGRESSION 8.12 OVERALL CONCLUSIONS
Chapter 9 - BAYES RULES AND PARTITIONS
9.1 BAYES RULE 9.2 BAYES RULE FOR A PARTITION 9.3 RISK REDUCTION SPLITTING RULE 9.4 CATEGORICAL SPLITS
Chapter 10 - OPTIMAL PRUNING
10.1 TREE TERMINOLOGY 10.2 OPTIMALLY PRUNED SUBTREES 10.3 AN EXPLICIT OPTIMAL PRUNING ALGORITHM
Chapter 11 - CONSTRUCTION OF TREES FROM A LEARNING SAMPLE
11.1 ESTIMATED BAYES RULE FOR A PARTITION 11.2 EMPIRICAL RISK REDUCTION SPLITTING RULE 11.3 OPTIMAL PRUNING 11.4 TEST SAMPLES 11.5 CROSS-VALIDATION 11.6 FINAL TREE SELECTION 11.7 BOOTSTRAP ESTIMATE OF OVERALL RISK 11.8 END-CUT PREFERENCE
Chapter 12 - CONSISTENCY
12.1 EMPIRICAL DISTRIBUTIONS 12.2 REGRESSION 12.3 CLASSIFICATION 12.4 PROOFS FOR SECTION 12.1 12.5 PROOFS FOR SECTION 12.2 12.6 PROOFS FOR SECTION 12.3
BIBLIOGRAPHY NOTATION INDEX SUBJECT INDEX
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion