Log In
Or create an account -> 
Imperial Library
  • Home
  • About
  • News
  • Upload
  • Forum
  • Help
  • Login/SignUp

Index
Cover Page Table of Contents Title Page Copyright Preface About the Software 0 Basic Prerequisite Knowledge
0.1 Distributions: Normal, t, and F 0.2 Confidence Intervals (or Bands) and t-Tests 0.3 Elements of Matrix Algebra
1 Fitting a Straight Line by Least Squares
1.0 Introduction: The Need for Statistical Analysis 1.1 Straight Line Relationship Between Two Variables 1.2 Linear Regression: Fitting a Straight Line by Least Squares 1.3 The Analysis of Variance 1.4 Confidence Intervals and Tests for β0 and β1 1.5 F-Test for Significance of Regression 1.6 The Correlation Between X and Y 1.7 Summary of the Straight Line Fit Computations 1.8 Historical Remarks Appendix 1A Steam Plant Data Exercises are in “Exercises for Chapters 1–3”
2 Checking the Straight Line Fit
2.1 Lack of Fit and Pure Error 2.2 Testing Homogeneity of Pure Error 2.3 Examining Residuals: The Basic Plots 2.4 Non-normality Checks on Residuals 2.5 Checks for Time Effects, Nonconstant Variance, Need for Transformation, and Curvature 2.6 Other Residuals Plots 2.7 Durbin–Watson Test 2.8 Reference Books for Analysis of Residuals Appendix 2A Normal Plots Appendix 2B MINITAB Instructions Exercises are in “Exercises for Chapters 1–3”
3 Fitting Straight Lines: Special Topics
3.0 Summary and Preliminaries 3.1 Standard Error of Y 3.2 Inverse Regression (Straight Line Case) 3.3 Some Practical Design of Experiment Implications of Regression 3.4 Straight Line Regression When Both Variables Are Subject to Error Exercises for Chapters 1–3
4 Regression in Matrix Terms: Straight Line Case
4.1 Fitting a Straight Line in Matrix Terms 4.2 Singularity: What Happens in Regression to Make X′X Singular? An Example 4.3 The Analysis of Variance in Matrix Terms 4.4 The Variances and Covariance of b0 and b1 from the Matrix Calculation 4.5 Variance of Y Using the Matrix Development 4.6 Summary of Matrix Approach to Fitting a Straight Line (Nonsingular Case) 4.7 The General Regression Situation Exercises for Chapter
5 The General Regression Situation
5.1 General Linear Regression 5.2 Least Squares Properties 5.3 Least Squares Properties When ε ~ N(0, Iσ2) 5.4 Confidence Intervals Versus Regions 5.5 More on Confidence Intervals Versus Regions Appendix 5A Selected Useful Matrix Results Exercises are in “Exercises for Chapters 5 and 6”
6 Extra Sums of Squares and Tests for Several Parameters Being Zero
6.1 The “Extra Sum of Squares” Principle 6.2 Two Predictor Variables: Example 6.3 Sum of Squares of a Set of Linear Functions of Y’s Appendix 6A Orthogonal Columns in the X Matrix Appendix 6B Two Predictors: Sequential Sums of Squares Exercises for Chapters 5 and 6
7 Serial Correlation in the Residuals and the Durbin–Watson Test
7.1 Serial Correlation in Residuals 7.2 The Durbin–Watson Test for a Certain Type of Serial Correlation 7.3 Examining Runs in the Time Sequence Plot of Residuals: Runs Test Exercises for Chapter
8 More on Checking Fitted Models
8.1 The Hat Matrix H and the Various Types of Residuals 8.2 Added Variable Plot and Partial Residuals 8.3 Detection of Influential Observations: Cook’s Statistics 8.4 Other Statistics Measuring Influence 8.5 Reference Books for Analysis of Residuals Exercises for Chapter
9 Multiple Regression: Special Topics
9.1 Testing a General Linear Hypothesis 9.2 Generalized Least Squares and Weighted Least Squares 9.3 An Example of Weighted Least Squares 9.4 A Numerical Example of Weighted Least Squares 9.5 Restricted Least Squares 9.6 Inverse Regression (Multiple Predictor Case) 9.7 Planar Regression When All the Variables Are Subject to Error Appendix 9A Lagrange’s Undetermined Multipliers Exercises for Chapter
10 Bias in Regression Estimates, and Expected Values of Mean Squares and Sums of Squares
10.1 Bias in Regression Estimates 10.2 The Effect of Bias on the Least Squares Analysis of Variance 10.3 Finding the Expected Values of Mean Squares 10.4 Expected Value of Extra Sum of Squares Exercises for Chapter
11 On Worthwhile Regressions, Big F’s, and R2
11.1 Is My Regression a Useful One? 11.2 A Conversation About R2 Appendix 11A How Significant Should My Regression Be? Exercises for Chapter 11
12 Models Containing Functions of the Predictors, Including Polynomial Models
12.1 More Complicated Model Functions 12.2 Worked Examples of Second-Order Surface Fitting for k = 3 and k = 2 Predictor Variables 12.3 Retaining Terms in Polynomial Models Exercises for Chapter 12
13 Transformation of the Response Variable
13.1 Introduction and Preliminary Remarks 13.2 Power Family of Transformations on the Response: Box–Cox Method 13.3 A Second Method for Estimation λ 13.4 Response Transformations: Other Interesting and Sometimes Useful Plots 13.5 Other Types of Response Transformations 13.6 Response Transformations Chosen to Stabilize Variance Exercises for Chapter 13
14 “Dummy” Variables
14.1 Dummy Variables to Separate Blocks of Data with Different Intercepts, Same Model 14.2 Interaction Terms Involving Dummy Variables 14.3 Dummy Variables for Segmented Models Exercises for Chapter 14
15 Selecting the “Best” Regression Equation
15.0 Introduction 15.1 All Possible Regressions and “Best Subset” Regression 15.2 Stepwise Regression 15.3 Backward Elimination 15.4 Significance Levels for Selection Procedures 15.5 Variations and Summary 15.6 Selection Procedures Applied to the Steam Data Appendix 15A Hald Data, Correlation Matrix, and All 15 Possible Regressions Exercises for Chapter 15
16 III-Conditioning in Regression Data
16.1 Introduction 16.2 Centering Regression Data 16.3 Centering and Scaling Regression Data 16.4 Measuring Multicollinearity 16.5 Belsley’s Suggestion for Detecting Multicollinearity Appendix 16A Transforming X Matrices to Obtain Orthogonal Columns Exercises for Chapter 16
17 Ridge Regression
17.1 Introduction 17.2 Basic Form of Ridge Regression 17.3 Ridge Regression of the Hald Data 17.4 In What Circumstances Is Ridge Regression Absolutely the Correct Way to Proceed? 17.5 The Phoney Data Viewpoint 17.6 Concluding Remarks Appendix 17A Ridge Estimates in Terms of Least Squares Estimates Appendix 17B Mean Square Error Argument Appendix 17C Canonical Form of Ridge Regression Exercises for Chapter 17
18 Generalized Linear Models (GLIM)
18.1 Introduction 18.2 The Exponential Family of Distributions 18.3 Fitting Generalized Linear Models (GLIM) 18.4 Performing the Calculations: An Example 18.5 Further Reading Exercises for Chapter 18
19 Mixture Ingredients as Predictor Variables
19.1 Mixture Experiments: Experimental Spaces 19.2 Models for Mixture Experiments 19.3 Mixture Experiments in Restricted Regions 19.4 Example 1 19.5 Example 2 Appendix 19A Transforming k Mixture Variables to k – 1 Working Variables Exercises for Chapter 19
20 The Geometry of Least Squares
20.1 The Basic Geometry 20.2 Pythagoras and Analysis of Variance 20.3 Analysis of Variance and F-Test for Overall Regression 20.4 The Singular X′X Case: An Example 20.5 Orthogonalizing in the General Regression Case 20.6 Range Space and Null Space of a Matrix M 20.7 The Algebra and Geometry of Pure Error Appendix 20A Generalized Inverses M– Exercises for Chapter 20
21 More Geometry of Least Squares
21.1 The Geometry of a Null Hypothesis: A Simple Example 21.2 General Case H0: Aβ = c: The Projection Algebra 21.3 Geometric Illustrations 21.4 The F-Test for H0, Geometrically 21.5 The Geometry of R2 21.6 Change in R2 for Models Nested Via Aβ = 0, Not Involving β0 21.7 Multiple Regression with Two Predictor Variables as a Sequence of Straight Line Regressions Exercises for Chapter 21
22 Orthogonal Polynomials and Summary Data
22.1 Introduction 22.2 Orthogonal Polynomials 22.3 Regression Analysis of Summary Data Exercises for Chapter 22
23 Multiple Regression Applied to Analysis of Variance Problems
23.1 Introduction 23.2 The One-Way Classification: Standard Analysis and an Example 23.3 Regression Treatment of the One-Way Classification Example 23.4 Regression Treatment of the One-Way Classification Using the Original Model 23.5 Regression Treatment of the One-Way Classification: Independent Normal Equations 23.6 The Two-Way Classification with Equal Numbers of Observations in the Cells: An Example 23.7 Regression Treatment of the Two-Way Classification Example 23.8 The Two-Way Classification with Equal Numbers of Observations in the Cells 23.9 Regression Treatment of the Two-Way Classification with Equal Numbers of Observations in the Cells 23.10 Example: The Two-Way Classification 23.11 Recapitulation and Comments Exercises for Chapter 23
24 An Introduction to Nonlinear Estimation
24.1 Least Squares for Nonlinear Models 24.2 Estimating the Parameters of a Nonlinear System 24.3 An Example 24.4 A Note on Reparameterization of the Model 24.5 The Geometry of Linear Least Squares 24.6 The Geometry of Nonlinear Least Squares 24.7 Nonlinear Growth Models 24.8 Nonlinear Models: Other Work 24.9 References Exercises for Chapter 24
25 Robust Regression
25.1 Least Absolute Deviations Regression (L1 Regression) 25.2 M-Estimators 25.3 Steel Employment Example 25.4 Trees Example 25.5 Least Median of Squares (LMS) Regression 25.6 Robust Regression with Ranked Residuals (rreg) 25.7 Other Methods 25.8 Comments and Opinions 25.9 References Exercises for Chapter
26 Resampling Procedures (Bootstrapping)
26.1 Resampling Procedures for Regression Models 26.2 Example: Straight Line Fit 26.3 Example: Planar Fit, Three Predictors 26.4 Reference Books Appendix 26A Sample MINITAB Programs to Bootstrap Residuals for a Specific Example Appendix 26B Sample MINITAB Programs to Bootstrap Pairs for a Specific Example Additional Comments Exercises for Chapter 26
Bibliography True/False Questions Answers to Exercises Tables
Normal Distribution Percentage Points of the t-Distribution Percentage Points of the χ2-Distribution Percentage Points of the F-Distribution
Index of Authors Associated with Exercises Index
  • ← Prev
  • Back
  • Next →
  • ← Prev
  • Back
  • Next →

Chief Librarian: Las Zenow <zenow@riseup.net>
Fork the source code from gitlab
.

This is a mirror of the Tor onion service:
http://kx5thpx2olielkihfyo4jgjqfb7zx7wxr3sd4xzt26ochei4m6f7tayd.onion