Sold Out
Book Categories |
List of Tables, Figures, and Boxes x
List of Abbreviations and Acronyms xv
Disposition Assessments Aligned With Teacher Standards (DAATS) Steps and Worksheets xvi
Foreword Richard C. Kunkel xix
Preface xxi
Acknowledgments xxv
About the Authors xxvi
What Are Dispositions, and Why Should We Measure Them? 1
The Importance of Measuring Dispositions 2
The Challenge 3
What Are Standards-Based Dispositions? 4
Hierarchical Relationships Among Knowledge, Skills, and Dispositions 6
Remembering Bloom 7
Dispositions and Accreditation: Requirements and Definitions 7
Measuring Dispositions: Sources of Confusion 10
Measuring Dispositions: Morals, Ethics, or Standards Based? 11
Different Construct, Different Assessments, Similar Assessment Design Process 14
Wrap-Up 15
Questions for Exploration 16
What Have You Noticed? 17
Assessment Belief Scale-Beliefs About Assessment 18
Cognitive, Affective, and Psychomotor Objectives and Assessments 20
Methods for Assessing Dispositions 21
A Conceptual Framework for Measuring Dispositions 21
Measuring Teacher Dispositions: The State of the Art 23
Back to Basics: Bloom and Krathwohl 24
Available Methods for Measuring Dispositions or Affect 26
Selected-Response Methods 27
Constructed-Response Methods 28
Observed Performance 30
Projective Techniques 31
The Importance of Inference in Measuring Dispositions 31
Wrap-Up 34
Questions for Exploration 35
Bloom and the INTASC Principles 36
Field Work 37
Review Your Feelings 38
Where We Have Been So Far 40
DAATS Step 1-Assessment Design Inputs 41
Why Are Purpose, Use, Propositions, and Content So Important? 43
Define the Purpose(s) and Use(s) of the System 43
Define the Propositions or Principles That Guide the System 46
Define the Conceptual Framework or Content of the System 48
Review Local Factors That Impact the System 52
Wrap-Up 53
Purpose, Use, Propositions, Content, and Context Checksheet 55
Purpose, Use, and Content Draft 57
Propositions 58
Contextual Analysis 59
Where We Have Been So Far 60
DAATS Step 2-Planning With a Continuing Eye on Valid Assessment Decisions 61
Analyze Standards and Indicators 62
All Those Indicators 62
Why Bother? 65
Visualize the Teacher Demonstrating the Affective Targets 66
Select Assessment Methods at Different Levels of Inference 69
Build an Assessment Framework Correlating Standards and Methods 71
Wrap-Up 74
Organizing for Alignment (Version 1) 76
Organizing for Alignment (Version 2) 77
Visualizing the Dispositional Statements 78
Selecting Assessment Methods for INTASC Indicators 79
Assessment Methods for INTASC Indicators: Blueprint 80
Cost-Benefit and Coverage Analysis of Assessment Methods 81
Where We Have Been So Far 82
DAATS Step 3-Instrument Development 83
Draft Items and Directions for Each Instrument 84
Thurstone Agreement Scales 84
Questionnaires, Interviews, and Focus Groups 91
Observed Performance 96
Thematic Apperception Tests or Situation Reflection Assessment 100
Review Items for Applicability to Values, Domain Coverage, and Job Relevance 103
Wrap-Up 106
Creating Scales 108
Creating Questionnaires, Interviews, or K-12 Focus Group Protocols 109
Creating an Affective Behavior Checklist 110
Creating an Affective Behavior Rating Scale 111
Creating a Tally Sheet for Affective Observation 111
Checklist for Reviewing Scale Drafts 113
Review Sheets for Questionnaires and Interviews 114
Review Sheets for K-12 Focus Group Protocols 115
Checklist for Reviewing Observations and Behavior Checklists 116
Coverage Check 117
Rating Form for Stakeholder Review 118
Where We Have Been So Far 120
DAATS Step 4-Decision Making and Data Management 121
Develop Scoring Rubrics 122
Dichotomous Response Scoring Keys 122
Rating Scale Rubrics 123
Determine How Data Will Be Combined and Used 127
Need for Shared Data 127
Data Storage 127
Data Aggregation 128
Develop Implementation Procedures and Materials 134
Preponderance of the Evidence Versus Cut Scores 134
Advising and Due Process 135
Scoring Procedures 137
Implementation 138
Wrap-Up 140
Explanation of Dichotomous Scoring Decisions 142
Rubric Design 143
Sample Format for Candidate/Teacher Tracking Form 144
Format for Data Aggregation 145
Sample Disposition Event Report 146
Management Plan 147
Where We Have Been So Far 148
DAATS Step 5-Credible Data 149
What Is Psychometric Integrity, and Why Do We Have to Worry About It? 150
Create a Plan to Provide Evidence of Validity, Reliability, Fairness, and Utility 151
Elements of a Plan 151
Purpose and Use 152
Construct Measured 153
Interpretation and Reporting of Scores 154
Assessment Specifications and Content Map 155
Assessor/Rater Selection and Training Procedures 156
Analysis Methodology 156
External Review Personnel and Methodology 157
Evidence of Validity, Reliability, and Fairness (VRF) 157
Implement the Plan Conscientiously 173
Wrap-Up 174
Assessment Specifications 176
Analysis of Appropriateness of Decisions for Teacher Failures 177
Analysis of Rehire Data 178
Program Improvement Record 179
Expert Rescoring 180
Fairness Review 181
Analysis of Remediation Efforts and Equal Opportunity (EO) Impact 182
Psychometric Plan Format 183
Logistic Ruler for Content Validity 184
Computation of the Lawshe (1975) Content Validity Ratio (CVR) 186
Disparate-Impact Analysis 187
Computation of Cohen's (1960) Kappa for Inter-Rater Reliability 190
Two Pearson Correlation Coefficients and Scatterplots: Disposition Scores Correlated With Praxis and Portfolio Scores 192
Spearman Correlation Coefficient and Scatterplot: Disposition Scores Correlated With Principal Ratings 195
Correlation Matrix and Scatterplots for Knowledge, Impact, Dispositions, Skills (KIDS) 197
t-Test Comparing Dispositions of Mathematics and Science Teachers 199
DIP Analysis for Programs 201
Using Teacher Scores for Continuous Improvement 203
Reasons Why We Use the Rasch Model 204
The Classical Approach 206
A Quick Overview of Where Rasch Fits Into the Grand Scheme of IRT Models 208
Rasch: The Basics 208
Getting Started 210
Differences That Item Writers Make 211
Guttman Scaling 211
A Sample Rasch Ruler 213
From Pictures to Numbers 214
The Fit Statistic 219
Gain Scores-Real or Imagined? 221
Ratings and Raters 221
Learning More About Rasch 227
Wrap-Up 228
A Decision-Making Tool for Measurement 229
Legal and Psychometric Issues-The Return of the Pied Piper 231
Why Not Portfolios? 232
Why the Pied Piper? 232
What If?? A Legal Scenario: Mary Beth JoAnne Sues XYZ University 233
MBJ Helps Us to Understand the Convergence of Psychometrics and Legal Requirements 233
Background "Facts" 234
Scenario #1 236
Scenarios #2, #3, and #4 237
Psychometric Issues and Legal Challenges in the Real World 238
Legal Issues and Precedents 240
Three Landmark Dispositions Cases in Two Years 241
Tide Changing in NCATE 242
Standards Are the Vanguard! 243
MBJ Revisited 244
End Note 245
INTASC Principles and Disposition Indicators 247
Glossary 251
References 261
Index 267
Login|Complaints|Blog|Games|Digital Media|Souls|Obituary|Contact Us|FAQ
CAN'T FIND WHAT YOU'RE LOOKING FOR? CLICK HERE!!! X
You must be logged in to add to WishlistX
This item is in your Wish ListX
This item is in your CollectionAssessing Teacher Dispositions: Five Standards-Based Steps to Valid Measurement Using the DAATS Model
X
This Item is in Your InventoryAssessing Teacher Dispositions: Five Standards-Based Steps to Valid Measurement Using the DAATS Model
X
You must be logged in to review the productsX
X
X
Add Assessing Teacher Dispositions: Five Standards-Based Steps to Valid Measurement Using the DAATS Model, , Assessing Teacher Dispositions: Five Standards-Based Steps to Valid Measurement Using the DAATS Model to the inventory that you are selling on WonderClubX
X
Add Assessing Teacher Dispositions: Five Standards-Based Steps to Valid Measurement Using the DAATS Model, , Assessing Teacher Dispositions: Five Standards-Based Steps to Valid Measurement Using the DAATS Model to your collection on WonderClub |