Principles of Assessment in Medical Education Tejinder Singh, Anshu
INDEX
360° Assessment (see Multisource feedback)
A
Admission procedures 248251
Medical College Aptitude Test (MCAT) 249251
multiple mini-interviews (MMI) 249, 251
National Eligibility-cum-Entrance Test (NEET) 252255, 257
UK Clinical Aptitude Test (UKCAT) 249250
AETCOM 69, 174175, 290
Angoff method (see Standard setting)
Assessment tools
acute care assessment tool (ACAT) 126, 194195
direct observation-based assessment of clinical skills 114137
directly observed procedural skills (DOPS) 125126, 185186, 189, 225
ethics (see Ethics) 169171
long case (see long case) 8390
mini-clinical evaluation exercise (see Mini-clinical evaluation exercise) 117125, 185186, 188189, 225
mini-peer assessment tool (mini-PAT) 128129, 185186, 193
multiple choice questions (see Multiple choice questions) 4764
multiple mini-interview (MMI) 111, 143147, 249251
multisource feedback (360° assessment) 127129, 170171, 185186, 192193
objective structured clinical examination (see Objective Structured Clinical Examination) 91113
objective structured long examination record (OSLER) 87, 129130
online assessment (see Online resources for assessment) 300301, 371383
oral examination (see Oral examination) 138148
patient management problem 301303
portfolios (see Portfolios) 151164
professionalism (see Assessment of Professionalism) 169171
selection type questions (See Selection type questions) 3946
structured oral examination 142143, 144146
team assessment of behavior (TAB) 129, 185186, 193194
viva voce (see also Oral examination) 138148
workplace-based assessment (WPBA) (see also Workplace-based assessment tools) 184196
written assessment (see Written assessment) 32
Assessment
assessment as learning 20, 2628, 263
assessment for learning (see Assessment for learning) 20, 25, 28, 233246, 263, 266
assessment of learning 2025, 234, 263
assessment versus evaluation 2
attributes of good assessment 6
basic concepts 117
clinical competence 1829, 84, 117, 120, 179, 275, 301, 353, 359
community-based (see Community-based assessment) 221232
competency-based (see Competency-based assessment) 206220
COLE framework 16
criterion-referenced versus norm-referenced 5
difference between assessment of learning and assessment for learning 234
end of training 288
ethics 165177
expert judgment 28, 122, 262, 267, 357, 359
for selection (see Admission procedures) 247260
formative (see Assessment for learning) 236
in-training 288
measurement versus assessment 2
objective 352353, 357
objective versus subjective 359361
online (see Online assessment) 296307
professionalism (see Professionalism) 165177
programmatic (see Programmatic assessment) 243244, 261277
purposes of 3
reducing assessment stress 15
subjective (See Subjective expert judgment) 353, 357
summative assessment limitations 234235, 279280
summative versus formative 36
test versus tool 3
triangulation of data 267
types 3
utility 15, 262
Vleuten's formula 15, 2324, 262263 (see also Utility of assessment)
workplace-based assessment (WPBA) (see Workplace-based assessment) 178205
written 3039
Assessment as learning 5, 20, 26, 263, 297
Assessment for learning 20, 25, 28, 233246, 263, 266
attributes 237240
cycle 236
effect size of feedback 235
faculty development for 242245
methods 240241
strengths 235
SWOT analysis 242
Assessment of learning 2025, 234, 263
B
Bloom's levels 33
taxonomy 33
MCQ writing 5658
Blueprinting 24
OSCE 101102, 117
question paper setting 7172
C
Checklists versus global ratings 358359
Clinical competence 18
newble's model 31
COLE framework 16
Community-based assessment 221232
4R model 223226
clinical axis 223224
evidence axis 223224
personal axis 223224
social axis 223224
methods 224226
direct observation of field skills 225
direct observation of professional skills 225, 230231
directly observed procedural skills 225
family study assessment 225
logbook 225
mini-CEX 225
multisource feedback 225, 231
objective structured clinical examination 225
observation by community stakeholders 225
portfolios 225, 226
project assessment 225, 229230
reflective writing 225, 228229
assessment 228229
rubrics 228229
self-assessment of professional skills 225, 227
Paul Worley's framework 223226
principles 222223
Community-oriented medical education 221223
Competency frameworks 19
Competency-based assessment 206220
design 212216
prerequisites 210
principles 210212
Competency-based medical education (CBME) 206, 264
assessment, competency-based 206220
Competency
core competencies 19
definition 206
dreyfus and dreyfus model 207
frameworks
ACGME competencies 19, 207
CanMEDs competencies 19, 207
five-star doctor 19
General Medical Council competencies 19, 207
indian Medical Graduate (IMG) 19
medical Council of India 19, 207
tomorrow's doctors 19, 207
ideal doctor 19
milestones 208
roles of Indian Medical Graduate 19
sub-competencies 207
Construct 9, 353
construct formulation 353
construct irrelevance variance 10, 353
construct underrepresentation 9, 353
Constructivism 267
Contrasting groups method (see Standard setting) 326327
Cronbach's alpha (See Reliability)
D
Direct observation-based assessment of clinical skills 114137
360° team assessment of behavior (TAB) 129
acute care assessment tool (ACAT) 126
directly observed procedural skills (DOPS) 125126, 185186, 189, 194
mini clinical evaluation exercise (mini-CEX) 117125, 188189, 225, 356
mini peer assessment tool (mini-PAT) 128129
multisource feedback (360° assessment) 127129
OSCE 90113, 116, 131, 299, 356, 359
OSLER 129130
professionalism mini evaluation exercise (PMEX) 127
tools (see Assessment tools) 115
Directly observed procedural skills (DOPS) 125126, 185186, 189, 225
Dreyfus and Dreyfus model 207210, 213216
E
Educational environment 239240
Educational feedback (see Feedback to students)
Educational impact 14, 262
Educational system 236
Entrustable professional activities (EPA) 207, 213219
designing EPAs 208, 213217
EPA versus specific learning objectives (SLO) 210
stages of entrustment 213
Ethics
AETCOM 174175
attributes 166
autonomy 166
beneficence 166
dignity 166
justice 166
non-maleficience 166
difference between professionalism and ethics 166
narratives 173174
Evaluation 2
Evaluation of teaching (see Student ratings of teaching effectiveness) 342351
Expert judgment (see Subjective expert judgment) 357, 359361
F
Faculty development for better assessment 364370
assessment for learning 242245
for better assessment 364370
formal approaches 367
informal approaches 367
model program for training 368370
objective-structured clinical examination 100101
transfer-oriented training 368
workplace-based assessment (WPBA) 199
Feasibility of assessment 14
Feedback (see also Feedback, educational; Feedback, from students; Feedback, to students) 2526, 238239
Feedback, educational 329341
attributes 332334
definition 329330
descriptive 334335
feedback loop 330331
immediate feedback assessment technique 337
issues 338339
models 335336
feedback sandwich 335
PCP model 335
Pendleton model 335
reflective model 336
SET-GO model 336
STAR model 336
stop-start-continue model 336
opportunities 336338
self-monitoring 337338
strategies for improvement 339340
types 332333
benchmarking 332333
correction 332333
diagnosis 332333
longitudinal development 332333
reinforcement 332333
Feedback, from students (see Student ratings of teaching effectiveness) 342351
Feedback, to students (see Feedback, educational) 6, 15, 25, 26, 63, 86, 92, 103, 115, 117, 123, 130, 170, 180, 188, 211, 234, 237, 263, 266, 268, 282, 329341
Feed forward 2526
H
Hofstee method (see Standard setting) 327
I
Internal assessment 278284
1997 MCI regulations 278279
2019 MCI regulations 279
formative or summative 280281
issues 286288
principles 283
quarter model (see Quarter model) 281, 285295
reliability 281282
strengths 255256, 278284, 286
validity 282
Item analysis 308318
item statistics 308313
difficulty index 309310
discrimination index 309311
distractor efficiency 309, 311
facility value 309310
point biserial correlation 311313
test analysis 313318
methods of estimating reliability 314318
equivalent-forms reliability 314
internal consistency reliability 314315
Cronbach's alpha 315316
KR 20 formula 315
Kuder Richardson formula 315
split half method 315
standard error of measurement 317318
parallel-forms reliability 314
test-retest reliability 314
reliability coefficient 284, 313314
K
Knowledge
assessment of knowledge (see Written assessment) 3046
free response type questions 3039
multiple choice questions (see Multiple choice questions) 4764
selection type questions (see Selection type questions) 3946
type A (declarative) 31
type B (procedural) 31
Kolb's learning cycle 155, 330331
Kuder Richardson formula (see Item analysis) 315
L
Logbook 185187, 225
Long case 8390
comparison with mini-CEX 131133
comparison with OSCE 131133
issues 8485
OSLER 83, 260
process 8384
strategies for improvement 8589
M
Mentoring 26, 153, 244245, 266, 271
Miller pyramid 2023, 3031, 92, 114115
Mini-clinical evaluation exercise (mini-CEX) 117125, 185186, 188189, 225
comparison with long case, 131133
comparison with OSCE, 131133
form 119120
process 118121
strengths 122
Mini-peer assessment tool (mini-PAT) 128129, 185186, 193
Modified essay questions (MEQ) 31, 3536
Multiple choice questions (MCQs) 4764
challenges of using MCQs 4849
conducting MCQ tests 48
guidelines for writing MCQs 5055
negative marking 6061
optical mark reading scanners 59
scoring MCQs 5860
standard setting 62
strengths of MCQs 48
structure of an MCQ 49
Multiple mini-interview (MMI) 111, 143147, 249251
Multisource feedback (360° assessment) 127129, 170171, 185186, 192193
N
Narratives 173174
critical incident technique 173
portfolios 173174
O
Objectification 357
Objective structured clinical examination (OSCE) 91113
admission OSCE 111
blueprinting 101102, 117
checklists versus global ratings 105, 116, 358359
comparison with long case 131133
comparison with mini-CEX 131133
computer assisted OSCE (CA-OSCE) 110
examiner training 100101
factors affecting utility 106108
feasibility 106107
group OSCE (GOSCE) 109
key features 93
modifications and innovations 109111
multiple mini-interview (see Multiple mini-interview) 111
objectivity 107
reliability 108
remote OSCE (ReOSCE) 110
resources to conduct OSCE online 380381
setup 97103
simulated patients 100101
standard setting (See Standard setting) 103104
team OSCE (TOSCE) 109110
telemedicine OSCE (TeleOSCE) 110
types of stations 9397, 116
procedure stations 94, 9597
question stations 9495
rest station 97
validity 107108
Objectivity 2, 108, 212, 282
reliability versus objectivity 352363
Observable practice activities (OPA) 210
Online assessment 296307
automation 298
cheating 305306
consortia 306
designing 297304
electronic patient management problem 301303
implementation 304307
methods 300301
open-book exams 305
plagiarism 305
question formats 299300
sharing resources 306
skill labs 306
take-home exams 305306
triage 306307
types of questions 300301
Online resources for assessment 371383
e-portfolios 379380
for creating distributing and grading assessment 373
for high stakes examinations 381382
for online collaboration 379
gamification apps 376377
interactive tools for formative assessment 373375
learning management systems 372373
online security 381382
proctor devices 381382
quiz apps 376377
to conduct online OSCE 380381
to conduct online simulations 380381
to create interactive videos 377378
to create online polls 378379
to create online surveys 378379
to create rubrics 381
to enhance student engagement 373375
Oral examination (viva voce) 138148
cost-effectiveness 141142
examiner training 147148
flaws 139
halo effect 140
objectivity 139140
reliability 140141
strategies for improvement 142148
strengths 139
structured oral examination 142143, 144146
validity 141
OSLER (see Direct observation-based assessment of clinical skills) 83, 129130, 260
P
Patient management problem 301303
Portfolios 151164, 185186
advantages 159160
challenges 162163
contents 152154
definition 151152
e-portfolios 379380
for assessment 157159, 173174
for learning 152157, 241
implementation 161162
limitations 160161
reflective writing 154157
workplace-based assessment 185186
Professionalism 165177
AETCOM 174175
altruism 166
assessment methods 169171
principles 167169
attributes 166
challenges 167169
conscientious index 175
definition 165167
difference between professionalism and ethics 166
multisource feedback 170171
narratives 173174
critical incident technique 173
portfolios 173174
patient assessment 170, 171
peer assessment 170, 241
professional competence 166
professional identity formation 174175
professionalism mini evaluation exercise (PMEX) 127, 172173
self-assessment 169170
supervisor ratings 170, 172
Professionalism mini-evaluation exercise (PMEX) 127, 172173
Programmatic assessment 11, 26, 243244, 261277
CBME 264, 270
challenges 273276
components 264270
implementation 271276
principles 264267
rationale 262264
traditional assessment versus programmatic assessment 268270
triangulation of data 267
Q
Quarter Model 281, 285295
format 289
implementation 289293
Question banking 318321
steps 319
uses 320321
Question paper setting 6582
blueprinting 7172
determining weightage 6870
item cards 7376
limitations of conventional practices 66
moderation 7778
steps for effective question paper setting 6778
R
Reflections (see Reflective practice)
Reflective practice
for assessment for learning 26, 166, 173, 241, 266
models 155
reflective writing 154157, 225, 228229
rubrics 228229
Reliability 1213, 262, 354356
equivalent-forms reliability 314
methods of estimating reliability 313318
internal consistency reliability 314315
Cronbach's alpha 315316
KR 20 formula 315
Kuder Richardson formula 315
split-half method 315
Standard error of measurement 317318
parallel-forms reliability 314
test-retest reliability 314
reliability coefficient 313314
versus objectivity 352
S
Selection type questions 3946
assertion-reason questions 4142
computer-based objective forms 45
matching questions 4344
key feature test 44
matching questions 42
multiple choice questions 40
multiple response questions 40
ranking questions 41
true-false questions 40
Self-monitoring 337338
Self-directed learning 266
Short answer questions (SAQ) (see Written assessments) 31, 36
Simulated patients 100101
Specific learning objectives 210
Standard error of measurement (see Item analysis) 317318
Standard setting 24, 322328
absolute standards 323324
compensatory standards 324
conjunctive standards 324
criterion-referenced 323324
effect on learning 324325
MCQs 62
methods 325328
for clinical skills 327328
for knowledge tests 325327
angoff method 326
contrasting groups method 326327
hofstee method 327
relative method 325
need 323
norm-referenced 323324
OSCE 103104
relative standards 323324
workplace-based assessment (WPBA) 184
Student ratings of teaching effectiveness 342351
design of instrument 343344
Dr Fox effect 347
generalizability 347
interpretation of data 345346
logistics 344345
misconceptions 342343
misuses 342343, 348
process 343346
professional melancholia 346
purposes 348
reliability 346347
validity 346347
Subjective expert judgment 28, 122, 262, 267, 357, 359361
T
Triage in medical education 306307
U
Utility of assessment 15, 262
Vleuten's formula 15, 2324, 262263
V
Validity 611, 353354
consequence-related evidence 8, 10, 262
construct-related evidence 8, 910, 353
content-related evidence 8
criterion-related evidence 89
factors which lower validity 11
Kane's arguments 354
key concepts 10
W
Web resources for assessment (see Online resources for assessment)
Workplace-based assessment (WPBA) 170171, 178205
difference from traditional assessment 180
faculty development 199
implementation steps 181184
need 178179
prerequisites to implementation 179181
direct observation 180181
feedback, 181
practice, 181
tasks 180
problem areas 199202
quality parameters 197199
role of assessors 196197
role of trainee, 197
standard setting 184
strengths 182
tools 184196
acute care assessment tool (ACAT) 194195
assessment of performance, 194
case-based discussion (CbD) 185186, 190191
clinical encounter cards (CEC) 185188
directly observed procedural skills (DOPS) 185186, 189
discussion of correspondence (DOC) 185186, 192
evaluation of clinical events (ECE) 185186, 191192
assessment tool (HAT) 195196
LEADER case-based discussion (LEADER CbD) 195
logbook, 185187
mini-clinical evaluation exercise 185186, 188189
mini-peer assessment tool (mini-PAT) 185186, 193
multisource feedback (360° assessment) 185186, 192193, 225, 231
patient satisfaction questionnaire, 185186
portfolio, 185186, 225226
procedure based assessment (PbA) 185186, 189190
safeguarding case-based discussion 196
sheffield assessment instrument for letters (SAIL) 185186, 192
supervised learning events 194
team assessment of behaviour (TAB) 185186, 193194
weaknesses 182
Written assessment 3138
closed-ended questions 32
context-poor questions 32
context-rich questions 32
essay questions 31, 3435
modified essay questions (MEQ) 31, 3536
multiple choice questions 4764
open-ended questions 32
short answer questions (SAQ) 31, 36
best response type, 37
completion type, 37
open SAQ, 3738
structured essay questions (SEQ) 31
×
Chapter Notes

Save Clear


Principles of ASSESSMENT IN MEDICAL EDUCATION
Principles of ASSESSMENT IN MEDICAL EDUCATION
Second Edition
Tejinder Singh MD DNB MAMS FIMSA FIAP MSc (Health Professions Education) (Maastricht; Hons) MA (Distance Education); PG Dip Higher Education (Gold Medal) Diploma Training and Development (Gold Medal) PG Diploma in Human Resource Management (Gold Medal) Certificate Course Evaluation Methodology and Examinations (AIU) FAIMER Fellow, IFME Fellow, IMSA Fellow Professor Department of Pediatrics and Medical Education SGRD Institute of Medical Sciences and Research Amritsar, Punjab, India Anshu MD DNB, MNAMS MSc (Health Professions Education) (Maastricht; Hons.) FAIMER Fellow, IFME Fellow, Commonwealth Fellow Professor Department of Pathology Mahatma Gandhi Institute of Medical Sciences Sevagram, Wardha, Maharashtra, India Forewords John Dent Lambert Schuwirth
Jaypee Brothers Medical Publishers (P) Ltd
Headquarters
Jaypee Brothers Medical Publishers (P) Ltd
EMCA House, 23/23-B
Ansari Road, Daryaganj
New Delhi 110 002, India
Landline: +91-11-23272143, +91-11-23272703
+91-11-23282021, +91-11-23245672
Corporate Office
Jaypee Brothers Medical Publishers (P) Ltd
4838/24, Ansari Road, Daryaganj
New Delhi 110 002, India
Phone: +91-11-43574357
Fax: +91-11-43574314
Overseas Office
J.P. Medical Ltd
83 Victoria Street, London
SW1H 0HW (UK)
Phone: +44 20 3170 8910
Fax: +44 (0)20 3008 6180
© 2022, Jaypee Brothers Medical Publishers
The views and opinions expressed in this book are solely those of the original contributor(s)/author(s) and do not necessarily represent those of editor(s) of the book.
All rights reserved. No part of this publication may be reproduced, stored or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission in writing of the publishers.
All brand names and product names used in this book are trade names, service marks, trademarks or registered trademarks of their respective owners. The publisher is not associated with any product or vendor mentioned in this book.
Medical knowledge and practice change constantly. This book is designed to provide accurate, authoritative information about the subject matter in question. However, readers are advised to check the most current information available on procedures included and check information from the manufacturer of each product to be administered, to verify the recommended dose, formula, method and duration of administration, adverse effects and contraindications. It is the responsibility of the practitioner to take all appropriate safety precautions. Neither the publisher nor the author(s)/editor(s) assume any liability for any injury and/or damage to persons or property arising from or related to use of material in this book.
This book is sold on the understanding that the publisher is not engaged in providing professional medical services. If such advice or services are required, the services of a competent medical professional should be sought.
Every effort has been made where necessary to contact holders of copyright to obtain permission to reproduce copyright material. If any have been inadvertently overlooked, the publisher will be pleased to make the necessary arrangements at the first opportunity.
Inquiries for bulk sales may be solicited at: jaypee@jaypeebrothers.com
Principles of Assessment in Medical Education
First Edition: 2012
Second Edition: 2022
9789354652479
Printed at
Cover design: Dinesh N Gudadhe
Contributors Foreword
“Students can, with difficulty, escape the effects of poor teaching, but they cannot (by definition, if they want to graduate) escape the effects of poor assessment.”
Boud, 1995
Unfortunately, authentic assessment can be the weak link in our curriculum. We tend to use approaches with which we are familiar and to assess too much, but at the same time fail to assess what is important for future clinical practice. In seeking to change our approach to assessment our questions may include:
What should we assess? Do we just aim to measure retention of factual knowledge or to measure clinical reasoning? At what level on Miller pyramid are we assessing? Do we focus assessment on agreed learning outcomes, including attitudes?
When should we assess? At the end of the course, during the course, throughout the course, or with an annual progress test?
How should we assess? What instruments should we choose to put in our assessment “toolkit”?
Who should assess? Should assessment be by faculty or should it be a wider, 360-degree, work-place based assessment?
And, what about assessment becoming part of learning?
Finding answers to these questions may lead us in different directions, but here, the second edition of Principles of Assessment in Medical Education provides an accessibly, comprehensive and convenient source of answers. In this book, the editors present a readable digest of key topics covering both the underlying theories of assessment as well as practical illustrations of how to apply a range of assessment instruments and the crucial role of feedback.
The topics are clearly presented in 28 chapters introduced by Key Points. There are six new chapters and others which have been extensively updated and rewritten. The format is clear, the pages are not too crowded, and the book is not over-long or too large or heavy to handle.
There is a great deal that we should know about the extensive and sometimes perplexing topic of assessment. But if we want to help our students by providing good assessment, I am confident that you will find this book a valuable “go-to” resource to learn more about assessment in medical education. But remember, our aim in assessment should be less about trying to trip students up, and more about giving them the chance to shine!
John Dent MMEd MD FAMEE FHEA FRCS (Ed)
International Relations Officer
Association for Medical Education in Europe
Hon Reader in Postgraduate Medicine, University of Dundee, UK
Foreword
What a rich history research and development of assessment in medical education have! Since the 1960s, an increasingly productive stream of ideas, developments, methods and research projects have found their way into the medical education literature.
This is not surprising. Assessment is important. It is not only what guides, steers and drives student learning but it also helps to certify the quality of our graduates and reassure the public.
Developing assessment in medical education is certainly not an easy task; medical competence has many facets, and all are important at various times. It is, therefore, only logical that our views on what constitutes good assessment have evolved. Originally, assessment was seen purely as a measurement process. Competence was treated the same way that test psychology approached personality characteristics, with structured and standardized testing. So, in that perspective, the measurement characteristics of assessment were the hallmark of quality: reproducibility or reliability, and construct validity.
Around the mid-1990s, the views changed. Instead of purely looking at assessment as a measurement process, it was acknowledged that every assessment involves human judgment. Even the most structured multiple-choice test is preceded by phases in which human judgments are used; blueprinting, standard setting, selection of items, options and specific wordings are all based on human judgments.
The distinguishing feature though, was that in authentic and workplace-based assessment human judgment must take place simultaneously with the observation of the candidate, in real-time, so to speak. Consequently, the focus of much research and development shifted to the examiner, because it was recognized that even the best designed rubrics or scales could not replace examiner expertise. So, a significant amount of research was now focused on human decision-making, assessment literacy and how to combine multiple perspectives of different assessors. Where, for example, in the measurement perspective, different views of assessors on the same candidate were seen as error and had to be eliminated, they are now seen as complementary and a logical phenomenon given the multifaceted nature of competence, as long as they can be relied upon as being well-informed, expert judgments. This assessor expertise has been and still is the focus of a considerable amount of research.
This evolved further in the mid-2000s, and our current views are best described as seeing assessment as a system or total program. Now, much of our research and development are focused at understanding how quantitative and qualitative information, formative and summative and assessment FOR, and assessment OF learning can be combined in an integral system. And how this can be used to optimally ‘diagnose’ competence and the development of each student in the more bespoke manner. In this system thinking the assessment as measurement views and the assessment as judgment views come together in a true synthesis.
This book provides a comprehensive overview of all the issues around assessment and will greatly support any reader who wants to develop a system of assessment in ensuring that they will be able to base this on the best available evidence.
Lambert Schuwirth
MD PhD FANZHPE
Professor of Medical Education
Director, Prideaux Centre for Research in Health Professions Education
College of Medicine and Public Health
Flinders University, South Australia
Professor for Innovative Assessment
Department of Educational Development and Research
Maastricht University, Maastricht, The Netherlands
Distinguished Professor of Medical Education
Chang Gung University, Taiwan
Professor of Medicine (Adjunct)
Uniformed Services University for the Health Sciences
Bethesda, Maryland, USA
Preface to the Second Edition
It is with a sense of pride and satisfaction that we present this second edition of Principles of Assessment in Medical Education. The first edition of this book has been received extremely well by teachers of health professions in India. The present edition has been revised to address assessment issues related to competency-based medical education, especially in the Indian settings. Many chapters have been re-written, and many new ones have been added. A number of graphics have also been added to make concepts clear. We do hope that this book will continue to serve its intended purpose.
We are grateful to Professor John Dent and Professor Lambert Schuwirth for contributing Forewords to this edition. We are also grateful to the Editors of Indian Pediatrics and National Medical Journal of India for allowing us to reproduce some of our earlier works, which have been acknowledged at appropriate places in the book.
We would be happy to receive suggestions to make this book better.
Tejinder Singh
Anshu
Preface to the First Edition
The earlier book Principles of Medical Education has been very well received going by the fact that it has entered into its third edition. The book was targeted towards orienting medical teachers to the art and science of educational methods. To a great extent, it also served as a ‘how to’ manual for various educational tasks required of a teacher. However, it was increasingly being felt that it may not satisfy the academic appetite of many readers, more so, with the spotlight shifting to the science of medical education in the recent times.
The present book is a sequel to the earlier book. It focuses on the specific area of student assessment, especially on using assessment as a tool for learning. The emphasis has shifted from ‘how’ to ‘why’ for most of the tools with the presumption that the readers of the book have already received a basic orientation to assessment methods. Plenty of literature support has been provided to help the readers take a broader view of the practice of assessment and a number of further readings have also been added. A chapter on evaluation of teaching by students titled “Student feedback” has also been included with the belief that it will help to improve the standard of assessment and teaching.
A number of international and national experts have shared their expertise in this area and we are extremely grateful to them for letting us use their work. We are also grateful to the editors of Indian Pediatrics and the National Medical Journal of India for allowing us to reproduce some of the chapters published earlier in their journals. We hope that the readers will benefit from seeing more than one perspective on assessment. Let us hasten to add that it is not a treatise on assessment. It has a very focused audience, i.e. those from India, and a very focused objective, i.e. to make the teachers competent in the use of assessment for learning.
The book shares some of the problems of multiauthor books. Readers may find occasional repetitions in some chapters. Although as editors, we could have cut down many of them, a deliberate omission was made for some important topics like validity and reliability or Miller's pyramid to let the readers get a multifaceted view of these important concepts.
We do hope that the book will be accepted like its predecessor and help to raise the level of knowledge and skills of medical teachers regarding student assessment. We also hope that better assessment will ultimately translate to better learning for students and better health care for the masses.
Comments and suggestions to make this book more useful are welcome.
Tejinder Singh
Anshu