Show simple item record

dc.contributor.authorGriffiths, DFR
dc.contributor.authorMelia, J
dc.contributor.authorMcWilliam, LJ
dc.contributor.authorBall, RY
dc.contributor.authorGrigor, K
dc.contributor.authorHarnden, P
dc.contributor.authorJarmulowicz, M
dc.contributor.authorMontironi, R
dc.contributor.authorMoseley, R
dc.contributor.authorWaller, M
dc.contributor.authorMoss, S
dc.contributor.authorParkinson, MC
dc.date.accessioned2018-09-04T13:10:35Z
dc.date.issued2006-05
dc.identifier6
dc.identifier.citationHISTOPATHOLOGY, 2006, 48 pp. 655 - 662
dc.identifier.issn0309-0167
dc.identifier.urihttps://repository.icr.ac.uk/handle/internal/2550
dc.identifier.doi10.1111/j.1365-2559.2006.02394.x
dc.description.abstractAims: To test the effectiveness of a teaching resource (a decision tree with diagnostic criteria based on published literature) in improving the proficiency of Gleason grading of prostatic cancer by general pathologists. Methods: A decision tree with diagnostic criteria was developed by a panel of urological pathologists during a reproducibility study. Twenty-four general histopathologists tested this teaching resource. Twenty slides were selected to include a range of Gleason score groups 2-4, 5-6, 7 and 8-10. Interobserver agreement was studied before and after a presentation of the decision tree and criteria. The results were compared with those of the panel of urological pathologists. Results: Before the teaching session, 83% of readings agreed within +/- 1 of the panel’s consensus scores. Interobserver agreement was low (kappa = 0.33) compared with that for the panel (kappa = 0.62). After the presentation, 90% of readings agreed within +/- 1 of the panel’s consensus scores and interobserver agreement amongst the pathologists increased to kappa = 0.41. Most improvement in agreement was seen for the Gleason score group 5-6. Conclusions: The lower level of agreement among general pathologists highlights the need to improve observer reproducibility. Improvement associated with a single training session is likely to be limited. Additional strategies include external quality assurance and second opinion within cancer networks.
dc.format.extent655 - 662
dc.languageeng
dc.language.isoeng
dc.titleA study of Gleason score interpretation in different groups of UK pathologists; techniques for improving reproducibility
dc.typeJournal Article
rioxxterms.versionofrecord10.1111/j.1365-2559.2006.02394.x
rioxxterms.licenseref.startdate2006-05
rioxxterms.typeJournal Article/Review
dc.relation.isPartOfHISTOPATHOLOGY
pubs.notesresearcherid-numbers: Waller, Michael/R-6231-2016 orcid-numbers: Waller, Michael/0000-0002-1050-4574 Howell, Simon/0000-0001-8467-1466 unique-id: ISI:000237015000003
pubs.notesNot known
pubs.organisational-group/ICR
pubs.organisational-group/ICR/Primary Group
pubs.organisational-group/ICR/Primary Group/ICR Divisions
pubs.organisational-group/ICR/Primary Group/ICR Divisions/Closed research teams
pubs.organisational-group/ICR/Primary Group/ICR Divisions/Closed research teams/Cancer Screening Evaluation Unit (DoH)
pubs.organisational-group/ICR
pubs.organisational-group/ICR/Primary Group
pubs.organisational-group/ICR/Primary Group/ICR Divisions
pubs.organisational-group/ICR/Primary Group/ICR Divisions/Closed research teams
pubs.organisational-group/ICR/Primary Group/ICR Divisions/Closed research teams/Cancer Screening Evaluation Unit (DoH)
pubs.volume48
pubs.embargo.termsNot known
icr.researchteamCancer Screening Evaluation Unit (DoH)en_US
dc.contributor.icrauthorMoss, Susan Maryen
dc.contributor.icrauthorMelia, Janeen


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following collection(s)

Show simple item record