Show simple item record

dc.contributor.authorGriffiths, DFRen_US
dc.contributor.authorMelia, Jen_US
dc.contributor.authorMcWilliam, LJen_US
dc.contributor.authorBall, RYen_US
dc.contributor.authorGrigor, Ken_US
dc.contributor.authorHarnden, Pen_US
dc.contributor.authorJarmulowicz, Men_US
dc.contributor.authorMontironi, Ren_US
dc.contributor.authorMoseley, Ren_US
dc.contributor.authorWaller, Men_US
dc.contributor.authorMoss, Sen_US
dc.contributor.authorParkinson, MCen_US
dc.date.accessioned2018-09-04T13:10:35Z
dc.date.issued2006-05en_US
dc.identifier6en_US
dc.identifier.citationHISTOPATHOLOGY, 2006, 48 pp. 655 - 662en_US
dc.identifier.issn0309-0167en_US
dc.identifier.urihttps://repository.icr.ac.uk/handle/internal/2550
dc.identifier.doi10.1111/j.1365-2559.2006.02394.xen_US
dc.description.abstractAims: To test the effectiveness of a teaching resource (a decision tree with diagnostic criteria based on published literature) in improving the proficiency of Gleason grading of prostatic cancer by general pathologists. Methods: A decision tree with diagnostic criteria was developed by a panel of urological pathologists during a reproducibility study. Twenty-four general histopathologists tested this teaching resource. Twenty slides were selected to include a range of Gleason score groups 2-4, 5-6, 7 and 8-10. Interobserver agreement was studied before and after a presentation of the decision tree and criteria. The results were compared with those of the panel of urological pathologists. Results: Before the teaching session, 83% of readings agreed within +/- 1 of the panel’s consensus scores. Interobserver agreement was low (kappa = 0.33) compared with that for the panel (kappa = 0.62). After the presentation, 90% of readings agreed within +/- 1 of the panel’s consensus scores and interobserver agreement amongst the pathologists increased to kappa = 0.41. Most improvement in agreement was seen for the Gleason score group 5-6. Conclusions: The lower level of agreement among general pathologists highlights the need to improve observer reproducibility. Improvement associated with a single training session is likely to be limited. Additional strategies include external quality assurance and second opinion within cancer networks.en_US
dc.format.extent655 - 662en_US
dc.titleA study of Gleason score interpretation in different groups of UK pathologists; techniques for improving reproducibilityen_US
dc.typeJournal Article
rioxxterms.versionofrecord10.1111/j.1365-2559.2006.02394.xen_US
rioxxterms.licenseref.startdate2006-05en_US
rioxxterms.typeJournal Article/Reviewen_US
dc.relation.isPartOfHISTOPATHOLOGYen_US
pubs.notesresearcherid-numbers: Waller, Michael/R-6231-2016 orcid-numbers: Waller, Michael/0000-0002-1050-4574 Howell, Simon/0000-0001-8467-1466 unique-id: ISI:000237015000003en_US
pubs.notesNot knownen_US
pubs.organisational-group/ICR
pubs.organisational-group/ICR/Primary Group
pubs.organisational-group/ICR/Primary Group/ICR Divisions
pubs.organisational-group/ICR/Primary Group/ICR Divisions/Closed research teams
pubs.organisational-group/ICR/Primary Group/ICR Divisions/Closed research teams/Cancer Screening Evaluation Unit (DoH)
pubs.volume48en_US
pubs.embargo.termsNot knownen_US
icr.researchteamCancer Screening Evaluation Unit (DoH)en_US
dc.contributor.icrauthorMoss, Susan Maryen_US
dc.contributor.icrauthorMelia, Janeen_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record