dc.contributor.author | Griffiths, DFR | |
dc.contributor.author | Melia, J | |
dc.contributor.author | McWilliam, LJ | |
dc.contributor.author | Ball, RY | |
dc.contributor.author | Grigor, K | |
dc.contributor.author | Harnden, P | |
dc.contributor.author | Jarmulowicz, M | |
dc.contributor.author | Montironi, R | |
dc.contributor.author | Moseley, R | |
dc.contributor.author | Waller, M | |
dc.contributor.author | Moss, S | |
dc.contributor.author | Parkinson, MC | |
dc.date.accessioned | 2018-09-04T13:10:35Z | |
dc.date.issued | 2006-05 | |
dc.identifier | 6 | |
dc.identifier.citation | HISTOPATHOLOGY, 2006, 48 pp. 655 - 662 | |
dc.identifier.issn | 0309-0167 | |
dc.identifier.uri | https://repository.icr.ac.uk/handle/internal/2550 | |
dc.identifier.doi | 10.1111/j.1365-2559.2006.02394.x | |
dc.description.abstract | Aims: To test the effectiveness of a teaching resource (a decision tree with diagnostic criteria based on published literature) in improving the proficiency of Gleason grading of prostatic cancer by general pathologists. Methods: A decision tree with diagnostic criteria was developed by a panel of urological pathologists during a reproducibility study. Twenty-four general histopathologists tested this teaching resource. Twenty slides were selected to include a range of Gleason score groups 2-4, 5-6, 7 and 8-10. Interobserver agreement was studied before and after a presentation of the decision tree and criteria. The results were compared with those of the panel of urological pathologists. Results: Before the teaching session, 83% of readings agreed within +/- 1 of the panel’s consensus scores. Interobserver agreement was low (kappa = 0.33) compared with that for the panel (kappa = 0.62). After the presentation, 90% of readings agreed within +/- 1 of the panel’s consensus scores and interobserver agreement amongst the pathologists increased to kappa = 0.41. Most improvement in agreement was seen for the Gleason score group 5-6. Conclusions: The lower level of agreement among general pathologists highlights the need to improve observer reproducibility. Improvement associated with a single training session is likely to be limited. Additional strategies include external quality assurance and second opinion within cancer networks. | |
dc.format.extent | 655 - 662 | |
dc.language | eng | |
dc.language.iso | eng | |
dc.title | A study of Gleason score interpretation in different groups of UK pathologists; techniques for improving reproducibility | |
dc.type | Journal Article | |
rioxxterms.versionofrecord | 10.1111/j.1365-2559.2006.02394.x | |
rioxxterms.licenseref.startdate | 2006-05 | |
rioxxterms.type | Journal Article/Review | |
dc.relation.isPartOf | HISTOPATHOLOGY | |
pubs.notes | researcherid-numbers: Waller, Michael/R-6231-2016 orcid-numbers: Waller, Michael/0000-0002-1050-4574 Howell, Simon/0000-0001-8467-1466 unique-id: ISI:000237015000003 | |
pubs.notes | Not known | |
pubs.organisational-group | /ICR | |
pubs.organisational-group | /ICR/Primary Group | |
pubs.organisational-group | /ICR/Primary Group/ICR Divisions | |
pubs.organisational-group | /ICR/Primary Group/ICR Divisions/Closed research teams | |
pubs.organisational-group | /ICR/Primary Group/ICR Divisions/Closed research teams/Cancer Screening Evaluation Unit (DoH) | |
pubs.organisational-group | /ICR | |
pubs.organisational-group | /ICR/Primary Group | |
pubs.organisational-group | /ICR/Primary Group/ICR Divisions | |
pubs.organisational-group | /ICR/Primary Group/ICR Divisions/Closed research teams | |
pubs.organisational-group | /ICR/Primary Group/ICR Divisions/Closed research teams/Cancer Screening Evaluation Unit (DoH) | |
pubs.volume | 48 | |
pubs.embargo.terms | Not known | |
icr.researchteam | Cancer Screening Evaluation Unit (DoH) | en_US |
dc.contributor.icrauthor | Moss, Susan Mary | en |
dc.contributor.icrauthor | Melia, Jane | en |