SP500215
NIST Special Publication 500-215: The Second Text REtrieval Conference (TREC-2)
Appendix A: TREC-2 Results
Appendix
National Institute of Standards and Technology
D. K. Harman
suitS - PRC, Inc.
Summary Statistics
Run Number prceol-full, manual 1
1
LTNum of Queries 50
otal number of documents over all queries
Retrieved: 50000 0.8
Relevant: 10185
Rel[OCRerr]ret: 3323 0.6[OCRerr]
[OCRerr]ecall Level Averages Document Level Averages Precision
Recall Precision Precision 0.4
0.00 0.5963 At 5 docs 0.4120
0.10 0.3270 At 10 docs 0.4000
0.2
0.20 0.2306 At 15 docs 0.3920
0.30 0.1430 At 20 docs 0.3140
0.40 0.0866 At 30 docs 0.3521 0
0.50 0.0425 At 100 docs 0.2122
0.60 0.0323 At 200 docs 0.1914
0.70 0.0246 At 500 docs 0.1090
0.80 0.0209 At 1000 docs 0.0665
0.90 0.0131 R-Precision (precision after R 1
1.00 0 0041 docs retrieved (where R is
rage precision over all the number of relevant docu-
[OCRerr]ant docs ments)) 0.8
-interpolated 0 1120 Exact 0.1809
0.6
it Level Averages
LIT Recall
00 0.2103
20 0.2531
40 0.2811
60 0.3006
80 0.3166
00 0.3294
20 0.3335
40 0.3335
60 0.3335
80 0.3335
00 0.3335
NORMAL DEVIATE - [OCRerr]11out-Reca11
Recall
1) 0.84 0.98 0 0.4
2
1
0
-1
-2
I I
-3
0.98
-3 -2 -1 0 1 2 3
0.2~
0.84
0.5 0
0.16
0.02
0.001
Recall-Precision Curve
0 0.2 0.4 0.6 0.8
Recall
Fallout-Recall Curve
I I I
I I
0 0.05 0.1 0.15 0.2
Fallout x 100