skip navigation links 
 
Index | Site Map | FAQ | Facility Info | Reading Rm | New | Help | Glossary | Contact Us blue spacer  
secondary page banner Return to NRC Home Page
                                                                 1
 1                      UNITED STATES OF AMERICA
 2                    NUCLEAR REGULATORY COMMISSION
 3                                 ***
 4              ADVISORY COMMITTEE ON REACTOR SAFEGUARDS
 5                                 ***
 6                  SUBCOMMITTEE ON PLANT OPERATIONS
 7                         478th ACRS MEETING
 8
 9
10                              U.S. NRC
11                        11545 Rockville Pike
12                             Room T-2B3
13                         Rockville, Maryland
14
15                     Wednesday, December 6, 2000
16
17              The above-entitled meeting commenced, pursuant to
18    notice, at 8:35 a.m., the HONORABLE DR. DANA A. POWERS,
19    presiding.
20
21    MEMBERS PRESENT:
22              DR. DANA A. POWERS, ACRS Member
23              DR. GEORGE APOSTOLAKIS, ACRS Member
24              DR. THOMAS S. KRESS, ACRS Member
25              MR. JOHN D. SIEBER, ACRS Member
.                                                                 2
 1              DR. GRAHAM B. WALLIS, ACRS Member
 2              DR. ROBERT L. SEALE, ACRS Member
 3              DR. WILLIAM J. SHACK, ACRS Member
 4              DR. ROBERT E. UHRIG, ACRS Member
 5              DR. MARIO V. BONACA, ACRS Member
 6              DR. GRAHAM LEITCH, ACRS Member
 7              MS. MAGGALEAN WESTON, Designated Federal Official
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
.                                                                 3
 1    PARTICIPANTS:
 2              FRANK GILLESPIE
 3              MIKE JOHNSON
 4              TOM BOYCE
 5              DON HICKMAN
 6              DOUG COE
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
.                                                                 4
 1                        P R O C E E D I N G S
 2                                                     [8:35 a.m.]
 3              CHAIRMAN SIEBER:  The meeting will now come to
 4    order.  This is a meeting of the ACRS Subcommittee on Plant
 5    Operations.  I am John Siebert, Chairman of the Subcommittee
 6    on Plant Operations.
 7              ACRS members in attendance are Dr. George
 8    Apostolakis, Dr. Mario Bonaca, Dr. Thomas Kress, Mr. Graham
 9    Leitch, Dr. Dana Powers, Dr. Robert Seale, Dr. William
10    Shack, and Dr. Robert Uhrig, and Dr. Graham Wallis.
11              The purpose of this meeting is to discuss the
12    changes to the revised Reactor Oversight Process since
13    implementation of the pilot program.  The subcommittee will
14    gather information, analyze relevant issues and facts, and
15    formulate proposed positions and actions as appropriate for
16    deliberation by the full committee.
17              Maggalean W. Weston is the Designated Federal
18    Official and Cognizant ACRS Staff Engineer for this meeting.
19              The rules for participation in today's meeting
20    have been announced as part of the notice of this meeting
21    previously published in the Federal Register on November
22    15th, 2000.
23              A transcript of this meeting is being kept and
24    will be made available as stated in the Federal Register
25    notice.
.                                                                 5
 1              It is requested that speakers first identify
 2    themselves and speak with sufficient clarity and volume so
 3    that they can be readily heard.  I also request that
 4    speakers use the microphones provided for you, so as to aid
 5    the subcommittee's members' understanding of the information
 6    that you are providing and also to aid the Court Reporter in
 7    obtaining an adequate transcript of the proceedings of this
 8    meeting.
 9              We have received no written comments from members
10    of the public regarding today's meeting.
11              We originally had scheduled a presentation from
12    NEI, but NEI is unable to make a presentation at this
13    meeting because there are, it turns out, a lot of meetings
14    going on with the NRC and NEI has a limited staff.
15              I would like to also welcome Rich Janati, who is
16    an old friend of mine, and works for the State of
17    Pennsylvania in the Department of Environmental Resources
18    and is the State's representative, one of the State's
19    representatives for the inspection of certain aspects of
20    power reactors in Pennsylvania.
21              As you are all aware, the NRC Staff has developed
22    a power reactor licensee oversight process which is intended
23    to mesh with the concepts of risk-informed,
24    performance-based regulation.  Over the past few years the
25    Commission has directed that the Staff develop and implement
.                                                                 6
 1    risk-informed, performance-based concepts on their own
 2    volition and in keeping with the recommendations of the
 3    General Accounting Office.
 4              The direction that the Commission has taken is
 5    also in concern with modern management concepts used
 6    throughout private industry.
 7              These concepts have indeed led to general
 8    improvements in productivity, safety, economic viability,
 9    and competitiveness for many businesses in the United
10    States.  The nuclear industry in general supports these
11    management concepts and has supported them in the past, so
12    it is prudent that the Commission has embraced these
13    concepts and recognized that the institutional concepts that
14    make up our current business atmosphere are successful and
15    they are also applicable to Government regulation.
16              In today's program the NRC Staff plans to present
17    information on the implementation of the pilot program,
18    performance indicators and the significance determination
19    process.
20              I would like now to invite any members of the
21    subcommittee to express any views that they may have at this
22    time.
23              DR. APOSTOLAKIS:  Are we going to have a detailed
24    presentation of the SDP, give us an example?  We never
25    really did that.
.                                                                 7
 1              MR. BOYCE:  Well, we have got two slides.  I am
 2    not sure how detailed two slides can be.
 3              DR. APOSTOLAKIS:  Is it in this package?
 4              MR. BOYCE:  It is in that package.
 5              DR. APOSTOLAKIS:  I think it is too high level.
 6              CHAIRMAN SIEBER:  I think what you are looking for
 7    is working through a specific case.
 8              DR. APOSTOLAKIS:  Yes, that was the idea of the
 9    subcommittee meeting.
10              MR. BOYCE:  Okay.  We could have done that,
11    George, because I mean we just came off doing that with
12    Indian Point 2 in immense detail, which is kind of our near
13    term, worst case effort you might say.
14              MR. JOHNSON:  We would be more than happy to do it
15    at not this meeting perhaps, but at a future time.  We would
16    be more than happy to sit down and go through --
17              CHAIRMAN SIEBER:  And I would be just fascinated
18    to go through and understand how the fire protection SDP
19    process works.
20              DR. APOSTOLAKIS:  There is an example here in the
21    document I received from Ms. Weston on operator
22    requalification.  To read it by myself is agony but if
23    somebody explains it that would be great.
24              So there is no way we can do this today?
25              MR. GILLESPIE:  Doug, are you familiar enough with
.                                                                 8
 1    that one --
 2              DR. APOSTOLAKIS:  Or anyone.  Do you have any
 3    other example that is already on transparencies?
 4              MR. COE:  I don't have anything already on
 5    transparencies.  If you wanted to give me an hour, I can go
 6    back to the office and try to put something together.  I
 7    don't know if it would meet your needs or not.
 8              If you are interested specifically in the fire
 9    protection area, I am probably less able to provide a good
10    example in that area because, as you are well aware, it is
11    very complex.
12              DR. APOSTOLAKIS:  Any example that will be the
13    easiest for you to put together in an hour and come back. 
14    We will try to ask questions so it will delay the whole
15    thing for you.
16              [Laughter.]
17              MR. COE:  Are you more interested in the Reactor
18    Safety SDP with the Phase 2 worksheets and that sort of
19    thing?
20              DR. APOSTOLAKIS:  Yes.  I would like to see those
21    worksheets, what decisions the inspectors have to make,
22    because we have never done this.
23              All of a sudden, you know --
24              MR. COE:  Give me an hour and I can try to put
25    something together.
.                                                                 9
 1              DR. APOSTOLAKIS:  If that is okay with you guys?
 2              CHAIRMAN SIEBER:  Well, this is an appropriate
 3    time too, since NEI isn't here.  We will have time.  We have
 4    the room.  The meeting is intended to last until Noon.
 5              MR. JOHNSON:  I guess I ought to mention that one
 6    of the reasons NEI is not here is because we have -- our
 7    branch is meeting with NEI, one of our regularly-scheduled
 8    meetings.
 9              CHAIRMAN SIEBER:  On the same issue.
10              MR. JOHNSON:  Yes, on the same issue, so Don
11    Hickman, for example, who is here to talk about performance
12    indicators, we were really trying to enable him to talk
13    early on so they can get back and support that other
14    meeting, so there are some additional time constraints on
15    us.
16              MR. GILLESPIE:  Doug, pick a typical one --
17              MR. COE:  Reactor safety.
18              MR. GILLESPIE:  Reactor safety SDP -- something at
19    least proceeded through the Phase 2 worksheets to a white or
20    a decision at that point?
21              DR. APOSTOLAKIS:  Yes, or maybe led to a more
22    detailed analysis.
23              MR. GILLESPIE:  Or it goes to Phase 3.
24              DR. APOSTOLAKIS:  Yes.
25              DR. SEALE:  Could I make another suggestion?
.                                                                10
 1              CHAIRMAN SIEBER:  Yes, sir.
 2              DR. SEALE:  I think it would be perhaps
 3    appropriate to put Mr. -- what was your friend's name
 4    again --
 5              CHAIRMAN SIEBER:  Janati.
 6              DR. SEALE:  -- Janati on notice that if he has any
 7    comments that he feels that would be appropriate for the
 8    committee to hear from a perspective of the state system,
 9    when we get through I think we would be very interested to
10    ask him to make those comments.
11              THE REPORTER:  Can't hear you.
12              DR. APOSTOLAKIS:  You have to come to a
13    microphone.
14              CHAIRMAN SIEBER:  There is a microphone right
15    here -- on the side.
16              MR. JANATI:  My name is Rich Janati and I am with
17    Pennsylvania Department of Environmental Protection.  I am
18    pleased to be here today.  I am here as an observer.  I have
19    not prepared any comments or was not planning to make any
20    comments, but I will be happy to ask questions if you would
21    allow me to do that.
22              [Laughter.]
23              MR. JANATI:  Either during the presentation or
24    after the meeting.
25              DR. SEALE:  Well, just any comments at the end
.                                                                11
 1    that you might be --
 2              MR. JANATI:  I would be happy to do that.
 3              DR. APOSTOLAKIS:  Instead of asking a question,
 4    start by saying "I wonder why" --
 5              [Laughter.]
 6              DR. APOSTOLAKIS:  That's a comment.
 7              MR. JOHNSON:  Really, we are learning the process. 
 8    It's a new process and unfortunately there were no pilot
 9    plants in Pennsylvania so it is going to take us some time
10    to learn the process.  Thank you very much.
11              CHAIRMAN SIEBER:  Thank you.  Any other comments?
12              I noticed in a recent edition of Inside NRC they
13    talked about applying the oversight process to steam
14    generators and if you in the course of your presentation, if
15    you have any insights that you could tell us about, what
16    your intentions are in that area, I think that I would
17    appreciate hearing that.
18              MR. BOYCE:  Okay, perhaps after we go through the
19    SDP, reactor safety SDP, that discussion in question should
20    come up because I think it will be a little more clear,
21    particularly in light of what we are doing with IP-2 so let
22    me try and cover it during that part.
23              CHAIRMAN SIEBER:  And I understand that that is
24    all prospective at this time and no real decisions have been
25    made, and so that ought to be on the record, but I think it
.                                                                12
 1    would be interesting for the subcommittee to hear about the
 2    potential plans you might have in that area.
 3              MR. BOYCE:  Yes, we can touch upon it, but the
 4    first thing is steam generators are in fact covered by the
 5    Reactor Safety SDP today.
 6              CHAIRMAN SIEBER:  Right, okay.
 7              MR. BOYCE:  So I wouldn't want to imply that they
 8    aren't, and in fact it worked at Indian Point.  We appraised
 9    it through it.  There were multiple degraded cornerstones. 
10    We went into reactive inspection.  Reactive inspection ended
11    up in coming up with a Red finding, but then at the end we
12    can address what the future plans may be and what the pluses
13    and minuses, because no one has really decided where it is
14    going.
15              One concept to think about is if for everything
16    that comes up you decide you want to have a set of
17    indicators, you pretty soon will be overwhelmed with the
18    number of indicators you could have.
19              One of the recommendations is have indicators for
20    steam generator leakage.  Right now we do have an
21    indicator -- that is 75 percent of tech specs.  Is the
22    problem the indicator or is the problem the tech spec is too
23    high?
24              Those are the kinds of questions that are getting
25    asked as a result of the Lessons Learned Task Force.
.                                                                13
 1              CHAIRMAN SIEBER:  Well, it would be good if you
 2    could review that for us, so with that what I would like to
 3    do is ask the Staff now to begin.
 4              MR. GILLESPIE:  Let me say a few words.  We are
 5    keeping everything as static as possible, so we are here to
 6    report we have had minimal change from the last time we
 7    talked to you.
 8              It has been difficult to do, by the way.  Don is
 9    going to come on first and talk about the most visible thing
10    that you have heard complaints about or seen issues on, and
11    that is on the PI for unavailability.  He can kind of give
12    you a status on what is happening there.
13              People didn't like the scram PI so we have got a
14    slightly different PI with a slightly different name but
15    with a similar focus and that is being trialed, and so Don
16    can address some of the PI things and then he's got to get
17    back over to the NEI people who couldn't come here because
18    they are going to talk to him over there on exactly some of
19    those issues.
20              The reason we have been keeping the program
21    static, if you remember one of the major criticisms of the
22    pilot program was it was a small sample and nothing happened
23    and so because nothing happened we couldn't judge, if you
24    would, in a severe case how the program would have reacted,
25    and so this first year of initial implementation has
.                                                                14
 1    actually been an expanded sample size to include the whole
 2    industry.
 3              We have a comprehensive program that got put in
 4    place about a month ago to collect information and data
 5    which really won't yield any results until about February or
 6    March.  We are now collecting information.
 7              We do have anecdotes and they are strictly
 8    anecdotes of various instances, and Doug will probably cover
 9    one in the SDP of where we applied it and where it appears
10    to actually quite honestly have worked, but we have kept it
11    very static, so we are not here to tell you of a lot of
12    drastic changes because if we changed in midstream we
13    wouldn't be able to appraise how the program was working and
14    we felt we needed the entire industry for a full year with
15    some stabilization in it, to then step back and say now what
16    would we change in an integral way, because the whole thing
17    is linked.
18              So you will be disappointed if you are thinking yo
19    are going to hear a lot of revolutionary changes because we
20    deliberately haven't done them but we do have them
21    backlogged but we want to see how to do it.
22              MR. LEITCH:  These changes that you are going to
23    talk about, are they changes since the full implementation
24    in April or prior to the full implementation?
25              MR. GILLESPIE:  Since the full implementation in
.                                                                15
 1    April.
 2              We have a Commission paper up, for example, on the
 3    Reactor Safety SDP.  As a result of a Quad Cities inspection
 4    it was fatally flawed, so that is the one fatal flaw that we
 5    found that caused us to go back to the Commission and say,
 6    hey, this one needs to be changed and we offered the
 7    Commission a temporary solution while we get with the
 8    stakeholders again and try to rework the right kind of
 9    ground.
10              That was the one major, major flaw that did come
11    out of it.  The others we have had some tweaks to improve
12    inspection report format, to resolve smaller questions
13    around the edges, and the team will kind of go through what
14    some of those were -- but no revolutionary changes.
15              We really have tried to keep the basic program
16    stable.
17              In inspection space we have not altered scope,
18    depth and frequency of inspections.  We have really tried to
19    maintain them and take the heat for where it is not perfect
20    until we can get a full year cycle in and be able to step
21    back.
22              With that -- Tom?
23              MR. BOYCE:  Well, actually, we are going to go
24    slightly out of order.  I was just going to present an
25    overview, but I think Don Hickman is going to go first and
.                                                                16
 1    talk about performance indicators, who will dive right into
 2    that so that Don can get back and meet with NEI and keep
 3    working some of these issues that we are trying to take care
 4    of.  Don?
 5              MR. GILLESPIE:  While Don is walking up, I think
 6    you are going to see that one of the problems that has
 7    really manifested itself with the PIs, and it happened with
 8    unavailability and in the scram PI and in some of the
 9    others, everyone seems to want to have a custom PI.  In
10    fact, PE&G -- Pacific Gas & Electric -- is coming in and
11    they are going to be talking individually to different
12    people because they don't like the new scram PI alternative
13    we are testing because for their facility seaweed twice a
14    year gets backed up in their screens.  They have to do a
15    rapid down-power and they sit there and wait.
16              So even the new PIs that we are trying have people
17    already before we have tried them saying they don't like
18    them, which means some places envision a future, and George
19    will remember this from an earlier presentation, at some
20    point we have to evolve to something that is going to likely
21    have to be more plant-specific, otherwise we are going to
22    get overwhelmed with exceptions, so I will give you that one
23    upfront.
24              You were right.  We knew we had to get there and
25    it is almost like it is starting to develop on how we might
.                                                                17
 1    do it and where are the first places that it might best be
 2    tried, so there is a longer term vision we still have in
 3    mind on that one.
 4              DR. APOSTOLAKIS:  In fact, I believe there is work
 5    by others in the Agency trying to develop plant-specific.
 6              MR. GILLESPIE:  The risk-based PIs or risk-based
 7    data --
 8              DR. APOSTOLAKIS:  Yes, yes.
 9              MR. GILLESPIE:  -- and that is again a three-year
10    vision.
11              DR. APOSTOLAKIS:  Right.
12              MR. GILLESPIE:  And so I think we are pretty much
13    in synch that there is something that we need to do now in
14    sort evolving in that direction.
15              DR. APOSTOLAKIS:  Okay, very good.
16              MR. HICKMAN:  Good morning.  I am Don Hickman.  I
17    am the Task Lead for Performance Indicators and Reactor
18    Oversight Process.
19              I am here today to talk a bit about the experience
20    we have gained in the last year with the pilot program and
21    now with initial implementation and some of the things that
22    we have done with the PIs.
23              We have a process that we call the Frequently
24    Asked Question process, perhaps a bit of a misnomer. 
25    Typically they are questions asked once that we address.
.                                                                18
 1              We have over 230 of those that we have officially
 2    answered and a number in line to be answered.  Many of those
 3    resulted in clarification for the guidelines.  A few
 4    resulted in some changes.  We recognize the need to make
 5    some changes to what we were doing.  We continue to get
 6    these and we will continue to get these for a number of
 7    reasons.
 8              The first bullet on my slide talks about the
 9    complicating factors that have generated a lot of these
10    questions.  One is the variety of plant designs.  An example
11    of that would be the post-accident recirc mode of what we
12    have been calling the residual heat removal system, which is
13    a typical Westinghouse design, and it turns out the CE
14    plants have a very different means for performing
15    post-accident recirc, and so we had to address that issue
16    and come up with a definition of how they should report the
17    equipment that performs that function.
18              Another variable is the tech specs, a lot of tech
19    spec requirements that vary from plant to plant.  An example
20    of that is the way that licensees measure reactor coolant
21    system leakage.  Some plants measure total leakage and
22    unidentified leakage.  Other plants measure identified
23    leakage and unidentified leakage.  Some PWRs include primary
24    to secondary leakage in the totals.  Some don't.
25              With Indian Point it just so happens that they did
.                                                                19
 1    include primary to secondary leakage in the totals so the PI
 2    for Indian Point did fall out the bottom.  In another plant
 3    that might not have happened.
 4              Another factor is a difference in operating
 5    procedures.  We came across this with regard to the scram
 6    with loss of normal heat removal indicator and what we said
 7    was that if the indicator would count those events where the
 8    normal heat removal path through the main condenser was lost
 9    and that path includes main feedwater, and we found at least
10    one plant where -- in a boiler -- where they have been
11    instructed to when they hit double-0 on a scram if HPSI and
12    RCSI start to, just let it go, or even if they don't start,
13    the feedwater pumps continue to run, to let the level fill
14    up until they hit the trip point and trip the main feedwater
15    pumps and that they should concentrate on the other scram
16    recovery actions, so that is something that they were
17    trained to do.  They would lose main feedwater and so we had
18    to address that issue -- should it count or should it not
19    count.
20              We told them that if that is their procedure and
21    that is the way they operate the plant that that would not
22    count.
23              Those are just some of the issues and, as I say,
24    there's a couple hundred of those.
25              Another factor here is the licensee response to
.                                                                20
 1    the program.  I guess as we might have expected, we have set
 2    these thresholds and so the licensees will set internal
 3    thresholds below those to ensure they don't exceed the
 4    threshold and go White.
 5              Of course, for every layer of management there is
 6    another lower threshold and what we wind up with is
 7    licensees who want to have zero events.  They want to keep
 8    the unplanned power changes.  They want to try to not have
 9    to count every unplanned power change, every safety system
10    function failure -- all of these kinds of events, and that
11    is a problem.
12              The program wasn't designed to work that way.  The
13    program was designed to provide this normal operating band
14    in which they could operate the plant and have a certain
15    number of events that we would leave to their corrective
16    action program and we would not get involved, so those kinds
17    of issues, this desire to minimize or reduce number of
18    events to zero, has generated a lot of questions as well.
19              Probably the most significant factor, however, is
20    the unintended consequences, and those are perceived
21    differently I think by different groups.
22              The scram issue, the fact that we count manual
23    scrams, and of course INPO has not done this in the years
24    that they have been collecting their performance indicators,
25    that rose to a very high level in industry.  A few high
.                                                                21
 1    level people are concerned about the unintended consequences
 2    of counting manual scrams and the effect it might have on
 3    operators, inhibiting them from performing a manual scram
 4    when necessary.
 5              We have not been terribly concerned with that.  We
 6    have not included manual scrams in the AEOD PIs, but we
 7    always track manual scrams.  We watched those over the
 8    years.  They have remained relatively constant.
 9              Even though the automatic scrams have come down
10    significantly, the manual scrams have come down only about
11    20 or 30 percent where the automatic scrams, as you know,
12    have down significantly more than that.
13              We will continue to watch the manual scrams.  We
14    really aren't terribly concerned about an operator not
15    manually scramming the reactor when he thinks it is
16    necessary, but we have to address the industry's concern, so
17    what we are doing is --
18              DR. APOSTOLAKIS:  I am not sure it is the
19    industry's concern, you know?
20              I have seen letters from very senior people
21    complaining about this, but then I have been in meetings
22    where actual operators are and other experienced people and
23    they just dismiss the issue, that the operators will never
24    think that way, so I don't know where the senior people got
25    this idea that this will inhibit the operators.
.                                                                22
 1              MR. GILLESPIE:  George, there is a difference, as
 2    we were going through this, talking to different senior
 3    people.  The industry is definitely not a monolith on this
 4    question.
 5              DR. APOSTOLAKIS:  Exactly.
 6              MR. GILLESPIE:  There were several very, very
 7    senior, very large utilities -- Southern Company was most
 8    notable -- that came in and said we object to this, so in
 9    this case I would not want to say we had a monolithic
10    industry.
11              DR. APOSTOLAKIS:  Exactly.
12              MR. GILLESPIE:  There was an industry concern in
13    one segment of the industry but then there was Dave Garchow,
14    people at other utilities, who said this is no big deal, our
15    guys are trained.  They are going to do the right thing and
16    you have given us three scrams.  It is not zero.
17              So there is a difference of opinion out there but
18    Don's stuck addressing it.
19              DR. APOSTOLAKIS:  And it also seems to me, and you
20    touched upon it, we cannot talk about removing manual scrams
21    from the PI without at the same time without finishing the
22    sentence talk about the numbers you have put forth.
23              That three has to change, in my view.  The numbers
24    were set, the way I understand it, having certain PIs in
25    mind.
.                                                                23
 1              MR. GILLESPIE:  Right.
 2              DR. APOSTOLAKIS:  Now if you start removing things
 3    are you keeping the three and the six and the 25 --
 4              MR. GILLESPIE:  Well, let me let Don -- those --
 5    one is normal operations, the other is risk-informed.  In
 6    this case normal operations is a better indicator than
 7    risk-informed maybe, but he's got a different alternative. 
 8    He can touch upon that.
 9              DR. APOSTOLAKIS:  Very good.
10              MR. HICKMAN:  You are absolutely right.
11              If we change the indicator, we have to address the
12    thresholds and we will do that.
13              DR. APOSTOLAKIS:  Sure.
14              MR. HICKMAN:  What we are doing right now is we
15    have begun a pilot program with 21 utilities -- or 21 plants
16    to try out the replacement indicator that's been agreed
17    upon.  It is called Unplanned Reactor Shutdowns.
18              This is what Frank was referring to earlier.  It,
19    unfortunately, has a time limit in it.  It talks about 15
20    minutes from the time of commencing the insertion of
21    negative reactivity until reactor shutdown.
22              The idea is to capture the same information that
23    we captured with the old definition, which said very clearly
24    manual and automatic scrams, but without using the word
25    "scram" -- that is the concern to the senior people in the
.                                                                24
 1    industry.
 2              We will collect the first data December 21st for
 3    two months and we will run it for a total of six months and
 4    we will look at what we have.  As Frank mentioned already,
 5    Pacific Gas & Electric and Diablo Canyon have complained
 6    that 15 minutes is too short.  On the other hand, we have
 7    received complaints from another utility who said 15 minutes
 8    is too long, so we are kind of stuck in here in the middle.
 9              My gut feeling is that whenever we have a time
10    limit we set ourselves up for problems and that we are going
11    to probably see that in this indicator.  It was the best
12    that we could come up with given the constraints and the
13    time we had to do it, so we will see what happens and we
14    will you involved.
15              MR. LEITCH:  Will that new indicator also include
16    therefore automatic scrams, since they are instantaneous?  I
17    mean they are less than 15 minutes obviously.
18              MR. HICKMAN:  Yes.  It is intended to capture all
19    automatic and manual scrams.  The concern was it might
20    capture a few others, but the intent was to try to make it
21    capture exactly the same data as the old definition.
22              DR. APOSTOLAKIS:  So it is 15 minutes now?
23              MR. HICKMAN:  That is the way that it is worded.
24              It says 15 minutes from the time of insertion of
25    reactivity until -- negative reactivity until you are shut
.                                                                25
 1    down.
 2              Oh -- that would also include -- we have two
 3    indicators that count scrams.  One is just the scrams, what
 4    used to be called unplanned reactor scrams, and the scrams
 5    with loss of normal heat removal.  Both of those have been
 6    changed to be worded the same way and we'll see what the
 7    results of that will be.
 8              The other indicator in the initiating events
 9    cornerstone was unplanned power changes, and we have always
10    had more concern about that indicator.
11              Again, it has numbers and the numbers start
12    causing problems.
13              It has the 72 hour rule, and the way the indicator
14    is worded is that what is important is whether the licensee
15    has planned the shutdown or not, and we use a 72 hour rule
16    which simply says that if 72 hours have elapsed from the
17    time you have identified an off-normal condition until you
18    begin the reactor shutdown, then we consider it planned.
19              Now it does not matter what degree of planning has
20    occurred during that time.  It is simply 72 hours.
21              The other aspect of that, the other numerical
22    aspect of that is that it talks about a greater than 20
23    percent power change.
24              We've had a number of questions, a number of
25    issues about both the 72 hours and the 20 percent.  We think
.                                                                26
 1    that there have been some cases where this indicator has
 2    influenced licensees to do things they would not have done
 3    otherwise, either to perhaps wait 72 hours and try to ride
 4    the problem out, or to come down less than 20 percent in
 5    power, see if they can fix it, and if they can, then they're
 6    off free, and if they can't, they'll have to come down
 7    further to fix it.
 8              Those are the kinds of issues that we're concerned
 9    about there.  We are discussing alternatives with the
10    Industry Working Group, NEI and their representatives from
11    the utilities, and we intend to pilot an alternate indicator
12    to replace the unplanned power changes.  We would hope that
13    that would start relatively soon.
14              I should mention that we have a formal process for
15    making changes to the program.  We've made some changes
16    early on that maybe weren't thought out completely, and in a
17    few cases, have caused a few more problems.
18              What we've tried to do now is to have a very
19    structured and formal process that would be very deliberate
20    so that the changes we make are thoroughly understood and
21    piloted, and we know -- we have some idea of how they're
22    going to work before we make the changes.
23              DR. LEITCH:  In the 72 hours, I would think that a
24    lot of times, a situation comes up, an elective power
25    reduction that could be done, say, within 24 hours, or could
.                                                                27
 1    be done beyond 72 hours.  Maybe the system operator makes
 2    that decision as far as wait till the weekend.
 3              So if this situation occurs on a Friday, he might
 4    say, well, might as well do it right now, although it could
 5    have been deferred for 72 hours.  Does that enter into this
 6    expression, that it could have been deferred for 72 hours? 
 7    Or is it actually deferred for 72 hours?
 8              MR. HICKMAN:  The guidelines say that the degree
 9    of planning is measured only by the 72 hours, and not by
10    whatever paperwork or anything else that you've done.  But
11    that issue has come up.
12              We had early in the pilot program, a licensee who
13    had a problem that had occurred in the past.  He had all the
14    paperwork to fix it then available.
15              He felt that he could go out and fix it right away
16    and it would be planned.  He already planned it.  So he
17    started the shutdown before 72 hours.  And what he was told
18    was that it's the 72 hours that's the determining factor,
19    not what planning you have done in the past.  That is an
20    issue.
21              There are many cases where they conceivably could
22    do it very well, have it well controlled and well planned in
23    much less than 72 hours.
24              MR. GILLESPIE:  This is one of those cases where
25    you have to keep in mind that the indicator isn't zero.
.                                                                28
 1              DR. LEITCH:  Right.
 2              MR. GILLESPIE:  It allows for this, but the
 3    institutions have driven themselves to say, well, we don't
 4    want to use that allowance, just in case something happens
 5    later.
 6              DR. LEITCH:  Sure.
 7              MR. GILLESPIE:  And so we've got a disconnect
 8    between the theory and how we thought it would apply and how
 9    the institutions are actually applying it and driving things
10    to zero inappropriately.
11              DR. LEITCH:  This could be an unintended
12    consequence if you defer maintenance for 72 hours and you're
13    all ready to go in 24, but it's an issue of managing the
14    indicators.  That's what happens with performance
15    indicators.  People manage the indicators.
16              MR. GILLESPIE:  And they're managing them because
17    it's a just-in-case; in case something comes up I haven't
18    anticipated in six months, I don't want to already be at one
19    or two.
20              CHAIRMAN SIEBER:  It seems to me that under some
21    circumstances, and indicator like that that people are
22    trying to manage, too, might be adverse to safety or good
23    operating practice.
24              MR. GILLESPIE:  It's interesting that the
25    inspectors haven't seen this, and we get it anecdotally, and
.                                                                29
 1    it's almost like every utility talks about another utility
 2    would do it.
 3              So, it's a tough one to get your arms around,
 4    because we don't have specific instances that we can sit
 5    down and analyze and say, well, does that mean the threshold
 6    should be adjusted, if, in a practical sense, there are two
 7    or three of these a year, shouldn't we make allowances for
 8    that?
 9              And that level of detail and discussion doesn't
10    seem to quite gel on this one yet.
11              CHAIRMAN SIEBER:  On the other hand, I don't think
12    that you specifically need examples of situations that have
13    occurred that have brought decisions one way or the other.
14              It seems to me that logic would indicate that
15    there might be or could be some situations where this would
16    be adverse to safety.
17              MR. GILLESPIE:  We agree, and what we're grappling
18    with is what's the alternative to it that still gets to the
19    safety meaning we are trying to get to, but allows the
20    plants to operate.
21              CHAIRMAN SIEBER:  I can think of instances in my
22    own career where we reduced reactor power to reduce
23    radiation dose because we had sent people into the
24    containment.  And, you know, to me that was the right thing
25    to do.
.                                                                30
 1              Under this system, you get penalized for it.
 2              MR. HICKMAN:  Another issue has been brought up by
 3    NEI that under deregulation, licensee may reduce power at
 4    night at low load times to fix minor problems so that they
 5    can be sure to operate before the peak load the next day.
 6              And this might change -- might result in more
 7    power changes.
 8              CHAIRMAN SIEBER:  And you have the other situation
 9    which we had frequently where the system operator would call
10    you and say, the voltage is too high on the system, and you
11    have to cut back.  And we did that every weekend.  I don't
12    know how that fits into the grand scheme of things, but, you
13    know, we followed the letter.
14              MR. HICKMAN:  Well, this indicator has a fairly
15    high threshold.  It's set at eight, and so we allow for it. 
16    That means you have to exceed eight to go white, so we have
17    allowed a significant number.
18              CHAIRMAN SIEBER:  That would take me eight weeks.
19              DR. BONACA:  It seems that the example you gave,
20    you know, operator coming down at night to fix something,
21    and it seems almost in the definition, 72 hours as part of
22    the definition, really is the problem, rather than the
23    concept.
24              DR. KRESS:  Particularly for things that routinely
25    they do a lot.  You know, it's the same process and if they
.                                                                31
 1    do it routinely, you could probably relax that 72 hours and
 2    still call it a planned power change.
 3              MR. GILLESPIE:  That's why we're talking to them. 
 4    Do you want to address the alternatives?
 5              MR. HICKMAN:  Well, this indicator, we like it
 6    because we benchmarked this two years ago, and it showed a
 7    pretty good correlation with watchlist plants, trend-letter
 8    plants, and -- but what we used at that time was the best
 9    information we had, which was from the monthly operating
10    reports.
11              And in the MORs, licensees had to tell us about
12    changes in average daily power level of greater than 20
13    percent from one day to the next.
14              That's actually what we benchmarked, and so that's
15    one of our proposals.  Now, we've got two of them, and
16    that's one that we go back to exactly what we benchmarked,
17    exactly what licensees had been reporting in the monthly
18    operating reports for many years.
19              And that's changes in average daily power level. 
20    And we would count those any time that those occurred.
21              Those are a little more difficult to manipulate,
22    because we're not talking about a rapid power change and
23    then coming right back; we're talking about being down in
24    power long enough to make a change of 20 percent in average
25    power.
.                                                                32
 1              DR. KRESS:  That would smooth out your time
 2    problem.
 3              MR. HICKMAN:  Yes, yes.  That would solve a few of
 4    the problems.
 5              DR. LEITCH:  Does it depend on whether you do
 6    maintenance during that power reduction?  In other words,
 7    say, on a Sunday night, you just come down 20 percent
 8    because of load-carrying situation and that's one situation?
 9              But say you're in that period of time and you also
10    decide, well, the water box differential looks a little
11    high, and we'll clean the tube sheet while we're down.  Does
12    that make a difference as to whether that's an event or not
13    an event?
14              MR. HICKMAN:  It could.  In the monthly operating
15    reports, they had to report any change in average daily
16    power level, but for each one, they had to indicate whether
17    it was forced or scheduled.
18              Now, if it was an equipment problem that forced
19    them to come down, as opposed to a scheduled power
20    reduction, we could tell the difference.
21              And actually, what we benchmarked was just the
22    forced power reductions, so we could do it either way; we
23    count them all, or we could identify just the forced power
24    reductions and count those.
25              DR. LEITCH:  I see.
.                                                                33
 1              MR. HICKMAN:  Safety system unavailability is
 2    probably the most contentious indicator.  It's got the most
 3    issues associated with it.
 4              We borrowed this one from WANO for a couple of
 5    reasons:  It was available; licensees had been reporting it
 6    for many years to WANO.
 7              But as we got into it, we found some problems.  A
 8    lot of these, I think, had to do with the closer scrutiny
 9    that we were giving to the reported data with the inspectors
10    doing verification inspection.
11              So licensees became much more concerned about
12    getting this data accurate, and so a number of -- many of
13    the licensees have changed the way they calculate the -- the
14    way they collect the data, to make it conform.
15              Probably the two biggest issues in the indicator
16    are the use of fault exposure hours.  WANO does not have an
17    unreliability indicator.  So they got some unreliability
18    information into the unavailability indicator by counting
19    fault exposure hours for surveillance test failures.
20              And that causes a problem.  We've had three
21    licensee in initial implementation fail an 18-month
22    surveillance test, and so if you take half the time, they
23    have nine months of unavailable hours for the system.
24              And it's really not representative, I guess, of
25    the risk significance of that failure.  In some cases, they
.                                                                34
 1    could have recovered relatively easily.
 2              That's an issue; what do we do with fault exposure
 3    hours?
 4              The other significant issue with this indicator
 5    has to do with -- oh, the WANO indicator allows licensees
 6    not to count unavailable hours when the train is not
 7    required, even though the system may be required, the
 8    function may be required.
 9              And an example of that is -- the best example is
10    emergency diesel generators when you're shut down.  If
11    you're in cold shutdown or refueling or de-fueled, you're
12    only required to have one diesel.
13              Therefore, the other diesel can be taken out and
14    you can do anything with it that you want to, and you don't
15    have to count it.
16              That means that licensees who do diesel generator
17    maintenance while shut down, don't have to count the biggest
18    hunk of the unavailable hours of the diesel, the overhaul.
19              But those that are doing it online would have to
20    count it by those rules.  Within the last few years in the
21    90s, we have issued a couple of dozen of extended AOT tech
22    spec changes to extend the AOT to allow licensees to do that
23    work online, and they have justified that with a risk
24    analysis and the requirement in Reg Guide 1.177 says on the
25    order of five times ten to the minus seventh, incremental
.                                                                35
 1    conditional core damage probability.
 2              And they have shown that, and the argument is, if
 3    it's no risk here to do it online, then when I'm shutdown,
 4    why would I have to count it here and not have to count it
 5    there?
 6              We've addressed that issue and we have made an
 7    exemption for people who have shown by risk analysis that
 8    there is, in fact, no difference in this.
 9              But that's opened the doors and we've gotten all
10    kinds of other requests now for other things, and once you
11    open the doors, that --
12              DR. KRESS:  Well, that gets you back to this
13    plant-specific PI, I think.
14              MR. HICKMAN:  Exactly.
15              DR. KRESS:  And I think that's what it's going to
16    end up being.
17              MR. GILLESPIE:  It's surprising, and what we've
18    done in trying to -- we talked about this way, way back in
19    the beginning -- in trying to apply a consistent oversight
20    process and consistent measures to all the utilities, what
21    it's done is started to expose the inconsistencies in the
22    requirements themselves.
23              Some people have this tech spec; some people have
24    that tech spec; some people have this leakage, some people
25    have that leakage, and what it's doing is, the warts are
.                                                                36
 1    coming -- the warts in our requirements are coming out via
 2    the oversight program to some extent.
 3              And now we're trying to deal with the warts, and I
 4    think we have to be very careful that we don't deal with
 5    them incorrectly; that if the underlying problem is in the
 6    requirement, that we don't do too much adjusting on
 7    oversight and not get at the right problem.
 8              So there's a caution there and a real reason why
 9    we have a very deliberate slow process for change.  It's an
10    interesting measure to come up, and we always thought the
11    requirements would now have to consistently catch up with
12    us.
13              And they really are and that kind of insight is
14    starting to come out of this.
15              DR. APOSTOLAKIS:  Also, I think you will remember
16    that we've had the problem in the past with the definition
17    of unavailability.
18              The industry simply takes the down time divided by
19    the total time, but in PRA's that's not the unavailability;
20    that's only a component, and it includes the unreliability,
21    I believe, that the industry has.
22              And for some reason, people don't want to develop
23    a consistent set of definitions.  I don't know why, but --
24              MR. HICKMAN:  And that's a big issue within
25    industry, too.  They want to have -- to report one set of
.                                                                37
 1    data, one time, but we have a variety of different
 2    definitions between WANO --
 3              DR. APOSTOLAKIS:  Do you have any idea why they
 4    don't want a uniform set of definitions?  I don't understand
 5    that.
 6              I had a telephone conversation with some folks at
 7    Research two or three weeks ago, and they were adamant that
 8    we shouldn't do it.
 9              It's a mystery to me why we shouldn't even try.  I
10    guess these definitions are so entrenched within the
11    industry that if you try to change them, it would be a major
12    task, so it's probably easier to change the PRA, guys, and
13    say that at least in the PRA, we start talking about it.
14              But there's such a reluctance to have a uniform
15    definition that we had it also with an appendix that was in
16    one of the NEI documents, I remember, related to the
17    maintenance rule, where the definition, again, was
18    inconsistent with --
19              DR. BONACA:  And we recommended, in fact, that
20    there would be a definition provided, and that -- because it
21    was brought to us as a presentation that the lack of a
22    consistent definition, in fact undermines the benefits of
23    certain rule implementation for station blackout.
24              MR. GILLESPIE:  Yes, in this case there now is a
25    major industry effort with us, but actually the industry is
.                                                                38
 1    deriving it to come up with a uniform definition.  As it
 2    happens, the INPO-WANO piece of this is more awkward for
 3    them than the NRC piece.  In fact, the maintenance rule and
 4    the oversight definition are -- we're going to get the words
 5    changed, but they're going to be exactly the same, and
 6    there's no problem on the NRC side of things with making
 7    those two exactly the same.
 8              And the industry likes our definition better than
 9    they like INPO's and WANO's.  So they're going against their
10    own institutional momentum.
11              They were participants in developing ours, much
12    more recently than the INPO and WANO development took place,
13    and so now they're trying to harmonize the other interests
14    to the industry and the NRC's.
15              DR. APOSTOLAKIS:  So your definition, then, is
16    what, of unavailability?
17              MR. HICKMAN:  We use the definition that came out
18    of the WANO indicator, which says that the unavailable
19    hours, divided by the hours the train was required to be
20    operable.
21              DR. APOSTOLAKIS:  So it still does not include the
22    probability of the failure to respond?
23              MR. HICKMAN:  It does not, no.
24              DR. APOSTOLAKIS:  Even if it's available.
25              MR. HICKMAN:  We feel -- I personally feel a great
.                                                                39
 1    responsibility to try to work on common definitions in
 2    everything that we do in this process.
 3              And we've done that in a few areas, with the
 4    maintenance rule people, with the PRA people, people in
 5    Research, with the reporting requirements people, and in our
 6    program in a few areas.
 7              This is a big one.  The industry is pushing all of
 8    us to do something here.  The real difficulty is working
 9    with WANO.
10              INPO represents the U.S. utilities, and we can
11    work with them better than we can work with WANO where they
12    have the influence from the European and Japanese plants.
13              That is more difficult, but it's something that we
14    will pursue.  We need to pursue that.
15              DR. APOSTOLAKIS:  Now, if you have an interim
16    between tests of T, and it's unavailable for tau, then you
17    take one-half of tau over T or tau over T for the
18    unavailability?  Do you remember?  If you don't remember,
19    it's okay.
20              MR. HICKMAN:  We used T over two in those cases
21    where you have no knowledge of when it failed.  You only
22    know that I ran the test and it didn't work.  The last time
23    I know that it worked was 18 months ago.
24              DR. APOSTOLAKIS:  So you divide by two?
25              MR. HICKMAN:  So you divide it by two.
.                                                                40
 1              DR. APOSTOLAKIS:  Now, the division by two is to
 2    -- if you average the unavailability over time, and you do
 3    it, you know, in the PRA context for a long time horizon,
 4    then it makes sense to divide by two.
 5              But if you go actually and find that this
 6    component was down during this interval for a certain period
 7    of time, I don't know that you need to divide by two.  Why
 8    should I take the average?  I can find the exact
 9    availability.  It was down for 16 hours and the interval,
10    you know, was 700 hours.
11              DR. KRESS:  And if you don't know how long the
12    interval was, you'd want to multiply it by something, rather
13    than divide.
14              DR. APOSTOLAKIS:  I don't know why you have to
15    divide by two.
16              DR. KRESS:  I don't either.
17              DR. APOSTOLAKIS:  It's these details that somehow
18    we -- I mean, this comes from a PRA.  We don't like the
19    definition of PRA, so PRA is different.  But then we see the
20    one over two in the PRA, we like that, we take it and use
21    it.
22              This is like picking and choosing from
23    risk-informed regulations.  We use what we like.
24              I think we need to have a group sit down and write
25    down what exactly we mean by all these things, and why we
.                                                                41
 1    divide by two, why we don't divide by two, and have a common
 2    understanding of these things.
 3              MR. HICKMAN:  Well, you're right and you hit on a
 4    good point, that the T over two tends to dominate the
 5    statistics.
 6              DR. APOSTOLAKIS:  Sure.
 7              MR. HICKMAN:  It only takes one failure, and it
 8    dominates the statistics.
 9              DR. APOSTOLAKIS:  But its purpose was different. 
10    It was the long-term unavailability, you know, to eliminate
11    transients.
12              MR. HICKMAN:  The tradeoff comes with the amount
13    of root cause analysis you have to do.  How much time and
14    effort do I want to put into this failure to try to figure
15    out exactly why it happened, so that I can determine with a
16    high degree of probability, when the failure actually
17    occurred, rather than take in the T over two.
18              We've had licensees go to great lengths, trying to
19    do that.
20              DR. APOSTOLAKIS:  Were they successful?
21              MR. HICKMAN:  It was a random resistor failure. 
22    They sent it to a lab to try to determine the time of
23    failure of the resistor, spent quite a bit of money, spent
24    quite a bit of time, and they came up with what they thought
25    was the most probable time.
.                                                                42
 1              They had to de-energize the circuit, and when they
 2    re-energized it, they think that that's probably when it
 3    blew, and they were probably right, but they couldn't prove
 4    it, of course.
 5              MR. GILLESPIE:  George, this is one where Pat
 6    Baranowski and his team are supporting us to try to come up
 7    with the reliability indicators of some kind, so that we can
 8    get rid of this term.  No one feels comfortable with the
 9    term, versus risk-based PIs in total.  He had a near-term
10    deliverable to us to try to see if he could come up with a
11    scheme on reliability.
12              So we're hoping we come up with that, and then
13    we're kind of more in tune with the real number we want to
14    look at, how reliable is it when you go to turn it on?
15              The other piece that's interesting is that if the
16    component is that important, why do we let the surveillance
17    interval be that long?
18              Well, because the component probably isn't all
19    that important.  So that's another aspect that kind of gets
20    into are we measuring the right thing, are we judging it the
21    right way?
22              DR. LEITCH:  In the area of unintended
23    consequences, if there's too much pressure to drive down
24    unavailability, it can have an adverse effect on
25    reliability, a lot of this unavailability is for maintenance
.                                                                43
 1    on systems.
 2              MR. HICKMAN:  You're right.
 3              DR. LEITCH:  If there's too much pressure to not
 4    do that, to push that out into the future, that can have a
 5    negative impact on reliability.
 6              MR. HICKMAN:  Yes, industry has mentioned that. 
 7    Of course, this is an indicator where zero is not good.
 8              DR. LEITCH:  Right.
 9              MR. HICKMAN:  You don't want to be zero,
10    necessarily.  You want to have some preventive maintenance,
11    and we understand that.
12              We used the WANO data from the last three years to
13    set the thresholds.  When we got into the program, licensees
14    began to realize that they had not been reporting completely
15    to WANO.
16              So the thresholds may or may not be set
17    appropriately.  What we've told them is, we will continue to
18    monitor the thresholds, and if we see more up-to-date, more
19    realistic reporting pushing the thresholds, we'll look to
20    see what we can do.
21              But we need the data.  We don't know what
22    threshold to set if we don't have the right data.
23              DR. LEITCH:  We're not trying to drive this to
24    zero.
25              MR. HICKMAN:  Definitely not.  In fact, WANO, when
.                                                                44
 1    they set their goals for that, they have a lower goal and an
 2    upper goal, and they set a band.
 3              DR. LEITCH:  Right.
 4              MR. HICKMAN:  Let me go on to the next one, the
 5    safety system functional failures.  This is an indicator,
 6    copied exactly after the AEOD indicator called Safety System
 7    Failures.  We've been using that one for 15 years.
 8              It correlated very well with the watchlist plants
 9    and the declining trend plants.  It picked out every one of
10    them.
11              We're trying to reproduce the same thing with
12    licensees, making that determination on their own.  We had
13    the Idaho lab reading LARs and coding these events.
14              And now the licensees are doing that.  Probably
15    the biggest issue in this indicator is the reactor core
16    isolation cooling system.
17              That system is reportable by about a third of the
18    boiling water reactors.  The others have not in the past
19    been reporting it.
20              It is a risk-important system.  It shows up and
21    ranks very highly.  And, in fact, at some of the plants,
22    it's right behind emergency diesel generators and
23    high-pressure coolant injection.
24              So on a risk-informed program like ours, we need
25    to count RCSI, but in the past, we would not know about all
.                                                                45
 1    the RCSI failures because they weren't in LARs.  As I said,
 2    only about a third of the plants put them in there.
 3              We're now discussing with NEI in our meetings
 4    about adding RCSI to the list for reporting safety system
 5    functional failures.
 6              The way it's worded now is, to capture exactly
 7    what we did in the past, we ask them to report to our
 8    program, any of the failures that required reporting in
 9    accordance with 10 CFR 50.73, the section that says any
10    event or condition that could prevent fulfillment of a
11    safety function.
12              But some licensees didn't report RCSI failures
13    under that at all.  So we're adding that one or we would
14    like to add that one.
15              MR. GILLESPIE:  Again, an example where you try to
16    do something uniform, the inconsistencies of the past start
17    showing up; that some facilities -- I don't know if you're
18    familiar with this one, but some facilities did not account
19    or did not need to take RCSI into account in their accident
20    analysis, so they said it's not a safety system we need to
21    report, and other facilities did.
22              Well, it's safety-significant at all those
23    facilities, independent of how they did that licensing
24    analysis.  So again, so Don's caught betwixt and between
25    now.
.                                                                46
 1              DR. KRESS:  That one strikes me as not being the
 2    same character as the other unintended consequences.
 3              MR. GILLESPIE:  This one doesn't have a down side.
 4              MR. HICKMAN:  The last bullet is on the problem
 5    PIs.  All of the indicators in those two cornerstones, the
 6    barrier integrity and the physical protection, are under
 7    review.
 8              We eliminated the containment leakage performance
 9    indicator that used to be in the barrier integrity
10    cornerstone, for the reason that Frank has talked about
11    here; that is that the requirements to measure and record
12    containment leakage varied so much, and in some cases, they
13    were not even required to report as-found leakage, only
14    as-left, and, of course, as-left had to be good or they
15    couldn't start up.  So it didn't tell us a whole lot.
16              We have similar kinds of issues with RCS activity
17    and RCS leakage.  They're measured different ways, with
18    different procedures, over different time intervals and all
19    of this.
20              And the activity measurement is something we
21    expected would tell us something about integrity of the fuel
22    clad barrier.
23              But, in fact, it said it ran white at 50 percent
24    and white/yellow at 100 percent of the tech spec limit, and
25    what we're seeing is that when you have a significant number
.                                                                47
 1    of fuel pins leaking, that you don't get very close to 50
 2    percent.  You maybe get ten or 15 percent.
 3              So the threshold is set pretty high.  And if you
 4    do have leakers, you're going to do whatever you can to
 5    minimize that by maybe reducing flux or making slow power
 6    changes.  You try to manage that and try to drive that
 7    number down.
 8              So that indicator, we don't think tells us a whole
 9    lot.
10              And I already mentioned about the RCS leakage, how
11    it's measured differently at different places, and some
12    things are included at one plant and not at another.
13              The physical protection cornerstone --
14              DR. KRESS:  On the leakage, on the barrier
15    integrity, do you have an indicator -- I forget what they
16    all are, but do you have one on the unfiltered in-leakage
17    into the control room envelope?
18              MR. HICKMAN:  No.
19              DR. KRESS:  That's not one of them?
20              MR. HICKMAN:  No, that's not.
21              DR. KRESS:  Okay.
22              DR. WALLIS:  It does seem though that leakage is
23    an important thing to know about, but because it's difficult
24    to do, you have to think more about how to do it.
25              MR. HICKMAN:  I believe that we could make a
.                                                                48
 1    useful indicator out of RCS leakage, but we don't have it
 2    now.  Right now, it measures only one of three parameters
 3    that could cause the plant to have to shut down.
 4              We measure either total or identified leakage, and
 5    we had one plant that had that to zero at one time.  They
 6    had some unidentified leakage that was slowly going up.
 7              It never got really close to the tech spec limit,
 8    but if it had, we would have had a PI that showed zero, and
 9    then the plant shuts down because of RCS leakage.
10              So it's not structured very well, and perhaps if
11    we restructure it, we can get something.
12              DR. WALLIS:  That doesn't mean to say you should
13    throw away the idea.
14              MR. HICKMAN:  No, I agree.
15              CHAIRMAN SIEBER:  It seems to me that the Staff
16    about 10 or 15 years ago had an inspection module on
17    leakage, RCS leakage, and in the process of applying that
18    inspection module to PWR plants, I think a lot of utilities
19    tried to adapt their calculational methods so that they
20    would be consistent with the NRC guidance.
21              Is there an opportunity to go and sort of redo
22    that or inform licensees as to what the Staff's position is
23    on calculating leakage?
24              MR. HICKMAN:  That might be a useful thing to do. 
25    I'm not really familiar, I guess, with what you're talking
.                                                                49
 1    about.
 2              CHAIRMAN SIEBER:  Yes, it goes back quite a number
 3    of years.
 4              MR. HICKMAN:  That may be something we should look
 5    at.
 6              DR. SHACK:  Those leakage requirements are in
 7    their tech specs, so it's not simply a matter of issuing the
 8    guide.  I mean, you literally have to change the tech spec.
 9              MR. GILLESPIE:  Yes, and I think this is one of
10    the things that's going to come out of the IP-2 lessons
11    learned report.  One of the recommendations was that we
12    should develop some PIs for steam generator leakage.
13              Well, then you have to ask why?  Well, because
14    everyone reports it all over the place right now, by tech
15    spec.  It's different ways, as Don said.
16              So then I've challenged this, and my question is,
17    if the problem is the reporting requirement and the tech
18    specs, then let's fix that problem and not have a de facto
19    fix through PIs.  If we have a reporting requirement that's
20    not right, let's fix the reporting requirement and don't do
21    a de facto fix by manipulating it through oversight.
22              And so that's kind of my question that I have on
23    the table, and we're all going to be looking at those kinds
24    of questions.  Indian Point brought it to a head.  What's
25    the right leakage number?  Is it consistent?  Do we have
.                                                                50
 1    apples and apples?
 2              And the answer to all of those things was no.
 3              We shouldn't fix it the wrong way, and I think
 4    that is going to be very important.  So this is the going to
 5    be the first test of can we fix it the right way in the next
 6    year or so.
 7              DR. LEITCH:  I would think in some cases, plant
 8    instrumentation may be different plant-to-plant, so it may
 9    not be just as easy as --
10              MR. GILLESPIE:  Oh, you may not be able to monitor
11    what -- the right parameters.
12              DR. LEITCH:  Exactly.  Yeah, right, exactly.
13              DR. KRESS:  This barrier integrity cornerstone
14    seems to be a lot different animal than all your other PIs. 
15    You don't count numbers of incidences, you count amounts of
16    leakage or amounts of activity, or how close you approach
17    some figure of merit like a PTS or something.
18              It seems to me like all the plants are going
19    through some sort of process of leakage occasionally and
20    fatigue on the things, and approaching some sort of
21    degeneration of barriers to some extent, aging.  It seems to
22    me like when you are measuring performance of the plant, you
23    shouldn't be asking how many times it leaks or how much
24    leaks, you should be asking what the licensee did about it. 
25    What was their response to it?
.                                                                51
 1              Is there some possibility of changing the nature
 2    of those PIs so that it really looks at -- I think you used
 3    to look at the response to it, how did they react to it. 
 4    Did they find it?  Did they find it in time?  Did they shut
 5    down and fix it?
 6              DR. APOSTOLAKIS:  I thought that was in the action
 7    matrix.  Was that in the action matrix, what the licensee
 8    proposed to do?
 9              MR. GILLESPIE:  Yeah.  The answer to both
10    questions is yes.  And if you go back to the first time we
11    came and talked about this PI, the PIs for containment are
12    kind of like a different animal.  It is almost like a
13    measure of how much you love your containment process, okay,
14    it really is.  It is how much do you love it on a day-to-day
15    basis.  The real risk issue is, do I have a big hole and is
16    the big hole going to be there when the accident occurs?
17              DR. KRESS:  Big holes you don't know about?
18              MR. GILLESPIE:  Yeah, big holes I don't know about
19    so I can't measure them.  And we haven't gotten there, the
20    damper that is open.  In fact, 10 years ago we used to have
21    a lot of dampers left open and valve lineups.  We are
22    actually not seeing a lot of those now.  They love their
23    containments more now and they seem to be more careful about
24    those things.
25              But, yeah, it is the ongoing -- what is the
.                                                                52
 1    ongoing measure to make sure that I don't have a hole in my
 2    containment that would be there in an accident.  The only
 3    containment that gives us an easy answer to that is probably
 4    the sub-atmospherics, the couple that there are.  Because if
 5    you can maintain a vacuum, you are probably okay.
 6              We don't have a similar parameter for the other
 7    containments.  In fact, that would be the ideal parameter
 8    would be to say if you can maintain some sense of
 9    atmospheric differential, the pressure inside and outside of
10    containment, I don't have a big hole, I am in good safety
11    space.  We haven't gotten there, though.
12              DR. KRESS:  Well, let me ask you a question about
13    that.  Dana and I are familiar with the big tanks up at
14    Hanford, and one of the things they do is measure the
15    diurnal pressure changes outside and inside, and there is a
16    lag between those, and the lag has to do with the leakage,
17    how fast, how much they are leaking.  Is that a possibility
18    for containments?  I mean you would have a continuous, sort
19    of a continuous measure, rather than a vacuum.  It is the
20    same thing as looking at --
21              MR. GILLESPIE:  I honestly can say I raised that
22    and was basically ignored by my staff, early on.
23              DR. KRESS:  Shame, shame on them.
24              [Laughter.]
25              MR. GILLESPIE:  No, I did raise that, because
.                                                                53
 1    everyone measures pressure, they have got the inches of
 2    mercury, water, whatever, inside and outside.  I would like
 3    to still keep that on the table, because to me that is a
 4    real safety measure, and on a continuous basis, so I have
 5    some integrity in my containment.
 6              I have talked to Steve Floyd at NEI about it on
 7    several occasions and he keeps kind of saying, well, I will
 8    ask about it.  But then the first thing someone is going to
 9    ask, is that a safety grade barometer?  And I think there
10    are some people --
11              DR. KRESS:  Is this also a backfit?
12              MR. GILLESPIE:  Is this also a backfit?  So there
13    are some other institutional issues.  But I think that is
14    the right thing we should be getting to.  And I haven't
15    quit, I am very patient.
16              DR. WALLIS:  There is a great virtue of
17    simplicity.
18              MR. GILLESPIE:  Yeah.
19              DR. KRESS:  Can we help you?
20              MR. GILLESPIE:  Well, I think it is there.  I
21    think the guys are going to be -- that we will be talking
22    more about it, and just see if we can get there, because for
23    the oversight process, it doesn't have to be safety grade. 
24    It just has to be realistic.
25              CHAIRMAN SIEBER:  It would seem to me, though,
.                                                                54
 1    that there's a lot of things that are going on in
 2    containment.  You know, it is like RCS leakage, when you go
 3    to measure that, you have to hold the plant steady.  You
 4    can't be changing power, and you meter everything and then
 5    you come up with a calculation.
 6              On the other hand, you know, the biggest energy
 7    input into containment is the ambient temperature, the heat
 8    input, and that changes from time to time depending on plant
 9    power level and all that, and that would overwhelm some
10    subtle change that is caused by the sun shining or not
11    shining.
12              We used to be able to tell when you turned the
13    lights on in containment, that would change the pressure.
14              MR. GILLESPIE:  But, see, that could be a good
15    enough indicator.  I think Joe Murphy told me one time many,
16    many years ago that if you have anything less than about a
17    five inch diameter hole, you don't really have a problem. 
18    It was interesting that, in fact, you might get more clean
19    energy out early in the accident and may, in fact, be better
20    off.
21              So there are some concepts here.  But the idea --
22    you don't care what the pressure is, what you are trying to
23    do is see, is there always a difference?  Because it is that
24    difference in pressure that says you are not communicating
25    adversely with the outside environment.  I kind of like it,
.                                                                55
 1    but I haven't gotten anyone to totally buy in on it yet, but
 2    I haven't quit.
 3              CHAIRMAN SIEBER:  I have worked mostly with
 4    sub-atmospheric plants.  It was very easy to tell what was
 5    going on in containment.
 6              DR. LEITCH:  In a more general sense, though, it
 7    seems to me as though a lot of these performance indicators
 8    are really lagging indicators.  And I guess what I think we
 9    need to be more concerned about is a licensee's performance. 
10    And I think that is what you get at when you say, do they
11    love the containment, do they love a lot of other things as
12    well.
13              And I guess I am concerned that I don't see any
14    performance indicators on some of these cross-cutting
15    issues, which I think are more of an indication of -- more
16    of an anticipatory indication of licensee performance.  I am
17    thinking about, I think there are three cross-cutting
18    issues, human performance, safety culture, and corrective
19    action programs.  And I guess what I am wondering, is there
20    any intention of developing performance indicators on some
21    of these things?
22              I mean I think the licensee's corrective action
23    program is a very, very important part of this, and it
24    addresses some of what Dr. Kress was saying earlier, in
25    other words, how thorough has the licensee's corrective
.                                                                56
 1    action been, how prompt has it been?  And there are a lot of
 2    ways that that performance could be measured.  But is there
 3    any thought of doing that type of thing so that we try to
 4    get in an anticipatory mode rather than a lagging mode?
 5              MR. GILLESPIE:  Well, I have to question, what are
 6    we trying to anticipate?  Because one of the underlying
 7    premises of the program was our thresholds are set low
 8    enough that something is going to show up.  Crossing a white
 9    threshold is not the end of the world, yet it has become the
10    definition of significant.  It is not, in fact, safety
11    significant.
12              So our indication, and I give you an example,
13    Kewaunee, Kewaunee had problems with the sirens, they went
14    yellow.  The result of that was that the sirens had a
15    problem they didn't put in the corrective action program, it
16    had not been dealt with in a timely manner.  A reactor
17    inspection took place, identified the corrective action
18    program.  And, in fact, that site ended up going outside
19    their system to get help to improve the corrective action
20    program.
21              Which was one of the anecdotal cases which, to us,
22    kind of said, well, you know, our system maybe kind of
23    works.  It wasn't that it was the corrective action program
24    for the sirens, it was the plant corrective action program,
25    that when we reacted to that indicator, was the broken
.                                                                57
 1    piece.
 2              DR. LEITCH:  Right.  But I guess what I am saying
 3    is you had direct indications that performance indicators of
 4    the health of the corrective action program, wouldn't that
 5    be more of an anticipatory thing than waiting to see that
 6    the sirens didn't work and trace it back to the corrective
 7    action program?
 8              MR. GILLESPIE:  Well, at some point there is the
 9    threshold between the regulator and the guy running the
10    plant, and that is what our program is really groping for. 
11    And we want an indication that we are protecting the health
12    and safety of the public.  And we are not trying to
13    anticipate that.
14              DR. LEITCH:  You can only do that in the lagging
15    sense.
16              MR. GILLESPIE:  Well, lagging --
17              DR. BONACA:  But you are inspecting the corrective
18    action program.
19              MR. GILLESPIE:  We are inspecting it.
20              DR. KRESS:  Yeah, you take care of that end on the
21    inspection.
22              MR. GILLESPIE:  In fact, we have an annual
23    inspection that runs around 240 hours, which is actually a
24    lot of man-weeks, and 10 to 15 percent of every inspection
25    procedure focuses on the corrective action programs in the
.                                                                58
 1    area of the procedure.  So it is the most inspected area we
 2    have.
 3              DR. KRESS:  And those are subjective judgments,
 4    and I think the principles of these PIs here was to try to
 5    get that subjectivity out of it as much as you could.
 6              MR. HICKMAN:  Yeah, but it is not all out.
 7              DR. KRESS:  You can't get it all out.
 8              MR. HICKMAN:  I mean the biggest increment of our
 9    program is actually individually focused on corrective
10    action programs.  Now, do we have a measure for it?  No.  Do
11    we have an insight into it?  Yes.
12              DR. BONACA:  And you do monitor for that, for
13    example, the number of condition reports per year?
14              MR. HICKMAN:  Yeah.
15              DR. BONACA:  And you are looking at the fraction
16    of those which are in the top tier.
17              MR. HICKMAN:  And how many are repeats, which
18    systems they affect.
19              DR. BONACA:  And the corrective actions that the
20    utility takes for the corrective action program, actual
21    improvements and so on?
22              MR. HICKMAN:  Yes.
23              DR. BONACA:  So you do monitor that?
24              MR. HICKMAN:  Yeah.  But we don't have a measure
25    for it, that's true.
.                                                                59
 1              DR. LEITCH:  And a lot of those types that you
 2    just mentioned, repeats, age of corrective actions and so
 3    forth, would lend themselves to a performance indicator.
 4              MR. GILLESPIE:  And then you have to have all the
 5    utilities define their aging process and their priority
 6    system the same way.  And I am just giving you a sense that
 7    it is probably the biggest backfit we could ever try.  I
 8    mean that is getting right to the heart of how people manage
 9    their facility.  And to try to regulate at that level, I am
10    not sure we would be successful.
11              Right now we say you have to have a corrective
12    action program, you have to follow Appendix B.  There is a
13    lot of freedom utility-to-utility within those regulations.
14              I am not saying it wouldn't be a neat idea to do,
15    but as a practical sense, and within our mission, I am not
16    sure really how to do it.
17              DR. BONACA:  More and more it is becoming, when
18    look for license renewal, so many commitments are going back
19    to the corrective action program.  And so it becomes a more
20    and more important tool for the regulator, too, I mean not
21    only for the licensee.  I understand it is a management tool
22    for the licensee, but so many of the commitments to the
23    regulator are made through it, it is becoming really a
24    shared system it seems to me.
25              MR. GILLESPIE:  Yeah, I agree, and it dominates
.                                                                60
 1    the inspection piece, we just don't have a PI for it.
 2              DR. BONACA:  No, but the question I had is, since
 3    the inspection also ends up with the color grades and so on
 4    so forth, so that also should have some visibility, right,
 5    in the process?
 6              MR. GILLESPIE:  Oh, it does.  If you look at our
 7    web page and look right underneath the little graphic that
 8    has got the cornerstones, there is the PIs, and then the
 9    next line is the inspection results.  And it has got the
10    latest color, the latest applicable color under each one.
11              DR. APOSTOLAKIS:  Isn't the concept of a leading
12    indicator a relative one?
13              MR. GILLESPIE:  It is relative to what you are
14    trying to indicate, George.  And that is what I am really
15    saying is what we are trying to indicate.
16              DR. APOSTOLAKIS:  All of these are really the
17    performance indicators, are leading, aren't they?
18              MR. GILLESPIE:  We think these are leading in the
19    sense of public protection.
20              DR. APOSTOLAKIS:  Right.  That is my
21    understanding.
22              MR. GILLESPIE:  But they aren't leading in the
23    sense of economic protection for getting an indicator that
24    turns yellow or an indicator that turns white, or an
25    inspection finding that is white.
.                                                                61
 1              DR. APOSTOLAKIS:  That's right.
 2              MR. GILLESPIE:  And what ramifications that may
 3    have on the utility, that is their responsibility.  So it is
 4    -- that is what I am saying, it is exactly -- it is relative
 5    to what you think you are protecting from.
 6              DR. APOSTOLAKIS:  Now, if you wanted to have the
 7    indicators that Mr. Leitch talked about, then it seems to me
 8    you have to rely on the significance determination process a
 9    lot, because you would be talking about programs.  So if you
10    find something, then you go through the SDP to determine the
11    significance of that, isn't that true?
12              MR. GILLESPIE:  That's true.
13              DR. APOSTOLAKIS:  Okay.
14              MR. GILLESPIE:  And that is one of the difficult
15    things on corrective action programs.  If you take any one
16    thing, any one thing by itself, there is no synergism with
17    other things.
18              Part of the picks to this, by the way, may be A4
19    was implemented, A4 became effective as the Maintenance
20    Rule, which is fundamentally asking the question, is my
21    plant safe today in its current configuration?  That is
22    going to help, because if your corrective action program has
23    a number of out of spec, inoperable, non-functional systems
24    in it and they are backing up, you are now going to be doing
25    a daily analysis in a risk sense that sets your threshold
.                                                                62
 1    for operation that day, which will include all those
 2    backups, because it is the actual configuration of the
 3    facility on any given day.
 4              So I am coming at it a little different way.  It
 5    is say if you have a big backup of safety significant things
 6    in your corrective action program, where things are
 7    inoperable, you are going to start bumping up against A4 and
 8    the recommended limits within A4.  And that is kind of one
 9    of our backstops to the whole thing.
10              DR. LEITCH:  Are we going to hear a little more
11    about what you did at Indian Point 2?  Is that part of your
12    presentation today or later, or could we talk about that
13    now?
14              CHAIRMAN SIEBER:  Well, it seemed to me that --
15              MR. GILLESPIE:  No, I think we need the right crew
16    of people, because that one went all the way, George, this
17    one went all the way to Phase 3, went through the process of
18    the interchange between us and the licensee.
19              DR. LEITCH:  I guess my fundamental question was,
20    was that work done after the steam generator tube rupture or
21    was it predicting that they were in trouble prior to the
22    steam generator tube rupture?
23              MR. GILLESPIE:  After.  After.
24              DR. LEITCH:  It was reactive.
25              MR. GILLESPIE:  Yeah.
.                                                                63
 1              CHAIRMAN SIEBER:  I think what we ought to do on
 2    the discussion of Indian Point would be to do that after the
 3    break, you know, because we asked them to get a few things
 4    together for the us.
 5              MR. GILLESPIE:  Yeah.  This is not a cop-out, but
 6    Doug has been involved in most of the panels with the
 7    process.  On Indian Point, it was Steve Long who really
 8    chaired the panel and was kind of full-time on it and acted
 9    as kind of the headquarters PRA/SRA expert on it.  And I
10    have been to three or four presentations, and whatever we
11    said would be hearsay, and we really should get the right
12    people to kind of go over that I think.
13              CHAIRMAN SIEBER:  All right.
14              MR. JOHNSON:  If I can just add a point.  There is
15    nothing in the -- to go to the question from a different
16    way, there is nothing in the process that we set up that
17    would try to be leading of events.  That is to say that we
18    expect that there will continue to be events, and that you
19    will have PIs that react to those events.
20              The process, though, was set up with PIs and the
21    series of thresholds to enable us to step back and look at
22    what the PIs and the inspection findings were telling us in
23    a timely manner to enable us to take action to really
24    address subsequent performance declines that would result in
25    that plant being unacceptable.
.                                                                64
 1              So, and we were discussing that a little bit
 2    earlier, but I just wanted to make sure that we were clear. 
 3    There is nothing -- you know, we don't want to leave you
 4    with the impression that we tried to set up a series of PIs
 5    that would forecast, if you will, upcoming events or enable
 6    us to make sure that those events didn't happen, because we
 7    know that events will continue to occur, but what we are
 8    really focused in on is the performance of the plants and
 9    the actions, regulatory actions that we take, and licensee
10    actions that are taken to prevent that plant's performance
11    from becoming unacceptable.
12              DR. LEITCH:  I guess I thought I heard earlier
13    that you were quite impressed with these performance
14    indicators in light of the Indian Point 2 event.  And I
15    guess that's good.  It was retrospective.  I would have been
16    more impressed if somehow the indicators gave you some kind
17    of a tip-off that there may be some kind of a problem that
18    will occur at Indian Point 2, and you are saying that that
19    did not occur, right?
20              MR. GILLESPIE:  That did not occur.
21              DR. LEITCH:  So that the indicators, in
22    retrospect, were useful.
23              MR. GILLESPIE:  The indicators did not prevent the
24    leakage event from occurring, but it did give us reasonable
25    assurance that if you did have an event like that, your
.                                                                65
 1    safety systems and your crews could react appropriately and
 2    protect the public.  In that case, I mean the whole event
 3    was over in a matter of hours, if not minutes.  All the
 4    right appropriate actions were taken, all the safety systems
 5    worked.  So we are not trying to predict that leakage event,
 6    but we are trying to say that the plant will not have a
 7    public protection problem with those events occur.
 8              There is a leakage event, what, someone told me
 9    every seven years, and it really doesn't matter who does
10    anything about it.  It just seems that about seven years
11    there is one of these things.
12              DR. BONACA:  So, do you mean that that kind of
13    performance that you indicate then has nothing to do with
14    the judgment that you have on the licensee?
15              MR. GILLESPIE:  Then you go in and you look and
16    you say now, was there a program failure?  But now you are
17    after the fact.  Was there a program failure that
18    contributed to this event, that if we don't correct it,
19    would make this event happen again?  And in the case of
20    Indian Point, it was.
21              DR. BONACA:  We have discussed this before, I mean
22    as far as some of the judgments we expressed, that maybe the
23    indicators were not as useful as, for example, the
24    inspection program.  That, because the inspection program,
25    if you went really into the paths of the corrective action
.                                                                66
 1    program, it will give you insight in how the organization
 2    thinks and operates, and it may be a predictor of possible
 3    troubles in this future, while these indicators really are
 4    simply a statement of certain events, of facts, and then you
 5    have to analyze them to determine whether or not there is
 6    anything to do with the reports.
 7              MR. GILLESPIE:  Yeah, but is a total program.  The
 8    indicators, you can't have indicators of inspection.
 9              DR. BONACA:  As you know, I was more critical at
10    the time, and I am beginning to see some of the benefits of
11    the program as-is, but, still, there is an issue of the fact
12    that these are not leading indicators of anything in
13    particular.
14              One thing that, by the way, I wanted to ask you
15    was, before Dr Kress, I think, raised the issue of this
16    doesn't really recognize performance.  For example, I mean
17    you have got an example on barrier integrity, you may have a
18    border where there are examples of that, you have one pin
19    failure, and many licensees have taken the conservative
20    action of shadowing the fuel rod even early in the cycle and
21    losing, literally, four to five months of operation because
22    of that.
23              And yet if you do not shadow that pin, most likely
24    you will not even exceed these ranges of activity.  Okay. 
25    You will have probably, however, contamination that
.                                                                67
 1    licensees don't like to have.  But really, I think it is an
 2    very important indication of licensee philosophy and safety
 3    commitment, a decision to shadow or not shadow.
 4              And is there anything in the inspection program
 5    that allows for this kind of recognition to be given to
 6    certain decisions?  Or is it totally, you know, blind to
 7    those kind of decisions?
 8              MR. GILLESPIE:  No.  In fact, one of the recent
 9    changes, the most recent change to what goes into an
10    inspection report, Manual Chapter 610, now actually allows
11    or causes the inspectors to put those types of things into
12    the reports.  It is not subjective, it is not going back to
13    the old way we used to write reports, but there is now a
14    lower level of thresholds for documentation of those, so
15    that we don't lose those insights.
16              And, in fact, that is one of the indicators that
17    Don said it has kind of got a problem with.
18              DR. BONACA:  Because the industry struggles with
19    those issues.  And then if they see that there is no payback
20    whatsoever from a regulatory standpoint, and there is no
21    recognition whatsoever, why would you sacrifice four months
22    of operation or more just because you want to be as clean as
23    you can, you know, to be as responsible as you can to your
24    people.  I mean it is good motivation, but I think that
25    there should be some regulatory incentive that is always
.                                                                68
 1    important.
 2              MR. GILLESPIE:  That causes them to go that way.
 3              DR. BONACA:  Yes.  Do the right thing.
 4              MR. GILLESPIE:  Yeah.  I agree.  But our reports
 5    now do recognize that a lower level, and that was a change a
 6    month ago, I think, when we put out 0610-Star.
 7              MR. HICKMAN:  Let me just go very quickly.  The
 8    concerns about the security, the safe physical protection
 9    cornerstone, there are three indicators there.  The one is
10    the protected area, security equipment, performance index. 
11    We use the compensatory hours, the guard hours, to
12    compensate for equipment that is out of service as the
13    measure, so it is kind of a surrogate.
14              There has been some desire on the part of some to
15    actually count the number of hours that the equipment is
16    unavailable rather than the number of hours that a guard is
17    stationed.  That was done largely at the instigation of the
18    industry, because that information is readily available. 
19    They know how much time they paid the guard force for
20    overtime to do the compensatory hours.
21              We are looking at that one and what we can do to
22    make that perhaps a better indicator.  There are a number of
23    problems with it, but the funny thing about it, it is the
24    same thing I noticed when I worked in the old AEOD PI
25    program, was that despite all the flaws that some of these
.                                                                69
 1    indicators appear to have, the results came out pretty good,
 2    about as we expected.  And, in fact, we have had some good
 3    results.  We have had about two or three licensees at the
 4    beginning of initial implementation that were yellow in that
 5    indicator, and really had no idea how their equipment was
 6    performing until they saw that.  And they made quick changes
 7    to fix it.  So it has had a good impact on licensees who did
 8    not have adequate programs for security of the perimeter.
 9              But there are still complaints from industry on
10    that.  We are still looking at that one to see if there is
11    maybe a better way to do it.  Although, as I say, we had
12    identified the plants that we thought had poor programs and
13    needed to address them.
14              CHAIRMAN SIEBER:  It seems to me, though, that
15    when you are told your licensee applies a compensatory
16    measure for a failed piece of security equipment, the
17    assumption is that the degree of physical protection has not
18    declined because the comp measure was applied.  And so you
19    aren't really measuring, for example, the effectiveness of
20    physical protection, you are measuring the availability of
21    the equipment.
22              MR. GILLESPIE:  Yeah, and that is one of the
23    reasons we were able to go with the PI, because it didn't
24    measure reduced security.  We actually got comments from
25    some other government agencies involved in security of
.                                                                70
 1    infrastructure that said, well, you don't want to publish
 2    something that someone says this plant has poor security,
 3    because that kind of sets them up.
 4              CHAIRMAN SIEBER:  Right.  Here is the unlocked
 5    door.
 6              MR. GILLESPIE:  The attribute you just described
 7    made this okay as a public PI because it really was getting
 8    not at the security status of the facility, but trying to
 9    get at the operability of the equipment that was expected to
10    be there, that they were actually licensed in the security
11    plan as the norm.  So compensatory measures are becoming
12    norm.  That really wasn't consistent with the way we thought
13    the place was operating.
14              So it is an interesting quandary.  It made it
15    useful in a public PI because of that attribute.
16              CHAIRMAN SIEBER:  Right.  Now, let me ask another
17    question.  I have been to some sites where, on a regular
18    basis, the fog rolls in in the morning, and so the cameras
19    are not as effective as they might be, and sometimes the
20    security supervisor will post out because of the degree of
21    fog, so he can protect the isolation zone.  Is that counted
22    as part of the --
23              MR. GILLESPIE:  No.  Environmental factors are not
24    counted against them.
25              CHAIRMAN SIEBER:  Okay.
.                                                                71
 1              MR. GILLESPIE:  It is really trying to focus on
 2    failures.
 3              DR. LEITCH:  Just a couple of real specific
 4    questions about performance indicators.  In our handout, we
 5    got a sheet that looked like this, with various plants, and
 6    just a number of keys, the letter N, I and U are used on
 7    this.  Do you know what they stand for?  I just couldn't
 8    find that.
 9              MR. HICKMAN:  Some of them are not applicable. 
10    For example, the reactor core isolation coolant system,
11    there are plants that have isolation condensers, don't have
12    RCSI 2.0 plants.  So an N should be not applicable. 
13    Incomplete, are there some on there?  That would indicate
14    that the licensee hasn't reported all the data.
15              DR. LEITCH:  The U and the Y.
16              MR. HICKMAN:  The U.
17              DR. LEITCH:  And the Y, yeah.
18              MR. HICKMAN:  Yes.  There are some thresholds that
19    are under development.  For example, and maybe we are a bit
20    tardy at this, you might see that under Oconee.  Oconee has
21    a very different emergency power system.  We haven't yet
22    established the thresholds for Oconee with their hydro
23    units.
24              MR. JOHNSON:  Incidentally, what that page is a
25    printout, really, a printout of what is on the external web. 
.                                                                72
 1    It is a summary rolled up of all of the performance
 2    indicators that are available.  And so it is just a picture,
 3    if you will, for any stakeholder to get on and see where the
 4    performance indicators stand with respect to the plants. 
 5    And we are developing an additional page that does the same
 6    thing for inspection findings, so you can take this same
 7    kind of a look.
 8              In addition to that, you can look for an
 9    individual plant to see what that plant's performance
10    indicators and inspection findings are.
11              DR. LEITCH:  Are these columns labeled on the web?
12              MR. JOHNSON:  Yes.  In fact, the reason why I
13    mentioned that was to make mention of the fact that the
14    labels are there, and the explanations are there.
15              DR. LEITCH:  Okay.  Good.  Thank you.  And just
16    one other specific question.  Well, I am not really sure, we
17    had some information in our handout concerning simulator
18    operational evaluation.  Is there a performance indicator
19    under development there?
20              MR. HICKMAN:  There is not a performance indicator
21    under development.  We do use simulator evaluations in some
22    areas.  We use it under an emergency preparedness for
23    drills.  We allow them to use simulator drills.  That may be
24    what it is referring to.  I am not sure what you are looking
25    at.
.                                                                73
 1              DR. LEITCH:  I just noted that if I am
 2    interpreting this chart correctly, that if eight out of
 3    eight crews failed, that is a yellow, and I guess -- I don't
 4    know what would drive it to a red.
 5              MR. COE:  That is the significance determination
 6    process that has been proposed.  We have discussed it with
 7    industry, and we expect to add it to the Manual Chapter in
 8    the near future.  But that is requalification, and it is
 9    there to monitor the success and/or non-success of the
10    requalification program in testing their crews.
11              DR. LEITCH:  Am I interpreting it correctly that
12    if eight out of eight fail, you consider that a yellow?  It
13    says Y.
14              MR. COE:  Yes, Y is yellow, that's correct.
15              DR. LEITCH:  I would think that would be pretty
16    red if eight out of eight failed.  Pretty bright red.
17              MR. COE:  Right.  I think you have -- I can't go
18    through all of the basis for each of those.  The way that
19    that chart was set up, the old SDP was run, we would have to
20    have somebody from the operator licensing group that worked
21    on that come in here and explain that in more detail to you.
22              MR. BOYCE:  At this point, we can -- I was
23    actually going to present an overview of where we were and
24    set the stage.
25              I'm Tom Boyce, I'm the Acting Section Chief in the
.                                                                74
 1    Inspection Program Branch of NRR.  We went slightly out of
 2    order this morning, because we need to get Don Hickman to
 3    the NEI meeting that's ongoing right now.
 4              I was going to provide an overview, and I can do
 5    that right now, or if you wanted to take a break, we could
 6    do that, too.  And that would essentially set the stage for
 7    Doug Coe to talk about the STP after that.
 8              CHAIRMAN SIEBER:  Yes, why don't we take a
 9    15-minute break now and come back at 20 after 10:00.
10              [Recess.]
11              CHAIRMAN SIEBER:  Let us resume the meeting.
12              MR. BOYCE:  As I said, I'm Tom Boyce, and I'm the
13    Acting Inspection Chief in the Inspection Program Branch of
14    NRR.  And today we wanted to give you an overview of several
15    things, and then lead into a discussion of PIs and STP.
16              I want to talk about just initial implementation
17    status, some highlights of program feedback and how we're
18    going about obtaining some of that feedback.  The two issues
19    that we called selected issues were PIs and STP.
20              I will then give you a glimpse of where we're
21    going in the future.
22              We've completed eight months of initial
23    implementation.  The program formally kicked off on April
24    2nd, 2000, and we've been exercising every aspect of the
25    oversight process since then.
.                                                                75
 1              On the inspection side, I think we've worked
 2    through almost all of the baseline procedures.  We've had an
 3    opportunity to work through a lot of the supplemental
 4    procedures.
 5              And we've even had a chance to perform some of our
 6    special and infrequent inspections.  We've had generally
 7    positive feedback.
 8              I'm not sure that everyone shares that view,
 9    because you only hear about the highly negative things, but
10    in general, when we've done some of the surveys, the answers
11    have come back in a positive tone; the majority of the
12    answers have come back that way.
13              That doesn't mean it's uniform across the board,
14    and it doesn't mean we don't have things we need to work on,
15    but I wanted to say, in context, it's been fairly well
16    received by both internal and external stakeholders.
17              And I'll go more into how we're obtaining some of
18    that feedback.  Finally, we just completed our mid-cycle
19    assessments that are performed in November.  As you may
20    remember, we have two assessments of plant performance every
21    year where we issue a formal letter to licensees, and we
22    just completed those, and letters have just gone out, and
23    we're in the process of posting them on our external
24    website.
25              Those are our selected issues.  We've really had a
.                                                                76
 1    lot of feedback, and we've gone out of our way, I think, to
 2    try to get some of that feedback.  I was just taking a look
 3    through the package that you had been handed, and this kind
 4    of illustrates the methods we're using to collect the
 5    feedback.
 6              There's a form in here that shows what we're
 7    asking our internal stakeholders to submit comments to us
 8    on.  We've established a formal feedback process.  We don't
 9    want anecdotal comments given to us -- well, we do, but we
10    would prefer them to be formally documented, because we have
11    a group of people that are dedicated to analyzing the
12    comments, the basis for the comments, taking a look at what
13    our current program does, checking it against data, and
14    finally making changes.
15              And that's illustrated in this simplified drawing
16    right here in your handout.  This is what we've set up to
17    try and handle feedback.  And it looks complicated, but the
18    intent is to make a it a very controlled process, as was
19    alluded to earlier, where you are trying to have a stable
20    process and do the right thing, based on good data, and not
21    anecdotal information.
22              There's a copy of the feedback form that we asked
23    for stakeholders to fill out, and it's not just, well, we
24    think there's a problem with this procedure; we saw a
25    problem on one plant.  It's, we saw this problem on one
.                                                                77
 1    plant and here's some of the background; here are the
 2    specific aspects of the procedure that we think ought to be
 3    changed in a recommendation on where to go with it.
 4              We've got right now --
 5              DR. SHACK:  Excuse me, but how much of that kind
 6    of response have you gotten?
 7              MR. BOYCE:  Fairly good.  I don't have hard
 8    numbers, but I want to say that we've had 100.
 9              DR. SHACK:  It's on that order?
10              MR. BOYCE:  It's on that order of 100, and the way
11    the Regions are -- the Regions are the primary people giving
12    us that input.  And the Regions are establishing their own
13    internal processes to get us good feedback.
14              Typically, a Resident Inspector who is on the
15    front line, will identify an issue; he'll work it through
16    his Regional branch chief; Regional management will discuss
17    the issue, flesh it out, apply the right context to the
18    issue, and decide whether or not it needs to go forward for
19    a program change, and then it gets forwarded to us at
20    headquarters for us to look at.
21              DR. SHACK:  Now, does an Inspector use the same
22    kind of form if he sees a problem with the process?
23              MR. BOYCE:  Right, in general, he fills that out
24    and sends that in to the Regions, although some of the
25    Regions have adopted a different strategy that the form we
.                                                                78
 1    get ultimately is this form.  But the many of the Regions
 2    have started with the Resident Inspector on this form.
 3              We have a monthly meeting with NEI where we go
 4    through a variety of issues.  There is in your handout, a
 5    sample agenda from a recent meeting, I think, as well as
 6    some of the meeting minutes.  And that can give you a sense
 7    as to the sorts of things we discuss there.
 8              We found that very, very useful in getting a sense
 9    as to where industry is, particularly on things like
10    performance indicators.  And a lot of the things Don was
11    saying, we would hear NEI's position on, we'd be able to
12    move forward with a good resolution that way.
13              We've got something called an Initial
14    Implementation Evaluation Panel.  The Commission directed us
15    to have an independent panel take a look at our
16    implementation during this first year.  It's very similar to
17    what we did in the pilot program where we had a pilot
18    program evaluation panel.
19              And that report was published.  I think this panel
20    reviewed that, but I don't know for sure.
21              There are four meetings.  It's chaired by a
22    Regional Division Manager.  It's got representatives from
23    the two states, industry representatives, Regional
24    representatives, and several Headquarters representatives.
25              We've also had a variety of public forums.  This
.                                                                79
 1    says mid-cycle, but actually we conducted what are called
 2    the end-of-cycle reviews on our nine pilot plants back in
 3    April, and we actually issued an end-of-cycle letter as if
 4    they were under the reactor oversight process.
 5              And then we held a public forum to say this is our
 6    evaluation of the plant.  In addition, we have gone out at
 7    every site and held a meeting in the vicinity of the site
 8    where we gave the public an overview of the reactor
 9    oversight process, and the opportunity to comment and
10    provide us with input.
11              We're also in the process of conducting Regional
12    site visits where members of our Branch go out and actually
13    visit the Resident Inspectors at the sites, talk to the
14    licensees, talk to members of the public, and try to obtain
15    firsthand feedback.
16              We're going to be issuing a Federal Register
17    Notice very shortly, where we're specifically soliciting
18    feedback on a variety of issues associated with the
19    oversight program.
20              And we're going to conduct lessons learned
21    workshops, one for the public, external, and also one
22    internal for our internal stakeholders.  So I think the
23    message is, we're really trying to beat the bushes and get
24    as much input as we can to improve the process during this
25    first year.
.                                                                80
 1              And here's a look at where we're going in the
 2    future.  You're going to hear a lot more, I hope, on STP
 3    improvements and enhancements.
 4              You've heard about the Scram PI Pilot Program from
 5    Don Hickman, and also unplanned power changes and
 6    unavailability PIs.  We're trying something a little bit new
 7    on this bullet right here, where we're talking about
 8    industry trend assessment.
 9              The thinking here is that we've done a lot to
10    modify our oversight program recently.  And we're trying to
11    say, okay, let's take a step back.  Are we really making a
12    difference?
13              Is industry performance improving, holding
14    constant, or being degraded as a result of our new oversight
15    process?  You know, there an underlying assumption that
16    industry is doing a good job and that their programs are
17    effective a maintaining safety at the plants.
18              Well, it's not enough that we inspect each plant
19    and that we've got performance indicators on each plant; we
20    want to take holistic view and come up with an answer that
21    we're comfortable with that says we're doing well.
22              And there are several things that we're looking
23    at.  This is the early stages.
24              One is the AEOD PIs, because they have been around
25    for 15 years, and we're going to continue that program under
.                                                                81
 1    NRR.  And we're going to watch.  All the trend lines have
 2    come down -- well, most of the trend lines have come down on
 3    those PIs, and we're going to continue to monitor those and
 4    we're going to look for any adverse trends.
 5              We're taking a look at the accident sequence
 6    precursor program.  We're trying to develop a means to take
 7    our current set of PIs that we've developed for the reactor
 8    oversight process, and try and get a way to aggregate those
 9    PIs and look for trends.
10              Right now, they're individual, based on, say,
11    plant power changes or unavailability PIs.  Does it make
12    sense to try and pull those together and develop a whole
13    other set of PIs?
14              Anytime you try and do that aggregation, you run
15    into problems.  An example might be diesel generators.  Not
16    every plant has the same number of diesel generators, and so
17    when you try and aggregate them, your data might be skewed
18    in a direction we don't fully understand.
19              So this is something we're going to be looking at
20    the data and trying to develop.  There are other ways that
21    we can count beans and give us an indication.
22              You were alluding to earlier, that matrix which
23    shows performance indicators on a sheet, and that gives you
24    a nice overview of the how the industry is doing.  Mike
25    Johnson says we're going to develop one for inspection
.                                                                82
 1    findings as well, so we'll have an analogous matrix for
 2    those.
 3              When you have performance indicators and
 4    inspection findings, you can combine those and show where
 5    any plant is on the action matrix.  So we're going to show
 6    the action matrix and show what column of the action matrix
 7    each plant is in.
 8              And that will give you a view of the whole process
 9    and the whole industry on our website.  That's where we
10    think we're going with that.
11              Another thing we're contemplating doing and we're
12    going to be discussing with NEI this afternoon is, you can,
13    from our website, count the number of plants that are in the
14    regulatory response column, the licensee response column,
15    the degraded cornerstone column, and you can come up with
16    bean counts.
17              We have 80 percent of the plants in the licensee
18    response column; we have 15 percent of the plants in the
19    regulatory response column, et cetera, and you can come up
20    with a mosaic of where all the plants are in the industry.
21              If we have that today, we can track that sort of
22    thing down the road, and if we start to see a lot of plants
23    migrating from the licensee response column over to the
24    degraded cornerstone column, we'll be able to say
25    definitively that we have a problem.
.                                                                83
 1              So those are the sorts of things we're looking at
 2    to try and develop a method for industry trends.
 3              DR. SEALE:  You earlier made note of the fact that
 4    the industry has is own programs of concern with safety, and
 5    presumably they're in the excellence rather than the minimum
 6    compliance business.
 7              And so that makes the things they do somewhat
 8    different from the things you do.  At the same time,
 9    undoubtedly, they have they own version of the industry
10    trends assessment process.
11              Is there ever an opportunity for you, in the
12    integral form, not the plant-specific form, because I know
13    they always get nervous when you talk plant-specific, but in
14    the integral form, is there a way for you to compare your
15    industry assessments with their industry assessments?
16              MR. BOYCE:  Are you speaking of the WANO
17    indicators?
18              DR. SEALE:  More than that.  I mean, that's the
19    high level up here.  I'm talking about now that we get down
20    to the nitty-gritty assessment kind of detail.
21              MR. BOYCE:  I think if you -- the one that's
22    really available is the INPO industrywide indicators where
23    they take all of their's and aggregate them.  I think that
24    once we get our process and we do something similar, there's
25    going to be no way to avoid analyzing any similarities or
.                                                                84
 1    differences.
 2              DR. SEALE:  Yes.
 3              MR. BOYCE:  It's not that we plan on doing it, but
 4    there is no way we're going to be able to avoid it.
 5              DR. SEALE:  Somebody's going to do it for you, and
 6    if you don't have some coherence and a story as to why it
 7    isn't, then you're going to have -- both of you will have a
 8    problem.
 9              MR. GILLESPIE:  I think that when we step back --
10    we're still trying to figure out how to do ours.  So we're
11    still up at bat here.
12              DR. SEALE:  I understand that.
13              MR. GILLESPIE:  But, yes, anytime there's a
14    difference, we're going to have to do an analysis of some
15    kind of the difference.
16              And also you've said something that we kind of
17    have to watch where we go.
18              DR. SEALE:  Yes.
19              MR. GILLESPIE:  We are not pushing the same as
20    INPO is when they set industrywide goals.  They have
21    excellence goals and things like that on exposure,
22    unavailability, et cetera.
23              We have regulatory goals, and how we articulate
24    our trends is also, I think, going to be very important, to
25    distinguish between those two.
.                                                                85
 1              MR. JOHNSON:  In fact, if I could add to what
 2    Frank's saying, you know, we have the NRC performance goals,
 3    and this goes directly to one of our performance goals,
 4    which is that we're going to maintain safety; that is, that
 5    we're not going to have any significant adverse trends in
 6    industry performance.
 7              And we will end up reporting on this on an annual
 8    basis as a part of the Agency's -- we talk about it in the
 9    performance plan, the green book.  We talk about it in the
10    report that we owe to the Congress and the President at the
11    end of the year that talks about our accomplishment of our
12    performance goals.
13              And so this is one that has high visibility, and
14    we're trying to measure and make sure that we maintain
15    safety.
16              DR. SEALE:  It's going to be interesting when
17    certain elements of the public become aware of the fact that
18    excellence is not a part of the objective of the NRC.
19              MR. BOYCE:  This is where you say I wonder?
20              [Laughter.]
21              DR. SEALE:  I don't wonder; I think there's a
22    shock out there someplace.
23              MR. GILLESPIE:  This gets back to George's earlier
24    comments when he says we're trying to be predictive of what? 
25    And we did not predict the leak at Indian Point, but given
.                                                                86
 1    the leak, and the process we went through, we did find a
 2    program failure which might have been predictive of a
 3    multiple tool failure, which, in fact, would be a much more
 4    severe public hazard, which is why it was a red finding.
 5              So, in that case, we feel, from the oversight
 6    point, to some degree that it was a success; the system
 7    worked.
 8              Yet people would say, well, the system failed
 9    because you didn't predict the leak.  Well, our threshold
10    really wasn't set at that level, and right now it still
11    isn't.
12              So, that's the difference between excellence and
13    public protection, and we're trying to keep that in mind,
14    yes.
15              CHAIRMAN SIEBER:  On the other hand, I think it
16    would be near impossible for you to predict a leak, and so
17    it may not be worth the effort.
18              MR. GILLESPIE:  Well, there were some lessons
19    learned.  Now let me backtrack on what I said, because there
20    were some lessons about how we conducted the inspections
21    from two and three years ago when they did their steam
22    generator inspections and how we oversaw it.  Was there some
23    correction available?
24              It may not fix the problem.  I don't know that it
25    would have predicted the leak, but there were some things
.                                                                87
 1    where we could tighten up on how we do some reviews in the
 2    field, and the experience level of the people doing the
 3    reviews.
 4              And so that was a positive thing that we do need
 5    to fix.  And that may or may not fix the problem, but it
 6    will help, I think.
 7              MR. BOYCE:  Risk-based performance indicators:  We
 8    have alluded to that.  We just got done reviewing Research's
 9    Phase I risk-based PI development report and providing them
10    comments.  I think Research is on your agenda for early next
11    year to present this.  I don't know exactly what month, but
12    we think that effort has got an awful lot of potential.
13              There are some serious pitfalls that may or may
14    not be show-stoppers, but definitely need to be addressed. 
15    For example, the data that you're talking about for
16    unavailability, how do you gather all of that, and how do
17    you do it consistently.
18              Right now, we have 18 performance indicators in
19    our current process.  Research identified in this early
20    document, 31 potential performance indicators.
21              And getting data to feed that right now is an
22    issue.  NEI has gone on record as saying, well, we've got 18
23    that work, you know, it's a voluntary program and why do you
24    want us to collect and submit data for 31?  That's a huge
25    increase in burden, and if we're going to go in that
.                                                                88
 1    direction, we want a corresponding decrease in the
 2    inspection that we get.
 3              And so that's one early position that we need to
 4    look at closely.  And we need to say whether or not the
 5    benefit that we're getting from risk-based PIs warrants that
 6    sort of response.
 7              But we don't know where that's going, but that's
 8    one issue.
 9              DR. APOSTOLAKIS:  Now, risk-based means
10    plant-specific?
11              MR. GILLESPIE:  It is very close to
12    plant-specific.  It depends on -- what it depends on is the
13    modeling used.  Research is using what are called the SPAR
14    models.  And to the extent that those are driven all the way
15    down to the plant-specific level, you'll get your answer.
16              I think right now there's 30 SPAR models, and they
17    are on their way to 70.  But that development is again --
18              DR. APOSTOLAKIS:  I wonder what NEI means when
19    they say that the system works?  Is it that 99 percent are
20    green; is that what it means?  Well, if I were they, I would
21    say it works very well.
22              So I'm not sure that the systems works is really
23    something that we all agree to.  If you have very high
24    threshold levels, there are lots of greens.  If you're the
25    industry, of course it works.
.                                                                89
 1              It's nice, and that's the whole idea of having
 2    plant-specific indicators, and then you will have some way
 3    to discriminate.
 4              In fact, I remember several months ago, I read
 5    some comments from states and other non-industry groups
 6    where they expressed amazement that so many greens were
 7    collected.  So, I don't know that the argument that the
 8    system works, carries much weight with me, at least.
 9              DR. KRESS:  That brings to mind Graham Wallis's
10    standard question; what's the measure of whether or not this
11    is working correctly or not?  How are you going to measure
12    that?
13              DR. APOSTOLAKIS:  And if those guys have 31
14    indicators and you are using now, 18, it doesn't look to me
15    like it's an insurmountable problem.
16              I mean, maybe they can lump some of their's and
17    come up with a smaller number.
18              MR. BOYCE:  Your concerns are very valid.
19              DR. SEALE:  They're crying some crocodile tears
20    here, too, you know.
21              DR. APOSTOLAKIS:  Who is?
22              DR. SEALE:  The industry is, because I think that
23    anybody will tell you that if you go to a real plant that's
24    doing things well, that there are a heck of a lot more than
25    18 indicators they're keeping track of.
.                                                                90
 1              DR. APOSTOLAKIS:  My understanding is that INPO
 2    has made --
 3              DR. SEALE:  Yes, and so there's a little bit of
 4    disingenuousness in this added burden thing.
 5              CHAIRMAN SIEBER:  Well, that's probably true, but
 6    the difficulty is that each plant has their own set with
 7    different definitions.
 8              DR. SEALE:  I fully agree with that.
 9              CHAIRMAN SIEBER:  And trying to get them to agree
10    is 98 percent of the problem, as opposed to just doing it.
11              DR. APOSTOLAKIS:  They're getting some benefit
12    from this, Jack.
13              DR. APOSTOLAKIS:  Yes.  And that's the way you
14    manage your plant well.
15              DR. SEALE:  That's right.
16              DR. APOSTOLAKIS:  You said this was voluntary?
17              MR. GILLESPIE:  This is voluntary.  Our whole
18    program is voluntary.  The PI portion is voluntary.
19              DR. APOSTOLAKIS:  The PI portion?
20              MR. GILLESPIE:  The PI portion.  They don't get to
21    volunteer whether they want inspection or not.
22              DR. APOSTOLAKIS:  Yes, but the inspections are
23    different.  You don't just do the baseline.
24              MR. GILLESPIE:  If someone would un-volunteer,
25    then we're committed to doing a different inspection program
.                                                                91
 1    to make up for the differences.
 2              DR. APOSTOLAKIS:  Which is what you had been doing
 3    before.
 4              MR. GILLESPIE:  Right.
 5              CHAIRMAN SIEBER:  Sounds like volunteering is a
 6    good deal.
 7              DR. APOSTOLAKIS:  So are they volunteering, all
 8    units are volunteering?
 9              MR. GILLESPIE:  Oh, yes, there were no dissenters.
10              DR. APOSTOLAKIS:  No dissenters, so this is a
11    warmly-embraced volunteer program.
12              DR. BONACA:  I have a question with this now. 
13    You're going to look at the baseline inspection and assign
14    colors to those inspections, too.
15              And the question I have is, for this PI, you had
16    some criteria you used to determine the number that you're
17    accepting for each indicator.  That was essentially where
18    you didn't see in any change in risk associated with the
19    value.
20              What about the inspection?  How are you going to
21    set the criteria to go from, you know, green to white, and
22    will the result be a consistency in the sense that only 99
23    percent will be green, or is it going to be a different kind
24    of spread?
25              MR. GILLESPIE:  Maybe we've just jumped into
.                                                                92
 1    Doug's presentation where he was going to go through the
 2    significance determination process.
 3              CHAIRMAN SIEBER:  Yes, but let me ask one final
 4    question.  You talked about 31 future PIs.  Does that
 5    include the 18 that already exist, or was that an addition
 6    to the 18?
 7              MR. BOYCE:  Well, there is some overlap, but, no,
 8    it does not include those.  They're largely distinct and
 9    different.
10              CHAIRMAN SIEBER:  Okay, so that's 49?
11              MR. BOYCE:  Right, which we would not -- we have
12    -- that's one of the huge issues with implementation, is how
13    do you transition, if that's the right thing to do, and all
14    the data collection mechanisms are in place, how do you do
15    that transition?
16              MR. GILLESPIE:  One of the things that you have to
17    do is don't get hung up on the word, PI, in the risk-based
18    PIs.  It's risk-based data.
19              I don't know if you heard the hesitancy in my
20    voice when I said this before.  That would be a different
21    program than we have today.  It would be different.
22              CHAIRMAN SIEBER:  This is not the green, white,
23    yellow, red?
24              MR. GILLESPIE:  It may have colors assigned to it,
25    but you've got a different data collection process which
.                                                                93
 1    doesn't exist today, that has to be put in place in
 2    infrastructure.
 3              You've got plant-specific thresholds like the
 4    maintenance rule kind of thresholds that might be useful.  I
 5    mean, the data is there; the thresholds are there.
 6              Consistency in reporting, we need a data
 7    dictionary so that the data elements are reported
 8    consistently.
 9              It's like the next evolution of this program.  It
10    may not, in fact, be these PIs.
11              And I say that because these PIs have baggage with
12    them.  Part 50.9, and is it willful if you make a mistake,
13    and all those arguments came up.
14              If we are actually going to have a window into all
15    the operating data at a facility on all the safety systems,
16    is it fair to expect the utilities to submit that to us
17    under 50.9 with all the expense involved in that, or is it
18    data similar to what an inspector gets and writes up in an
19    inspection report, and it's the best available data?  And
20    you then get it confirmed if you're going to use it for an
21    official purpose.
22              And if it's like inspection data, but available
23    long distance, that's a whole different approach to
24    monitoring reactor safety than we're in today.  It's a
25    different scheme, and I would suggest that you can't think
.                                                                94
 1    of it in the same context of this program.
 2              You almost have to think of it as the next
 3    incremental jump.
 4              DR. APOSTOLAKIS:  It's moving to homo sapiens,
 5    right?
 6              MR. GILLESPIE:  Yes, we're still kind of walking
 7    on two's and three's, and so it would be a different vision;
 8    it would be a different mix of program.
 9              DR. APOSTOLAKIS:  But it seems to me, Frank, that
10    you said something earlier that's very relevant.  There is a
11    perception among the licensees that no matter what colors
12    you give them, they have to be green.
13              And as you said earlier, being white is not the
14    end of the world.
15              MR. GILLESPIE:  It's not.
16              DR. APOSTOLAKIS:  And I think, in fact, having the
17    vast majority of the indicators being green works against
18    the program.  Now, people might say, well, what's wrong with
19    that?  The industry is working very well.
20              Well, yes, but every now and then you have
21    something that upsets that perception.
22              DR. KRESS:  Maybe we need light green and dark
23    green here.
24              DR. APOSTOLAKIS:  I think green and white is
25    perfectly legitimate to find green and white, and then you
.                                                                95
 1    correct things and so on.
 2              CHAIRMAN SIEBER:  I think that the problem is that
 3    you have to look at it from the Chief Nuclear Officer's
 4    standpoint.  As soon as you get a white, you're talking to
 5    the Directors and the CEO and the financial people and
 6    everybody in the world.
 7              DR. APOSTOLAKIS:  You do not distort the
 8    inspection process because you have to do that.
 9              MR. GILLESPIE:  And we haven't.  That's why I
10    would really like to get to Doug.  Doug can go through an
11    example, having had a whole hour to prepare several case
12    studies for presentation to the graduate panel here, and
13    we'll see if he gets his degree or not.
14              [Laughter.]
15              MR. GILLESPIE:  I think that will illustrate the
16    leveling, because the FDP process is really the leveling
17    process for the inspection results to bring consistency.
18              And this is one of the real successes.  This was a
19    real innovation.  We had PIs before; we just didn't use them
20    the same way.
21              The real innovation is the measurement yardstick
22    for inspection results.
23              DR. APOSTOLAKIS:  I have one question before we
24    get into that:  There are some numbers that are now in the
25    books that are kind of puzzling.
.                                                                96
 1              For example, to get into the red for the
 2    initiating events, you have to exceed 25 scrams.  I mean,
 3    where did this 25 come from?
 4              MR. GILLESPIE:  Actually, I wish Don were here to
 5    take the blame.  No, it wasn't Don.
 6              That came from our risk people talking to their
 7    risk people, and focusing actually very, very narrowly,
 8    which is why there is a difference between the normal
 9    C-level, which is the green/white threshold at three, and
10    the risk level at 25.
11              If nothing else is wrong and you strictly vary the
12    number of scrams in a PRA, and everything else works, it is
13    not very risk-significant.  And, therefore, you can have a
14    whole lot of scrams and everything else is working.
15              And that was the difference between people who
16    wanted to have a measure and people who wanted to have an
17    indicator.  Those of us who wanted an indicator and not a
18    measure like the normal C-green/white threshold and say,
19    hey, that's the real meaningful threshold, when you depart
20    from normalcy, we should inquire more.
21              And the risk-informed thresholds that are lower
22    for that parameter, are just out of sight, because way
23    before someone has 25 scrams in a year, they're going to fix
24    it.
25              DR. APOSTOLAKIS:  Right, and presumably you will
.                                                                97
 1    do something.
 2              MR. GILLESPIE:  Presumably we'd be on their case
 3    pretty heavily by that time, too.
 4              That's the reason, and that is the difference
 5    between a risk measure which is purely done as a parametric
 6    kind of study for risk, with a single parameter being
 7    varied, and the normal sea level which is kind of the
 8    green/white thresholds.  And I don't know that we will ever
 9    fix that, I mean it is kind of we are stuck with it the way
10    it is right now, but that is where it came from.
11              I would like to give Doug some time because this
12    is -- he is actually a better speaker than the rest of us.
13              MR. BOYCE:  To segue into your question, Doug will
14    try and address how we separated out the wheat from the
15    chaff with our inspection findings into the more significant
16    ones.
17              MR. COE:  Thank you.  For the record, my name is
18    Doug Coe, I with the Inspection Programs Branch at NRR.  I
19    am pleased to be here.  In fact, I am real excited to be
20    here, because, as Frank mentioned, I think the SDP is in
21    fact an innovation and a quantum step forward in our ability
22    to risk-inform people's thinking.  I am going to come back
23    to that point in just a moment.
24              But to answer the question that you asked, about
25    2-1/2 years ago, Mike Johnson approached me.  He said, we
.                                                                98
 1    have got these great indicators, performance indicators, and
 2    we have got this concept and we want to set these
 3    thresholds, he said, but we have a real problem.  He said,
 4    we need something to match inspection findings in terms of
 5    their significance so that we can add these things together,
 6    in effect, in the action matrix to decide what kind of
 7    action to take.  We want these things to be more or less on
 8    the same footing and the same scale.
 9              So the performance indicators, at least in the
10    reactor safety area were built around this concept of the
11    change in core damage frequency.  And so the only real
12    answer for Mike -- of course, Mike wanted a simple tool, but
13    we will give the exact answer that was, you know, the most
14    accurate answer.  So that was the genesis of the SDP.  And,
15    of course, you know, what it meant was we had to design a
16    tool that would estimate the risk change in terms of a delta
17    core damage frequency metric, so that it can be compared to
18    the thresholds that have been established or were being
19    established at that time for the PIs.
20              You know, we also, of course, in the larger sense,
21    there is other cornerstones that aren't related to risk, as
22    you well know.  And so the idea there, as time progressed
23    and as we began developing those other cornerstone SDPs, we
24    were -- I think the idea was to try to represent the
25    significance in a way that caused the NRC to react with what
.                                                                99
 1    we thought was the appropriate amount of response relative
 2    to what we would be responding to and reacting to in the
 3    reactor safety area.  So there was a commensurate level of
 4    response, even though we couldn't actually quantify the risk
 5    value for some of those other cornerstones.
 6              But I want to make a couple of points real strong,
 7    and as I go through the couple of examples that I have
 8    provided today, I hope that it illustrates these points. 
 9    Inasmuch as the SDP was originally intended to be a backend
10    tool for an inspection finding, in other words, the
11    inspector goes out, does the inspection program, gathers up
12    a bunch of findings, brings the findings back, plugs them
13    into the SDP, turns the crank and out comes a number.  That
14    was kind of the original idea.
15              However, a number of us saw, who had been
16    struggling for a long time, feeling a little bit like the
17    guy with the bamboo cane that is kind of whacking at the
18    elephant's leg, you know, trying to get the elephant to move
19    in the risk-informed direction, we saw some real value in
20    helping to risk-inform people's thinking because, at least
21    personally, my assessment, having immersed myself in this
22    business since 1995, was that the real bottleneck, the real
23    problem was that the people who were deciding things on the
24    basis of risk insights really didn't own the risk analysis. 
25    They hadn't participated in it and, in fact, they were
.                                                               100
 1    simply looking to other people, specialists, analysts, to
 2    provide an answer, a risk answer.
 3              And, so, in fact, one way to answer Mike's
 4    question 2-1/2 years ago was say, okay, Mike, that's great. 
 5    You have established a threshold already, we will just hire
 6    a bunch of risk analysts and we will just put them to work. 
 7    And all the inspectors will give their findings to the risk
 8    analyst.  The risk analyst will be the oracle and they will
 9    come back and they will say, well, this is a green, and this
10    is a white.
11              DR. APOSTOLAKIS:  So what is wrong with that?
12              [Laughter.]
13              MR. COE:  What's wrong with that?  I will tell you
14    what is wrong with that.  Part of the effort that we have
15    tried mightily, I think, and now with greater success
16    because of the SDP, is to risk-inform inspectors' thinking. 
17    Because if they don't know, if they are not sensitive to the
18    plants, the things at a particular plant that drive risk
19    significance at that plant, then, you know, their effort and
20    their success in finding the most significant issues is
21    hampered.
22              And in the past we have tried, we have given them
23    training, and I even helped create a course, a two-week
24    course, at Frank's behest a number of years ago, which I
25    think has been somewhat successful, although it still fell
.                                                               101
 1    short.  We sent them forward.  Two weeks of training later,
 2    they knew how PRAs were done.  They had had some exposure to
 3    the IPEs, okay, because we brought those into the classroom,
 4    and we sent them out.  And we said, well, make the best use
 5    of this, do what you can.  You know, take the risk insights
 6    that you have gotten out of the IPEs, which, by the way, are
 7    10 years old, so they are not very useful anymore, and you
 8    will have to get new insights from the licensee, and use
 9    these to help find good issues.
10              Well, we sent them forward without any tools.  We
11    sent them forward without any -- well, I mean the
12    computer-based tools, we weren't able to give them training
13    on in a two-week course.
14              So what the SDP does is it provides a link between
15    the sophisticated, computer-based analyses that we have
16    created over time, and need for the inspector to have at
17    least a conceptual understanding on a high level, on a
18    functional level as to what drives risk at their plant.
19              I think we have struck the right balance, because,
20    on the one hand, our inspectors looked at our initial SDPs
21    and said this is hopelessly complex.  The analysts, on the
22    other hand, looked at our initial SDP worksheets and said
23    this is hopelessly crude.  So, given the two extremes and
24    the two perspectives, I think we have probably hit about the
25    right mark in the middle.
.                                                               102
 1              And so we have got a tool that isn't perfect, it
 2    is a gross estimator, but it forces the user to think in
 3    risk terms.  And I can't tell you how it has warmed my heart
 4    to hear not only inspectors, not only some NRC managers, but
 5    licensee PRA people that have come to me and said, this is a
 6    tool that I can use to help explain risk to my management
 7    and I have never been able to do that until now.
 8              And so, you know, I don't want to overplay this.
 9              DR. APOSTOLAKIS:  I have a question on this.
10              MR. COE:  Sure.
11              DR. APOSTOLAKIS:  The SDP documents are
12    plant-specific?  Because you said that they have to
13    appreciate the risks, so, you know, the dominant
14    contributors and all that.  So an inspector at, say, a
15    Davis-Besse would have matrixes that would reflect somewhat
16    that plant?
17              MR. COE:  Yes.  It is intended that the plant
18    equipment, the equipment that is designed for that plant,
19    and, of course the type of plant that it is, whether it is
20    PWR or BWR, and so forth, it is intended that, in fact, the
21    SDP be a fairly accurate, high level representation of the
22    plant-specific PRA that the licensee has performed, and that
23    we also try to capture in our SPAR model.
24              DR. APOSTOLAKIS:  And this is based on the IPE?
25              MR. COE:  Well, initially, we created -- you see,
.                                                               103
 1    this is a problem because we created -- the SDP initial
 2    concept was developed from the IPEs, because that is the
 3    only thing that we had available to us.  We have legal
 4    restrictions on going out and asking licensees for broad --
 5    you know, making broad requests for information to supply us
 6    with their up-to-date, most current models.
 7              So we started with that knowing that we would have
 8    to go get, on a plant-specific basis, an update, and we did
 9    that.  We formulated, early in the program, -- actually, we
10    didn't even have all of the initial worksheets developed in
11    April when we started initial implementation, but we
12    finished shortly thereafter, and we engaged in a series -- a
13    site visit to every site.
14              We sent one of our risk analysts, and sometimes
15    more than one.  We had some help from our contractor.  And
16    we gathered -- and we allowed the licensee to comment on the
17    worksheets and to incorporate things that we knew were
18    missing from the IPEs.
19              DR. APOSTOLAKIS:  So right now they are not they
20    are not based on the IPEs?
21              MR. COE:  Now they have progressed beyond the IPEs
22    and we have incorporated, or are in the process of
23    incorporating the comments and feedback that we got from
24    doing the site visits.
25              DR. APOSTOLAKIS:  Perhaps you are aware that there
.                                                               104
 1    is a letter from the Union of Concerned Scientists to the
 2    Commission to direct the staff not, as I recall, not to use
 3    the IPEs because we have told them so many times that the
 4    IPEs are not very good.
 5              MR. COE:  Right.
 6              DR. APOSTOLAKIS:  So if the Commission says don't
 7    use these IPEs, this program is not --
 8              MR. GILLESPIE:  Right.  I am watching the clock
 9    because I want Doug to go through some details, otherwise,
10    we will never really answer your questions.
11              They are plant-specific, we did have plant visits
12    with the contractors.  We have shared the booklets, and
13    there is a booklet for every plant, with the licensees.  And
14    the licensees know we are measuring them against that.  So I
15    think every licensee has done his best to make sure it
16    fairly reflects his facility.  So it hasn't been
17    unilaterally the NRC taking 10-15 year old data now.
18              Yeah, at a high level, I think we have got really
19    pretty good information.
20              DR. APOSTOLAKIS:  Okay.  So the booklets then are
21    plant-specific, but the ultimate comparison with the
22    thresholds are generic?
23              MR. GILLESPIE:  Yes.
24              DR. APOSTOLAKIS:  And we are planning to fix that.
25              MR. GILLESPIE:  Fortunately, though, we are
.                                                               105
 1    dealing with delta risk and not absolute risk.  Changes.
 2              MR. COE:  And it is important, too, to answer the
 3    concerns that are still out there and will stay out there, I
 4    think, for a while, that, you know, the quality of the
 5    analysis that we do is only justified based on whether or
 6    not we agreed upon the influential assumptions that were
 7    used in the analysis.  And part of the process that we were
 8    seeing happen here, I think, which is one of the intents is
 9    that the influential assumptions of a particular risk
10    analysis be more broadly understood by a lighter population
11    of people, including the inspectors in the field, who are in
12    a position to challenge those assumptions if they have, you
13    know, having information that they do, that they gain on a
14    day-to-day basis.
15              So, you know, it is a question of improving the
16    understanding at all levels in the organization of what is a
17    risk analysis, and what it isn't.
18              So let me just make a couple of points real quick. 
19    I have got a couple of overview slides in here.  And let me
20    just point out, you talk about a lot of green findings, and
21    that is true, however, eight months into the program now, in
22    the initial implementation, we have 16 greater than green
23    issues that have been processed by what we call the SERP, or
24    SERP, which is the SDP and Enforcement Review Panel.  This
25    is the headquarters panel that meets to look at every issue
.                                                               106
 1    that the regions send us as a white issue or greater.
 2              And these are the cornerstones along this side
 3    here.  And not too surprisingly, of course, we have got a
 4    lot of whites, a couple of yellows, and, of course, you know
 5    about the Indian Point 2 red.  The note at the very bottom
 6    is supposed to reference the physical protection cornerstone
 7    and reflects the Commission's review currently of an interim
 8    SDP that is going to be used.  So that one white issue may
 9    change after we hear from the Commission.
10              DR. APOSTOLAKIS:  So what is the first bullet, 16
11    greater than green?
12              MR. COE:  Sixteen.  If you count up these issues,
13    you will find 16 issues here, and these are all greater than
14    green, and they have all been processed by the headquarters
15    panel.  And this is simply a high level tally, the stats to
16    date, since April.  Okay.
17              And my only point, the only take-away on this is
18    simply that we have had --
19              DR. WALLIS:  I found it greater than confusing,
20    because, to me, grades go up and greater than green would
21    mean better than green, not worse than green.
22              MR. COE:  I'm sorry, I was using a mathematical
23    symbol, I probably should have used language.
24              DR. WALLIS:  Or non-green or something.
25              MR. COE:  Non-green is better.  Non-green is
.                                                               107
 1    better.
 2              DR. SHACK:  One thing that has been surprising to
 3    me is those two yellows that plug in there for what seems
 4    like a long time.  They were yellow when I first looked
 5    them.  You know, they are yellow, you know, months later.
 6              MR. COE:  Yellow inspection findings do stay
 7    yellow for four quarters, that is a characteristic.
 8              DR. SHACK:  So a corrective action may well have
 9    been taken.
10              MR. COE:  Yes.
11              DR. SHACK:  I see, but its color hangs in there.
12              MR. COE:  Right.  It is not real time like a
13    performance indicator, or it is relatively less real time. 
14    But the idea was there that --
15              MR. BOYCE:  One second.  You need to carry that
16    for four quarters, because if you have one and then it is
17    fixed, and then one the next day and then it is fixed, you
18    can do that every day.  And the process is built on if you
19    have got more than one, you end up degrading your
20    cornerstone and that takes you into the action matrix where
21    you get increased regulatory attention.  That is the basis
22    for it.
23              MR. COE:  And, also, a yellow inspection finding
24    does represent some underlying causal factors that make --
25    it would probably take, it was more likely than not to take
.                                                               108
 1    some time to fix, and to be assured that it wasn't going to
 2    recur.  So a year seemed, for better or for worse, seemed
 3    like about the right amount of time.
 4              Okay.  And we talked a little bit about some of
 5    the key issues earlier in the meeting.  The risk-informed,
 6    plant-specific, again, plant-specific, risk-informed
 7    notebooks, I am going to illustrate their use with these two
 8    examples that I have here for reactor safety cornerstones.
 9              Fire protection, I know we have talked about that. 
10    I know your interest in that area, and it is an area that we
11    acknowledge that, because of the nature of fire protection
12    issues and the spatial complexities that are inherent in
13    that, it is a more complex SDP, and there is essentially no
14    way to get around that.  But we do need to make the tool
15    usable, and we are working to improve its usability.
16              Safeguards, I have talked about, and the
17    Commission is reviewing the current interim SDP containment. 
18    We really haven't made as much progress as I would like to
19    have made on containment.  We have a basis document that is
20    in place, and the Risk Analysis Branch of NRR has taken on
21    the task of improving that process from a usability
22    standpoint for the inspectors' use.
23              Shutdown, we have a checksheet basically that goes
24    through some of the industry guidelines and, basically, it
25    is a Phase 1 kind of a process where it prompts the
.                                                               109
 1    inspector to send the issue to headquarters for further
 2    analysis if certain thresholds within that checksheet are
 3    met.
 4              So, with that, I think I am ready to dive into an
 5    example, unless there's any questions at this point.
 6              [No response.]
 7              MR. COE:  All right.  I have given you a package
 8    that incorporates two examples.  I have to point out that
 9    these examples we used for the rollout training prior to
10    April and so the worksheets have changed slightly since
11    April, since we have gone into initial implementation.  And
12    what I hope to achieve today is simply to illustrate to you
13    the concept.
14              First, just as an overview, let me say that the
15    SDP for reactor safety is divided into three phases.  And,
16    Dr. Apostolakis, it actually parallels very closely the work
17    that you and Budnitz did on the fire and seismic.  You
18    actually had three levels, I believe, we call them phases. 
19    But I think it paralleled -- what we have done here
20    parallels very closely the thoughts and concepts that you
21    articulated in that NUREG on how to assess.  And I know your
22    task at that time was to assess LERs for their possible
23    addition to the ASP process, the Accident Sequence Precursor
24    process.  But the concept is very similar.
25              So Phase 1 is essentially a screening where we can
.                                                               110
 1    call as green those issues which have a very high likelihood
 2    of not reaching the 1E to the minus 6 delta core damage
 3    frequency threshold.
 4              Phase 2, if something does not pass that screen,
 5    and we can't just say that it is definitely, you know, of
 6    that low of significance, then we have to do a little
 7    further analysis on a higher, on kind of a gross level,
 8    functional level.  But that is the process that I believe
 9    will give us the most gains in helping to risk-inform our
10    inspectors' thinking, and that is what we are going to go
11    through here.
12              The Phase 3 is the point at which the inspector
13    has to hand-off the issue for either verification that it is
14    a risk significant issue, to a risk analyst, or because the
15    SDP process simply can't accommodate the particular issue
16    the inspector has, and that should become apparent as the
17    inspector tries to work through the process.  So it is an
18    automatic acknowledgement that there's limitations to Phase
19    2, and if you exceed those limitations, you need help.
20              This part of the worksheet right here is just
21    basically setting up the problem.  And in this particular
22    case, we have an accumulator and a PWR, one of four that has
23    not met its required tech spec level for a period of time. 
24    And so this is very important, because, as we have
25    experienced with our initial rollout of this technique,
.                                                               111
 1    inspectors really have to write down very clearly what the
 2    deficiency is, because, as they proceed through a
 3    risk-informed process, there is a strong tendency to change
 4    the rules, or change the initial conditions as you go.  And
 5    you have to resist that, and so we have to very clear.
 6              So, basically, this is just information that helps
 7    identify the problem and in the lower half of the worksheet
 8    simply helps the inspector identify that one of the
 9    cornerstones that we are going to address in this process
10    has been affected, there has been an impact or a change.
11              DR. WALLIS:  Well, this looks like a cascade.  I
12    mean I would think you would check core decay heat removal,
13    because that is what it has to do with.  Then you would
14    check initial, and then you would check primary, and then
15    you would check one of the pressures.  Instead of just
16    checking one of those things, you would check probably four
17    of those.
18              MR. COE:  Yes, you are quite right.
19              DR. WALLIS:  It is a cascade of reasoning.
20              MR. COE:  That is exactly it.  It is a cascade of
21    reasoning, that is the way it was developed.  And, in fact,
22    that is a very good point.  And we have seen inspectors use
23    it in that fashion as well.
24              But given that the logic should be apparent here,
25    it would be clear, I hope, that if you check this, that it
.                                                               112
 1    really related to these other two higher order functional
 2    statements.
 3              DR. APOSTOLAKIS:  What does it mean that for the
 4    Maintenance Rule, it is a risk significant event?  I mean
 5    they can tell that right away?
 6              MR. COE:  The Maintenance Rule required licensees
 7    to categorize the systems and components into two
 8    categories, risk significant and non-risk significant -- or
 9    high risk significance, low risk significance.
10              DR. APOSTOLAKIS:  Oh, it is a categorization of
11    the accumulators.
12              MR. COE:  It is what the accumulators, where they
13    are in the licensee's scheme.
14              DR. SEALE:  That ties it all together.
15              MR. COE:  And that information is needed for the
16    Phase 1 screening question, which we will get to on the next
17    page, actually.
18              The next page is the Phase 1 screening process,
19    and this is the series of questions that are related to
20    trying to -- you know, if we have an issue that is --
21              DR. WALLIS:  So this value is a cross instead of a
22    checkmark, so his vote doesn't count.
23              [Laughter.]
24              MR. GILLESPIE:  Well, it depends on what happens
25    to the chad.
.                                                               113
 1              MR. COE:  This process here, Phase 1 starts by
 2    checking off which cornerstones are affected.  And, again,
 3    we have updated this a little bit, clarified it a little bit
 4    more.  But the basic idea here is that if more than one
 5    cornerstone is affected, then we go ahead and go on into
 6    Phase 2.  We are not confident, if more than one cornerstone
 7    -- typically, this might be something that would affect both
 8    initiating event frequency and the mitigating system
 9    capability.
10              DR. APOSTOLAKIS:  Can you give me an example of
11    something that affect no cornerstone?  It seems to me you
12    have covered just about everything there.
13              MR. COE:  Yes.  A licensee fails to put an SSC
14    into Category A1 of the Maintenance Rule, an SSC has reached
15    its performance criteria, and so they begin to take -- well,
16    they are taking corrective actions on a failure-by-failure
17    basis, but the licensee fails to recognize that it has
18    exceeded its performance criteria, it has been now put into
19    the Maintenance Rule.  The deficiency is that they failed to
20    take an administrative action that would then help give more
21    attention and, presumably, improve the reliability of that
22    component.
23              This is an issue currently because those kinds of
24    things that can't be related to a cornerstone, and that is a
25    good example of one, end up as what we call a no-color
.                                                               114
 1    finding.  In other words, the licensee failed to comply with
 2    the Maintenance Rule, but there was no -- for the deficiency
 3    that I described, there was no affect on plant reliability.
 4              DR. APOSTOLAKIS:  So, primarily, administrative
 5    kinds of things.
 6              MR. COE:  Yes.
 7              DR. WALLIS:  There is one thing that is missing
 8    here, and that is none of the above.  And maybe the
 9    inspector has a very good reason to know he ought to go to
10    Phase 2, although it doesn't fit in one of these boxes.
11              MR. COE:  And that is a good point, and what we
12    have stated in our implementing guidance, and what I stress
13    when I talk to inspectors, is that they should never at any
14    time feel constrained to not go, proceed into a Phase 2
15    analysis, because, as I mentioned, and as I stress over and
16    over again, it is like anything else, the more you use it,
17    the better you understand it, and that is, in fact, one of
18    the goals.  Okay.  So that is a good point.
19              If, however, only one cornerstone is affected,
20    Phase 1 screening questions try to get a little bit more
21    definition of the potential significance.  And, so, in this
22    particular case, the mitigation system cornerstone was
23    affected.  We asked a series of questions.  And, as you can
24    see here, Question Number 3 was the one that was checked
25    yes, and it was that the issue represented actual loss of a
.                                                               115
 1    safety function of a single train for greater than tech spec
 2    allowed outage time.
 3              I would point out that these screening questions
 4    are in sync with or came out of the accident sequence
 5    precursor screening criteria.  So we drew from that to try,
 6    number 1, to be consistent, and -- well, that is primarily
 7    it, is to be consistent with that.  And we might adjust
 8    these screening criteria as time goes on, as we gain more
 9    experience.  If we had found an issue that passed through
10    this and none of the questions were answered and it came out
11    green, and we subsequently found it to be of significance,
12    we would certainly want to revisit this.  But to date, we
13    have not encountered that.
14              So we say, okay, this has greater than --
15    potentially greater significance than green.  So we want to
16    continue the analysis.  At that point we ask the inspector
17    to look at what are the accident sequences that are of
18    concern here.  And for an accumulator, its function, as was
19    represented in the worksheet, the first page of that
20    worksheet, was to provide core reflood capability upon a
21    design basis LOCA.
22              And so we look at large LOCA and we look at the
23    frequency, and we look at how long the condition existed. 
24    In this case, the condition existed --
25              DR. BONACA:  How does the --
.                                                               116
 1              MR. COE:  What did I say?  Ninety hours.  I said
 2    90 hours.
 3              DR. BONACA:  How does the inspector know that you
 4    don't rely on this for a small LOCA?  I mean does it have
 5    sufficient understanding of that?  What kind of effort --
 6              MR. COE:  No, actually, the process requires the
 7    inspector to right down what the function is of the system. 
 8    And I believe, I think we might have even changed this, so
 9    it doesn't just focusing on licensing basis.
10              MR. GILLESPIE:  How does he know which sequence to
11    pick for his plant?
12              DR. BONACA:  Well, I mean here, you know, if you
13    understand the LOCA, you know, for a small break LOCA, you
14    can use the accumulators.  But mid-break LOCA, it depends on
15    the plant, and large break LOCA, all of them need it.
16              MR. GILLESPIE:  Right.
17              DR. BONACA:  So the question is -- you know, that
18    is quite a time-consuming effort to go back into this.
19              DR. APOSTOLAKIS:  That is why they take the
20    two-week course.  Is that what you teach them?
21              MR. GILLESPIE:  That's the key to it.  Plus they
22    had a week of additional training specifically on the
23    applicability of this.
24              But, Doug, do you want to?
25              MR. COE:  Yeah, we have a table in the
.                                                               117
 1    plant-specific workbooks, and what they will do, for every
 2    system, in this case, accumulators would be listed on the
 3    left column of that table.  You would go over to the right
 4    column and it would tell you which accident sequences, and
 5    each accident sequence that is represented there would have
 6    a separate worksheet.  And we are going to go through a
 7    couple of those in just a moment.
 8              Now, I am noting here that what I gave you here
 9    was that we should have had a 90 hour time period, which
10    would actually have put us over here in the -- well,
11    actually, in the middle column.  Okay.  And so this isn't
12    exactly right, and I think that has been fixed in our
13    subsequent presentations.
14              DR. BONACA:  We have different --
15              MR. COE:  Three days.  Three days is what, about
16    75 hours or so.  And so what we really have is we have a
17    time period that is greater than three days and less than
18    30.  So what we really should be in is this block right
19    here.
20              DR. BONACA:  Our slides, in fact, are correct.
21              DR. APOSTOLAKIS:  A little different.
22              DR. BONACA:  They show the F for Category 5 and
23    the G for 6, so they are correct.
24              MR. COE:  Well, the time period here is specified
25    as 90 hours.
.                                                               118
 1              DR. WALLIS:  We have 5 and you have 6.  Why is
 2    that?
 3              DR. SHACK:  Medium LOCA and large LOCA.
 4              MR. COE:  Okay.  All right.  You have -- I think
 5    you have the updated.
 6              DR. BONACA:  We have the updated version.
 7              MR. GILLESPIE:  You gave them the right answer.
 8              MR. COE:  I gave you the right answer, I have got
 9    the wrong slide.  Okay.
10              [Laughter.]
11              DR. APOSTOLAKIS:  So let me understand, it is a G,
12    is that what it is?
13              MR. COE:  For large LOCA is G.
14              DR. APOSTOLAKIS:  And what does G mean?
15              MR. COE:  G is simply the value here, multiplied
16    by this in terms of years.  It gives you an estimated
17    likelihood that that event will occur during that period of
18    time.
19              MR. GILLESPIE:  It is an index to get to the next
20    table.
21              DR. APOSTOLAKIS:  Okay.  The next table, okay.
22              MR. GILLESPIE:  Yeah, let me continue.  Now we are
23    at the simplified tool stage.
24              DR. APOSTOLAKIS:  Yeah.
25              MR. GILLESPIE:  To avoid a calculation.
.                                                               119
 1              MR. COE:  If you want to just jump ahead to this
 2    in your first example, this is where we are headed.  This
 3    initiating event likelihood of G represents that value, that
 4    likelihood that that event would occur during that period of
 5    time.  And the remaining part of the risk equation is what
 6    is the likelihood that, given that event during that time,
 7    that the plant's mitigation functions would completely fail? 
 8    And so that depends on how many mitigation functions are
 9    available to address that initiating event.
10              And so our final, what we are shooting for here is
11    which one of these cells are we in for any of the initiating
12    event sequences that were affected by that particular
13    deficiency.
14              To get to that point we need to know what the
15    plant's capability is, and what we have represented here is
16    simply a high level representation of accident sequences
17    beginning with, in this case, the large LOCA initiating
18    event.
19              So this worksheet is going to take the value of
20    the large LOCA initiating event probability during that
21    period of time, which we have represented as "G" --
22    correctly on your handouts, "G" -- as an entry point to that
23    table that I just showed you, but we document on this
24    worksheet as "G" for these large LOCA sequences.
25              We note that the exposure time was 90 hours and in
.                                                               120
 1    other words we know where that G came from because it could
 2    have come from a couple of different places in Table 1.
 3              In the next section here --
 4              DR. WALLIS:  Just look at that, because it seems
 5    to say that four out of four accumulators are needed.  Isn't
 6    that what it says there?
 7              MR. COE:  That's correct.
 8              DR. WALLIS:  But you have lost one.
 9              MR. COE:  Right.
10              DR. WALLIS:  So why is not a bad thing to lose
11    one?
12              MR. COE:  These definitions up here are there to
13    define these functions down here, and of course these are
14    going to represent the accident sequences that we are going
15    to be concerned about in this analysis.
16              Therefore, the early inventory accumulator
17    function is satisfied in terms of the full plant capability
18    for that, to satisfy that function.  The requirement, and
19    this is actually a restatement of the success criteria, is
20    that four out of four accumulators are required to function,
21    okay, to satisfy the EIAC safety function.
22              DR. WALLIS:  He's lost one.  Hasn't he lost one?
23              MR. COE:  You have indeed lost one.
24              DR. WALLIS:  So why is a zero in there?
25              MR. COE:  Because what you do down here is, first
.                                                               121
 1    of all, can you recover the failed train, and you either get
 2    a credit of zero if you can't or you get a credit of one if
 3    you do --
 4              DR. WALLIS:  And so zero is bad?
 5              MR. COE:  Well, zero is simply a reflection that
 6    you cannot recover that train --
 7              DR. WALLIS:  I was puzzled by that, because zero
 8    equals Green on the right.
 9              MR. COE:  I'll get there.  You are jumping ahead
10    of me.
11              DR. WALLIS:  Okay, I'm sorry.
12              MR. COE:  Let's walk through this as logically as
13    I can.
14              The EIAC function has been affected, okay, and
15    that is the only function here that has been affected.  In
16    other words, all of these sequences are large LOCA
17    sequences, but the only sequence likelihood that has changed
18    is this one up here, because that is the function that was
19    affected.
20              The function requires four out of four
21    accumulators.  You have less than four.  Therefore, over
22    here you get zero credit for that function, zero credit
23    here.  We ask the supplemental question in almost all cases
24    could this function be recovered by manual operator action? 
25    What the user of this SDP has said here is that in their
.                                                               122
 1    judgment you couldn't do it fast enough.
 2              In other words, a large LOCA is going to happen
 3    and there is no time -- there is no opportunity to regain
 4    that function.
 5              DR. WALLIS:  You are not going to refill an
 6    accumulator while there is a LOCA.
 7              MR. COE:  Precisely, and so that is why that is
 8    zero and so this is zero, and so we come over here for this
 9    sequence.  We essentially have lost all functional
10    capability and therefore the risk significance for that
11    sequence is solely dependent upon the initiating event
12    frequency or the likelihood that that initiating event would
13    occur during that period of time.
14              That is how we get to Table 2 -- this table
15    here -- where we have this G, initiating event likelihood
16    during that period of time.  We come out over here to zero,
17    zero capability, and so what this cell represents is a range
18    of delta CDF values that basically reflect the value of
19    that --
20              DR. APOSTOLAKIS:  So what saves you is that the
21    frequency of the large LOCA is small enough.
22              MR. COE:  That's correct.
23              DR. BONACA:  Is small enough -- and the frequency
24    is in the 90 hours.  Its accumulators are very small.
25              DR. WALLIS:  So it is small enough you could have
.                                                               123
 1    lost all the accumulators and you still wouldn't care?
 2              MR. COE:  For that particular accident initiator.
 3              Now what the table would tell the inspector to do
 4    would be that not just the large LOCA takes credit for
 5    accumulators, which is design basis and really that's all
 6    the design basis looks at, but from a PRA standpoint the
 7    medium break LOCAs take credit for accumulator reflood,
 8    okay? -- so we also have to look at --
 9              DR. BONACA:  Before we go to that though, the
10    interesting thing it seems to me that if, however, it was
11    more than 30 days, going back -- then you would be an F --
12              MR. COE:  Yes.
13              DR. BONACA:  -- and you would be White.
14              MR. COE:  Yes.
15              DR. BONACA:  So then because the frequency of the
16    LOCA and --
17              MR. COE:  Time duration --
18              DR. BONACA:  Exactly.
19              MR. COE:  So what we have done is we have allowed
20    the user to begin to see the interplay between time,
21    initiating event frequency, and remaining mitigation
22    capability, and to do so in a simple fashion that does not
23    require them to sit at a computer terminal and understand
24    the complexities of what is happening in software.
25              DR. APOSTOLAKIS:  But if we did or if they did
.                                                               124
 1    only what you said, then even if all four accumulators were
 2    unavailable the result would be the same, but the Agency
 3    would have reacted in a different way, I hope.
 4              MR. COE:  Well, from a risk standpoint, if we are
 5    using that as the yardstick for our response, what we are
 6    saying is in this case, and for the large LOCA we would have
 7    the same -- the change in core damage frequency we would
 8    estimate to be less than 1E to the minus 6, but it is a good
 9    point.
10              Actually, when you look at the next accident
11    sequence, medium LOCA -- actually, I don't know the answer
12    right now, but let's see if that makes a difference.
13              Now in a medium LOCA the success criteria is that
14    you only need two out of the four accumulators, okay, to
15    satisfy that function, same function.  Essentially you have
16    got the same series of sequences here except that the medium
17    LOCA frequency might be a little bit higher, but now you can
18    satisfy the medium LOCA, which is generally going to be less
19    than a six-inch break, with two out of the four
20    accumulators, so if you have either three or four
21    accumulators you can take credit for a multi-train system in
22    terms of the reliability of the remaining accumulators to
23    satisfy that function.
24              If you had lost two accumulators, you could not --
25    well, you would take credit for one train.  You would
.                                                               125
 1    essentially have one train of capability.
 2              DR. APOSTOLAKIS:  Let's say you lost all four.
 3              MR. COE:  If you lost all four, then this would
 4    be, instead of three here, this would be zero, and now here,
 5    as was pointed out a moment ago, you have a higher
 6    initiating event frequency --
 7              DR. APOSTOLAKIS:  So it would be White.
 8              MR. COE:  Yes, but it is because of this
 9    rationale, this logic, which is grounded in probabilistic
10    risk principles and forms the framework for our judgment
11    that that is a more significant issue now.
12              The loss of three accumulators out of four would
13    put this into a category where we would be more concerned. 
14    In fact, two accumulators would only give you a credit of
15    two for a single train, and as I am sure you are aware, that
16    value of two represents 10 to the minus 2 failure upon
17    demand.  It is simply the negative logarithm.
18              DR. APOSTOLAKIS:  Then it would be a Green, right? 
19    If you have lost two --
20              MR. COE:  I can never do this from memory.
21              DR. APOSTOLAKIS:  I already can.
22              [Laughter.]
23              MR. COE:  If you were F and you had two, you're
24    right, you would be Green.
25              DR. APOSTOLAKIS:  Something which may be obvious
.                                                               126
 1    to everyone, here you seem to be making decisions using the
 2    PRA results in some way.  You are going to severe accident
 3    stuff.
 4              MR. COE:  Yes.
 5              DR. APOSTOLAKIS:  It is not just design basis,
 6    right?
 7              MR. COE:  Right.  We reflect the dominant and the
 8    accident sequences.
 9              DR. APOSTOLAKIS:  Changing the rules?
10              DR. BONACA:  Wait a minute.  Why are you saying
11    severe accidents?
12              DR. APOSTOLAKIS:  They are looking at PRA --
13              DR. BONACA:  -- insights.
14              DR. APOSTOLAKIS:  Yes.  I mean from the design
15    basis point of view, did you really look at multiple
16    failures?  You did not.  Just one.
17              MR. COE:  We are essentially trying to establish
18    the significance.
19              DR. BONACA:  The significance, the safety
20    significance --
21              MR. COE:  Of a finding, okay?
22              DR. BONACA:  -- of a PRA, a PRA reading the safety
23    significance of the condition.
24              DR. APOSTOLAKIS:  No, but would there be a case
25    where from the design basis perspective there is no problem
.                                                               127
 1    but from the PRA perspective there is?
 2              MR. COE:  Yes, and if that condition were to
 3    arise, and we identified that the licensee's performance was
 4    deficient in some fashion, even though they were meeting
 5    their design basis, we may conclude that there was a change
 6    in core damage frequency sufficient to warrant our action.
 7              MR. GILLESPIE:  George, we had a specific example
 8    of flooding at Sequoyah.
 9              DR. APOSTOLAKIS:  Yes?
10              MR. GILLESPIE:  Where the water in the whole site
11    drained downhill and they had to put a transformer building
12    right where all the water drained, and the water during a
13    hundred year flood would bubble up out of the drain instead
14    of draining down it, and that was considered a White finding
15    and there was an NOV issued against the maintenance rule on
16    it and it was not covered in their design basis
17    documentation at all.
18              DR. APOSTOLAKIS:  So does this imply now that if
19    the licensee volunteers to accept this system that licensee
20    is saying that it is willing to be regulated by things that
21    go beyond their licensing basis?
22              MR. COE:  No.  We do not, absolutely do not, say
23    that the licensee has to do anything but meet our
24    requirements.  However, the actions that we take, the level
25    of management interaction that we engage in with the
.                                                               128
 1    licensee, is going to be informed by this yardstick.
 2              DR. SHACK:  He violated a tech spec.
 3              DR. BONACA:  Yes, he violated a tech spec.
 4              MR. COE:  Well, in this particular example.  I
 5    think George is asking a larger question.
 6              MR. GILLESPIE:  What we did, George, on that --
 7    that was a big question early on.  If we find something that
 8    is not covered in the design basis but we find it is
 9    risk-significant, then we are going to call it according to
10    the SDP and call it what it is, and we will write it up that
11    way, but the resolution of it will be different.
12              The licensee could either choose to fix it -- I
13    mean literally voluntarily -- or we have a backfit decision
14    to make and under the backfit rule --
15              DR. SHACK:  How would you get into this process if
16    he hadn't done something wrong by your licensing basis?
17              MR. COE:  Actually, the entry condition is not
18    tied only to a violation of regulatory requirements.  If a
19    licensee -- if we evaluate that a licensee has had some kind
20    of deficient performance that has caused a change in core
21    damage frequency evaluated through our SDP process to be
22    greater than 1E to the minus 6 per year, then that becomes a
23    greater than Green issue, and we can process that issue.
24              As Frank just noted, we had one issue where that
25    was pretty much the case.
.                                                               129
 1              CHAIRMAN SIEBER:  In fact, it doesn't even have to
 2    be a performance issue.  It could be a condition that was
 3    discovered, for example, by a resident inspector.  Is that
 4    not true?
 5              MR. COE:  Well, actually, we have imposed the
 6    requirement that the Staff should be capable of articulating
 7    what the performance deficiency is as an input to a
 8    significance determination process because this question
 9    arises.  A piece of equipment fails and the licensee has
10    done everything correctly.  That QA program has worked
11    exactly the way it anticipated that it would.  It was just
12    one of those things -- if you will excuse the expression --
13    a "random failure" -- okay?
14              Under those conditions if we cannot identify a
15    performance deficiency, then what the program says is that
16    this is just part of the baseline risk of a large, complex
17    industrial facility that will sometimes have components that
18    fail.
19              CHAIRMAN SIEBER:  I am thinking more of a design
20    defect that deviates from, for example, the construction
21    code that went undiscovered during the construction process
22    and the preservice testing, surveillance tests, and so
23    forth, did not reveal its existence until somebody began to
24    think about it and said oh-oh, this probably wouldn't work
25    under these kinds of conditions.
.                                                               130
 1              MR. COE:  And that is a good question and we have
 2    struggled with that one a little bit, because the question
 3    is has the opportunity arisen for the licensee to detect
 4    this and if they missed it, they had the opportunity and
 5    they missed it, you could call that a performance
 6    deficiency.
 7              On the other hand, we struggle a little bit
 8    with -- you know, we want licensees to be aggressive in
 9    finding these things too, and so we treat those very
10    carefully on a case-by-case basis so as hopefully not to
11    discourage licensee initiative.
12              CHAIRMAN SIEBER:  One example of that could be for
13    an MOV to fail under dynamic conditions, accident
14    conditions, because of high differentials or what have you.
15              DR. APOSTOLAKIS:  Is it true that we are pushing a
16    little bit the envelope of the licensing basis now?
17              MR. COE:  I want to say absolutely not, because we
18    have, as Frank mentioned, very clearly told the licensees we
19    will not engage in anything that would not conform to the
20    backfit process that we have established.
21              We have choices that we have to make about the
22    focus and the nature of our inspection, and those can be
23    done independently of whether the licensee is violating
24    anything or not.  If we find performance deficiencies that
25    rise to a level of significance that causes us to think
.                                                               131
 1    about focusing our inspection program differently, that is
 2    within our purview, but we will not impose requirements on
 3    the licensee, although -- in addition to those that are
 4    already required.
 5              If a licensee chooses to correct performance
 6    deficiencies by making the changes that then lower that
 7    delta risk back to what we would consider the nominal level,
 8    great, and that is in fact what happened in the case that
 9    Frank mentioned.  If they choose not to, then we have a
10    choice to make.  Either it is significant enough that we do
11    a backfit analysis or if it is not we may have to drop it,
12    or we may choose to modify our assessment tools, our risk
13    assessment tools of that plant to now acknowledge that there
14    is a new additive value that is related to this issue that
15    they are not going to correct and that has now become part
16    of the baseline risk, and in the future they might have more
17    susceptibility to a higher significance issue.
18              DR. APOSTOLAKIS:  But in my mind this whole
19    process is a small but a very significant step towards
20    totally risk-informed regulation.
21              MR. COE:  So far I haven't drawn a strong
22    connection to it, George --
23              DR. APOSTOLAKIS:  Why?
24              MR. COE:  Why?
25              DR. APOSTOLAKIS:  You have sequences.  You have
.                                                               132
 1    the frequency of the initiator -- fundamental in your
 2    decision.
 3              MR. GILLESPIE:  This is where at some point the
 4    rules have to catch up with the oversight.
 5              DR. APOSTOLAKIS:  Exactly.
 6              MR. GILLESPIE:  The maintenance rule, because --
 7    maybe it is easier because it is a smaller piece, is kind of
 8    there -- A(4) is is the plant safe today, and it is kind of
 9    integrated in, and we have got some links.
10              DR. APOSTOLAKIS:  Right.
11              MR. GILLESPIE:  And it's kind of got risk and
12    guidance on how you apply it.
13              The other rules are not necessarily as in synch,
14    and that is the whole risk-informing Part 50 kind of thing,
15    and it's just got to catch up.
16              DR. APOSTOLAKIS:  A different topic.  This is
17    based entirely on the sequences and this is a perfectly good
18    thing to do.
19              I was wondering whether for some SSC that are not
20    as big items as the accumulators, the importance measures
21    would have a role to play and maybe lead you to the colors
22    without -- especially if you have a component that appears
23    in, you know, 66 different sequences it would be a little
24    difficult to go through every one of them, but if you have
25    an importance measure that takes into account the
.                                                               133
 1    frequencies or the initiator and everything maybe that could
 2    play a role in the decision of whether it is White or Red.
 3              MR. COE:  You are quite right, and historically
 4    inspectors, as I said, being given two weeks of training and
 5    thrown back out into the field, saying go do your best work
 6    will typically gravitate toward the risk achievement worth
 7    and the other risk-important measures.
 8              What that doesn't do is give them a sense of why
 9    that is important and that is what we are trying to achieve
10    here.
11              Secondly, we have one other step to complete.  We
12    are in the process now of issuing these notebooks with the
13    best information that we have been able to glean from the
14    licensee.  There is one more step in our process, and that
15    is through the experience of using these and through a
16    systematic effort we hope to test the SDP worksheets against
17    the licensee models to assure ourselves that in fact the
18    dominating sequences that we're representing are in fact the
19    dominating sequences that the licenses believe that they
20    have at their plant and that there's some consistency at a
21    high level.
22              We acknowledge that there will not be necessarily
23    consistency at all levels but this is something that we
24    expect we will improve over time as we gain experience with
25    each one of these.
.                                                               134
 1              DR. BONACA:  I have another question that I would
 2    like to ask you about just to understand the whole thought
 3    process of the inspection program.
 4              For example, you said -- you postulated there was
 5    a miscalibration of a level and this is 90 hours and it
 6    could have been over 30 days and with so high a level
 7    change.
 8              Assume that you have discovered that this happened
 9    because the training was not that detailed and the operators
10    don't understand the importance of the accumulators.  They
11    use the water to wash their car in the yard or something --
12    I don't know -- so you discover that kind of issue.
13              You would still go to this tabulation and find the
14    Green.  How does it kick into the significance, to the
15    training program or whatever are the reasons which led to
16    this kind of condition?
17              MR. COE:  The agency has agreed that or has
18    formulated this whole program, the framework of the program,
19    such that Green is called the licensee response band.
20              Issues that arise and are a significance of Green
21    are our responsibility under our program is to ensure that
22    the licensee has captured that issue in their corrective
23    action program.
24              We will not engage further in terms of root cause
25    analysis or failure.  We have an opportunity to come in and
.                                                               135
 1    look at Green issues kind of in the aggregate during our
 2    annual PI&R, Problem Identification and Resolution
 3    inspection, so we have an opportunity to do that, but on a
 4    single issue basis, we would not go to the depth that you
 5    have suggested.
 6              DR. BONACA:  I mean if you really had an
 7    indication, that example I gave you, I mean the total
 8    failure of the training program or preparation of the people
 9    behind an operation, anything of the kind, wouldn't you have
10    a means by an inspector to flag it as an issue that has to
11    be -- to try to pursue them.
12              MR. GILLESPIE:  Yes.
13              DR. BONACA:  I am trying to understand how the
14    whole thing works together.  You know, you have a window on
15    the world here coming from these kinds of inspections and
16    also from the Safety Determination Process, and you should
17    not lose those insights.
18              MR. GILLESPIE:  One of the things we needed a year
19    to implement this was people were telling us you could have
20    a site that is absolutely everything working perfectly and
21    is all Green, yet has a totally failed training program,
22    corrective action program and every other program.
23              Our premise was if those programs were totally
24    failed it would show up either in an SDP evaluation as a
25    White or in one of the PIs.  That is a premise we are
.                                                               136
 1    testing this first year, and since you said a total failure
 2    of the training program, do we really believe that a total
 3    failure of the training program for the trades at a reactor
 4    would only result in one miscalibrated accumulator --
 5              DR. BONACA:  I understand what you are saying
 6    and --
 7              MR. GILLESPIE:  We are testing that and that is
 8    something we might agree with other people just to disagree
 9    with but we think it's going to show.
10              DR. BONACA:  I meant to say that wouldn't it be
11    prudent to have some process in the training that says if
12    something really transpires that is significant in the
13    evaluation and is not coming, you know, that at the minimum
14    some note is made somewhere.
15              MR. BOYCE:  There is.  Let me try to address the
16    process a little bit.  In the inspection reports Frank had
17    alluded to, we have made a change in our threshold so that
18    you can document the sort of issue you described in the
19    inspection report, and even if the issue comes out as Green,
20    that is documented.
21              That means we don't take action on any individual
22    finding but what happens is that there is an annual look at
23    the problem identification and corrective action system of
24    the licensees, and they will go back through and look at all
25    the significance determination process findings and look for
.                                                               137
 1    those sorts of commonalities, so in that case you describe
 2    there could be an individual failure.  The guy took the
 3    garden hose out and watered his lawn, but is that pervasive?
 4              Now you would hopefully see that in your annual
 5    look.  That is captured in a document called "The Plant
 6    Issues Matrix."  That plant issues matrix is put up on the
 7    website for everybody to see, so that anybody can do the
 8    sort of review that I just described at any time.  It isn't
 9    just incumbent on the NRC to take a look at it.
10              DR. BONACA:  I mean for example the licensee for
11    an event like this he would go back and try to understand in
12    fact if it's a violation of tech specs.  It would probably
13    be a Level 1.  He would perform a root cause, try to
14    understand it.
15              He may determine that there are some fundamental
16    issues there why this was lost.  I don't think that the NRC
17    process should be ignoring all those steps that we are
18    requiring of licensees.
19              MR. BOYCE:  Well, we are not.  We are hesitant
20    though to put our own label on the cause unless we know
21    exactly what it was and that sort of thing, but also in the
22    assessments that we do on a quarterly basis and for the
23    mid-cycle and end-of-cycle reviews, we go and look for those
24    sorts of issues.
25              If there are enough of them and they rise to a
.                                                               138
 1    level, that becomes a cross-cutting issue.  We are
 2    documenting those cross-cutting issues in our assessment
 3    letters that we send out at the mid-cycle and the
 4    end-of-cycle, at the end of those reviews.  We send them to
 5    licensees and we say we think we have got a human
 6    performance issue.
 7              Now the problem is that we are not quite sure what
 8    to do with that.  Once we say we think we have got a human
 9    performance issue based on these things it doesn't mean we
10    have a path to take definitive action.  We have identified
11    it or are still wrestling with "so what?"  So what if we
12    have a human performance problem here?  Does it show up in
13    the plant operation?  Does it show up in the performance
14    indicators?  We have got a whole working group trying to
15    take a look at these sorts of issues during the first year
16    of implementation.
17              We may come out with we may have a "so what" there
18    and take a hard action based on that, but right now it is
19    just a question we have been wrestling with since we started
20    implementing the program.
21              DR. BONACA:  My sense would be, for example, if
22    you go back now and you look at what the licensee did and
23    put in his Corrective Action Program and took proper
24    resolution and et cetera, it would be become a no issue.
25              MR. BOYCE:  That's right.  You would hope that if
.                                                               139
 1    all these things show up in the PIM and it's publicly
 2    available, that scrutiny would drive the utility into fixing
 3    it.
 4              DR. BONACA:  Okay.
 5              MR. BOYCE:  But that is not what we are regulating
 6    towards.  We are not using that as our tool.  That is just a
 7    means where we are trying to be transparent to the public.
 8              MR. GILLESPIE:  Let me ask Doug because he has got
 9    a second example here --
10              MR. LEITCH:  Let me ask a similar question.  What
11    about repetitive Green issues.  Do they ever become White
12    based on the repetitive nature of them or does that rather
13    point to a problem with the Corrective Action Program?
14              MR. BOYCE:  Well, I think it is more the latter,
15    because the SDP relies on the risk on each individual
16    finding and that becomes one of the problem identification
17    resolution type of issues that we would look at in the
18    annual look.
19              We would issue an inspection report that said you
20    have had three repetitive deficiencies in exactly the same
21    area, and that would be documented in our inspection report
22    which would show up in the assessment letter sent to each
23    licensee.
24              Again, it comes down to do we have a specific
25    process that acts on that information.  We don't have one at
.                                                               140
 1    this point -- it's just one of those issues we are wrestling
 2    with is how to handle those cross-cutting issues.
 3              MR. COE:  I can go through the second example, but
 4    at a high level it is essentially the same process that I
 5    have just described here.
 6              It involves a much more substantial loss of
 7    function and in a system that has implications to a larger
 8    number of sequences and so it is intended to give the
 9    audience a sense that there are some issues that are fairly
10    invasive in terms of the risk at the plant and affect a
11    number of things, and that the process of working through
12    the Phase 2 worksheets might be commensurately greater, but
13    the process ends up to be the same.
14              If you look on the very last page of that
15    example --
16              MR. GILLESPIE:  Go to the highlight worksheets.
17              MR. COE:  At least get to the plant-specific one
18    and what it is applicable to.
19              DR. POWERS:  The examples you have chosen all
20    involve fairly non-controversial questions.  Is the train
21    there or is it not?  Is the system there or is it not?
22              MR. COE:  That's right.
23              DR. POWERS:  When I look at the significance
24    determination on the fire protection I find a more
25    subjective sort of decision that has to be made.  Is this
.                                                               141
 1    protection system degraded, and if so, is there a lot of
 2    degradation, a little bit of degradation, and how do I make
 3    that judgment?  How do I decide whether it is a lot or a
 4    little bit?
 5              MR. COE:  Right.
 6              DR. POWERS:  Because those judgments then are
 7    translated into numbers that turn out to be exponents in a
 8    risk metric and I have no idea where those exponents come
 9    from, but I do know that whether I decide it is medium or
10    low degradation makes a big difference on those numbers.
11              MR. COE:  Yes.
12              DR. POWERS:  I mean decades there.
13              MR. COE:  Yes.
14              DR. POWERS:  How do I do that?
15              MR. COE:  Well, for one, the very fact that the
16    analysis points to the things that are most influential is a
17    big feature, is a significant feature and an improvement
18    because it fosters communication and focus on those
19    particular aspects.
20              There's certain aspects that you will see in these
21    analyses that won't make a difference whether you judge it
22    to be this way or that way.  I mean they won't change the
23    final result necessarily.  However, there are some that
24    will, and those are the ones that get focused on most
25    intently.
.                                                               142
 1              The cases that you are describing, fire
 2    protection, we know we have some problems with guidance in
 3    there, some lack of guidance on how to be consistent in
 4    defining those levels of degradation, and in fact we are
 5    preparing a revision to that guidance right now that I hope
 6    to issue before the holidays that will provide additional
 7    guidance for certain of those characteristics in the fire
 8    protection SDP.
 9              Ultimately speaking, it boils down -- you're
10    right -- to a judgment, and what I stress when I talk to any
11    number of different audiences about the SDP process is that
12    this process does not produce, you know, numbers
13    automatically.  They are all the result of judgments that
14    are made along the way and what the process does is it helps
15    focus on those judgments that are most influential, and then
16    what happens is all of the -- well, primarily all of the
17    discussion focuses on the validity of those influential key
18    assumptions.
19              Sometimes it takes an engineering analysis to
20    discern whether the degradation was enough to warrant
21    changing the reliability value that it is given or its
22    availability value, and sometimes it just simply boils down
23    to a matter of judgment, but in all cases the Staff's
24    responsibility is to formulate the basis for that judgment
25    and how it relates to the degradation that they have chosen
.                                                               143
 1    and it is our challenge over time to make that process
 2    consistent.
 3              That's the best answer I can give you.
 4              DR. POWERS:  I mean, what you've said, I think, is
 5    that you're going to try to improve the guidance and at
 6    least we'll have this judgment that has the potential of
 7    being reasonably consistent.
 8              The next question is, where did the numbers come
 9    from?
10              MR. COE:  Where did the numbers come from?
11              DR. POWERS:  Yes.
12              MR. COE:  It's very simply a train is valued at
13    about a ten to the minus two failure per demand.  An
14    operator action under normal stress conditions, typically is
15    also valued at about a ten to the minus two.
16              DR. POWERS:  I was thinking more of the fire
17    protection numbers.
18              MR. COE:  In the fire protection arena, I
19    couldn't, off the top of my head, give you a good basis for
20    how those values were -- why those values were put in there
21    the way they were.
22              We have one individual on our staff that's been
23    deeply involved in formulating that SDP.  We've worked
24    closely with him, but he's the real expert, and he's the guy
25    that you really should probably have in front of you to
.                                                               144
 1    explain that.
 2              DR. POWERS:  It would be helpful if the
 3    documentation on this process explained those sorts of
 4    things.  It would inspire confidence.
 5              Because my looking at the fire protection SDP said
 6    this is totally subjective.  I mean, if I know what -- well,
 7    as soon as I figure out that these are exponents in a risk
 8    metric, I know what answer I want to get.
 9              And I go in and fill the matrix out.
10              MR. COE:  Well, we have to guard against --
11              DR. POWERS:  Even if there's a typographical error
12    in the thing and it takes forever to figure it out, but once
13    you figure that out, the typographical error, then you can
14    go through and I can get the answer that I want at the end.
15              MR. COE:  I can tell you from personal experience,
16    from serving on these panels that meet with regard to -- and
17    we've had a couple of fire protection issues -- that I have
18    not detected any intent at all to try to get a particular
19    answer.
20              I think what I have stated is, in fact, true, that
21    it focuses our attention on the things that are most
22    influential, and we have great discussions and great debates
23    about, you know, how that -- what we know to be true and
24    what we know we don't know, should be reflected in the value
25    of degradation that we choose, the exponent on those powers
.                                                               145
 1    of ten that we choose.
 2              So, you know, one of the objectives of the SDP was
 3    to foster greater communication, greater insight.  And I
 4    think because of what I have seen in these kinds of
 5    discussions that we have, I think it's achieving that.
 6              And as I said, our greater challenge over the long
 7    run is to make the process more consistent.  I would say
 8    that this process, although it's going to -- it has retained
 9    subjectivity and it will continue, I think, to have some
10    element of subjectivity in it, it is far less subjectivity
11    than what we had before this process existed, and that this
12    process is constraining our judgments and our subjectivity
13    to within the logical framework of a risk analysis
14    methodology.
15              So in that sense, I think we're on an improving
16    track.  I'd be happy to answer any of the questions about
17    this example or anything else.
18              DR. APOSTOLAKIS:  Are the SPAR models going to
19    play some role eventually?
20              Are the computerized SPAR models going to play
21    some role at some point after the inspectors become more
22    familiar with them?
23              MR. COE:  They already do, to some extent, those
24    that have been improved to the point where we've started to
25    get some confidence with them.
.                                                               146
 1              We have the Phase III part of our process which
 2    acknowledges that these worksheets have limitations and we
 3    may need to go beyond these to verify the results that we're
 4    getting or modify them.
 5              We've used licensee analyses.  We've interacted
 6    with the licensee, they have -- of course, they produced
 7    their own analyses.
 8              The short-term answer is that I would expect SPAR
 9    models to be used for Phase III analysis as time goes on,
10    more and more, as they become more available.
11              In the long run -- and now I'm speculating far out
12    -- is, maybe the SDP process, as it's currently provided in
13    Phase II, is just a stepping stone, and maybe ultimately
14    we'll have inspectors out there that have access and
15    training and the ability to appropriately utilize more
16    sophisticated tools.
17              But believe me, I can only tell you how pleased
18    that I've been that people are now starting to think in
19    these terms on a day-to-day basis, whereas in the past, they
20    simply turned to me or they turned to some analyst and said,
21    what's the number?  We're beyond that.
22              MR. GILLESPIE:  George, when you look at these
23    things, this is an obvious candidate for some kind of
24    computerization.  I mean, you're filling in tables and
25    answering questions with yes's and no's and check marks.
.                                                               147
 1              The value to this in our training phase, in the
 2    initial phase, was getting people to think logically and
 3    think in terms of sequences.
 4              Whether we'll get to the point here inspectors use
 5    SPAR models or not, will be driven by the practicality of
 6    it, and is it practical to train all of our inspectors to
 7    the level of training that if they can't see it on a check
 8    sheet, will we lose the understanding of that next inspector
 9    who didn't grow up with the program?
10              DR. WALLIS:  Well, you anticipated my question. 
11    It seems that all these check sheets could be put in a nice
12    little hand calculator.
13              MR. GILLESPIE:  Oh, yes, the Palm Pilot could take
14    care of this.
15              DR. WALLIS:  And certainly the younger people
16    might even be happier with that.
17              MR. GILLESPIE:  You can electronic-ize the check
18    sheets, but that's different than not having the questions
19    there to --
20              DR. WALLIS:  But then you can upgrade at any time.
21              MR. GILLESPIE:  And the question is, would we want
22    to upgrade?  I'm not sure.  The number of findings that go
23    beyond white are very limited when you look at all the
24    inspection we do.
25              And this shouldn't be surprising if you look at
.                                                               148
 1    the LER program.  Generally it will go 1200 findings, and
 2    there are only 12 or less that actually break the ten to the
 3    minus six kind of threshold.  I know that's a CDP threshold,
 4    but the fact that you have a lot of findings and only a few
 5    that pass this kind of risk screening, should not surprise
 6    us because we have years of evidence of that.
 7              And, therefore, do we want to upgrade every
 8    inspector to a computer analyst?
 9              DR. WALLIS:  The inspector can learn, too.  I
10    mean, if you have this little hand thing, you can say, oh,
11    I'm curious about what happens with this; I'll try it.
12              MR. GILLESPIE:  Oh, yes, yes.
13              CHAIRMAN SIEBER:  On the other hand, looking at
14    the manual sheets allows one to actually see what the
15    process is, and what the relationships are amongst the
16    parameters that are in there as opposed to answering yes and
17    no, and all of a sudden, it comes up yellow.
18              MR. COE:  We've started to explore the very
19    concept that you're discussing.  We've actually been
20    experimenting with some spreadsheets that can do this.
21              And the key is, as I think you just noted, is that
22    it can't -- you can't lose the intellectual engagement.  And
23    once you lose the intellectual engagement, we're back to
24    where we were five years ago.
25              MR. BOYCE:  The other thing I would add is, you
.                                                               149
 1    could be driven to a bottom-line number as your main
 2    criteria, and that's not really the point.  We're trying to
 3    stay within bands, so that we can make good regulatory
 4    decisions.
 5              DR. WALLIS:  It's always puzzled me as to why
 6    there is this aversion to having a number.
 7              MR. GILLESPIE:  What we are really avoiding is
 8    comparing bottom-line risk numbers,
 9    plant-to-plant-to-plant-to-plant.
10              MR. COE:  It's consumer confidence.
11              DR. WALLIS:  Every time you put in these
12    judgmental things, you're really fuzzy'ing up the decision;
13    aren't you?
14              MR. COE:  But you're doing it in the daylight. 
15    You're no longer hiding the assumptions that are influential
16    in a model in some software somewhere.
17              DR. WALLIS:  That's right, but I get very worried
18    about the cop who stopped me and used a lot of fuzzy logic
19    to tell whether I was going over the speed limit or not.
20              MR. GILLESPIE:  Let me take that to the next
21    point, because I don't want to lose sight of this:  Once you
22    get through, the inspector gets through Phase II, and if he
23    comes up with a white finding, it right now gets reviewed by
24    a panel, a separate panel.
25              The licensee then gets a proposed finding of
.                                                               150
 1    white.  The licensee then gets an opportunity to see what
 2    your logic was, to see the writeup of the detail that's
 3    contained in these sheets, and, if he so wishes, to put in a
 4    counter argument which now focuses him in on the specifics
 5    of what was wrong.
 6              This includes fire protection.  It may be
 7    subjective, but it is within the system of checks and
 8    balances that the regulatory isn't unilaterally subjective.
 9              DR. WALLIS:  So it allows argument.
10              MR. GILLESPIE:  It allows for structured argument. 
11    This sets up the structure.  What do you disagree with in my
12    analysis?
13              DR. WALLIS:  This was designed by lawyers, rather
14    than engineers.
15              MR. GILLESPIE:  Well -- so there is that element
16    of checks and balances, and, in fact, once the proposed one
17    goes out and then there's some give-and-take, and normally
18    there would be a regulatory conference on it where the facts
19    would get on the table, then even when the final one goes
20    out, if there is an enforcement attached to it, there is yet
21    another opportunity to rebut it.
22              So the checks and balances on ourselves and on the
23    system is, in fact, with the licensee, but this gives us a
24    structure to argue over the points.  And, in fact, when you
25    get to a Phase III, which is a phase Doug didn't talk about,
.                                                               151
 1    when you're beyond the check sheets, and if a licensee
 2    challenges it, then you're in Phase III.  You've got the
 3    analysts involved with their analysts.
 4              In fact, then you're arguing over things like
 5    assumptions on human reliability and other assumptions.
 6              DR. WALLIS:  But if you had -- you would need the
 7    inspector at all.  All the licensee needs to do is go
 8    through the same thing.
 9              MR. BOYCE:  Licensees, of course, have their own
10    models and can do that, yes.
11              MR. COE:  And they should be.  I think that's part
12    of the communication process that we'd like to foster.
13              But it's not sufficient for just the licensee to
14    do it.  We need intelligent inspectors that are out there,
15    not only assessing the significance of their finding, but as
16    I said earlier, risk-informing their approach so that they
17    are more likely to find the bigger issues that are out
18    there.
19              DR. LEITCH:  How do you deal with an issue such as
20    an inspector finds a locked high-rad door unlocked?  Is
21    there a similar -- I mean, how do you deal with that?
22              MR. COE:  There is a significance determination
23    process.
24              DR. LEITCH:  There is a significance
25    determination?
.                                                               152
 1              MR. COE:  Yes, there is.  There's a significance
 2    determination process for each of the cornerstones, and that
 3    particular cornerstone would be radiation protection,
 4    occupational safety cornerstone.  And I believe that locked
 5    doors and such are reflected in that SDP.  I don't have it
 6    with me.
 7              MR. GILLESPIE:  Philosophically, though, all of
 8    those follow the same kind of barrier.  The Table II that
 9    Doug put up, number of recovery systems, if you find an
10    unlocked door in a high-rad area, did it lead to an
11    overexposure, yes or no?  It's got those kinds of questions.
12              Was there a training failure?  Why was it
13    unlocked?  Was it loss of control of all the keys?  Did
14    someone just leave it behind?
15              So, it's got the same Phase I, Phase II kind of
16    approach, which is looking for kind of barriers of
17    protection.  Did another barrier of protection that would
18    have prevented overexposure also break down?  Was there a
19    loss of control of the keys besides one high-rad area
20    unlocked?
21              Was there a problem in the training program so
22    that you've lost two barriers of protection?
23              So, philosophically, that was used in all the
24    areas.
25              DR. LEITCH:  Okay, thanks.
.                                                               153
 1              DR. APOSTOLAKIS:  One other question:  The various
 2    colors, yellow, white, under various -- in different
 3    contexts, especially when it comes to the action matrix, do
 4    they all represent the same kind of core damage frequency
 5    change, or sometimes they do and sometimes they don't?
 6              MR. COE:  Well, we can't relate some of the
 7    cornerstone colors to core damage frequency, so what we did
 8    is, we tried to make sure that our level of response for
 9    that particular issue would be appropriate; in other words,
10    that it would be consistent with what we would expect to
11    respond to an issue of similar significance in the reactor
12    safety area.
13              In other words, there was no quantitative attempt
14    to link those, but there was a qualitative one.
15              DR. APOSTOLAKIS:  So if I look at the action
16    matrix, and it says, you know, when you have two yellows, do
17    this, but if you have one yellow and a red, do something
18    else, that the fundamental basis for that is really the
19    judgment of the people who developed it, in looking at what
20    red might mean and so on and say, well, gee, this is the
21    appropriate level of response?
22              I shouldn't be looking for more quantitative
23    justification?
24              MR. GILLESPIE:  No.  In initiating events and
25    mitigating systems -- the mitigation cornerstone and the
.                                                               154
 1    initiating cornerstone, which is really reactor safety that
 2    Doug's been talking about, there is some sense of
 3    equivalency between PIs and inspection, because we could
 4    deal with it there.
 5              But in radiation protection, we really had to look
 6    at the data.  It's occupational protection we're really
 7    worried about there.
 8              MR. COE:  And emergency planning.
 9              MR. GILLESPIE:  And emergency planning and EP. 
10    You can't relate those to a delta CDF or a LERF number.  You
11    really kind of conceptually have a defense-in-depth, and so
12    in EP, you know, there's a training PI.  Everyone has to
13    have practiced in something, and 80 percent within every two
14    years.
15              So we're kind of -- and then there's the siren, so
16    we're thinking more in terms of number of -- equivalent
17    numbers of barriers to not -- to do the function that needs
18    to be performed.
19              So at that point, it's strictly subjective, but
20    traditionally, our reaction, a yellow reaction to EP should
21    bear some resemblance to the Agency's reaction to a yellow
22    in reactor safety.  So it was by reaction.
23              MR. COE:  These SDPs were created by a group of
24    people with industry participation, public participation,
25    and so it's got -- there's a lot of thinking that's gone
.                                                               155
 1    into these.
 2              Not that they are perfect, and as we gain more
 3    experience, we do find that we need to make adjustments and
 4    we continue to do that.  It should be noted that with all
 5    the hassle you're hearing about in the press about PIs and
 6    the comments about PIs from the industry, that everything
 7    that we've put in place for the first year of implementation
 8    was 100 percent consensus between industry and NRC.
 9              It's kind of an interesting concept.  And, in
10    fact, they're critiquing something they have already agreed
11    on, which is okay, but there's a certain perspective.
12              The regulator didn't just think all this up, and
13    we weren't necessarily smart enough, just ourselves, to put
14    it in place.  It was actually a very tortuous
15    consensus-building process to get there.
16              And sometimes that gets lost in the mix as if,
17    well, the NRC just said we had to do it this way, but NEI
18    took their 80 percent vote and all that kind of stuff that
19    they go through to say this is consensus.
20              DR. LEITCH:  I was just curious, and I think that
21    at the outset, you mentioned that during the pilot program,
22    what I think you termed the fatal flaw, was discovered at
23    Quad Cities.  Could you say a word or two more about that?
24              MR. GILLESPIE:  Yes.  Physical security was
25    unique.  We actually all agreed with industry on this SDP
.                                                               156
 1    for physical security.  And what we did was, we had -- if
 2    you ran an exercise and the exercise, the adversary was
 3    successful in getting to a certain point, then it got -- you
 4    asked the question, what equipment did the adversary get to
 5    and compromise, and you took that equipment and you went
 6    through the SDP, really, that Doug has here.
 7              In essence, you did that.  What we forgot to stay
 8    cognizant of is success in the physical security area is
 9    actually in an adversary attack, having at least one train
10    left.
11              Well, if I go into the SDP with only one train of
12    cooling left to prevent core damage, which is the success
13    criteria of the exercise, I am very far to the right to
14    start with.
15              And we kind of didn't recognize that ourselves,
16    that that's the design criteria for a physical security
17    system.  And so the design criteria that would already be
18    put in this process would be considered to the extreme.
19              So, when we tried to apply it, it didn't work. 
20    And it was inappropriate to level of reaction and corrective
21    action we'd be looking for.  And none of us saw it, us or
22    industry when we came to a consensus on how to apply this. 
23    And that was the fatal flaw.
24              So we've got kind of another, more subjective
25    scale, quite honestly, up with the Commission, and saying
.                                                               157
 1    the Commission, as a temporary basis, we think this makes
 2    sense, and then we need to step back and figure out, is
 3    there a better way to do it in the long run?
 4              CHAIRMAN SIEBER:  If there are no further
 5    questions, I would point out that this afternoon's meeting
 6    will start on time at 1:00, and I would like to thank the
 7    Staff for responding to having a last-minute, total
 8    readjustment of their -- well, usually it happens sort of
 9    randomly, as opposed to structurally.
10              On the other hand, the presentation was very good. 
11    Do you expect a letter from us?
12              MR. GILLESPIE:  No.  At this point, in fact, this
13    was kind of a status briefing.
14              CHAIRMAN SIEBER:  Right.
15              MR. GILLESPIE:  I'm going to expect that -- we're
16    holding a major public/private workshop, internal workshop
17    in the January/February timeframe, and then a major lessons
18    learned public workshop in the end of March.
19              And at that time, we're going to have this massive
20    evaluation document.  It's massive, if you look at all the
21    parameters we're looking at.
22              At that time, it may be appropriate to come back
23    and say here's our picture of the first year.  Here's the
24    major things that we need to work on.  Were there any fatal
25    flaws like the one in security that jumped out, and why
.                                                               158
 1    we're doing which ones first.
 2              So I would say that we'd probably see coming back
 3    and maybe getting back to the Committee and then the Full
 4    Committee in kind of the April timeframe.
 5              CHAIRMAN SIEBER:  Right.
 6              MR. GILLESPIE:  That's a precursor to a report we
 7    owe the Commission in June on the evaluation of the first
 8    year, so I would suggest April/May-ish, maybe May, so we can
 9    get all our ducks in a row.  And then we'd be probably
10    looking for a letter to be a precursor to the June
11    Commission meeting.
12              CHAIRMAN SIEBER:  Okay, thank you.  Again, we
13    appreciate your presentation and the time you took and for
14    your versatility.
15              MR. COE:  It's our pleasure.
16              CHAIRMAN SIEBER:  So at this time, I'd like to
17    adjourn the Subcommittee meeting.
18              [Whereupon, at 12:10 p.m., the meeting was
19    adjourned.]
20
21
22
23
24
25