skip navigation links 
 
Index | Site Map | FAQ | Facility Info | Reading Rm | New | Help | Glossary | Contact Us blue spacer  
secondary page banner Return to NRC Home Page

Official Transcript of Proceedings

NUCLEAR REGULATORY COMMISSION

Title: Results of Reactor Oversight Process
Initial Implementation

Docket Number: (not applicable)

Location: Rockville, Maryland

Date: Friday, July 20, 2001



Pages 1-133


UNITED STATES OF AMERICA

+ + + + +

NUCLEAR REGULATORY COMMISSION

+ + + + +

BRIEFING ON
RESULTS OF REACTOR OVERSIGHT PROCESS
INITIAL IMPLEMENTATION

+ + + + +

FRIDAY, JULY 20, 2001

+ + + + +

ROCKVILLE, MARYLAND

+ + + + +


		The briefing was held at the Nuclear Regulatory Commission, One
White Flint North, Room 1F16, 11555 Rockville Pike, at 9:30 a.m., Richard A.
Meserve, Chairman, presiding.
PRESENT:
	RICHARD A. MESERVE, Chairman
	GRETA JOY DICUS, Commissioner
	EDWARD McGAFFIGAN, JR., Commissioner
	JEFFREY S. MERRIFIELD, Commissioner



	C-O-N-T-E-N-T-S
                  AGENDA ITEM                   	PAGE
Panel 1 Presentation	4
Panel 2 Presentation	70
Questions and Answers	94
Adjourn	133

	P-R-O-C-E-E-D-I-N-G-S
	(9:29 a.m.)
		CHAIRMAN MESERVE:  Good morning.  On behalf of the Commission,
I'd like to welcome you all to today's briefing on the results of the Reactor
Oversight Process Initial Implementation.
		As I think you are all aware, the implementation of the ROP
occurred in April 2000, after the completion of a six-month pilot.  We
undertook -- we use the word "initial implementation" I think with the clear
intent that this would be a work in progress.  In fact, I think the first
year's effort has suggested that the revised oversight process has been
implemented in a fashion that went much more smoothly than I think any of us
would have anticipated.
		Nonetheless, there clearly are things that need to be examined
and recommitted at the outset, that we would do so.  And there are clearly some
things that we need to consider changing.
		This morning's briefing is a follow-on, obviously, from our
briefing yesterday about the outcomes with regard to plants arising from the
first year's implementation.  Today's meeting is a focused examination on the
process itself and what the evaluation of it has been and what changes we ought
to consider.
		With that, why don't we -- let me see if any of my colleagues
have an opening statement.
		COMMISSIONER MERRIFIELD:  Yes.  Mr. Chairman, I just want to make
a comment.  I know obviously we are pleased with the success that we've had so
far in our efforts to implement this new process.  There's a lot of people who
have taken a large part in making that happen.  Obviously, our staff are very
notable in that respect.  
		We have others today from the industry, individuals who have been
very active, and also Ray Shadis has made a significant time commitment on his
part.  A lot of people are providing a lot of help, and I just wanted to
recognize that.
		Thank you, Mr. Chairman.
		CHAIRMAN MESERVE:  Dr. Travers, would you  like to proceed?
		DR. TRAVERS:  Thank you, Chairman.  I think you've set the stage
for our presentation today.  I'll just note quickly that we have been involved
in a number of important initiatives the last several years, and certainly the
new reactor oversight program has been one of the most significant efforts.
		From its inception through the first year of initial
implementation, the program really has benefitted from significant and
extensive internal and external involvement.  Today we will provide a summary
of our experience over the past year and highlight some of the most significant
issues that have been identified, and, accordingly, the challenges we face
going forward.
		As you have pointed out, Mr. Chairman, an element or a hallmark
of the program really is this idea of continual self-assessment and
improvement, and certainly we expect that to continue as we move on in the
program.  And there are a number of processes that we expect will act to help
to facilitate that.
		Today's briefing represents the culmination of a tremendous
amount of staff effort that has been recognized by you, and, as I indicated, in
no small means been affected by the frequent and numerous interactions we've
had with external stakeholders throughout the development and this first year
of implementation.  And you're going to be hearing directly from some of those
stakeholders in just a moment.
		At the table with me today are Bill Kane, my Deputy for Reactor
Programs; Jon Johnson and Mike Johnson from the Office of Nuclear Reactor
Regulation; Ellis is here, Ellis Merschoff from Region IV.  I should also point
out that in your second panel, Loren Plisco, who is the Chair of the Initial
Implementation Evaluation Panel, is going to be with you and giving you some
information on that panel.
		Lastly, let me just briefly note that Tony McMurtray, a senior
resident who is one of those on the staff who is responsible for implementing
this new reactor oversight process on a day-to-day basis onsite is here in the
audience as well, in the gallery, and available to answer questions if you have
them.
		And with that, let me turn the briefing over to Jon.
		MR. JON JOHNSON:  Thanks, Bill.
		Good morning, Chairman, Commissioners.  The purpose of today's
briefing is to discuss the results of the initial implementation of the
oversight process.  There's been a tremendous amount of effort and coordination
by the NRR Inspection Program Branch staff to get here.  This staff was
formerly led by Bill Dean and now by Michael Johnson, and both the NRR
executive team and the regional management team have had confidence in this
transition.  It has gone smoothly.
		I'd be remiss if I didn't recognize the regional office staff and
management.  They have worked hand in hand with NRR in this process to make it
work.
		I'd also like to point out the efforts in support of the Office
of Research.  They have provided some of the fundamental bases for the
performance indicator work and the risk determination process, and these
efforts are continuing.
		The reactor oversight process is a living program.  It's not
static.  We've learned a lot, but we can continue to make improvements.  Some
of the issues we've identified are problems with timeliness in the SDP process. 
We also have some issues with the guidance for what to document in inspection
reports.
		This has provided a dilemma.  We focus on risk-significant issues
in the inspection reports, but it raises a question as to what types of
cross-cutting issues that we -- and other minor issues that we could put in
inspection reports.
		Michael Johnson will discuss the major topics in the Commission
paper.  These include -- could I have slide 2, please?  These include feedback
from internal and external stakeholders, the overall results and lessons
learned, and a discussion about resources, what kind of resources it took to
implement this program.
		Michael?
		MR. MICHAEL JOHNSON:  Thank you, Jon.
		Good morning, Chairman, Commissioners.  
		Can I have the next slide, please?  
		First, by way of background, let me just remind us that we have
traveled a tremendous distance in the last two or three years from the concept
development, through a pilot test, and through successful completion of the
first year of initial implementation.
		Next slide, please.  
		In addition to taking on the substantial task of implementing the
ROP at all of our operating reactors, we conducted numerous activities to
interface with both internal and external stakeholders and to evaluate our
activities through a self-assessment process, to identify out-of-tolerance
conditions, and to be able to take action based on those conditions.
		Through activities such as weekly conference calls with the
division directors, visits to sites, monthly NRC industry working group
meetings, and the Federal Register notice, we collected feedback from our
internal and external stakeholders.  In addition, from early in the concept
development, in the pilot, we established a set of criteria, measures and
criteria, and we used those measures and criteria to evaluate the effectiveness
of the pilot program.  
		We continued those -- that concept.  We developed measures and
criteria for initial implementation, and we used those in the self-assessment
process.
		We, at the direction of the Commission, established a FACA panel,
the IIEP, to provide oversight.  And, in addition, we briefed the ACRS in a
number of briefings to provide them the opportunity to be aware of where we
were with respect to the oversight process.
		And so I'll discuss the results from the feedback on
self-assessment activities in a few minutes, but first let me highlight the
overall results.
		Next slide, please.
		At the start of initial implementation, we had had the
opportunity to pilot -- pilot test several aspects of the program, although we
hadn't had an opportunity test all of the aspects of that program.  And, in
fact, there were still a number of the staff who had not had an opportunity to
directly implement the program at the start of initial implementation.
		Since then, we've come a great ways.  We've exercised almost a
full range of the process, and in doing that we've learned valuable lessons
about the process.  For example, with respect to IP-2, we exercised for the
first time the action matrix for a plant that was in the multiple repetitive
degraded cornerstone column.  We conducted the 95003 inspection procedure and
learned lessons.
		IP-2 taught us what we knew, what we already knew, and that is no
matter what oversight process you have, if you have a plant that has
significant performance problems, it's going to take extensive resources,
direct inspection resources, to follow up those issues.  It's going to take
extensive other direct resources, such as inspection-related travel and
interface with external stakeholders.  So we learned lessons based on IP-2.
		With respect to Kewaunee, for example, Kewaunee taught us
valuable lessons about what the program provides with respect to what we will
do if we do a supplemental inspection and find that the licensee hasn't taken
actions that are appropriate in our view to address significant performance
issues.  And so we went back and looked at the procedures that we had in place,
and we strengthened those procedures.
		So, and I could give you other examples, but the point I'm trying
to make is in each case, in every case, we didn't wait.  We fixed the program,
and we went forward.  Most of us believe the program represents a significant
improvement over the previous process and that the program will achieve and has
achieved the goals that -- the Commission's goals with respect to the ROP.
		And so we're not at a point where we're asking, can the process
work?  But we're asking, how can we make the process work better?
		Next slide, please.
		The next slide -- and, in fact, the next three slides -- I won't
spend much time on them at all.  They simply convey the results, first of all,
of the inspection findings across thresholds.  You can see that we had findings
across the -- of varying significance across thresholds.
		Next slide, please.
		Also, with respect to the performance indicators, we have
performance indicators that obviously cross thresholds. 
		Next slide.
		And, finally, with respect to the action matrix, there were
concerns at the beginning of the program that the program wouldn't be
responsive to differing levels of performance.  And you can see, based on the
action matrix results, that, in fact, we did have performance in plants that
crossed not just in the licensee response column but also in other columns of
the action matrix.  So the program, in fact, was responsive to differing levels
of performance.
		Next slide, please.
		I'd like to shift gears slightly to focus briefly on the feedback
that we got from stakeholders and the self-assessment metrics.  First of all,
with respect to internal stakeholders and what they told us about the oversight
process, as I indicated, we conducted a variety of activities to get their
insights.  We got consistent results based on those activities.
		So let me just focus in on the survey, because it provides sort
of an illustration of what we found.  The survey was generally positive and
dramatically so.  For example, 68 percent of the staff agreed, and an
additional 20 percent strongly agreed, that the program provides assurance of
plants who operated safely.  We had similar results with respect to whether the
program was objective and risk-informed and an improvement of the previous
process, and many other areas.
		In addition, the survey demonstrates that we made progress in
many areas from the previous survey that was conducted in 1999.  One of the
things that concerned us following that 1999 survey was that only 24 percent of
the staff believed that the program had the ability to provide an indication of
declining performance before there were significant degradations in
performance.
		That has doubled based on the percentages that came back in this
most recent survey, and we made significant gains in other areas.  For example,
in 1999, 41 percent of the staff believed that the program provided appropriate
attention on performance issues.  That's up to 74 percent based on this most
recent survey.  So, again, we believe that the survey demonstrates that we made
significant progress.  
		Finally, despite the positive view from the survey, the survey
really did point to areas that we need to improve on.  We'll talk about them
more in a minute.  They talk to ease of use of the SDP and the timely handling
of feedback -- internal feedback basically and how we were -- how timely we
were in dealing with that particular feedback.
		Next slide, please.
		Again, I won't spend much time on this next slide.  You'll hear
firsthand from external stakeholders regarding their views.  From our
perspective, the majority of the feedback was positive.  However, as would be
expected, external stakeholders identified areas that we know we need to work
on, specifically the performance indicator refinement and the SDP.  And I'll
talk, again, more about that in a few minutes.
		Next slide, please.
		As I indicated earlier, we established a systematic approach to
objectively measure the ROP through a process, a metric process if you will. 
We looked at the NRC's four performance goals.  But in addition to that we
looked for goals that we had established for the process with respect to, is it
understandable and objective, risk-informed and predictable.
		We used agency data, data from RPS, data as a result of audits
conducted by NRR, but also by Research's operating experience, Risk Analysis
Branch, and we folded in feedback from external and internal stakeholders into
the metrics to be able to populate those metrics to provide insights to us
regarding the effectiveness of the program and meeting those goals.
		The current results were factored into the ROP assessment, and we
continued to refine that self-assessment metrics process to make it better.
		Next slide, please.
		We used the feedback and the insights from the metrics to
identify improvements in each of the major areas of the ROP.  I will discuss
those -- each of those areas very briefly hopefully.  I'm going to focus in on
successes, and then I'll talk about the improvement areas.
		With respect to the inspection program, although we recognize
that we need to continue to evaluate the quality of inspections that are done,
we believe that the inspection program has identified significant safety issues
and provides an improved focus on risk-significant areas.
		In addition, we think it's a significant accomplishment that
despite all of the challenges that we have with respect to startup that we
were, with very few minor exceptions, able to complete the inspection program
this first year.
		Having said that, there are improvement areas.  During initial
implementation, we changed the guidance to clarify the thresholds for
documentation and our expectations for documentation of the significance of
findings in inspection reports.  Having said that, we still find areas where we
know we need to continue to improve with respect to how well we document our
rationale for the significance of findings in inspection reports.
		In addition to that, we've identified several inspection
procedures that we know we need to make changes to.  For example, with respect
to the maintenance rule inspection procedure, we've been told and recognize
that the maintenance rule inspection procedure is too frequent.  
		It causes us to focus in programmatic areas that are not
risk-informed, and it focuses us on licensee implementation of the maintenance
rule and not necessarily on the effectiveness of maintenance activities.  And
so we will take that issue on, and I'll talk about how.
		Next slide, please.
		Among the actions that we're going to take based on those
improvement areas, we plan to continue evaluating and revising Inspection
Manual Chapter 0610, the guidance for inspection report documentation.  In
addition, we're issuing a newsletter shortly to provide examples of findings
that are correctly documented to help the staff understand the expectations
with respect to documentation.
		As I indicated, we are making significant changes to some
procedures.  For example, we're revising the ISI procedure based on lessons
learned from the IP-2 steam generator tube rupture.  In addition, we are
revising the PIR inspection procedure, the problem identification and
resolution inspection procedure, to make it more effective.
		Next slide, please.
		With respect to the performance indicators, as an interesting
success, some new concerns that the performance indicators would result in
potential unintended consequences, we found an area, at least one area in the
performance indicators, that, in fact, we believe resulted in improving --
licensees improving their performance in an important area.  
		And I'm speaking specifically of the area of EP.  If you look at
the EP performance indicators, we've got an EP drill performance indicator, an
EP drill participation performance indicator.  If a licensee wants to improve
their performance in those areas, they have to run more drills, and they have
to do a better job at those particular drills. 
		And we found cases where licensees have, in fact, gone after
those improvements, and we think that benefits the performance in this area
that is particularly important.
		In addition, you no doubt remember that at the start of initial
implementation there were concerns, we had concerns, the external stakeholders
had concerns regarding the accuracy of PI reporting.  I'm happy to report that
those concerns regarding reporting accuracy were less than anticipated.  In
fact, in only two instances did we find that PIs were initially reported.  We
had a subsequent report and those reports caused those PIs to cross the
threshold.  That's a success.
		With respect to improvement areas, we recognized -- in fact, the
industry pointed out prior to the start of initial implementation their
concerns regarding the SCRAM performance indicator.  We had concerns regarding
the unplanned power changes performance indicator -- again, both with respect
to potential for unintended consequences.
		Lastly, if you look at the safety system unavailability
performance indicator and the frequently asked questions, those frequently
asked questions are questions raised by licensees but also by internal
stakeholders regarding interpretation of those issues.  The single biggest by
far area of frequently asked questions deal with the safety system
unavailability indicator.
		The definition is complex.  There are differing applications
between INPO, WANO, the ROP PIs, the PRA application, and the maintenance rule,
and that causes some confusion and inefficiency.  So those are areas that we
need to work on.
		Next slide, please.
		To address those concerns, we piloted a replacement, SCRAM
performance indicator.  That pilot has been completed.  We've had a number -- I
guess two meetings with the NRC initial working group to evaluate the results
of that pilot against preestablished criteria.  We're finalizing where we think
we ought to come out on that particular SCRAM indicator, and we'll be making
progress and resolving that as we go forward.
		With respect to the potential replacement for the unplanned
transients PI, we are making good progress on that.  I'm happy to report we
hope to have something that we can pilot in the near future.  And we've had a
number of meetings on the safety system unavailability performance indicator. 
And, again, we're making good progress I think in addressing the concerns
associated with that to identify a standard definition of unavailability.
		Next slide, please.
		We considered the significance determination process really to be
one of the ROP's most important achievements.  We believe it has enabled us to
separate those issues that are truly important from those that are not.  The
SDP has improved inspectors' awareness of plant-specific risk and enabled
licensees and us to focus on areas that are most -- of greatest significance.
		I should note that, as has been pointed out earlier, we have
received valuable assistance from the senior reactor analysis, from NRR's
Probabilistic Safety Assessment Branch, and also from Research's Operating
Experience Risk Analysis Branch, in implementing the SDP process.
		Despite the successes, we truly do have a concern with the
timeliness of the SDP.  In addition to that --
		Next slide, please.
		-- we recognize that there are several SDPs that we need to
improve.  For example, the fire protection -- with respect to the fire
protection SDP, the lack of written guidance for fire scenario development
requires extensive time by the SRAs and fire protection engineers to enable us
to be able to resolve those significance determination process issues.
		In addition, we're conducting benchmarking to ensure the accuracy
of the worksheets that are used in the SDP process.  We found some instances
where improvements are warranted.
		Next slide, please.
		To address those concerns, with respect to timeliness, we truly
do expect to do fewer Phase III evaluations because of the availability of
Phase II worksheets.  You'll remember the last time we talked we had a
significant number of those Phase II worksheets that we needed to get out.  We
are near complete with those Phase II worksheets.
		We are looking to improve the significance -- significance and
enforcement review panel process, the process that enables us to review and
arrive at the significance of the SDP issues.  And in addition to that, we are
working to put issues that potentially require some elevated attention into the
NRR process, the TIA process, task interface agreement process, to make sure
that we provide the visibility and the tracking to be able to resolve those
issues in a timely manner.
		However, I should point out that as we become more risk-informed
the SDP causes us to focus in on uncertainties.  There are influential
assumptions, and arriving at convergence on those important assumptions is
important to openness and defensibility of the process.  And so we really do
need to look at the goals that we have and make sure that those goals are
realistic and adjust them as appropriate.
		We are improving tools for assessing fire scenarios, as I
mentioned.  And we will continue to upgrade the Phase II notebooks as we go
forward.
		Next slide.
		I won't mention -- I won't spend time on this slide, except --
because many of the points on this slide are very similar to the SDP because
those processes are coordinated, are in sync if you will.  I will point out,
however, that as we -- when we went to implementation on the maintenance rule
we established maintenance rule effectiveness review panels to help ensure
consistency.
		We believe that we have been able to ensure that consistency, and
that the SDP process provides for appropriate consistency as we go forward. 
And so the Office of Enforcement plans to suspend maintenance rule panels.
		Next slide, please.
		With respect to the assessment program, the assessment program we
believe truly is more predictable.  That's a major achievement.  It's more
objective.  Subjectivity is not a central part of that process, and that was
one thing that we were really trying to go after.
		However, having said that, we did find we do have some concerns. 
A question has been raised regarding how should we deal with historical issues
that have significance but they are not reflective of current performance.  
		We have an issue with respect to no color findings.  No color
findings are those findings that are greater than minor, but that you can't run
through an SDP and get a colorized result, and that aren't subject to
traditional enforcement.  And so they get documented as no color findings.  In
a process that is colorized, that potentially causes a concern because it
doesn't communicate the significance of those findings.
		Lastly, we have -- there is an issue that was raised by external
stakeholders and also by the IIEP that deals with the dwell time for inspection
findings.  We talked yesterday about inspection findings lasting four quarters. 
There is a question for us to consider.  Do we want to phase that dwell time
for inspection findings based on the significance?  Where a red finding would
last longer, for example, than a white finding potentially.
		Next slide, please.
		And so we are improving guidance regarding the treatment of
historical issues.  We want to reflect the significance of those historical
issues, but we also do not want to create a disincentive for the licensees to
go out and aggressively find those issues.
		We're working on -- we're evaluating a graded reset for
inspection findings, and we're developing program modifications to address no
color findings.
		Next slide, please.  Resource slide, please.
		We provided considerable attention to the area of resources
during the year of initial implementation.  We developed estimates based on
expert judgment in meetings that we had in headquarters and in the regions with
the regional division directors.
		We think we did a good job.  The actual expenditures compare
favorably with those estimates, and we believe they are generally appropriate.
		Expenditures -- and I should caveat my -- this next statement
with the statement that we -- it's problematic to compare the 52 weeks prior to
initial implementation from a resource perspective with the 52 weeks after
initial implementation.  Neither of those periods were standard or typical. 
And, in fact, the programs vary differently from the old program to the new
program.
		But when you make that comparison, expenditures were slightly
greater with respect to -- that we used for initial implementation than they
were for the prior program.
		Next slide, please.
		Although we believe it's premature to implement further
reductions in the program, as I've indicated, there are areas that we believe
can be targeted for future efficiencies in the ROP.  For example, we believe
that there are efficiencies with respect to the documentation that could be
achieved, for example, through implementation of quarterly inspection reports.
		I've talked about the SDP and our focus in those areas.  The
ability of Phase II worksheets we believe will result in some efficiencies.
		We are establishing a focus group to identify efficiencies, and
we'll modify the program to implement those efficiencies, balanced, of course,
with future challenges for the reactor oversight process.
	Jon?
		MR. JON JOHNSON:  Thank you, Michael.
		Slide 24, please.
		About a year and a half ago, the Commission approved the
transition from a resident inspector staffing from what we call N+1 to N.  The
staff has evaluated this change and its affect or ability to implement and
complete the baseline inspection program.
		We believe that the baseline program can be done with some
assistance from the region.  If there's a vacancy in a resident inspector at a
site, we're definitely going to need to support that vacancy with assistance
from the region-based inspectors and project engineers.  What that means is
there's more travel time than we assumed in the planning -- planning for
inspections, the travel that it takes to go out to the site, and also is an
added burden on the supervisors to keep track of making sure that there's a
qualified inspector on-site.
		The early indications are that this also will challenge the
training and rotations and professional development of the resident inspectors. 
And there's not quite as much flexibility that we thought the regional
administrators would have when the transition from N+1 to N -- there resources
would presumably go back to the regional office and provide the regional
administrator with more flexibility on inspection resources.
		But what we found was that the transition when we did transition
some of these were due to promotions or attrition, and so the result was not
always having a greater number of qualified inspectors in the region but that
the region would have to hire some new people.  And so then they would have to
go through a two-year training program to train up and qualify these
inspectors.
		The staff -- our staff plans to establish some criteria to look
at the allocation of resources, to take into account this conversion from N+1
to N, and also to look at some unique assignments where we would have different
types of technology at a site such as a PWR and a BWR.  It does take some extra
amount of resources to provide the training and requalification for the staff.
		We're also going to monitor parametrics, metrics such as
overtime, the amount of training opportunities that the resident inspectors
have, to make sure that they have the same opportunities that the other
inspectors have.
		Could I have Slide 26, please?
		In conclusion, we believe that the reactor oversight process has
met the goals that the Commission established.  We believe that the process is
more objective, that it's predictable, more understandable, and definitely more
risk-informed.
		We continue to learn and improve.  We are in transition now from
primarily developing the program to refining it and making improvements, and
we're going to continue to try to identify resource efficiencies.
		DR. TRAVERS:  Chairman, that completes the staff's presentation,
and we'll be happy to take your questions.
		CHAIRMAN MESERVE:  Thank you very much for a very helpful
presentation.  There's obviously an enormous amount of work that you have
undertaken in not only implementing the program but also in evaluating it.  And
as Commissioner Merrifield indicated, we appreciate the assistance of the
industry and of others -- Mr. Shadis -- for their participation in that effort.
		Let me turn to Commissioner Dicus to start the questioning.
		COMMISSIONER DICUS:  Okay.  Thank you.
		Let's go to Slide 9.  And refresh my memory on, when you're
talking about generally positive feedback -- and I appreciate the fact that
you've done this -- and I do agree, I think the program is successful and it --
but it is a work in progress.
		What percentage did you say was generally positive?
		MR. MICHAEL JOHNSON:  I gave an example, Commissioner, of
providing assurance that plants are operated safely; 68 percent of the staff
agreed.  An additional 20 percent strongly agreed.  And my statement was that
for those --
		COMMISSIONER DICUS:  An additional over the 68?
		MR. MICHAEL JOHNSON:  Yes.
		COMMISSIONER DICUS:  So it's 88 --
		MR. MICHAEL JOHNSON:  Yes.
		COMMISSIONER DICUS:  -- agreeing.
		MR. MICHAEL JOHNSON:  Absolutely.
		COMMISSIONER DICUS:  That's the number I'm trying to get to.
		MR. MICHAEL JOHNSON:  That's right.  And similar percentages for
the next two bullets and then some other areas.
		COMMISSIONER DICUS:  Okay.  Of the ones who disagree, other than
the SDP -- well, I shouldn't say SDP process -- but the SDP and the timely
handling of feedback, what were some of the other concerns?  And were these
resident inspectors?  I mean, what staff are we talking about?
		MR. MICHAEL JOHNSON:  Let me answer the second part of your
question first.
		COMMISSIONER DICUS:  Okay.
		MR. MICHAEL JOHNSON:  We sampled resident inspectors,
region-based inspectors.  We sampled folks who were involved in the regions and
in headquarters with implementation of the ROPs.  So there was a cross-section
of respondents.
		The reason why I highlight these points, the use of the SDP and
timely handling of feedback, is -- and I want to make this clear -- the
majority of respondents believe that the SDP was not easy to use, and the
percentages were 49 percent disagreed.  An additional 11 percent strongly
disagreed that the SDPs were easy to use, and that was not just the reactor
SDPs.  Those were also the non-reactor SDPs.
		So what I've done is highlight the most two prevalent areas of
concern.  The second area, this timely handling of feedback, we also had a
majority of respondents who believed that even though we solicited input from
them we did not do a good enough job in either turning that feedback around in
a timely manner or getting back to the staff on the results of that feedback.
		Those are the two most prevalent.  There were other areas, but
these were the ones that were -- that the majority disagreed with the program.
		COMMISSIONER DICUS:  Okay.  I've heard some concerns raised --
and it's the background of my question -- about -- and it may be in the timely
handling of feedback.  But that some concern raised at the regional level in
management, not necessarily RAs but in management, that we're somewhat more
limited in enforcement.  Did that feedback come back through, that maybe there
are issues that we find but we don't have an enforcement tool to deal with
them?
		MR. MERSCHOFF:  I can address that.  I wouldn't agree with that
statement, in that nothing in the new program has changed the regulations or
our approach in decisions on whether or not to enforce an issue.  So anything
that was a violation before this new program is a violation in the program.
		What has changed, of course, is how escalated enforcement is
dealt with, fines versus the colors, but we certainly don't feel that we're
limited in our ability to enforce issues.
		COMMISSIONER DICUS:  That has surfaced in some discussions, and
so that's good feedback that I have because that did concern me.  So you feel
that --
		MR. JON JOHNSON:  One thing, Commissioner, in terms of evaluating
the risk, we have -- even though the SDP process is more complex in a lot of
cases, we still have a tremendous amount of information that we're using now. 
And so we are able to look in some cases and look at the actual risk
significance of it.  
		So in the past where we may have had maybe a more severe type of
enforcement action, now for a similar case if using the PRA it actually shows
that this is not quite as risk significant, then the actual enforcement may not
seem as severe.  But Ellis is correct in terms of this.  This has not changed
any of our regulations.
		COMMISSIONER DICUS:  Okay.  Let me ask you, then, another
question.  For the -- some 12 percent who still have concerns, are you
comfortable that we have the framework in place to continue to address these
concerns and to evaluate any additional concerns that might be raised, other
than surveys?  What are -- I'm not -- I think this is good.  But what kind of
framework do we have in place?
		MR. JON JOHNSON:  Well, as I indicated, the development -- the
program office now is in a transition from primarily creating the program to
improving it and monitoring its implementation.  And we've also -- I know Bill
Borchardt and Bruce Boger are starting an initiative to have Bruce be more of a
-- I guess a coordinator and communicator with the regional offices.  He has
already set up a visit to Region IV.
		So we intend to continue the dialogue and feedback with the
inspectors.  And we've always had a system to be able to comment on our
inspection procedures, and we expect that to continue.
		MR. MICHAEL JOHNSON:  Also, Commissioner, if I could add, we -- I
talked about that self-assessment metrics very quickly as I went through that
slide, because I wanted to get through all of the slides.  But we don't just
rely on the internal feedback that we can gain through surveys and those kinds
of things, although that is certainly a major part of what we do to see if we
need to make -- continue to make improvements.
		As a part of those metrics, we look at each aspect of the
program, the inspection program, the assessment, the SDP.  For example, one of
the things that we do is we audit findings that are greater than green and
compare them against the ASP result to make sure that we came out in the right
spot.
		We have a number of metrics that enable us to reach conclusions
regarding whether we need to make changes on the program or not.
		MR. KANE:  From a more general standpoint, we are expecting to
increase our sensitivity to internal communications and make sure that we're
communicating well to everybody and addressing and understanding those kinds of
findings.  And that effort will continue and grow.
		DR. TRAVERS:  Not to pile on, but --
		COMMISSIONER DICUS:  But you are piling on, though.
		DR. TRAVERS:  -- I mentioned an element of the program which I
think is an important one, and that's the process for self-assessment.  We've
actually got a Manual Chapter 0307 which addresses this in the most formal
sense.  And it envisions -- in fact, it specifies that on a periodic basis the
self-assessment program would collect information, including -- and there's a
whole host of things that are listed, but including stakeholder surveys, which
I assume includes information from our internal stakeholders as well as we
proceed to --
		COMMISSIONER DICUS:  Internal and --
		DR. TRAVERS:  And external.  We would expect to continue to
refine the program with the benefit of a host of inputs.  This is just one.
		COMMISSIONER DICUS:  Okay.  Yes, I know the SDP is an issue I
think with everyone and working toward improving it.
		I want to bring up a -- go back to a question that I asked
yesterday and see if we can continue to refine what our response is.  And the
question that I asked had to do with the statistically significant adverse
trends, and whether or not they are always a regulatory concern together with
maybe less of a regulatory concern.
		And as we discussed it further, there is some uncertainty when we
report these things to Congress that we do maybe prioritize, or whatever, to
the significance of what that really is.  So would you care to address that?
		DR. TRAVERS:  Yes.  Thanks for the opportunity to add something
to that discussion.  We are developing a system where we would identify at this
early stage statistically significant adverse trends.  And I think your
question goes to the heart of a concern I have about the perspective and
characterization you might get to that.
		Later, this program may, in fact, have thresholds that will be
risk-informed, or potentially risk-based, that would define when you might
actually have a regulatory concern.  At the moment, I think this program
envisions that you could identify a statistically significant adverse trend,
i.e. in the wrong direction.  But what would occur at that point would be
further consideration of what this trend is and what it's telling you or what
it's not telling you.
		Presumably, there would be no regulatory concern associated with
certain potentially statistically significant adverse trends.  But I think the
important element here is that what it would trigger is a further evaluation to
determine its significance in the absence of good or further enhanced
risk-informed or risk-based thinking.
		COMMISSIONER DICUS:  Okay.  I appreciate that clarification.
		Mr. Chairman, I have no further questions.
		CHAIRMAN MESERVE:  Thank you.
		Commissioner McGaffigan?
		COMMISSIONER McGAFFIGAN:  Thank you, Mr. Chairman.
		Let me run through several things.  We talked about the
timeliness of the SDPs yesterday in the context of the Farley potentially
yellow physical protection standard.  Have you considered a timeliness goal?  I
will throw this out.  Since the Commission has created timeliness goals for you
in the past and licensing actions, of something like 90 percent of SDPs will be
done within 90 days and 100 percent within 150 days, is that a possible -- is
that a possible goal that you could, you know, with relatively few exceptions
meet?
		MR. MICHAEL JOHNSON:  We actually have timeliness goals.  The
timeliness goals are 90 percent within 90 days, 100 percent within I believe --
		COMMISSIONER McGAFFIGAN:  So you actually have goals.
		MR. MICHAEL JOHNSON:  We actually have goals, timeliness goals.
		COMMISSIONER McGAFFIGAN:  I just made these up, so --
		MR. MICHAEL JOHNSON:  Well, you came incredibly close to what
they are. 
		And we try to manage to those goals.  And, in fact, I was looking
at an audit that was done by the Office of Enforcement in looking at
timeliness, and on average I think they would say that, because they're
involved in the significance -- enforcement review panel and keeping track of
those kinds of things, about 90 days is the average for an issue that gets into
the process.
		And that's -- there are some cases where we do turn these around
very quickly.  There are also some cases where it takes us an incredibly long
time.  There are some areas that are --
		COMMISSIONER McGAFFIGAN:  Fire protection.
		MR. MICHAEL JOHNSON:  -- fire protection issues, it takes a long
time.  
		Ellis, do you want to --
		MR. MERSCHOFF:  The issues that tend to take a long time are the
ones that involve policy issues or particularly difficult -- EQ is an example,
where EQ is a common mode failure, doesn't fit well into a PRA.  Security
issues have taken us a while, and fire protection was mentioned.
		On the other hand, we are seeing some that are relatively
straightforward and that licensees have declined to have a regulatory
conference to discuss, where it was well enough known at the exit of the
inspection to allow us to proceed promptly.  And I've had two of those cases in
Region IV.  So I think there is some hope as we work through these
first-of-a-kind efforts.
		COMMISSIONER McGAFFIGAN:  I just hope we can meet some of these
timeliness goals more frequently as we move forward.
		On the Phase II notebooks, you mentioned in one of the slides
that you are working on them.  In the paper itself -- and I understand
anecdotally when we do these Phase II notebooks and we go out at two sites
visits per month, we try to benchmark them against what the licensee has, we
find problems.  And those problems have to be fixed.
		MR. MERSCHOFF:  Right.
		COMMISSIONER McGAFFIGAN:  And at two a month, where we have
60-odd sites, that's going to take 30 months.  You've been working on it a
while.  But I just wonder, do we have enough resources going into fixing these
notebooks, getting them properly benchmarked?  
		This also relates, as I understand it, to the SPAR models that
may well feed into these notebooks, whether when those get benchmarked whether
everything that Dr. Apostolakis was talking about some -- some problems we are
running into there as well in terms of matching up with the licensee's latest
PRA.
		MR. JON JOHNSON:  I think we're pretty well along, aren't we?
		MR. MICHAEL JOHNSON:  Actually -- well, let me -- with respect to
this benchmarking effort, let me just talk about that for a second.  We had --
we periodically have counterpart meetings with the division -- DRS and DRP
regional division directors, and this was an issue that we talked about.  
		We have -- as you are well aware, we've done five or six of these
benchmarkings now.  And you're right, we have found in one or two cases where
we needed to go back and revise the Phase II worksheets to strengthen them.  In
one case, in fact, the Phase II worksheet was probably overly conservative.  In
the other case it wasn't conservative enough.
		We think that we can do a smart sample.  That is, when we went
out and we did the original Phase II worksheets and we went and visited the
sites, some sites gave us a lot of feedback with respect to the accuracy of the
Phase II worksheets.  We tend to have a greater degree of comfort with respect
to those.  
		Some sites gave us very little feedback with respect to the
accuracy of the Phase II worksheets and additional input, what systems we were
missing and those kinds of things.  And so our smart sample would be to start
with those.
		We've got them programmed, and we'll look -- our commitment is
that we'll finish this fiscal year at the current rate that we have.  We're
going to look at what we find, and we'll make decisions based on whether we
need additional resources to strengthen that.
		MR. JON JOHNSON:  We have alternatives if we need some
assistance.  NRR, Rich Barrett, Risk Assessment Branch, and also the Office of
Research can provide assistance to the region.
		COMMISSIONER McGAFFIGAN:  Is the area that you're describing less
than 50 percent and with -- I mean, I'm just kidding, but that's one of the
areas you're trying to also improve.
		MR. JON JOHNSON:  Right.  But it does take longer because these
are typically the complex problems.
		I also want to mention that our staff worked with our training
group in Chattanooga and put together an instructional guide to assist the
resident inspectors in some examples that we have gone through and we have
done, so they don't have to reinvent the wheel.  And it's web-based, and they
can go through the process and basically learn how someone else has gone
through and evaluated the risk in real typical situations.  And that's been an
assistance.
		COMMISSIONER McGAFFIGAN:  I've got several questions, so I'm
going to leave that one and go on to the next.  The web page -- one of the --
it relates to this issue of how long a shadow do inspection findings have.  I
honestly thought -- and it's only recently when I was looking at a Region II
press release did I understand that we only have the latest quarter on the web
page.
		If I go to majorleaguebaseball.com, I can get for the entire 20th
century, any season, how somebody batted, and, you know, what their ERA was, or
whatever.  I did that recently for my son -- with my son.  He was doing it more
for me.
		Are we at some point going to have essentially every quarter ever
under the revised oversight process available on the web page, so that you can
just look and see?
		MR. MICHAEL JOHNSON:  Yes is the answer.
		COMMISSIONER McGAFFIGAN:  Good answer.  Good answer.
		(Laughter.)
		MR. MICHAEL JOHNSON:  I would just say that what you see when you
look at the web page now is not just the recent quarter.  It's the current
performance that looks back a year.  But what you don't have is previous views
and that --
		COMMISSIONER McGAFFIGAN:  But for the PIs it's only the
performance for the quarter.  You can click on it.  But isn't it just the
quarter?  There's somebody shaking his head at the back.
		MR. MICHAEL JOHNSON:  Well, what you have, for example, is SCRAMs
for 7,000 critical hours.  And so that 7,000 critical hours is all --
		COMMISSIONER McGAFFIGAN:  Last year.  Okay.
		MR. MICHAEL JOHNSON:  That's right.
		COMMISSIONER McGAFFIGAN:  But they can rotate off.  I mean, like
Farley rotated out of its --
		MR. MICHAEL JOHNSON:  That's right.
		COMMISSIONER McGAFFIGAN:  -- system and into green, and it's now
a column one plant because they rotate off.  And when you -- when I clicked,
there was Luis Reyes' press release where he was going to go down and conduct
the meeting at Farley, referenced four performance and performance indicators,
mitigating systems, or something to that effect.  
		I clicked on the page and it was green.  I said, you know, so why
is he going to Farley?  You know, if I'd been smart enough to have the previous
quarters there, I would have been able to figure that out.  But I was able to
figure it out after some conversation.
		So you are going to -- the web page will look back at previous
quarters --
		MR. MICHAEL JOHNSON:  Yes.
		COMMISSIONER McGAFFIGAN:  -- at some point.
		MR. MICHAEL JOHNSON:  Yes.
		COMMISSIONER McGAFFIGAN:  That's the goal.
		MR. MICHAEL JOHNSON:  It's just a question of time and effort in
doing it, time and resources.
		COMMISSIONER McGAFFIGAN:  The fatigue -- the paper that we have
currently before us, and that's been released for -- while we're voting on it
-- has a sentence in it on page 2 that says, "The control of working hours in
accordance with these technical specifications was monitored through routine,
periodic inspections, but was discontinued with the implementation of the
revised reactor oversight process.  
		"The change continues to be considered appropriate and consistent
with the general design of the revised reactor oversight process, which is to
identify indications of plant performance problems," etcetera.  
		Give that you are in this paper essentially suggesting that we go
about a rulemaking, it's a rulemaking planned to do one of -- several options,
this recommended option for a rulemaking on fatigue, basically, I read this to
say we once inspected in this area.  I mean, the paper also says there is lots
and lots of exceptions made.  I mean, the tech spec allows it, and in some
cases thousands of exceptions are made per year.
		But is this an area that we should be looking at in the revised
oversight process?  Because we are essentially not enforcing -- although it may
not be enforceable, this goal that we have for hours worked.
		MR. MERSCHOFF:  Can I address that, Jon?
		We do look at that in the revised reactor oversight process, but
it's on a performance-based approach, and that is when an event occurs, when an
incident occurs, we'll look at the causative factors to that.  And if excessive
over time, if fatigue is a causative factor, then we'll address it and deal
with it, but we don't have a routine inspection module that looks at it on a
fixed periodicity.
		MR. JON JOHNSON:  I think when we worked on this fatigue paper,
we realized that fatigue is just one element of being fit for duty and being
alert and knowledgeable as a worker in the safety-related activities.  And we
see a tie to access authorization and security, and we see a tie to risk, as
Ellis mentioned.  But I think the --
		COMMISSIONER McGAFFIGAN:  Do we inspect those areas?  Do we
inspect fitness for duty as part of the revised oversight process?
		MR. JON JOHNSON:  Well, what we do is we concentrate on the
results of workers' efforts.  And if there's an event in the plant, we'll look
at that event and follow up, if it's a risk-significant event, and try to
follow what the root cause of that was.  
		And one of the reasons that the transition -- instead of looking
at it from a procedure standpoint, or a prescriptive standpoint in terms of
working hours, we are more looking at it from a standpoint of, what is the
result of that effort?  So it is an indirect way of inspecting that.
		COMMISSIONER McGAFFIGAN:  I have some more questions.  Do you
want me to ask now or continue?  Whichever way you --
		CHAIRMAN MESERVE:  We have another panel, so --
		COMMISSIONER McGAFFIGAN:  Let me try to run through.  There's a
PI for -- that Bruce -- that Ontario Power uses.  There's a couple PIs that
Ontario Power uses that we don't use at the moment.  One is radiation exposure
to the public.  They have a goal -- I've got their latest quarterly report that
Ontario Power puts out, and they have a goal in the first quarter at Darlington
of 1.9 microsieverts, which would be two-tenths of a millirem, or, no,
two-hundredths of a millirem.  Sorry.
		I'm not doing that well enough.  It would be -- one microsievert
is a tenth of a millirem, so it would be about two-tenths of a millirem.  And
they overachieved that by a factor of 10, so they had about two-hundredths of a
millirem in the first quarter there.
		I understand -- and I went back -- when I saw this I went back
and discovered that up through '92 or so we used to do NUREGs on that
commitments due to radioactive releases from powerplants, and we still get
annual reports.  Here is Fitzpatrick's in '99, and San Onofre's in 2000.  So it
sounds like we have the data with which we could do something like this.
		And I know it's -- you probably -- I mean, from a safety
perspective, telling the public they're getting less than a tenth or a
hundredth of a millirem per quarter, if -- and I'll read the figures.  This
figure is an estimate, so it's an estimate of the radiation dose people would
receive if they live just outside the station boundary at their residences, 24
hours a day, drank local water and milk, and ate local fish and produce.
		The only reason I raise it is Lochbaum -- David Lochbaum always
used to tell us in a license renewal context one of the things he wanted us to
look at was, what are the doses people are getting?  And we have the Tooth
Fairy Project running around doing bad science, trying to convince people that
there is a dose effect from the plants.
		Is there -- was there ever any discussion about having an
indicator like this, if not quarterly, annually, consistent with these reports,
and have somebody rack it up and say, although Ontario does it quarterly, that,
you know, this is what we believe the dose is at the site boundary?
		MR. MICHAEL JOHNSON:  I understand the question.  I honestly
can't -- I can't recall.
		MR. JON JOHNSON:  Well, one of the -- public radiation exposure
is one of our cornerstones, one of our key cornerstones, and we do have a
performance indicator in that area.  But it may be related to release rates as
opposed to actual dose.  
		COMMISSIONER McGAFFIGAN:  Well, it's an estimated dose.  So it
would actually -- I'm sure what they do is they take the effluent reports and
just try to guesstimate what one would get if one were at the site boundary. 
But it's data that we apparently do collect.
		MR. JON JOHNSON:  We'll have to get that information and get back
to you.
		MR. MICHAEL JOHNSON:  Yes.  As Jon indicates, we do have
performance indicators that look at occurrences, you know, effluent release
occurrences, and those kinds of things.  But I just don't remember what --
		COMMISSIONER McGAFFIGAN:  The last issue I'll just mention, the
N+1 to N.  I was the Commissioner that wasn't very enthusiastic about that at
the time.  I do hope that you are trying to keep a qualified resident at these
sites, and then we don't get -- stay at N-1 very long.  But a lot of these
issues are the ones that I was very fearful of, especially during summers when
people are taking vacations, both at the regions and at headquarters.
		I was told last year the flexibility was going to be that
everybody was going to be in the region and we'd be able to dispatch people
out, and that everything would be fine.  
		Now you're discovering that flexibility isn't there, so I hope
there's a mechanism to -- so we always have at least one qualified, and I'm
underlining "qualified" -- not in training -- resident.  And I am disappointed
that it's hurting the training of new residents that -- because we've lost
flexibility that we used to have under N+1.
		MR. KANE:  Well, we certainly will, and that's obviously -- we
have that sensitivity.  It's an issue that has to be managed.  We've explained
some of the issues with -- the difficulties perhaps in managing that, but I
have every confidence that the regional administrators will be able to manage
that.
		MR. MERSCHOFF:  The program office has guidance that a site will
not be uncovered, without at least one qualified resident, for more than 72
hours, three days in a row.  We're able to meet that.
		If you look at the N+1 change, while we're at N+1, that really
only helped us in that aspect for two-unit sites.  The single-unit sites always
had the challenge that you currently described in terms of having coverage at
the site.  So where we have a lot of experience and we're skilled at backing up
the residents with inspectors from the region, occasionally we get help from
project managers from NRR, to assure that we have a presence and that we don't
go more than 72 hours without a qualified person there.
		CHAIRMAN MESERVE:  Commissioner Merrifield?
		COMMISSIONER MERRIFIELD:  The first question I have is you talked
earlier about issues associated with the SDP process in terms of its
timeliness, its simplicity, the quality of how we're engaged in that.  And,
clearly, I think from the presentation today that remains a significant issue
for us.
		Do you think we have, at this point, an action plan that captures
the issues in this area and has a methodology and appropriate metrics for
making the determination down the road as to resolve it?
		MR. MICHAEL JOHNSON:  I'm only pausing because you asked about an
action plan.  We have a number of actions that are going -- that we've taken to
address this issue.  Now, whether they're documented in a single plan, I don't
believe that's the case.  
		But, yes, I believe that we are making progress.  We've
identified the kinds of things that we need to fix.  For example, we will be
issuing shortly a draft revision to the SDP manual chapter that clearly defines
roles to the way we conduct that SERP panel that make it more efficient.
		For example, we've mentioned the fact that we're making the SDP
instructional guide available to -- have made it available, and that will
provide -- address the concerns with respect to some of the ease of use of the
SDP.  So we've got a number of actions that we've taken to address the
concerns.
		MR. JON JOHNSON:  But I would like to add also that we -- we
coordinated with the regions and set up a program to provide additional
training, backup -- they're called kind of backup senior reactor analysts.
		And these positions worked with Human Resources to make a fair
solicitation and selection and provided additional courses, training courses,
so that if we lose a senior reactor analyst in a regional slot that there are a
number of people that are right behind them that have already had some of the
training, so they would be able to more quickly fill into that slot and be able
to perform some of these analyses and assessments for the regional
administrator.
		COMMISSIONER MERRIFIELD:  Well, obviously, there are some generic
concerns, one of them being the resources it takes for us to deal with these
issues, the transparency with which we are making our determinations, and the
predictability and consistency that we're making those determinations.
		And since a lot of things revolve around the SDP, obviously it is
important.  And so I encourage the staff to put the appropriate resources to
that to make -- to resolve those issues moving forward.
		On Slide 15, the staff speaks to the issues associated with the
standard definition for safety system unavailability.  And in the paper that
came up on June 25th, page 7, you make references about that as well.  Can you
give me a little better understanding of specifically how you are planning on
responding to the stakeholder concerns associated with the SSU indicators?
		MR. MICHAEL JOHNSON:  Yes, I can.  We established -- we have
established for some time a focus group specifically to deal with the SSU PI,
performance indicator, and arriving at a standard definition.  That group has
already had a couple of meetings.  We met earlier this month.  We have an
additional meeting in August, and we'll -- and a plan to get to a pilot for,
for example, replacement performance indicators for those unable to PI.
		So we've got a well-orchestrated approach that involves the NRC,
folks from the various communities, maintenance rule folks, PRA folks, regional
folks who understand what it means to implement the inspection program, but, in
addition, external stakeholders, and some healthy involvement to try to come to
a standard definition on these issues.
		I should have mentioned INPO also.  INPO was involved in that,
because INPO and WANO is one of the areas that we know we need to get in line
with.
		COMMISSIONER MERRIFIELD:  Okay.  The last area I want to get into
is the issue of no color findings.  That's referenced on page 11 of the paper
and page 7 of Attachment 5.  You discuss a little bit about what some of the
concerns are relative to no color findings in the oversight process.
		You know, for my own part, I do have -- you know, I share some of
the concerns out there about those color findings, and, again, the
inconsistencies relative to their use, and a perception that it would
demonstrate some instability in terms of a regulatory process.  And so I'm
interested in getting your views in terms of how we're going to resolve those
concerns going down the line and what plan we have for that.
		MR. MICHAEL JOHNSON:  Okay.  We actually got feedback on this
issue.  We had -- we engaged external stakeholders on the discussion -- on the
issue of no color findings at the External Lessons Learned Workshop that we had
in March.  Coming out of that workshop, we have had a number of internal
meetings to discuss the issue of no color findings and to propose a resolution
to that.
		At our last NRC industry working group meeting, we raised the
issue of no color findings and put on the table at that time for discussion
with the external stakeholders the resolution of that issue.  
		We really believe that we need to do something with respect to no
color findings, but the way that -- something that we do has to recognize that
once you get past minor, unless we're willing to develop SDPs that are --
multiple SDPs that can cover all eventuality, we're always going to be faced
with those issues that get around the SDP.  And so how do we deal with those in
a way that is scrutable and understandable?
		COMMISSIONER MERRIFIELD:  Okay.  Just one other thing I do want
to raise.  Mr. Shadis, in the testimony he is going to be providing to us in
the next panel, raises a number of concerns about design issues and where those
fit in our inspection process.  I just was wondering if, having reviewed that,
whether you had any comments on Mr. Shadis' assessment and whether we are
comfortable with the level of oversight that we have in the design area.
		MR. MICHAEL JOHNSON:  I have had a chance to read Mr. Shadis'
paper, and I look forward to his comments.  You know, when we did the
framework, we looked at what we should look at in terms of inspectable areas
and what we should do in terms of performance indicators.  The mitigating
system cornerstone has placed in prominent view the recognition that we need to
look at design mods, and we need to look at additional design.
		And so we've got -- and from that we developed inspectable areas. 
We have inspection procedures.  We looked at permanent mods.  We looked at
temporary mods.  We have a biennial inspection that looks at safety system
design and performance.  And so we've got specific areas built into the
baseline to try to address those inspectable areas related to giving us
insights with respect to design.
		And so it's not something that we left out of the baseline.  It
very much is a part of the baseline.  I was looking at Ellis to see if he had
something to add.
		MR. MERSCHOFF:  And I agree, Mike.  In terms of the inspection
procedures and the oversight of design, the levels are appropriate and are
working.  There are some areas still under consideration like units with
diverse NSSS systems, and should there be more engineering there or not. 
		But by and large, the engineering procedures and programs in
place give us a good look.  The question on the table is a good one, and that
is, when you find a design problem that was from the very initial construction
and design, can you or should you recognize a licensee's self-assessment
program that identified that problem, or, rather, deal with it as a mature
industry, that it's a problem that's found today and work in the action matrix. 
That's a good question, and it's one that we need to look at further.
		MR. JON JOHNSON:  I would like just to add also that one issue
that we've been basically putting on hold for a while is to take credit for a
licensee's own audits and self-assessments.  And we weren't willing to address
that, or we didn't feel it was appropriate to in the first year of
implementation.
		But we do want to encourage utilities to continue their own
audits and design reviews and to keep conducting those.  And as Ellis
indicated, and Mike did, we do have an inspection procedure that causes us to
go through this in a significant amount of effort.  
		And so in the future, we are going to be looking at the efforts
the utilities are taking themselves and looking at the impact on our inspection
program.  That might be one area for efficiency in the future, but we didn't
feel that in this first year that we wanted to go into that in detail.
		COMMISSIONER MERRIFIELD:  That's a fair point.  I think we should
have -- at the end of the day we should have a program that does allow for an
inspection of what may be latent issues, and not allow ourselves or our
licensees to be lured into the belief that we've got a program that's working
now and everything going forward is fine.  There may be things out there
lurking from the days of early operation, and we should encourage them to
continue to find those.
		Thank you, Mr. Chairman.
		CHAIRMAN MESERVE:  Thank you.
		I'd like to ask you a little bit about Slides 6 and 7, which is
your summary of the inspection results.  And sort of ask what your analysis of
this is in terms of its implications for the program.  I mean, it's quite
striking when one examines that slide, that the hits are in -- very
significantly with regard to both inspection findings and performance
indicators on mitigating systems.
		And a surprising -- to me, a surprising number of hits are on
emergency preparedness.  And it raises a question, I guess, for this context
whether -- maybe this belonged in yesterday's discussion, whether there's a
trend issue or -- of course, it maybe is not a trend, but an issue that we
ought to be worried out that's reflected there, or whether it says something
about the thresholds.
		And I just don't know what the philosophy is that has guided the
staff on this.  I mean, the slide is here.  The information is here.  We've
defined these cornerstones, and we're seeing a great disparity in the results
from one cornerstone to the next.  And maybe the thresholds are too high. 
Maybe in some areas they're too low.
		It depends on what the philosophy is.  Is it driven by risk?  Is
it driven by challenging the industry from where they are?  I mean, I'd just
sort of be interested in how you -- how you interpret this slide, and what
implications it has for the oversight program.
		MR. MICHAEL JOHNSON:  I guess I'll just start off and say I --
that was a question that actually we got also from the ACRS when we last
briefed them.  And we are deciding what we think the results tell us based on
how they're spread out across the cornerstones.
		Obviously, when we looked at the emergency preparedness area, we
had a different metric, if you will, for setting up those thresholds through
the SDP.  We looked at planning standards, we looked at risk-significant
planning standards, to decide the significance of an emergency preparedness
issue.  And you're right, we have a high number -- a relatively high number of
issues that came back in the emergency preparedness area.
		CHAIRMAN MESERVE:  Well, and also mitigating systems.  I mean,
there's a huge number there as compared to the others.
		MR. MICHAEL JOHNSON:  Yes.  I think the actual -- the mitigating
systems area is more explainable.  We do actually a large portion of our
inspection in the initiating events, but primarily the mitigating systems
inspection.  So there's a lot of effort that would cause you to have a finding
that you would link primarily to the mitigating systems cornerstone.
		So I think that probably is more of the rationale for why that
area is as high as it is.  Emergency preparedness is one that I think we need
to think about.
		MR. JON JOHNSON:  One thing that I think we all learned was that
we have found some things by focusing on our risk.  In the PRAs, we found some
things that were not in the old inspection program.  Some of those have showed
up in some flooding issues -- the emergency preparedness, but also radiation
protection, occupational radiation protection, some of the findings on the
ALARA programs.  I think in one of the sites in Region IV, Ellis probably could
speak of in detail.
		But there are some things that we weren't focusing on that I
think we've learned, and part of the program shows us that there are some
things we can learn about inspecting, and so forth.
		In the mitigating systems, we have -- the PRAs point out that
auxiliary feedwater and diesel generators are some important equipment, and we
have a number of findings in those areas.  Also, the performance indicators
have pointed out some of the initial -- I believe some of the initial
indicators in terms of out-of-service times.
		The calculations for the performance indicators require you to go
into, how long was this piece of equipment out of service?  And there's a
standard of calculating that fault exposure time, and it's basically a judgment
in terms of how long we're going to assume in this calculation this equipment
was out of service.  
		And basically, on average, it uses the -- half the time between
when you last tested it and can demonstrate it.  So in some cases the equipment
may not have been out of service as long as the calculation shows, but our
concern is to get it fixed and find out why, and make sure it doesn't happen
again.
		But there are some things we've learned in terms of calculating
these risks that we're working with Office of Research.  They do a study on
accident sequence precursors, and a fairly in-depth study of those, and we've
learned that our initial risk assessments that go with our inspection findings
need to be coordinated with their more in-depth long-term studies, because we
don't want to have two assessments of the same event basically coming out in
different areas.  So we're working with Research to basically strengthen our
risk assessments.
		MR. KANE:  Just to add a comment.  It's hard to provide a
comparison to what preceded this, of course, but I think one of the
opportunities here over time is to take a look at this information and
potentially make adjustments of resources within your baseline inspection
program that we're trying to -- based on what this is telling you, whether you
have a low number -- continue to have a very low number of findings in an area
perhaps that would suggest that maybe you can scale back there, and other areas
perhaps you need to --
		CHAIRMAN MESERVE:  Well, you took away the low number of --
		MR. KANE:  Well, and then -- you know, but I think it is useful
from that perspective.
		MR. MERSCHOFF:  If I can add just a thought on this.  We hire our
staff and train them for a healthy skepticism and questioning attitude.  And
going into this process, the general feeling in the region that I shared was
that these thresholds were too high, and that we wouldn't be able to engage the
licensees when we needed to when problems were identified.
		I think one of the reasons that we're seeing the improvement from
the survey in this program is the fact that it has allowed us to engage, as you
can see in this spectrum of findings, where we needed to engage.  We have
certainly found in Region IV, in the areas of ALARA, in the areas of EP, this
has given us a tool and a visibility to correct long-standing problems that
were difficult to get to in the old program.
		So I find these number of findings and thresholds crossed to be
very encouraging and proof in helping convince the staff that the program
works.
		CHAIRMAN MESERVE:  I wasn't suggesting otherwise.  It was the
differences among the various cornerstones which was interesting.  
		MR. MERSCHOFF:  Neither have we come across a cornerstone where
we had a problem that really bothers us, but we didn't have a tool to address
it in.  So I don't have a great concern personally of the ones that don't have
numbers in them.
		CHAIRMAN MESERVE:  When we first went into this process, there
was a fair amount of public questioning about resources that were going to be
spent on this project, and we responded that it was our intention to apply the
resources -- exactly the same resources that we had before but to deploy them
in a different fashion, but that we -- we would reexamine the resource issue at
the end of the first year.
		In fact, as it turned out, we employed slightly more resources to
do this than we had in the previous years.  So it has ended up increasing the
resources slightly.
		Your recommendation on Slide 23 is to basically defer, again, the
resource question.  This comes to mind because we are now, as you know, putting
together the fiscal year 2003 budget.  And you do see the opportunity for some
-- perhaps some future efficiencies.  And I'd just sort of try to get some
sense of whether you have a better feel now of what the right size of this
activity is a year out from now.
		MR. KANE:  I'd like to address that, although I'm not sure I can
answer that question directly.  I think we have to be cautious here in terms of
-- and target the areas that we can look at, and we've identified, of course,
some of the areas.  In the area of preparation and documentation, I think there
are opportunities there that we will look at.
		But we've noted there's also the need to increase the work in the
significance determination process, make sure that we can meet timeliness goals
there, which perhaps have resource implications.  So I think -- for this next
period I think we'll just be targeting our efficiencies or looking for
efficiencies at these limited areas that we've discussed in the paper.
		DR. TRAVERS:  But I think as a baseline answer to that, and it's
a little bit predictive, is I think we've felt for some time that we wouldn't
expect significant changes even with the experience and even with some of the
nominalization of what we're doing at the front end.
		But as Bill mentioned, we have, in fact, identified specific
areas where we would expect some efficiencies to be obtained.  We just don't
feel we're in a good position right now to give you -- especially since we
don't expect them to be very large relative to the overall -- we don't expect
to be able to give you, with precision, any estimate of where that is, without
some additional experience.
		CHAIRMAN MESERVE:  One area in which we have -- there has been
very favorable public response to this program has been the fact that we have
performance indicators that are available, that the public has access to on the
web, the financial community has access to.  Where are we in terms of the
development of new performance indicators?  
		I don't mean as replacements but for what we have.  We have some
of those that we're worrying about fine-tuning things.  But are there some
other types of performance indicators that we're pursuing?  One that would be
very attractive -- I don't know if this is feasible -- I mean, the core of this
program really is the Corrective Action Program and being comfortable that for
the plants that are green that there is a process in place with the -- at the
-- among our licensees.  
		Is there an indicator we could develop for that?  I mean, I'm
just sort of curious where we are and what kind of process you have underway to
think about and develop and pilot other performance indicators?
		MR. MICHAEL JOHNSON:  The example that you use is actually one
that we're taking on as a part of the PI&R; focus group.  We're looking, for
example, at the possibility of establishing an objective way to be able to
measure licensees' effectiveness in the correction action or PI&R; area.  So
that's one example.  That's an example that we're actually working on.
		We are, in fact, working to develop -- continuing to develop new
PIs.  Of course, we have a formal process to make changes in the PI program. 
One of the things that I know you're aware of is the work that Research has
done with respect to risk-based performance indicators.  
		An important potential area that we're looking at that could come
out of that is reliability indicators.  We recognize and very much want a suite
of reliability indicators to complement the unavailability indicators that we
have now.
		And that -- so we are working on that process to -- continually
working to develop new performance indicators.  Of course, when we add those,
we need to look at what we have in place to make sure that we're doing
something that is effective and efficient and it adds additional information,
does what we intend to do with respect to the overall framework.
		CHAIRMAN MESERVE:  Good.
		COMMISSIONER MERRIFIELD:  Mr. Chairman, if I may make a
suggestion.  I know Commissioner McGaffigan brought up an indicator that
Ontario Power had.  I know in times past I've asked about -- Finland has some
new performance indicators that our counterparts there are using for their
plants.  
		It may be worthwhile to task the staff at some point to come back
to us and document to us some of the different areas that they've taken a look
at, just so we can get a sense of the breadth in which we're trying to do peer
reviews of others who are using performance indicators.
		CHAIRMAN MESERVE:  Good idea.
		COMMISSIONER DICUS:  That follows up on the question that I asked
yesterday about -- it was with the industry trends, but I suggested
internationally what are you looking at.  I think your response was positive,
so I think that backs that up.
		MR. MICHAEL JOHNSON:  If I can add one last thing on performance
indicators.  The agency person on performance indicators, this guy named Don
Hickman who is -- who is really recognized as a world expert on performance
indicators -- and, in fact, we interact -- he interacts on international -- in
international areas with respect to performance indicators with respect to
Finland, exchanges with Finland, and those kinds of things.  
		So we'd be happy to get back to the Commission with what we've
done and what we've considered and what we ought to consider as we go forward.
		CHAIRMAN MESERVE:  Let me close with just one final comment. 
That the tone of the presentation on N+N might leave some of us with the
impression that, gee, we made a mistake, and it has created a big problem, or
maybe not a big problem.
		I think that if the resources we've applied are the same, and the
effect of it was to increase -- therefore, is to increase the flexibility, and
if you're having difficulties within you'd have them even more I think if they
were N+1, because you'd have the same work to be done, but you'd have people
deployed maybe in the wrong places.
		Have I misunderstood what your presentation -- 
		MR. JON JOHNSON:  Oh, it's not that bleak.  We wanted to just
point out that it does cause more careful managing, as Bill Kane indicated, of
the travel, the planning and scheduling of inspections, and the training.  And
it requires our supervisors and branch chiefs in the regions to look ahead and
plan and hire staff ahead of time, so that they are -- they've gone through the
training and they're qualified ahead of time.
		And, you know, I think --
		MR. MERSCHOFF:  Let me address this.  You're exactly right,
Chairman.  No resources were lost.  The decision was made that that N+1
resource would be put in the region where it was more easily fungible and
usable.  The work wasn't at the site, so that person would have to travel from
the site.
		Now, the fact that many of the N+1 residents were lost to the
inspection program wasn't really, in my mind, a function of the change to N+1. 
Since we achieved N+1 through attrition, every one of those that --
		CHAIRMAN MESERVE:  Achieves N, you mean.
		MR. MERSCHOFF:  Achieved N, yes, sir.  Through attrition.  Each
one of those inspectors was scheduled to leave anyway and would have left to
the place that he or she ended up going.  So it has given us an opportunity to
bring more people into the program to achieve some EEO goals in the process. 
So the net effect from my personal point of view has been positive in moving
from N+1 to N.
		DR. TRAVERS:  One thing we were trying to highlight I think is an
expectation that we'd have experienced people available to the regional
administrators in the region.  And to some extent that hasn't materialized,
because of some other good things that have happened -- promotions, movement
into the program office, and headquarters, and other things.
		And so the challenge was to develop new people as the same
resource in terms -- and that's a bigger challenge than having experienced
people available to the RAs as a function of this change.
		CHAIRMAN MESERVE:  It sounds to me like you have an even greater
challenge if we had not made that move, because you now have -- you'd have to
do that anyway.  These people are going to leave or move on.  And you at least
have the flexibility to now move people around to where there's the work and
where there's the need.
		MR. JON JOHNSON:  Yes, sir.  I agree with that.
		COMMISSIONER MERRIFIELD:  Mr. Chairman, unlike Commissioner
McGaffigan, I have been a supporter of our change in that area.  I had not
inferred from the Commission -- from the staff's presentation that particular
view, but it's a fair point one could -- given Commissioner McGaffigan's
comments, I guess one could have gone either way.
		COMMISSIONER McGAFFIGAN:  Could I get a word in edgewise?
		(Laughter.)
		COMMISSIONER MERRIFIELD:  You already did.  I was really
responding to you when I --
		(Laughter.)
		COMMISSIONER McGAFFIGAN:  It strikes me that maybe the only
problem is we don't have enough qualified residents as a total at the moment in
the sites and in the headquarters, in which case we did what some of our
licensees do at times with senior reactor operators and reactor operators.  
		We didn't have enough classes and we weren't anticipating --
although you -- the staff had been asking to go from N+1 to N really for some
period of time, I don't think they had fully thought through the implications. 
I think you're a little short on qualified residents at the moment.  You're
going to try to make it up, and in doing so you'll meet some EEO goals, and
that's great.  But I'm not sure that this was as easy a transition as it was
predicted to be.
		CHAIRMAN MESERVE:  I'd like to thank the staff.  This is
obviously an enormously important program to the agency, and it's been a very
thoughtful presentation.  We very much appreciate your work.
		We now have a second panel that is -- of people who have been
involved in the evaluation of the reactor oversight process.  Let's give them
time to come to the table.  
		We have a second panel that has spawned from the Initial
Implementation Evaluation Panel, which was a FACA panel, that was created to
systematically evaluate the program.  And from that panel we have four
individuals who is Loren Plisco, who is the Chairman of the IIEP, who is the
Director of the Division of Reactor Projects in Region II; Steve Floyd, who is
Director of Regulatory Reform and Strategy for the Nuclear Energy Institute;
Richard Hill, General Manager, Support, for the Farley Project, Southern
Nuclear Operating Company; and Raymond Shadis, from the New England Coalition
on Nuclear Pollution.
		Welcome, and we very much appreciate your joining us today.  Mr.
Plisco, why don't you proceed.
		MR. PLISCO:  Thank you.  Good morning.  I'm here today with three
other members of the Initial Implementation Evaluation Panel to discuss to
discuss our conclusions regarding the first year's implementation of the
reactor oversight process.
		This was the second Federal Advisory Committee Act panel to
review the reactor oversight process.  The pilot plan evaluation panel reviewed
the results of the six-month pilot program at eight sites in 1999.  But we had
the advantage of evaluating experiences from the year-long nationwide
implementation, where we exercised many more elements of the process.
		The makeup of this second panel was very similar to the first
panel.  We had 16 members, including NRC managers from each regional office, a
director in the Office of Enforcement, four utility managers, two state
representatives -- California and Georgia -- the Nuclear Energy Institute, two
public stakeholders, and an NRC senior resident inspector and senior reactor
analyst.  Three of these panel members were members of the previous panel and
provided some continuity for us.  
		The panel had six meetings from November of 2000 to April of
2001.  The panel members brought their own experiences with the oversight
process through the first year's implementation.  They brought the experiences
of their organizations that they represented.  
		But we also invited other groups to present their views about the
process to the panel.  For example, we heard from the Union of Concerned
Scientists, the states of Illinois, Pennsylvania, New Jersey, Vermont, the
Nuclear Energy Institute, and we had a panel of public affairs officials and
union representatives, and panel of senior reactor analysts, and a panel of
senior resident inspectors.
		We also had many discussions with the NRR staff regarding the
status of the oversight process and a self-assessment program.  I did want to
note in the senior resident inspector panel we made sure we had some of the 12
percent in that group that we talked about earlier.
		We had three objectives.  The first was to determine whether the
reactor oversight process is achieving the agency's goals.  The second is
whether the more significant problem areas have been identified.  And, third,
was whether the NRC has developed a sound self-assessment program to look at
the program in the future.
		Overall, the panel concluded that the new program is a notable
improvement over the previous licensee performance assessment program and that
it should be continued.  We also found that the program has made progress
toward achieving the agency's four performance goals, and that the process is
more objective, risk-informed, predictable, and understandable.
		As you would expect by the members that were on the panel, we
focused our efforts really on how the staff could improve the program.  We
provided 25 recommendations to improve the reactor oversight process in our
report.
		Although the panel reached consensus on the recommendations in
our report, I must say that the reasons for each individual's agreement may be
quite different from another individual.  In most cases, these same problem
areas had also been identified by the staff through the self-assessment process
and stakeholder feedback that was received.
		We concluded that the self-assessment program has the necessary
elements to evaluate the oversight process in the future.  However, we couldn't
evaluate the effectiveness of the program given that much of the assessment
data wasn't available to us by the time we were concluding our review.
		As we evaluated all of the issues raised by the panel members and
the presenters, we noted three common themes which the panel termed "tensions"
that contributed to many of the issues regarding the process.  And I want to
take a minute to discuss those.
		The first was maintaining safety rather than improving safety. 
The staff designed a process to maintain safety as specified in the NRC
strategic plan.  However, some public stakeholders stated that they did not
believe current nuclear industry performance is sufficient, and others stated
that the NRC should continue to strive for more improvements, and some even
recommended we strive for excellence in industry performance.
		This disagreement with the agency's goal could limit the
confidence of some members of the public in that process and really led to some
of what I call "rubs" between the views of the public stakeholders and where
the program is going.
		The second area is applying risk-informed regulation rather than
a deterministic regulation process.  The reactor oversight process is ahead of
many other regulatory processes in the use of risk insights.  The licensees and
inspectors have had practical difficulties in carrying out the risk-informed
reactor oversight process in a deterministic regulatory framework.  Over the
long term, the staff efforts to risk-inform the regulations should close this
gap.
		The third area was using indicative measures of performance
rather than predictive measures of performance.  The reactor oversight process
is structured on the premise that a licensee's corrective action program can
address low-level issues without NRC involvement, and that performance
degradations will progress across the action matrix, allowing NRC involvement,
rather than jump from the licensee response column to the unacceptable
performance column.
		Many of the internal and external concerns regarding the
cross-cutting issues and inspection report thresholds that you heard about come
from skepticism about these assumptions from some stakeholders.
		These tensions, since they are created by the fundamental
unknowns of the oversight process, are likely to limit what some may consider
complete success with regard to achieving all of the agency goals across the
board, in some people's eyes.  On the other hand, the panel discussed that in
some respects this tension is beneficial because it really is a forcing
function for continued questioning and evaluation of the oversight process and
the premises behind the process.
		In closing, I'd like to recognize the dedicated effort by the
panel members, the NRC staff who supported the panel, and the many stakeholders
who presented their views to the panel.
		CHAIRMAN MESERVE:  Unless there are specific questions for me, I
will turn the discussion over to Stephen Floyd, and when we finish the
statements, then we will have a round of questions directed at all of you.
		Mr. Floyd.  
		MR. FLOYD:  Good morning, Mr. Chairman, and Commissioners.  I
will give you my bottom line first.  The industry does believe that the new
reactor oversight process is a significant improvement over the previous
process.  
		We find it to be far more repeatable, and far more predictable,
with the objective evaluation tools that are impeded in it, and it is much more
risk informed, which we think is probably one of the most important aspects of
the new process.
		With respect -- if I could have slide two.  With respect to the
initial implementation evaluation process, while it was painful to sit through
the length of some of the meetings that we had, and the number of meetings that
we had, I did find overall that it was a very effective vehicle for addressing
divergent views.
		And we did have a lot of divergent views and a lot of divergent
opinions about the various topics that we discussed in the meeting.  But
nonetheless I thought -- I was very impressed with the professionalism of all
of the members on the panel, and I think that everybody on the panel had a more
than ample opportunity to raise their opinions. 		And I thought the
rest of the members of the panel were very willing to listen and try to
understand the opposing views, and to try and come up with a final report, with
a set of recommendations that addressed everybody's views, and I think overall
that objective was met.  
		For slide number three, I would like to switch now to some of the
key areas for improvement.  As has been mentioned several times, this is a work
in progress, and while it is much improved, it certainly is not perfect, and
never will be. 
		But we are very pleased to see in SECY paper that the staff
recognizes the need for continued periodic assessments of the effectiveness of
the process, and to constantly look for improvements in it.  We think that is a
key element in this.
		One of the issues that we think is important to look at is the
parity of the significance of thresholds that are used in both performance
indicators and the significance of termination process.
		While a lot of effort went into the early construct of this
program, it still is not -- I don't think there is a complete parity obviously
between a yellow in some of the more qualitative significance processes, and
what a yellow means, and perhaps the risk reactor or safety performance
indicator results, which can be much more quantified.	
		There is some disconnects there which we ought to continue to
work on so that we don't send mixed messages out.  With respect to performance
indicators for Corrective Action Programs, we really believe that there are a
number of performance indicators already imbedded in the program for a
corrective action program.
		The combination of the 18 performance indicators, and the 28, 4
times 7, cornerstone areas, gives a good sense for what is going on in the
Corrective Action Program.  
		We took a look at the data from the first year of implementation
of the program, and what we looked for were negative comments in inspection
reports regarding deficiencies in licensee's correction action programs.
		We looked at those by action matrix columns, and what we are
finding is that for the plants that are in the Licensee Response column, there
is about one-and-a-half negative comments in the inspection reports regarding
corrective action programs.
		And if you move over to the next column, the regulator response
column, that jumps to about an average of about three comments.  If you go to
the next column over, it jumps up to about six comments per plant.
		And we only had one plant that was in the multiple degraded
cornerstone, but they had 10 negative comments over the course of the
inspection year regarding the corrective action program.
		Most of those negative comments by the way were not on the
subject which caused them to trip the threshold.  They were on green finding
areas, which where the corrective action program was found to be a contributing
factor to that condition.
		So we actually think imbedded in the program and in the construct
of the program, the performance indicator results, and the inspection finding
results serve as a performance indicator for the corrective action program.
		And we are not sure how you would develop a metric that would
actually do a much better job than actually looking at what the purpose of the
correction action program is in the first place, and that is trying to find
problems early on, and address them, and take care of them before they become
significant.
		The next issue we have is resolving consistencies amongst the
unavailability definitions.  This really has been a topic throughout the entire
program, and it has come up at every workshop, and every NRC public workshop.  
		We have had three meetings so far this year devoted just to this
topic, three public meetings.  We think that all of the issues are not on the
table that need to be factored into a decision, and we really encourage the
need to get on with making a decision.
		We think that a decision one way or the other could be made in
relatively short order, and if one could be made favorably to try to come up
with a common definition, we would like to shoot for a pilot beginning January
of next year to start piloting that effort, and we think that is achievable.
		I won't comment on the next point as that has already been
discussed by the staff.  The  next slide is the consideration of licensee
self-assessments.
		We think that this is an area where there could be some
efficiency improvements put into the program.  This was an element that was
part of the previous oversight program that the staff had, where under certain
limited conditions, with a set of criterion, and where the staff looked at the
qualifications of the licensees' staff that was going to do the
self-assessment, the cope of the inspection, and the likely conduct of the
inspection, as well as overviewing the results, the staff made a determination
whether or not they needed to come in a do an investigation that would have
largely looked at the same areas that the licensee had just done.
		And we think that there is a number of opportunities where the
staff could use a similar process in the new oversight process.  We understand
the logic for the first year in not doing this, and that you wanted to
establish a base line, and treat everybody very uniformly during the first year
to see what the program was telling you with regards to the effectiveness of
the new program.
		But we think now that you could introduce some efficiencies to
credit for licensee self-assessment under a well-defined set of circumstances. 

		On the significance determination process area, the fire
protection one does in our view need to be simplified even further.  There have
been some recent changes made to the fire protection SDP, and the fire
protection and PRA people in the industry tell us that it is significantly
improved and much easier to use than the previous one.	
		The area of biggest improvement that probably still needs to be
made is a better determination of what is the fire initiating event frequency
which needs to be factored in and how do you measure that.
		The security one, as we all know, that one was broken from the
get-go with the original one that was tied to the reactor safety one.  The
interim SDP that has been recently promulgated provides some near term
stability to the process, but we are looking forward to a final SDP that
mirrors the resolution of the rule making which is ongoing in that area.
		And in the ALARA area, the biggest concern there for the industry
-- and I think we are on a path to come up with an improvement in this area as
well -- is that the current SDP treats a plant differently if they happen to be
in the fourth quartile with respect to total dose exposure at the site.
		And what we are really seeing is that unlike in the past history,
today there is not a significant difference between the plants that are in the
first and third quartile, and never mind the third and fourth quartile, in
terms of total dose exposure.
		And a single leaking fuel assembly unexpected can easily put a
plant from the second quartile to fourth quartile.  And we don't think that
ought to be an influencing factor on how good of ALARA program they should be
doing.
		We think that people who are in all quartiles, in terms of total
dose exposure, ought to have an effective ALARA program, and be assessing that,
and looking for improvements in it, and correcting deficiencies.
		So I think the thrust of the new ALARA SDP should really focus on
how well a job is a licensee doing in carrying out their program, and when
deficiencies are found how effective is management oversight in getting those
deficiencies resolved and corrected, and less focused on what is the total
exposure at the plant.
		I think just a word about the Phase Two work sheets if I could. 
The initial round of those Phase Two work sheets that came out did have some
significant deficiencies.  
		The feedback that we are getting now is that on the enhanced
Phase Two work sheets that are being promulgated now -- and in fact most of
them are out -- the licensee thinks they are significantly improved and seeing
far less disconnects between their PRAs at the site and the enhanced Phase Two
work sheets
		And in our industry workshops and in our meetings with our chief
nuclear offices, we have urged the licensees to take a good hard look at those
enhanced Phase Two work sheets as they get promulgated, and to please flag very
early to the SRAs at the regions any disconnects that they see so that they can
get resolved and addressed before they have to be applied.  It is kind of hard
once you reach that stage.
		The last slide, our overall conclusions.  We agree with the
comments that were made at the outset that the first year of implementation
exceeded expectations, and it really did exceed industry expectations as well.
		We think that a tremendous amount of credit needs to go to the
staff and the management of the staff are putting in place as expansive a
program as this.
		It was done very professionally.  I think it was done with the
interest of genuinely trying to get as much stakeholder involvement as
possible, and to try and get a fair hearing of everybody's views on that.
		We think overall that the program is meeting the agency
objectives.  The industry is very committed to making the process work.  One of
the key elements in the new process is the importance of a corrective action
program and self-assessment capability at the site.
		And we have taken a number of measures within the industry to
bolster that activity and put more attention on that, and I think that is
starting to pay dividends as well.
		As I mentioned, it is a work in progress.  There is further
refinements to go, but I think the defined process that is in Manual Chapter
0608 for evaluating future changes to the program, and that is a very
disciplined instruction process, will ensure again the same give and take, and
the same consideration of diversity of views that set the original program in
place.
		And that will also be addressed through any changes that are put
in place, and that concludes my remarks.  Thank you.   
		CHAIRMAN MESERVE:  Thank you very much.  Mr. Hill.
		MR. HILL:  Good morning, Mr. Chairman, and Commissioners.  I
agree with Mr. Floyd's comments that the reactor oversight process is a notable
improvement over the previous licensee performance assessment program.
		I also agree that the initial implementation evaluation panel
that I was on was an effective vehicle for addressing divergent views, and that
there are some areas of improvement as identified by Mr. Floyd.
		However, there are two areas of concern that I would like to take
this opportunity to address.  Southern Nuclear opposes the use of the current
unplanned power change performance indicator, as well as replacement, that is
under consideration.
		In the past the industry would postpone corrective maintenance on
certain equipment deficiencies, and continue an acceptable operation based on
risk.  
		However, in today's competitive generation involvement, the
industry places more emphasis than ever before on improved reliability of the
plant for optimum performance at peak electrical periods, utilizing a
performance indicator to monitor decisions based on its competitive market
reason seems to be an inappropriate use of assessing performance within a
regulatory framework.
		The second area of concern is that Southern Nuclear agrees with
the industry position taken in the May 19th, 2000 letter from Messrs. Pate,
Rhodes and Collins to the Chairman, which states, "There is a significant level
of concern within the industry over the possibility of unintended consequences
that may result from the use of the performance indicators that counts SCRAMs. 
We continue to oppose the counting of annual SCRAMs due to the possibility
unintended consequences."
		I appreciate the opportunity to participate on the panel, as well
as the opportunity to address these two specific concerns that Southern Nuclear
has with the reactor oversight process.  Thank you.
		CHAIRMAN MESERVE:  Thank you.  Mr. Shadis.
		MR. SHADIS:  Thank you.  Good morning, Mr. Chairman and
Commissioners.  As you know, I replaced or at least took the seat of David
Lochbaum on the panel.  Mr. Lochbaum left feeling -- I think something like a
minority of one with respect to the orientation of the panel as regards
pro-safety or in getting on with the program.
		I don't have a problem serving as a minority of one.  I serve as
a minority of one on our local citizens advisory panel in decommissioning.  I
am the only person of anti-nuclear persuasion there, and I am kind of getting
used to it.
		The panel was something of a surprise to me, in that it was a
departure from my previous experience with various NRC activities.  In that,
panel members, including the NRC support staff, were quite solicitous of
getting my input.  
		They were quite tolerant of my opposing view comments, and that
was much appreciated.  In addition to that -- and this is maybe the most
outstanding difference, but in the past in many NRC activities, we submitted
comments, and then we failed to see them reflected anywhere in any subsequent
documents.
		The report of the panel, upon my reading of it, reflects not only
my own input in various areas, but the input of other stakeholders, external
stakeholders; the State people that came in, Mr. Lochbaum. 
		So I was really pleased to see that reflected in the document. 
The reactor oversight process itself is problematic for us, and a part of this
may be just the cultural shift.  
		We are asked to compare the previous process, the SALP process,
with the reactor oversight process, and everyone agrees that the reactor
oversight process is an improvement.
		The question is, is that the damnation with faint praise, because
many of had almost zero respect for the previous process.  Now, we may be on
the road to somewhere, but even making those comparisons is difficult.
		And that was illustrated in yesterday's meeting, in that we have
a previous assessment process using specific terminology, and specific
methodology to see where we were with reactor oversight.
		And we shift to a new process and now we have a new way of
measuring.  We have a new set of terminology, and the comparisons are difficult
to make.
		And one specific example of that that interested me was that in
reading NUREG 1275 on design basis issues, that document drew a fairly tight
correlation between the number of engineering and design inspection hours
expended, and the number of design basis issues that emerged.
		It makes sense.  If you look, you are going to find stuff, and if
you don't look, you definitely are not going to find stuff.  So one of the
inquiries we made toward the close of the IIEB process was whether or not in
the current round, the first year of experience, we had increased or decreased
the number of engineering inspection hours.
		And that information was not readily available.  For one thing,
the group that put together NUREG 1275 had their set of criteria for how you
define engineering inspection hours and design inspection hours.
		That set of criteria was not being used in any case, and couldn't
be found in the review process for the reactor oversight process, and as of 3
or 4 days ago, we still had not combed out enough information to make a
comparison, or at least I didn't.
		I had been asking for it, and the NRC support staff had been
looking for it, and it had not come into our hands.  So the elemental question
with respect to these design basis issues are then are we looking, and are we
looking as hard as we ever used to, or should we be looking harder.
		And that is hard to get a grip on because of the change i
methodology and terminology, and not to be too facetious about it, but I was
going to suggest perhaps the inauguration of an Office of Policy Terminology
HDR verization reconciliation.
		COMMISSIONER MCGAFFIGAN:  What's the --
		MR. SHADIS:  Well, if you shifted those around, you could
probably come up with new some vulgar acronym for it
		COMMISSIONER DICUS:  Not ESP?
		MR. SHADIS:  Well, if you were interested in convincing the
public that you are doing a good job on reactor oversight process, your
activities have to be translatable to a public kind of common sense.
		And I ran into a definition of science, and I heard some
laudatory things about science.   
But I ran into a definition of science the other day from Mr. Einstein, and he
said it was an extension of every day thinking, a refinement of every day
thinking.
		And people out there should not assume that things have to been
explained in simplistic terms, or that people need to be talked down to in
order to get an explanation to them.  
		It may be that in their own way that the general run of public,
as disinterested as they are on this issue, they may have a better handle on
the language than what happens to the language when we try to bring in every
little single consideration, and we build a technical nomenclature.
		Dealing with the PRAs, and the SDPs, and the kind of reasoning
that gets wrapped in it, and it reminds me of an attorney that we had, who said
that this area of the law is vague and murky.  
		It is not as crystal and clear from the outside as you might
guess, and so in trying to get these various initiatives, going back to our own
experience in Maine with the Independent Safety Assessment Team, and stepping
forward to the reactor oversight process, what I ran into was a continual set
of hurdles in changing vocabulary, in changing designations for various
activities, of augmented inspection teams, of diagnostic evaluation teams and
so on and shifting policy also.
		It was all very difficult to track, and I am suggesting to you,
and I don't know the real answer to this, but I think something really needs to
be done seriously, in terms of reconciling what has gone on in the past, and
what is going on now, in order to make the transitions understandable and
scrutable.  It is not happening.
		One of the things that we tried to get to in our written comments
was the notion that a little experience can replace an awful lot of theory, and
also an awful lot of theoretical analysis can be replaced by just a little bit
of experience.	
		And if that were not the case, there would be no problems with
the Osprey vertical take-off aircraft.  There would be probably no problems
with the Firestone tire/Ford Explorer controversy, because someone at some
point pushed a button on their computer and came up with an analysis that said
that those problems wouldn't happen.
		And I think ultimately there is nothing like taking a look and
finding what is actually physically in front of us.  The example was brought up
yesterday when the question was raised that if there was interaction between
the NRC staff and foreign nuclear operators with respect to certain
experiences.
		And the example that was brought up here was about interaction
with the France on control rod drive mechanisms, and the cracking around vessel
head penetration.
		I get accused of digging up ancient history, but that is ancient
history.  That was first brought to our attention as activists by Greenpeace in
1995, I think.
		Shortly thereafter, we saw an NRC paper pop up on it, and that
issue has been kicking around.  The interesting thing for me in that is that
the French -- and you may know the history of this, and bear with me if I am
repeating stuff you know.
		But the French found the first indications of cracking in the
reactor head penetrations by pressure testing.  Not doing an ASME-approved code
-- you know, computer examination, but actually physically pressure testing to
in excess of operating pressures.
		And I don't know if they went up by -- I think they went up by a
factor of two if I recall correctly, and then they discovered the cracks.  In
fact, some elements within the French reactor community were complaining that
that pressure testing in excess of anything that they could expect in operating
pressure had caused the cracks, and that operating pressure they had not
defined.
		Our sense was that -- and we had pushed by the way, and this is
also ancient history, but it sets an example.  We had pushed in the Maine
Yankee experience when they did their tubes leaving steam generators.  
		We had pushed that before restart that they ought to physically
do a hydrostatic pressure test, and it was roundly refused not only by the
utility, but by the NRC.
		Our sense is that there is a reason that we need on occasion to
check physically as to whether or not our calculations and our theories are
correct.
		And it is more than just record keeping.  We have a very nice set
of numbers now, and even though the curves -- I noticed that they were drawn
with a certain amount of artistic liberty, and this downhill run of curves that
indicates that the industry is doing a much better job.
		But this is reporting, and the question is what are the
parameters of reporting, and what are the categories.  Are they set up so as to
give us predictable results.
		And to our sense, the only way to really prove this is to take a
long hard look.  In our written comment -- and I am done, but in our written
comment we raised the question of the Maine Yankee ISAT, which is somewhere
back at the beginning of the history of this long trail of evolution.
		It was one of those watershed events,a nd that particular
inspection took 17,000 man-hours, and 4,500 hours were expended on-site
physically examining the plant.
		It was confined to two systems in their entirety.  Let's see. 
What were those.  Yes, service water and the HPSI systems were done in their
entirety, and partial examinations on two more systems, the auxiliary feed
water, and emergency diesel generators.
		And a raft of stuff that came out was simply overwhelming on four
systems out of roughly 30.  So when you are talking now about doing whatever
the new word for augmented inspections is 
-- the inspection, for example, at IP-2.
		You are not even in the same ballpark as what was done there, and
it comes down to the very simple argument that if you want to find stuff, which
ought to be one of the principal -- or at least I think, one of the principal
occupations of regulators, if you want to find stuff, you have to look hard.
		And if we are not going to look hard, physically look hard and
examine, we can condense ourselves that we are moving right along, and making
great improvements every day.
		Commissioner Peter Bradford recently spoke in Vermont, and his
comment was that the current atmosphere in the agency was deja vu, and it
reminded him very much of the pre-1979 era.  
		So with that cautionary note, I am going to close.  Thank you.  
		CHAIRMAN MESERVE:  Thank you.  Commissioner McGaffigan.  
		COMMISSIONER MCGAFFIGAN:  I always get a touch time to start. 
Mr. Shadis, why don't I start where you left off.  I obviously disagree with
former Commissioner Bradford.
		And the other comment I would make is that we do have these
curves and that most of them are flatlined, and we were talking yesterday about
exponential decay curves, which we had some discussion about.
		But licensees fall though far more performance indicators than
that.  I mean -- and I think their own experience is that they are striving,
and in many cases achieving, and in their own performance indicators you have
better performance.
		They do have an economic interest in these plants.  Almost all of
them are seeking license renewal, and perhaps all of the existing plants.  
		So if they plan to operate them for 60 years and do it well and
economically, there is a nexus between safety and economics.  So I couldn't see
-- I mean, this is by no means a situation of pre-1979.  
		So why I don't ask you to flush out -- I mean, you say we could
do more.  We could require massive inspections of these plants, and that would
require massive dollars and massive resources.  I don't know that we have a
basis.
		I mean, we had a pretty good basis back in '79 after TMI.  I
mean, the industry itself would say that their performance indicated things
were pretty miserable at that point.  But what would be the basis for massive
inspections today? 
		MR. SHADIS:  The direct answer to your question is that we are
not suggesting massive numbers of inspections, although it certainly would go a
long way to proving what you have in hand if you were to do a few random
inspections.
		And there was that interim independent safety assessment done in
1996, and not so far back, and that was done I believe under political
motivation, which is a good reason to do things.
		COMMISSIONER MCGAFFIGAN:  The interim what?
		MR. SHADIS:  The interim period between 1979 and the present.
		COMMISSIONER MCGAFFIGAN:  Right.  We did what?
		MR. SHADIS:  You did an independent safety assessment at Maine
Yankee.
		COMMISSIONER MCGAFFIGAN:  Right.
		MR. SHADIS:  And that is the massive, or single massive --
		COMMISSIONER MCGAFFIGAN:  Well, we did other massive inspections
at D.C. Cook, and at --
		MR. SHADIS:  Millstone.
		COMMISSIONER MCGAFFIGAN:  At Millstone, et cetera, where we
thought there were significant problems that had self-identified themselves, or
that our inspectors found.
		I mean, if we do find -- and we went through a 54.F process,
licensees invested massive resources in the 1996/1997 time frame, you are
questioning all of that.  
		But we have to follow, I think, the indicators where they lead
us, and if we are -- I mean, I just don't know that we are in anything like the
situation we were with Maine Yankee, or Millstone, or in fact we did do as the
staff would say, but we did do -- and not in every plant, but we did some
additional design basis inspections.
		The only place we found significant problems was D.C. Cook.  I
forget now many there were in addition to the ones that got on the pages of the
paper, and not in the depth.  I mean, we weren't looking at every system.
		But we did design basis inspections, and we weren't finding
anything except D.C. Cook, and we as a Commission, I think before the
Chairman's time, decided to terminate the effort and roll it back into the
normal inspection process, because we thought we had turned -- you know, we had
made a judgment that we had turned up what we were going to turn up.
		And that those are judgments that we have to make with finite
resources.  Why don't I go on to Mr. Floyd.  In terms of -- well, this
look-back issue that you talked about, and the staff was talking about, what is
your proposal with regard to look-back?
		If you get what I want, which is every quarter on the webpage,
just like every year of major league baseball on the webpage, the public is
going to be able to look back at the previous quarter anyways.  
		So would you just carry -- I guess the issue is for Pis, but it
is for inspection findings.  Would you carry still the inspection findings?  
		I mean, the red one would carry forward how many quarters, and
the yellow how many, and the white how many, and the green how many.  Do you
have a proposal?
		MR. FLOYD:  No, we don't have a concrete proposal.  I know that
some folks in the industry think that maybe the red ought to stay on there for
four quarters, and maybe the yellow would stay on there for three quarters, and
the white would stay on for two quarters.  I think we would have to talk a look
at that, and see.
		COMMISSIONER MCGAFFIGAN:  Okay.  There is a -- it is almost like
a -- well, I sense a little bit of deja vu since it would be the old SALP
process in some sense, because people are anxious to get these things off the
page.
		In fact, Calloway, I believe, helped -- you know, they wanted the
world to know that their ALARA white, three white findings, were going to
rotate as of August 8th when the next indicators go up, and they will have a
green board at that point, at least on the inspection findings.
		So I think people are pointing to it.  As I said, I think it is a
fairly moot issue if we can in fact get all the quarters on the webpage, and
then people can just look back and see when the event occurred, and see what we
graded it at the time.
		MR. FLOYD:  I think one of the challenges in trying to decide
when to roll off the inspection findings is the fact that not every inspection
module gets examined every quarter, unlike the Pis, where every quarter you do
update the PI information.
		So in some cases it is very appropriate to keep it on for four
quarters because it may be the only time during the year that that area was
looked at and inspected.
		On the performance indicators, you are certainly right that on
the top block of the very first page, you are only seeing the most recent
outcome of that performance indicator.
		But if you click on it, you can see at least a 12, and in some
cases at least a 36 month look at what the indicator result would have been in
previous quarters.  But it does require drilling down one level.
		COMMISSIONER MCGAFFIGAN:  It requires drilling down and as a
former busy Congressional staffer or whatever, I would prefer to just be able
to click back on quarters, or I think that is probably the way the public is.  
		They just want to see what it was like for a previous quarter
without having to do the information themselves.  In light of the time, Mr.
Chairman, I think I will just leave it like that. 
		CHAIRMAN MESERVE:  Commissioner Merrifield.
		COMMISSIONER MERRIFIELD:  Yes.  Mr. Shadis, I appreciate your
comments regarding the language we use and the way in which we use it around
here.  
		I think there is a balance that we try to achieve, and we
obviously are a very technical agency, with  highly skilled people, who can
talk at an extremely high level.  
		I think the Commission has encouraged our staff through our plain
English initiative to try to capture those in a way which is understandable to
an average member of the public.
		Now, obviously one has to be careful about not overreaching that
in that respect, and talking down, or using language that's too base in that
respect.  But I just wanted to comment on that, and I appreciate your comments,
sir, and I think it is a continuing evolution we have to make sure that we are
getting it right.
		I do appreciate -- and I know before of the time that you spent
on the IEP.  You made some very positive comments about the process itself, and
similarly we received very positive comments about the activities of all the
participants, including you.
		Looking forward, one of the decisions that we are going to have
to make is what is the appropriate nexus for having a continuing ability to
sample and judge our process going forward.
		One way is to do it using a FACA panel, such as this, and which
can be quite expensive, and time consuming for our staff.  There are obviously
other ways of doing that which would engage stakeholders, including yourself,
and/or others.
		Any sense of whether it has got to be FACA-like going forward, or
whether there are some other ways we can achieve the same results without the
duplicity of complexity and costs.
		MR. SHADIS:  Well, I think that a lose poll of our panel members
would tell you that it would be pretty hard to get them to serve again.  
		(Laughter.)
		MR. PLISCO:  We did take a vote on who would be in the next
panel.
		MR. SHADIS:  We all had a good time, thank you very much, but it
is time consuming and extreme, and I have two banker boxes full of paper at
home as a result of involvement with this panel.
		So it is burdensome and it may not be the most efficient way
either of doing things, and I am not sure what the answer is, but we can do
better with our electronic communications certainly.
		And we ought to think about doing some of these meetings with
some sort of live electronic hookup so that people don't have to travel and can
still comment.
		And the other thing that would help, too, would be trying to
apply those plain language initiatives to the documentation as it moves forward
so that it is a little easier to follow.  
		And those issues that are high profile things, we would like to
be able to get a handle on them a little quicker and a little better. 
Monticello, for example, and the recent bellows compression thing.  And we
would be interested to see how that is rated in the new program.
		COMMISSIONER MERRIFIELD:  That's a fair comment.  As you go back
to Maine and enjoy the summer, which is much more pleasant than those that we
have here in D.C., if you have heard the reflections on how we may improve our
process, either as it relates to these panels, and the stakeholder involvement,
or the way in which we communicate, this would be helpful to receive further
comment from you.
		Mr. Floyd, we had some specific comments from Mr. Hill that were
indicative of supporting where NEI was on the testimony that you made, but some
refinements and some concerns that Southern Nuclear had in particular about a
couple of the performance indicators.
		And in both of those cases, those are areas where I think the
Commission has engaged quite rigorously previously, and the staff has engaged
with NEI to try to see if we can revolve that through pilots and through
delving in some other issues.
		I guess I would turn the question back around since Mr. Hill
thought it was important to characterize those as an opinion of Southern, and
distinguish it from NEI.
		And I am wondering on the flip side what is the official NEI
position regarding some of the issues that Mr. Hill as raised? 
		MR. FLOYD:  Well, I would say that where we are right today is
that there is a process that has been establish, the pilot process.  It had
established evaluation criteria for it.
		What needs to be done now is to step back and take a look at what
the evaluation against the criteria tells us about the replacement indicators.
		Both the replacement for the SCRAM on an in-plant power change
one, which has yet to be piloted, but nonetheless is in an effort to try to
initiate a pilot.  
		As I mentioned the Manual Chapter 0608, which the staff has
developed, I think provides a very disciplined process, and requires the
establishment of performance criteria against which to evaluate changes to the
program.
		And our encouragement is the staff needs to follow the process
and let the answer come up to what they think the answer is when you do follow
the process.
		COMMISSIONER MERRIFIELD:  Do you think our staff is being
prejudgmental in terms of its analysis in that area, or is it really trying to
see if we can identify different ways of solving this --
		MR. FLOYD:  Oh, I think they are being very open to looking at
alternatives, and I don't think there is any prejudice on their part or any
indicators.
		We have had some very frank discussions on both of those
indicators, and an extensive give and take over the last year on both of those,
and I haven't seen any reluctance to consider alternatives at all.
		CHAIRMAN MERRIFIELD:  I don't know if I am going to have the last
word on this particular one, but I do have to say that we had a discourse about
resident inspectors early, and I do want to see party shot, and that is that I
have had the pleasure of meeting at this point over a hundred of our resident
inspectors.
		And I think we all recognize the value for which they serve, in
terms of being the sentinels of safety in this agency.  I want to compliment
our regional administrators, in terms that they have brought a -- you know, in
terms of the changeover that we have had -- and obviously those are areas where
we do get some new people.
		But the high quality of those individuals and the degree of
increasing diversity we have among those individuals is I think reflective of a
significant effort on the part of our regional administrators to make sure that
those people are of the highest quality.
		And I think that they may have to try harder to make sure that we
fill those slots is what Commissioner McGaffigan has asserted.  But in terms of
the people that we are actually getting, I think they are terrific.  
		MR. FLOYD:  Thank you, Mr. Chairman.
		CHAIRMAN MESERVE:  The SECY paper associated with this meeting,
of course, attach not only your report, but as Attachment 5 included the
staff's response to your report.  
		And I would be interested in whether you have any reactions to
the staff's response, and is there anything in there that disappoints you, or
suggests that the staff had not understood what you said, or intended to say,
or do you have any comments on what the staff's reaction to all the work that
you have done?
		MR. PLISCO:  I can say that I have read through the response, and
my reading of their response is that they understood clearly what the issues
are, and their response is reflective of our comments.
		And as I said earlier, we worked closely with the staff all
through our meetings because they were at most of our meetings, and I think we
spent a lot of time explaining to them the perspectives of the panel members of
what our issues are.
		So I think they had a very good understanding of what our
concerns and issues were, and the different perspectives of the panel members
were.
		CHAIRMAN MESERVE:  Well, the response not only indicates whether
they understood what you said, but what they intend to do about it.  Are you
all comfortable with that?
		MR. PLISCO:  I can say that they were responsive, and as an
example, there are a number of recommendations that you didn't see because the
staff responded to them long before we wrote our report.	
		And as a panel, we elected not to include those in our report,
and that they were taken care of.  So many of our recommendations through our
six months were handled, and we were happy with the resolution of those, and so
we didn't include those in our report.  So I think they have been very
responsive, and that those comments are responsive.
		CHAIRMAN MESERVE:  Any of the others, if you want to react to
that?
		MR. FLOYD:  I think there is good alignment between again the
issues that were identified and what the industry thinks is important.  You
can't prejudge what the resolution of them will be, but I think the actions
that they have laid out to address each one of those are the right actions to
be taken, and we have no disagreement with those.
		CHAIRMAN MESERVE:  Okay.  
		MR. SHADIS:  I would like to comment on that if I may.  I think
the input, especially the critical input of external stakeholders, went through
a kind of filter process.  It had to in composing the panel report.
		And in general it was not reduced to single objective statements;
subject, predicate, object analysis.  A lot of it was qualitated in its view,
and that does not appear to me to be dealt with in full in the staff response.
		And I realize that it would be difficult, because the staff was
looking for specific chores to do, and they detailed out what they were going
to do.  
		But I think it bears, and it would probably be fruitful actually
to go back through some of the transcripts of the meetings, and so of the
comments that were submitted by those external stakeholders -- and the States
in particular -- and see if the staff can't wrestle through the creative
language, and get down to something that they can attack point by point.
		CHAIRMAN MESERVE:  Good.  Thank you.  Mr. Floyd, I want to pursue
one thing that you raised that was not on your slides, and it was very
interesting.  
		You indicated that when you went back through inspection reports
that you saw a correlation between the number of comments relating to the
Corrective Action Program and the allocation that played into the columns and
the action matrix, and that the more comments that were correlated with the
position on the action matrix.  
		And which you drew the conclusion and were getting at the
corrective action program adequately through the existing mechanism.  It seems
to me that there is another conclusion that one could draw, which is that maybe
we have stumbled on a predictive indicator.
		That we look at the corrective action program, and we are finding
something that correlates with risk, and you indicated that in fact you saw
some of the comments didn't relate to the areas that were the ones that caused
the plant to be in a given column.
		And I am just curious.  It seems to me that one could draw an
entirely different conclusion from the data that you provided than you did.
		MR. FLOYD:  I don't think that is inconsistent, and in fact we
think the entire construct of the oversight process is in and of itself a
predictive indicator, because I was a little bit struck by yesterday's
conversation at the meeting about the need for a predictive indicator.
		And the first question you have to ask yourself is predictive to
what, and if you are looking for an indicator that is predictive to when you
are going to have a SCRAM, and when you are going to have an unavailability
situation on a system, that is probably very difficult.
		But if you are looking for an indicator of when do we have a
significantly increased likelihood that we are going to have an increased
likelihood that we are going to have a significant exposure to the public as a
result of the problem at a nuclear power plant, the entire construct of the
oversight process is set up to look for the erosion of margins to providing
that level of protection, and trying to predict when that event might happen.
		So I think that is highly consistent that if you take a look at
the outcomes of the action matrix, and look at the importance of the corrective
action program, it is indeed a predictive indicator in that respect.
		CHAIRMAN MESERVE:  Let me ask this.  All of you spent an enormous
amount of time dealing with the program, and one of the issues that -- and as I
think I indicated with the earlier panel -- that we are worried about, and not
right now, is resources.
		And I would like to get your impressions of whether or allocation
-- if you think our allocation of resources to this effort is appropriate; to
great, too little, and I think we have heard from Mr. Shadis on this point
already.
		And, Mr. Merschoff, your comment was that perhaps we ought to dig
deeper in certain places.  But I would like to get your views.
		MR. PLISCO:  I'll start --
		CHAIRMAN MESERVE:  It is a little unfair to ask you.  
		MR. PLISCO:  Well, yes.  Well, I'll talk as the Chairman with the
Chairman's hat, first, of the panel; and as a panel, we really didn't spend a
lot of time looking at resources, because a lot of that information was not
available to us until the very end.  
		We did have some discussions in specific areas.  We had some
stakeholders that raised issues about resources specifically in the ALARA area
that we heard from some of the stakeholders, and the concern had to do with --
well, if you look at the performance indicators, and if you look at exposure
clear across the industry over the last 10 years, there have been significant
improvement.
		Yet, if you compare how many resources we are spending in the new
program compared to the old one, we are actually spending more in that area,
and that didn't seem to make sense.  
		We heard those comments from some stakeholders, but overall we
really didn't spend a lot of time looking at that.  Now, my regional hat, I
think the resources are about right.  
		We are making some minor changes here and there with experience,
with specific procedures -- and I am talking about a low level of detailed
minor changes, but overall I think right now the resources are right.
		CHAIRMAN MESERVE:  Mr. Floyd.
		MR. FLOYD:  We think that the way the program was developed the
resources are probably pretty close to be correct.  There was a lot of effort
made in trying to look at what are the areas that were risk significant in each
of the seven cornerstones, and whether or not the elements in the inspection
module are necessary to satisfy whether or not the objectives of the
cornerstone could be measured, and you could draw a conclusion as to whether or
not they are being met or not.
		We think that there are some efficiencies certainly that can come
into the program, and we are hopeful, and I hope not optimistically hopeful,
that the Phase Two work sheets will reduce some of the resources that have been
expended in the reactor safety findings area.
		As I mentioned in my remarks, I think we could take more
advantage of licensee self-assessments for those licensees who the NRC has good
confidence in that they do have a good self-assessment and corrective action
program capability, and there could be some efficiencies there.
		I would comment that if you look at the results that have been
achieved in the program, it seems to me that we are looking at pretty well
even.  If you look at slide seven on the staff, where I believe it was you, Mr.
Chairman, that made the comment about the number of thresholds that have been
crossed in the Pis, and in what areas.
		You have to remember, I think that the white threshold being
crossed is a departure from the norms of industry performance, and not
necessarily a risk-significant departure.  So there is a difference between
crossing the green and white threshold, and crossing the white and yellow
threshold in terms of risk significance.
		So what we are really seeing, I think, in the green and white
threshold column, where a preponderance of the indicators are, is where some
plants are just starting to deviate from where the rest of the industry
currently is.
		So it is identifying the smaller set of plants that have specific
problems in some focused areas.
		CHAIRMAN MESERVE:  Good.  Mr. Shadis, do you have any further
comments?
		MR. SHADIS:  Well, I think I know where you can get more
resources.  But I am just going to suggest that I believe you have to look at
the allocation of resources, and you have to look at all the programs.  
		It is a puzzle to the public why the agency spent resources to
put three generic reactor designs on the approval shelf, and with maybe nobody
ever using them.  
		It is a puzzle the way that we do reach out for some of these
things when we have operating plants, and we are concerned about recruiting
inspectors, and the number that we have available and trained, and so on, and
it seems to us to misplace the focus.
		CHAIRMAN MESERVE:  As a regulator, we are required to respond to
applications that are submitted.  Commissioner Dicus.
		COMMISSIONER DICUS:  Okay.  Thank you.  Let me address the issue
of indicative versus predictive indicators.  I raised the issue yesterday, and
so I am going to go back to it.  
		And of course it is one of the tensions that has been listed in
your report.  And you are sort of thinking that it was curious to comment that
the indicators that we have now might all be considered predictive.
		But they are after the fact, and the Chairman brought up the
issue of whether or not the corrective action programs are really a predictive
indicator.  I would like for you to expand on that.
		MR. FLOYD:  Sure.  I don't think that the indicators in the
programs themselves are individually predictive.  What I meant to say was that
the entire program, the construct of the entire program itself, is predictive
in nature because it is looking at margins to when a plant might have a threat
which might be significant to public health and safety.
		The only two indicators that historically -- and we agreed with
the staff evaluation on this -- that did have some correlation with the past
plants that had significant problems, and that had some leading capability,
were the safety system functional failures, and the unplanned power change PI.
		COMMISSIONER DICUS:  Would you care to comment, Mr. Shadis?
		MR. SHADIS:  Well, if your local bookmaker gives odds on a horse,
I call that predictive; and therefore your probablistic risk assessment is all
predictive, in the sense it says these are the odds of the sequence of events
happening.  
		And the rest of it is not, and it is indicative.  I don't know
that we can really find a way to get into predictive space.
		COMMISSIONER DICUS:  That sort of goes to some comments that you
have made in your submitted testimony about are we finding everything that we
need to look, and you quoted me in a question that I asked in 1997, I think it
was.  
		And if we looked at all the plants in an in-depth review, would
we find the same thing that we found at Maine Yankee; and so that's why I come
back to the predictive question.  
		The other thing that I wanted to just briefly review -- and I
know that the time has gotten around on us, and this has to do with what
somebody has already brought up with the plain language issue. 
		I know -- and again looking at your testimony, we don't -- one of
the things that we are criticized for is not talking in plain language.  And
that is not to say that the language needs to be simplified or whatever.
		But we have got to be able to talk in terms of when we are
talking to someone that may not be well versed on a technical issue, for
example.  That we can accurately and clearly explain the situation.
		And that is what we attempt to do as you know, and we talked last
week -- I had the opportunity last week when I was in Connecticut to have a
breakfast meeting, and unfortunately we didn't have enough time, but it got
abbreviated, and local officials, and public interest groups, and Mr. Shadis,
and quite a few people from Maine actually were there.
		And we talked about some of these things, and trying to how we
could better communicate, and that is one of the issues that the Commission is
looking at.  
		We also talked about the issue, another issue, that you brought
up verbally about participation with external stakeholders that represent
public interest groups, and that represent the public, and the difficult that
it is.
		And one of the things that we discussed last week is funding for
various groups, and how this should be.  Would you care to elaborate some on
that from any thoughts that you have?  
		This came up yesterday with Dr. Lyman as well, the difficulty
that activist groups may have in being able to attend meetings, and to be part
of them.  And that is of concern to us, and interest to this Commission.
		MR. SHADIS:  My lose polling of activists is that I tried to get
local and regional activists to, for example, come down and participate in the
regulatory information conference.  
		And many of them just don't want to have anything to do with NRC
processes.  They have made their judgment, and they don't see anything on the
horizon that is going to convince them that the NRC isn't a glove on the hand
of the industry.
		And that is their perception, and so these things are
problematic, in terms of reestablishing trust.  One of the things about any
independent advisory board is that it ought to have its own independently
arrived at structure. 
		It ought to have resources allocated to it so that it can
independently select and call forward expertise, and expend that money. 
Secondly, if you are going to involve citizen activists, you have to realize I
think -- I sort of hate to use the words "have to."
		But it is imperative that you have to realize that people have to
earn a living somewhere and most people are not paid to do this kind of thing. 
So some sort of compensation really should be provided to panel members.
		And I realize that all this stuff is problematic, and it all
needs to be worked out, but what I am pushing for here essentially is
independence in the structure, and the place that any panel may be coming from.
		I mean, it was my take, and I joined this group, and a fine group
it was, too.  But my take was that I was coming into a room where most of the
fellows involved were speaking the same language, and coming from the same
common set of experiences.
		And that cultural cohesion really blurred the distinction between
regulator and licensee, and that it was sort of a foregone conclusion that the
program is working pretty good and ought to continue.  
		Well, I could have written that on the first day, but we went
through a long way to get there.  
		COMMISSIONER DICUS:  Well, you make that comment in your
submittal, but you also make the comment that you thought that the experience
was quite positive. 
		MR. SHADIS:  Oh, yes, very much so.
		COMMISSIONER DICUS:  That's all.  Thank you.
		CHAIRMAN MESERVE:  I like to thank the panel.  I know that this
was an enormous amount of work for you to come to the meetings that you came
to, and endure all of the assessment that you had to undertake to draft a
report. 
		It is very, very much appreciated, and we appreciate your effort. 
I would like to thank both panels for their participation this morning. This
has been very interesting and very helpful.  With that, we are adjourned.
		(Whereupon, the meeting was concluded at 11:59 a.m.)