[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Event reconstruction times in STAF as a function of impact parameter



"Maguire, Charles F" wrote:

>         A set of STAF runs has been done to obtain CPU times as a
> function of impact parameter for Central Arm event reconstruction.
> The input PISA hits files at fixed impact parameter and at fixed Z=0 cm
> were actually generated with HIJING in late August as part of the
> descoping studies.

Super!  This will help a lot in the exercise we're going through with the RCF
revisiting our CPU estimates.  Last week, since I didn't have these numbers at
hand, I had to take my best guess at the CPU difference between handling a min
bias sample and handling a central sample.  Thanks!
 
>         3) A second order polynomial fit for the CPU time as a
>            function of mean multiplicity gives, just for fun,
> 
>            T =   29    +    0.0061*M  +  0.0000042*M**2
>                +/- 4    +/- 0.0047   +/- 0.0000007
> 
>            with a chisquare/deg ~1 assuming 5% errors on the times.

Since the events occur with a geometrically weighted impact parameter
distribution (I'm presuming nobody actually knows the multiplicity distribution
yet!), I fit your raw numbers as a function of impact parameter instead.  I've
attached a little root macro (bdist.C) which shows what I get. 

However, what I really want is the cpu need averaged over a min bias sample. 
Since the cpu per event as a function of b is nearly a linear function (not
quite, but close enough, and interesting in its own right), I'll assume that
<cpu(b)> ~ cpu(<b>).

Then I'll assume that the impact parameter distribution of the events we accept
is unmodified from the geometrically weighted one except for a `bmax' that our
trigger imposes.  Of course, the right thing to do is convolute with the
acceptance of our trigger.  Under these assumptions, <b> = 2/3 * bmax.

Next, I made a little plot of cpu need as a function of our bmax cutoff.  I did
this relative to the cpu need of central events.  I've attached a second root
macro that shows that plot.

Anyway, the gist is this: if I assume a bmax of 14fm, the cpu need is 15% of a
central sample; if I assume a bmax of 10fm, the cpu need is 30% of a central
sample.  Just another one of those places where a factor of two uncertainty is
easy to come by!

So, do we know what kind of bias there is in our min bias trigger?  Is a 10fm
cutoff reasonable?  Is there a strong rolloff in acceptance for peripheral
events? 

Dave

-- 
David Morrison  Brookhaven National Laboratory  phone: 516-344-5840
                Physics Department, Bldg 510 C    fax: 516-344-3253
		          Upton, NY 11973-5000  email: dave@bnl.gov
{
gROOT->Reset();

Int_t i;
Float_t values[8] = {532, 439, 302, 177, 97, 60, 47, 30};
Float_t x, e;

gStyle->SetOptFit(1); 

c1 = new TCanvas("c1","The Fit Canvas",200,10,700,500);
TF1 *func = new TF1("fitf","pol2(0)",-100,100);


hpx = new TH1F("hpx", "CPU vs impact parameter", 8, -0.5, 14.5);
hpx->SetFillColor(2);

for (i = 0; i < 8; i++) {
  x = values[i];
  e = 0.05*values[i];
  hpx->Fill(2*i, x);
  hpx->SetBinError(i+1,e);
}

hpx->Fit("fitf", "W");

hpx->Draw();
c1->Modified();
c1->Update();

}

{
gROOT->Reset();
c1 = new TCanvas("c1","CPU vs min bias cutoff",200,10,700,500);

fun1 = new TF1("fun1","(591-86*(2*x/3)+3.3*(2*x/3)*(2*x/3))/591",0,14);
fun1->Draw();

c1->Update();
}