1 |
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2//EN"> |
---|
2 |
<HTML> |
---|
3 |
<HEAD> |
---|
4 |
<META HTTP-EQUIV="Content-Type" CONTENT="text/html; charset=us-ascii"> |
---|
5 |
<META NAME="Generator" CONTENT="MS Exchange Server version 6.5.7036.0"> |
---|
6 |
<TITLE>RE: [mpich2-maint] #403: mpiexec kills the remote login shell</TITLE> |
---|
7 |
</HEAD> |
---|
8 |
<BODY> |
---|
9 |
|
---|
10 |
|
---|
11 |
<P><FONT SIZE=2> Hi,<BR> |
---|
12 |
Can you try out the patch attached. The patch contains some extra debug statements which will help us in narrowing down on your problem.<BR> |
---|
13 |
<BR> |
---|
14 |
Applying the patch<BR> |
---|
15 |
---------------------<BR> |
---|
16 |
1) change directory to top-level of mpich2 source<BR> |
---|
17 |
2) patch -p0 < mpich2_1_0_8_Korebot.patch<BR> |
---|
18 |
3) Re-compile & re-install MPICH2<BR> |
---|
19 |
<BR> |
---|
20 |
Now re-run smpd & mpiexec in debug mode with a simple mpi program, hellow.c (smpd -d > smpd.log / mpiexec -verbose -n 1 ./hellow > mpiexec.log).<BR> |
---|
21 |
<BR> |
---|
22 |
Regards,<BR> |
---|
23 |
Jayesh<BR> |
---|
24 |
<BR> |
---|
25 |
-----Original Message-----<BR> |
---|
26 |
From: Yu-Cheng Chou [<A HREF="mailto:cycchou@ucdavis.edu">mailto:cycchou@ucdavis.edu</A>]<BR> |
---|
27 |
Sent: Thursday, February 12, 2009 1:33 PM<BR> |
---|
28 |
To: Jayesh Krishna<BR> |
---|
29 |
Subject: Re: [mpich2-maint] #403: mpiexec kills the remote login shell<BR> |
---|
30 |
<BR> |
---|
31 |
Hi,<BR> |
---|
32 |
For the first question, I am not able to get the core dump for mpiexec/hellow/ssh on the Korebot because of the limited memory on the Korebot.<BR> |
---|
33 |
<BR> |
---|
34 |
For the second question, I can run such a program with fflush(stdout) and fflush(stderr) statements on the Korebot.<BR> |
---|
35 |
<BR> |
---|
36 |
Thank you<BR> |
---|
37 |
<BR> |
---|
38 |
On Thu, Feb 12, 2009 at 11:14 AM, Jayesh Krishna <jayesh@mcs.anl.gov> wrote:<BR> |
---|
39 |
> Hi,<BR> |
---|
40 |
> I have yet to make the debug module (shouldn't take much time). The<BR> |
---|
41 |
> answers to the questions in my prev email will help me to put in the<BR> |
---|
42 |
> right debug statements.<BR> |
---|
43 |
><BR> |
---|
44 |
> Regards,<BR> |
---|
45 |
> Jayesh<BR> |
---|
46 |
><BR> |
---|
47 |
> -----Original Message-----<BR> |
---|
48 |
> From: Yu-Cheng Chou [<A HREF="mailto:cycchou@ucdavis.edu">mailto:cycchou@ucdavis.edu</A>]<BR> |
---|
49 |
> Sent: Thursday, February 12, 2009 12:23 PM<BR> |
---|
50 |
> To: Jayesh Krishna<BR> |
---|
51 |
> Subject: Re: [mpich2-maint] #403: mpiexec kills the remote login shell<BR> |
---|
52 |
><BR> |
---|
53 |
> Hi,<BR> |
---|
54 |
> Would you give me the debug module directly?<BR> |
---|
55 |
> Thank you<BR> |
---|
56 |
><BR> |
---|
57 |
> On Thu, Feb 12, 2009 at 10:15 AM, Jayesh Krishna <jayesh@mcs.anl.gov> wrote:<BR> |
---|
58 |
>> Hi,<BR> |
---|
59 |
>> Do you get a core dump of mpiexec/hellow/ssh ? (if yes, what does it<BR> |
---|
60 |
>> show<BR> |
---|
61 |
>> ?)<BR> |
---|
62 |
>> Can you run a simple non-MPI C program with fflush(stdout) &<BR> |
---|
63 |
>> fflush(stderr) in it?<BR> |
---|
64 |
>> If the above suggestions don't narrow down the problem I will give<BR> |
---|
65 |
>> you a debug module (patch with some extra printfs) to help us narrow<BR> |
---|
66 |
>> down the problem.<BR> |
---|
67 |
>><BR> |
---|
68 |
>> (PS: I looked into the code, but cannot think of anything that might<BR> |
---|
69 |
>> fail in your environment.)<BR> |
---|
70 |
>><BR> |
---|
71 |
>> Regards,<BR> |
---|
72 |
>> Jayesh<BR> |
---|
73 |
>><BR> |
---|
74 |
>> -----Original Message-----<BR> |
---|
75 |
>> From: Yu-Cheng Chou [<A HREF="mailto:cycchou@ucdavis.edu">mailto:cycchou@ucdavis.edu</A>]<BR> |
---|
76 |
>> Sent: Thursday, February 05, 2009 4:27 PM<BR> |
---|
77 |
>> To: Jayesh Krishna<BR> |
---|
78 |
>> Cc: mpich2-maint@mcs.anl.gov<BR> |
---|
79 |
>> Subject: Re: [mpich2-maint] #403: mpiexec kills the remote login<BR> |
---|
80 |
>> shell<BR> |
---|
81 |
>><BR> |
---|
82 |
>>> Hi,<BR> |
---|
83 |
>>> The debug outputs look normal (the problem could be with the part of<BR> |
---|
84 |
>>> the code at mpiexec exit() which has no dbg statements). I have<BR> |
---|
85 |
>>> added this to our bug tracking list.<BR> |
---|
86 |
>>> Meanwhile,<BR> |
---|
87 |
>>><BR> |
---|
88 |
>>> # Can you send us your ".smpd" config file ?<BR> |
---|
89 |
>><BR> |
---|
90 |
>> The ".smpd" file only contains one line of statement as follows.<BR> |
---|
91 |
>><BR> |
---|
92 |
>> phrase=123<BR> |
---|
93 |
>><BR> |
---|
94 |
>>> # Did you modify the MPICH2 code to run on Korbet (Please send us<BR> |
---|
95 |
>>> your configure command & any env settings set to configure/make<BR> |
---|
96 |
>>> MPICH2)?<BR> |
---|
97 |
>><BR> |
---|
98 |
>> I did not modify the MPICH2 source code.<BR> |
---|
99 |
>> The configure command that I used is listed below.<BR> |
---|
100 |
>><BR> |
---|
101 |
>> ./configure LDFLAGS=-L/tmp/korebot/mpich2-1.0.8/korebot_openssl/lib<BR> |
---|
102 |
>> --host=arm-linux --with-cross=crosstype --with-pm=smpd --with-mpe=no<BR> |
---|
103 |
>> --disable-f90 --disable-f77 --disable-cxx<BR> |
---|
104 |
>> --prefix=/tmp/korebot/mpich2-1.0.8/korebot_mpich2<BR> |
---|
105 |
>><BR> |
---|
106 |
>> The "korebot_openssl/lib" contains the libraries needed for building smpd.<BR> |
---|
107 |
>><BR> |
---|
108 |
>> The content of the file "crosstype" is listed below.<BR> |
---|
109 |
>><BR> |
---|
110 |
>> CROSS_SIZEOF_FLOAT_INT=8<BR> |
---|
111 |
>> CROSS_SIZEOF_DOUBLE_INT=12<BR> |
---|
112 |
>> CROSS_SIZEOF_LONG_INT=8<BR> |
---|
113 |
>> CROSS_SIZEOF_SHORT_INT=8<BR> |
---|
114 |
>> CROSS_SIZEOF_2_INT=8<BR> |
---|
115 |
>> CROSS_SIZEOF_LONG_DOUBLE_INT=16<BR> |
---|
116 |
>><BR> |
---|
117 |
>> Thank you<BR> |
---|
118 |
>><BR> |
---|
119 |
>><BR> |
---|
120 |
>>> > -----Original Message-----<BR> |
---|
121 |
>>> > From: Yu-Cheng Chou [<A HREF="mailto:cycchou@ucdavis.edu">mailto:cycchou@ucdavis.edu</A>] > Sent:<BR> |
---|
122 |
>>> Wednesday, February 04, 2009 2:32 PM > To: Jayesh Krishna > Cc:<BR> |
---|
123 |
>>> mpich-discuss@mcs.anl.gov > Subject: Re: [mpich-discuss] mpiexec<BR> |
---|
124 |
>>> kills the remote login shell > > > Hi,<BR> |
---|
125 |
>>> > > Does smpd abort when you run your MPI job ?<BR> |
---|
126 |
>>> ><BR> |
---|
127 |
>>> > No.<BR> |
---|
128 |
>>> ><BR> |
---|
129 |
>>> > ><BR> |
---|
130 |
>>> > > Regards,<BR> |
---|
131 |
>>> > > Jayesh<BR> |
---|
132 |
>>> > ><BR> |
---|
133 |
>>> > > -----Original Message-----<BR> |
---|
134 |
>>> > > From: Yu-Cheng Chou [<A HREF="mailto:cycchou@ucdavis.edu">mailto:cycchou@ucdavis.edu</A>] > > Sent:<BR> |
---|
135 |
>>> Wednesday, February 04, 2009 1:56 PM > > To: Jayesh Krishna > > Cc:<BR> |
---|
136 |
>>> mpich-discuss@mcs.anl.gov > > Subject: Re: [mpich-discuss] mpiexec<BR> |
---|
137 |
>>> kills the remote login shell > > > > Hi > > > > I can<BR> |
---|
138 |
>>> cross-compile the program and then simply run the > executable on <BR> |
---|
139 |
>>> ><BR> |
---|
140 |
>>> > Korebot with no errors.<BR> |
---|
141 |
>>> > ><BR> |
---|
142 |
>>> > ><BR> |
---|
143 |
>>> > >> Hi,<BR> |
---|
144 |
>>> > >> Can you try running (without mpiexec) a simple C program with<BR> |
---|
145 |
>>> > >><BR> |
---|
146 |
>>> exit(-1) on Korebot ?<BR> |
---|
147 |
>>> > >><BR> |
---|
148 |
>>> > >> ========================================<BR> |
---|
149 |
>>> > >> #include <stdlib.h><BR> |
---|
150 |
>>> > >> int main(int argc, char *argv[]) > >> {<BR> |
---|
151 |
>>> > >> exit(-1);<BR> |
---|
152 |
>>> > >> }<BR> |
---|
153 |
>>> > >> ========================================<BR> |
---|
154 |
>>> > >><BR> |
---|
155 |
>>> > >> Regards,<BR> |
---|
156 |
>>> > >> Jayesh<BR> |
---|
157 |
>>> > >> ________________________________ > >> From:<BR> |
---|
158 |
>>> mpich-discuss-bounces@mcs.anl.gov > >><BR> |
---|
159 |
>>> [<A HREF="mailto:mpich-discuss-bounces@mcs.anl.gov">mailto:mpich-discuss-bounces@mcs.anl.gov</A>] On Behalf Of Jayesh > >><BR> |
---|
160 |
>>> Krishna > >> Sent: Wednesday, February 04, 2009 1:04 PM > >> To:<BR> |
---|
161 |
>>> 'Yu-Cheng Chou'<BR> |
---|
162 |
>>> > >> Cc: mpich-discuss@mcs.anl.gov<BR> |
---|
163 |
>>> > >> Subject: Re: [mpich-discuss] mpiexec kills the remote login<BR> |
---|
164 |
>>> shell ><BR> |
---|
165 |
>>>>> > >> Hi,<BR> |
---|
166 |
>>> > >> Can you also attach the corresponding smpd debug output ?<BR> |
---|
167 |
>>> > >><BR> |
---|
168 |
>>> > >> Regards,<BR> |
---|
169 |
>>> > >> Jayesh<BR> |
---|
170 |
>>> > >><BR> |
---|
171 |
>>> > >> -----Original Message-----<BR> |
---|
172 |
>>> > >> From: Yu-Cheng Chou [<A HREF="mailto:cycchou@ucdavis.edu">mailto:cycchou@ucdavis.edu</A>] > >> Sent:<BR> |
---|
173 |
>>> Wednesday, February 04, 2009 1:02 PM > >> To: Jayesh Krishna > >> Cc:<BR> |
---|
174 |
>>> mpich-discuss@mcs.anl.gov > >> Subject: Re: [mpich-discuss] mpiexec<BR> |
---|
175 |
>>> kills the remote login shell > >> > >> Hi, > >> > >> Firstly,<BR> |
---|
176 |
>>> the previously attached mpiexec verbose output is > a wrong one.<BR> |
---|
177 |
>>> > >> I've attached the correct one to this email.<BR> |
---|
178 |
>>> > >><BR> |
---|
179 |
>>> > >> Secondly, I want to point out that as long as mpiexec is<BR> |
---|
180 |
>>> initiated ><BR> |
---|
181 |
>>>>> from Korebot to run a program, no matter it's a MPI or non-MPI ><BR> |
---|
182 |
>>>>> >><BR> |
---|
183 |
>>> program, no matter the program can be found or not, as soon as > >><BR> |
---|
184 |
>>> mpiexec is finished, the ssh connection to Korebot will be gone.<BR> |
---|
185 |
>>> > >><BR> |
---|
186 |
>>> > >> Thank you<BR> |
---|
187 |
>>> > >><BR> |
---|
188 |
>>> > >><BR> |
---|
189 |
>>> > >>> Hi,<BR> |
---|
190 |
>>> > >>> The mpiexec output shows the following error when<BR> |
---|
191 |
>>> > running hellow,<BR> |
---|
192 |
>>> > >>> ==================<BR> |
---|
193 |
>>> > >>><BR> |
---|
194 |
>>> > >>> Unable to exec 'hello' on korebot > >>> > >>> Error 2 - No<BR> |
---|
195 |
>>> such file or directory > >>> > >>> ================== > >>><BR> |
---|
196 |
>>> > >>> Please provide the debug output of smpd (smpd -d 2>&1 | tee<BR> |
---|
197 |
>>> > >>> smpd.out) along with mpiexec (mpiexec -verbose -n 2 ><BR> |
---|
198 |
>>> ./hellow<BR> |
---|
199 |
>>> 2>&1<BR> |
---|
200 |
>>> | > >>> tee mpiexec.out).<BR> |
---|
201 |
>>> > >>><BR> |
---|
202 |
>>> > >>> # Can you run simple C programs (without using mpiexec) ><BR> |
---|
203 |
>>> on Korbet ?<BR> |
---|
204 |
>>> > >>> # Is the ssh connection aborted when you run non-MPI<BR> |
---|
205 |
>>> programs<BR> |
---|
206 |
>>> > >>> (mpiexec -n 2 > >>> hostname) ?<BR> |
---|
207 |
>>> > >>> # Can you send us your ".smpd" config file ?<BR> |
---|
208 |
>>> > >>> # Did you modify the MPICH2 code to run on Korbet > (Please<BR> |
---|
209 |
>>> send us > >>> your configure command & any env settings set to ><BR> |
---|
210 |
>>> configure/make MPICH2)?<BR> |
---|
211 |
>>> > >>><BR> |
---|
212 |
>>> > >>> Regards,<BR> |
---|
213 |
>>> > >>> Jayesh<BR> |
---|
214 |
>>> > >>><BR> |
---|
215 |
>>> > >>> ________________________________ > >>> From:<BR> |
---|
216 |
>>> mpich-discuss-bounces@mcs.anl.gov > >>><BR> |
---|
217 |
>>> [<A HREF="mailto:mpich-discuss-bounces@mcs.anl.gov">mailto:mpich-discuss-bounces@mcs.anl.gov</A>] On Behalf Of Jayesh ><BR> |
---|
218 |
>>> >>> Krishna > >>> Sent: Wednesday, February 04, 2009 8:41 AM > >>> To:<BR> |
---|
219 |
>>> 'Yu-Cheng Chou'<BR> |
---|
220 |
>>> > >>> Cc: mpich-discuss@mcs.anl.gov > >>> Subject: Re:<BR> |
---|
221 |
>>> [mpich-discuss] mpiexec kills the remote login shell ><BR> |
---|
222 |
>>>>>> > >>> Hi,<BR> |
---|
223 |
>>> > >>> I will take a look at the debug logs and get back to you.<BR> |
---|
224 |
>>> > >>> Meanwhile, can you run simple C programs without using<BR> |
---|
225 |
>>> mpiexec on ><BR> |
---|
226 |
>>>>>> Korbet ?<BR> |
---|
227 |
>>> > >>> MPICH2 currently does not support heterogeneous systems (So you<BR> |
---|
228 |
>>> > >>> won't be able to run your MPI job across ARM & other ><BR> |
---|
229 |
>>> architectures).<BR> |
---|
230 |
>>> > >>><BR> |
---|
231 |
>>> > >>> Regards,<BR> |
---|
232 |
>>> > >>> Jayesh<BR> |
---|
233 |
>>> > >>><BR> |
---|
234 |
>>> > >>> -----Original Message-----<BR> |
---|
235 |
>>> > >>> From: Yu-Cheng Chou [<A HREF="mailto:cycchou@ucdavis.edu">mailto:cycchou@ucdavis.edu</A>] > >>> Sent:<BR> |
---|
236 |
>>> Tuesday, February 03, 2009 7:52 PM > >>> To: Jayesh Krishna > >>> Cc:<BR> |
---|
237 |
>>> mpich-discuss@mcs.anl.gov > >>> Subject: Re: [mpich-discuss]<BR> |
---|
238 |
>>> mpiexec kills the remote login shell > >>> > >>>> # Can you run<BR> |
---|
239 |
>>> non-MPI programs using mpiexec (mpiexec -n > 2 hostname) ?<BR> |
---|
240 |
>>> > >>> Yes.<BR> |
---|
241 |
>>> > >>><BR> |
---|
242 |
>>> > >>>> # Can you compile and run the hello world program > >>>><BR> |
---|
243 |
>>> (examples/hellow.c) provided with MPICH2 (mpiexec -n 2 ./hellow)?<BR> |
---|
244 |
>>> > >>> Yes.<BR> |
---|
245 |
>>> > >>><BR> |
---|
246 |
>>> > >>>> # How did you start smpd (the command used to start > smpd) ?<BR> |
---|
247 |
>>> How did > >>>> you run your MPI job (the command used to run your job)?<BR> |
---|
248 |
>>> > >>> I have a ".smpd" file containing one line of information, ><BR> |
---|
249 |
>>> which is > >>> "phrase=123".<BR> |
---|
250 |
>>> > >>> Thus, I started smpd using "smpd -s".<BR> |
---|
251 |
>>> > >>> Then I used "mpiexec -n 1 hellow" to run hellow on Korebot.<BR> |
---|
252 |
>>> > >>><BR> |
---|
253 |
>>> > >>>> # How did you find that mpiexec kills the sshd process (We <BR> |
---|
254 |
>>> ><BR> |
---|
255 |
>>> >>>> typically ssh to unix machines and run mpiexec without > any<BR> |
---|
256 |
>>> >>>> problems) ?<BR> |
---|
257 |
>>> > >>> I logged in Korebot with two terminals.<BR> |
---|
258 |
>>> > >>> >From #1 terminal, I checked all the processes running on Korebot.<BR> |
---|
259 |
>>> > >>> >From #2 terminal, I started smpd and run hellow using > the<BR> |
---|
260 |
>>> commands > >>> mentioned above.<BR> |
---|
261 |
>>> > >>> After hellow was finished, the connection to Korebot via ><BR> |
---|
262 |
>>> #2 terminal > >>> was closed.<BR> |
---|
263 |
>>> > >>> >From #1 terminal, I knew that the sshd process > associated<BR> |
---|
264 |
>>> with<BR> |
---|
265 |
>>> #2 > >>> >terminal > >>> was gone.<BR> |
---|
266 |
>>> > >>><BR> |
---|
267 |
>>> > >>>> Can you run smpd/mpiexec in debug mode and provide us with<BR> |
---|
268 |
>>> the ><BR> |
---|
269 |
>>>>>>> outputs (smpd -d / mpiexec -n 2 -verbose hostname) ?<BR> |
---|
270 |
>>> > >>> The first attached text file is the output from running<BR> |
---|
271 |
>>> hellow in ><BR> |
---|
272 |
>>>>>> mpiexec's verbose mode.<BR> |
---|
273 |
>>> > >>><BR> |
---|
274 |
>>> > >>><BR> |
---|
275 |
>>> > >>> There is another issue.<BR> |
---|
276 |
>>> > >>> This time, I used two machines. One is Korebot as ><BR> |
---|
277 |
>>> mentioned above, > >>> and the other is a laptop running Ubuntu Linux OS.<BR> |
---|
278 |
>>> > >>> I started smpd with the same ".smpd" file and command as ><BR> |
---|
279 |
>>> mentioned > >>> above both on Korebot and the lap top.<BR> |
---|
280 |
>>> > >>> There is a machine file called "hostfile" on Korebot. The<BR> |
---|
281 |
>>> file<BR> |
---|
282 |
>>> > >>> contains the following information about the name of the ><BR> |
---|
283 |
>>> > >>> two machines.<BR> |
---|
284 |
>>> > >>><BR> |
---|
285 |
>>> > >>> korebot<BR> |
---|
286 |
>>> > >>> shrimp<BR> |
---|
287 |
>>> > >>><BR> |
---|
288 |
>>> > >>> Then from Korebot, I ran cpi using the following command.<BR> |
---|
289 |
>>> > >>><BR> |
---|
290 |
>>> > >>> mpiexec -machinefile ./hostfile -verbose -n 2 cpi > >>> ><BR> |
---|
291 |
>>> >>><BR> |
---|
292 |
>>> ><BR> |
---|
293 |
>>>>>> But the value of pi is a huge number. I think it is related to ><BR> |
---|
294 |
>>>>>> >>><BR> |
---|
295 |
>>> "double type variables" being transferred between > processes<BR> |
---|
296 |
>>> running<BR> |
---|
297 |
>>> ><BR> |
---|
298 |
>>>>>> on an ARM-based Linux and a general Linux machines.<BR> |
---|
299 |
>>> > >>><BR> |
---|
300 |
>>> > >>> The second attached text file is the output from running cpi<BR> |
---|
301 |
>>> in<BR> |
---|
302 |
>>> ><BR> |
---|
303 |
>>>>>> mpiexec's verbose mode.<BR> |
---|
304 |
>>> > >>><BR> |
---|
305 |
>>> > >>><BR> |
---|
306 |
>>> > >>>><BR> |
---|
307 |
>>> > >>>> I am cross-compiling mpich2-1.0.8 with smpd for Khepera ><BR> |
---|
308 |
>>> III mobile > >>>> robot.<BR> |
---|
309 |
>>> > >>>><BR> |
---|
310 |
>>> > >>>> This mobile robot has a Korebot board which is an ARM-based<BR> |
---|
311 |
>>> ><BR> |
---|
312 |
>>> >>>> computer with a Linux operating system.<BR> |
---|
313 |
>>> > >>>><BR> |
---|
314 |
>>> > >>>> The cross-compilation was fine.<BR> |
---|
315 |
>>> > >>>><BR> |
---|
316 |
>>> > >>>> Firstly, I logged in to Korebot through ssh.<BR> |
---|
317 |
>>> > >>>> Secondly, I started smpd.<BR> |
---|
318 |
>>> > >>>> Thirdly, I ran mpiexec to execute an MPI program (cpi) ><BR> |
---|
319 |
>>> that comes > >>>> with the package.<BR> |
---|
320 |
>>> > >>>><BR> |
---|
321 |
>>> > >>>> The result was correct, but when mpiexec was finished, the<BR> |
---|
322 |
>>> ssh<BR> |
---|
323 |
>>> ><BR> |
---|
324 |
>>>>>>> connection to the Korebot was closed.<BR> |
---|
325 |
>>> > >>>> I found that mpiexec kills the sshd process through which I<BR> |
---|
326 |
>>> was ><BR> |
---|
327 |
>>>>>>> remotely connected to Korebot.<BR> |
---|
328 |
>>> > >>>><BR> |
---|
329 |
>>> > >>>> I've been looking for the cause, but still have not > found<BR> |
---|
330 |
>>> any clues.<BR> |
---|
331 |
>>> > >>>><BR> |
---|
332 |
>>> > >>>> Could you give me any ideas to solve this problem?<BR> |
---|
333 |
>>> > >>>><BR> |
---|
334 |
>>> > >>>> Thank you,<BR> |
---|
335 |
>>> > >>>><BR> |
---|
336 |
>>> > >>>> Yu-Cheng<BR> |
---|
337 |
>>> > >>>><BR> |
---|
338 |
>>> > >>><BR> |
---|
339 |
>>> > >><BR> |
---|
340 |
>>> > ><BR> |
---|
341 |
>>> ><BR> |
---|
342 |
>>> }}}<BR> |
---|
343 |
>>><BR> |
---|
344 |
>>><BR> |
---|
345 |
>>> --<BR> |
---|
346 |
>>> Ticket URL: <<A HREF="https://trac.mcs.anl.gov/projects/mpich2/ticket/403">https://trac.mcs.anl.gov/projects/mpich2/ticket/403</A>><BR> |
---|
347 |
>>><BR> |
---|
348 |
>><BR> |
---|
349 |
><BR> |
---|
350 |
</FONT> |
---|
351 |
</P> |
---|
352 |
|
---|
353 |
</BODY> |
---|
354 |
</HTML> |
---|