<html>
<head>
<style><!--
.hmmessage P
{
margin:0px;
padding:0px
}
body.hmmessage
{
font-size: 10pt;
font-family:Tahoma
}
--></style></head>
<body class='hmmessage'><div dir='ltr'>
Hi<div><br></div><div>I have tried to compile Molpro 2012.1 using Intels composer_xe_2013.1.117 (cc, fortran and mkl-library) and OpenMPI, both versions 1.6.2 and 1.4.3. However, every time I run the testjobs I get the same error message, see below.</div><div>Both Molpro and OpenMPI was compiled on the frontend. </div><div><br></div><div>I configured Molpro this way:</div><div>./configure -icc -i8 -mpp -openmpi -mppbase /software/kemi/openmpi-1.4.3/include/ -prefix /kemi/obessal/MOLPRO -batch</div><div><span style="font-size: 10pt;">and then used Gnu Make 3.81 </span></div><div>then make test</div><div><br></div><div>How can I solve this problem??</div><div><br></div><div><br></div><div>Running job aims.test</div><div>CMA: no RDMA devices found</div><div><br></div><div> GLOBAL ERROR fehler on processor 0</div><div> 0: fehler 0 (0).</div><div> 0: In mpi_utils.c [MPIGA_Error]: now exiting...</div><div>--------------------------------------------------------------------------</div><div>[[44089,1],0]: A high-performance Open MPI point-to-point messaging module</div><div>was unable to find any relevant network interfaces:</div><div><br></div><div>Module: OpenFabrics (openib)</div><div> Host: fend03.dcsc.ku.dk</div><div><br></div><div>Another transport will be used instead, although this may result in</div><div>lower performance.</div><div>--------------------------------------------------------------------------</div><div>Received signal 11 Segmentation violation</div><div> 0: fehler 0 (0).</div><div>--------------------------------------------------------------------------</div><div>MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD</div><div>with errorcode 0.</div><div><br></div><div>NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.</div><div>You may or may not see output from other processes, depending on</div><div>exactly when Open MPI kills them.</div><div>--------------------------------------------------------------------------</div><div>--------------------------------------------------------------------------</div><div>mpirun has exited due to process rank 0 with PID 11981 on</div><div>node fend03.dcsc.ku.dk exiting without calling "finalize". This may</div><div>have caused other processes in the application to be</div><div>terminated by signals sent by mpirun (as reported here).</div><div>--------------------------------------------------------------------------</div><div>**** PROBLEMS WITH JOB aims.test</div><div>aims.test: ERRORS DETECTED: non-zero return code ... inspect output</div><div>**** For further information, look in the output file</div><div>**** /users/kemi/obessal/MOLPRO/Molpro/testjobs/aims.errout</div><div><br></div><div><br></div><div><div>Bonus question:</div><div>Also, is it important to compile Molpro and OpenMPI with the same compiler?</div></div><div><br></div><div><br></div><div><div>Best Regards</div><div>- Lasse</div></div><div><br></div> </div></body>
</html>