[molpro-user] ARMCI test errors after compiling with GA/MPICH2, gfortran, ATLAS on Ubuntu 11.04
Gregory Magoon
gmagoon at MIT.EDU
Fri May 20 07:17:26 BST 2011
Thanks very much for the tips, Andy. I get different test output now, but it
still doesn't seem to be working properly, in essentially the same places (see
below). I noticed that ga_BLAS_SIZE = 4 in the configure script output (though
INTEGER=8 in the CONFIG file)...does this indicate a problem? I'll look into
some of your other suggestions tomorrow.
Below is:
1. partial test output
2. configure script output
make[1]: Entering directory `/home/user/Molpro/testjobs'
Running job Cs_DKH10.test
Running job Cs_DKH2.test
Running job Cs_DKH2_standard.test
Running job Cs_DKH3.test
Running job Cs_DKH4.test
Running job Cs_DKH7.test
Running job Cs_DKH8.test
Running job Cs_nr.test
Running job allene_opt.test
Running job allyl_cipt2.test
Running job allyl_ls.test
Running job ar2_dk_dummy.test
Running job au2o_optdftecp1.test
Running job au2o_optdftecp2.test
Running job au2o_optecp.test
2:2:fehler:: 1
(rank:2 hostname:kamet pid:828):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
3:3:fehler:: 1
(rank:3 hostname:kamet pid:829):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
4:4:fehler:: 1
(rank:4 hostname:kamet pid:830):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
1:1:fehler:: 1
(rank:1 hostname:kamet pid:827):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
5:5:fehler:: 1
(rank:5 hostname:kamet pid:831):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
6:6:fehler:: 1
(rank:6 hostname:kamet pid:833):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
7:7:fehler:: 1
(rank:7 hostname:kamet pid:834):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
=====================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= EXIT CODE: 256
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
=====================================================================================
**** PROBLEMS WITH JOB au2o_optecp.test
**** For further information, look in the output file
**** /home/user/Molpro/testjobs/au2o_optecp.errout
Running job aucs4k2.test
Running job b_cidft.test
Running job basisinput.test
Running job bccd_opt.test
Running job bccd_save.test
Running job benz_nlmo.test
Running job benzol_giao.test
Running job big_lattice.test
Running job br2_f12_multgem.test
5:5:fehler:: 1
(rank:5 hostname:kamet pid:9652):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
2:2:fehler:: 1
(rank:2 hostname:kamet pid:9649):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
6:6:fehler:: 1
(rank:6 hostname:kamet pid:9653):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
7:7:fehler:: 1
(rank:7 hostname:kamet pid:9654):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
=====================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= EXIT CODE: 256
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
=====================================================================================
**** PROBLEMS WITH JOB br2_f12_multgem.test
**** For further information, look in the output file
**** /home/user/Molpro/testjobs/br2_f12_multgem.errout
Running job c2f4_cosmo.test
Running job c2h2_dfmp2.test
Running job c2h4_c1_freq.test
Running job c2h4_ccsd-f12.test
2:2:fehler:: 1
(rank:2 hostname:kamet pid:10664):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
3:3:fehler:: 1
(rank:3 hostname:kamet pid:10665):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
4:4:fehler:: 1
(rank:4 hostname:kamet pid:10666):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
5:5:fehler:: 1
(rank:5 hostname:kamet pid:10667):ARMCI DASSERT fail.
src/armci.c:ARMCI_Error():276 cond:0
=====================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= EXIT CODE: 256
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
=====================================================================================
**** PROBLEMS WITH JOB c2h4_ccsd-f12.test
**** For further information, look in the output file
**** /home/user/Molpro/testjobs/c2h4_ccsd-f12.errout
Running job c2h4_ccsdfreq.test
user at kamet:~/Molpro$ ./configure -batch -gcc -gfortran -mpp
-auto-ga-tcgmsg-mpich2 -instroot /usr/local/molpro2010.1
machine type recognized as x86_64 (Generic 64-bit)
kernel recognized as Linux
user request compiler gfortran
GNU Fortran Compiler, Version 4.5.2
FC=/usr/bin/gfortran
user request compiler gcc
GNU Compiler Collection, Version 4.5.2
CC=/usr/bin/gcc
Use BLAS library - Automatically Tuned Linear Algebra Software (ATLAS)
BLASLIB=-L/usr/lib64 -lf77blas -lcblas -latlas
Use LAPACK library - undetermined LAPACK library, probably same type as BLAS
LAPACKLIB=-L/usr/lib64 -llapack
starting auto-build of prerequisites
building MPICH2 version 1.3.3rc1, each step could take a few minutes
./configure --prefix=/home/user/Molpro/src/mpich2-install
--with-device=ch3:nemesis --with-pm=hydra --enable-f77 F77=/usr/bin/gfortran
--enable-fc FC=/usr/bin/gfortran --enable-cc CC=/usr/bin/gcc --enable-cxx
CXX=/usr/bin/g++
make
make install
MPICH2 built, if you want to test the MPICH2 build, please run:
cd /home/user/Molpro/src/mpich2-1.3.3rc1; make testing
building Global Arrays version 5-0-2, each step could take a few minutes
./configure --prefix=/home/user/Molpro/src/ga-install --with-scalapack=no
--enable-f77 F77=/usr/bin/gfortran CC=/usr/bin/gcc CXX=/usr/bin/g++
--with-tcgmsg --with-mpi='/home/user/Molpro/src/mpich2-install/lib -lmpich
-lopa -lmpl -lrt -I/home/user/Molpro/src/mpich2-install/include'
make
make install
Global Arrays built, if you want to test the Global Arrays build, please run:
cd /home/user/Molpro/src/ga-5-0-2; make checkprogs; make check
MPIEXEC="/home/user/Molpro/src/mpich2-install/bin/mpiexec -np 4"
./configure -batch "-gcc" "-gfortran" "-mpp" "-instroot"
"/usr/local/molpro2010.1" -mppbase /home/user/Molpro/src/ga-5-0-2
machine type recognized as x86_64 (Generic 64-bit)
kernel recognized as Linux
user request compiler gfortran
GNU Fortran Compiler, Version 4.5.2
FC=/usr/bin/gfortran
user request compiler gcc
GNU Compiler Collection, Version 4.5.2
CC=/usr/bin/gcc
Use BLAS library - Automatically Tuned Linear Algebra Software (ATLAS)
BLASLIB=-L/usr/lib64 -lf77blas -lcblas -latlas
Use LAPACK library - undetermined LAPACK library, probably same type as BLAS
LAPACKLIB=-L/usr/lib64 -llapack
ga_GA_MP_LIBS = -lmpich -lopa -lmpl -lrt
ga_GA_MP_LDFLAGS = -L/home/user/Molpro/src/mpich2-install/lib
ga_GA_MP_CPPFLAGS = -I/home/user/Molpro/src/mpich2-install/include
ga_TARGET = LINUX64
ga_MSG_COMMS = TCGMSGMPI
ga_prefix = /home/user/Molpro/src/ga-install
ga_ARMCI_NETWORK_LDFLAGS =
ga_ARMCI_NETWORK_LIBS =
ga_BLAS_SIZE = 4
Use MPP library - Global Arrays version 5 or higher
MPPLIB=-L/home/user/Molpro/src/ga-install/lib -lga -larmci
MPILIB=-L/home/user/Molpro/src/mpich2-install/lib -lmpich -lopa -lmpl -lrt
parallel=mpich2
parse-Linux-x86_64-i8.o is your object
CONFIG file created; proceed to compilation
Quoting Andy May <MayAJ1 at cardiff.ac.uk>:
> Greg,
>
> I can't seem to replicate this problem using options as close as
> possible to yours.
>
> One thing I have noticed is that linking only the routines that come
> with ATLAS does not work anymore with Molpro. I have fixed this, and
> there will be an update sometime in the future, but this leads me to
> conclude the -lapack you are using is either a supplemented ATLAS
> library, or a system one.
>
> I don't think the BLAS/LAPACK will be the problem, but perhaps
> configure has not correctly determined the integer size in these
> libraries, you can always try using Molpro internal ones with -noblas
> -nolapack options.
>
> There may be problems with the MPICH2 version which ships with
> Ubuntu, certainly there were problems with hydra around 1.3.1, but if
> you say that ifort works then maybe it's okay. To eliminate this I
> would suggest to use a newer version (no need to do this by hand, see
> below).
>
> Also, we are now at PL21 (there is a new tarball).
>
> Can I therefore suggest you first try the following with 2010.1.21:
>
> ./configure -batch -gcc -gfortran -mpp -auto-ga-tcgmsg-mpich2
> -instroot /usr/local/molpro2010.1
>
> and see if this then gives you a working build.
>
> Best wishes,
>
> Andy
More information about the Molpro-user
mailing list