./configure
which creates the file CONFIG. This file contains machine-dependent parameters, such as compiler options. Normally CONFIG will not need changing, but you should at the least examine it, and change any configuration parameters which you deem necessary. For further information, see any comments in the CONFIG file.
The configure procedure may be given command line options, and, normally, additionally prompts for a number of parameters:
At present, Molpro supports several different cases: the GA library can be either built on top of tcgmsg, mpi, or myrinet; on the IBM SP platform, it can also be built with a GA library made with the LAPI target. configure prompts for the type (default tcgmsg), and then for the directory holding the associated libraries. Normally, tcgmsg is recommended, which is most efficient on most systems and also most easily installed. If a myrinet network is available, myrinet should be chosen. This requires in addition to the usual MPI libraries the gm library and mpirun_gm rather than mpirun. At present, the myrinet option has been tested only on Linux systems. The name of the MOLPRO executable is generated from the program version number, the library type and the machine architecture. It is then possible to install different versions simultaneously in the same MOLPRO tree; see section A.3.4.
When building Global Arrays on Linux the default is tcgmsg. You should build with something similar to:
make TARGET=...where TARGET is LINUX on a 32 bit Linux system, LINUX64 on a 64 bit system. On other platforms consult the README for a list of valid targets. The parallel job launcher needed to start molpro can be found at tcgmsg/ipcv4.0/parallel and should be copied into your PATH or it's location specified in configure
In some cases you will need to specify the compiler you use when building molpro
make TARGET=... FC=...where for example FC=ifort for the Intel compiler, FC=pgf90 for Portland or FC=pathf90 for Pathscale.
When building with MPICH you should use something similar to:
export MPI=/opt/mpich # or equivalent export PATH=$PATH:$MPI/bin export MPI_LIB=$MPI/lib export MPI_INCLUDE=$MPI/include export LIBMPI=-lmpich make TARGET=... FC=... USE_MPI=yesThe details will vary from system to system. When running configure -mpp -mpptype mpi you should specify the location of the GA libraries and mpirun when prompted. When asked for the location of the MPI library
Please give both the -L and -l loader options needed to access the MPI libraryit is necessary to give both the directory and library name even if the library would be found automatically by the linker, for example:
-L/opt/mpich/lib -lmpichwhere the directory /opt/mpich/lib will vary between platforms. If any extra libraries are needed to link in the MPI library then they should not be specified here but manually added to the LIBS entry in CONFIG. After configure you should see something similar to this in your CONFIG file:
MPI_LIB="-L/opt/mpich/lib -lmpich" MPPNAME="mpi" MPITYPE="mpich" MPIBASEDIR="/opt/mpich/"
A special situation arises if 64-bit integers are in use (-i8), since on many platforms the system BLAS libraries only supports 32-bit integer arguments. In such cases (e.g., IBM, SGI, SUN) either 0 or 4 can be given for the BLAS level. BLAS=0 should always work and means that the MOLPRO Fortran BLAS routines are used. On some platforms (IBM, SGI, SUN) BLAS=4 will give better performance; in this case some 32-bit BLAS routines are used from the system library (these are then called from wrapper routines, which convert 64 to 32-bit integer arguments. Note that this might cause problems if more than 2 GB of memory is used).
For good performance it is important to use appropriate BLAS libraries; in particular, a fast implementation of the matrix multiplication dgemm is very important for MOLPRO. Therefore you should use a system tuned BLAS library whenever available.
Specification of BLAS libraries can be simplified by placing any relevant downloaded libraries in the directory blaslibs; configure searches this directory (and then, with lower priority, some potential system directories) for libraries relevant to the hardware, including that specified by a -p3, -p4, -athlon, -amd64, -em64t command line option.
For Intel and AMD Linux systems we recommend the following BLAS libraries:
The latter two parameters are relevant only if the documentation is also going to be installed from this directory (see below).
The following command-line options are recognized by configure.
molpro@molpro.net