Next: A..3.4 Configuration of multiple Up: A..3 Installation from source Previous: A..3.2 Prerequisites


A..3.3 Configuration

Once the distribution has been unpacked, identify the root directory that was created (normally molpro2006.1 ). In the following description, all directories are given relative to this root. Having changed to the root directory, you should check that the directory containing the Fortran compiler you want to use is in your PATH. Then run the command

./configure

which creates the file CONFIG. This file contains machine-dependent parameters, such as compiler options. Normally CONFIG will not need changing, but you should at the least examine it, and change any configuration parameters which you deem necessary. For further information, see any comments in the CONFIG file.

The configure procedure may be given command line options, and, normally, additionally prompts for a number of parameters:

  1. On certain machines it is possible to compile the program to use either 32 or 64 bit integers, and in this case configure may be given a command-line option -i4 or -i8 respectively to override the default behaviour. Generally, the 64-bit choice allows larger calculations (files larger than 2Gb, more than 16 active orbitals), but can be slower if the underlying hardware does not support 64-bit integers (e.g., some IBM RS6000 hardware). Note that if -i4 is used then large files (greater than 2Gb) are supported on most systems, but even then the sizes of MOLPRO records are restricted to 16 Gb since the internal addressing in MOLPRO uses 32-bit integers. If -i8 is used, the record and file sizes are effectively unlimited.

  2. In the case of building for parallel execution, the option -mpp or -mppx must be given on the command line. For the distinction between these two parallelism modes, please refer to the user manual, section 2.

    At present, Molpro supports several different cases: the GA library can be either built on top of tcgmsg, mpi, or myrinet; on the IBM SP platform, it can also be built with a GA library made with the LAPI target. configure prompts for the type (default tcgmsg), and then for the directory holding the associated libraries. Normally, tcgmsg is recommended, which is most efficient on most systems and also most easily installed. If a myrinet network is available, myrinet should be chosen. This requires in addition to the usual MPI libraries the gm library and mpirun_gm rather than mpirun. At present, the myrinet option has been tested only on Linux systems. The name of the MOLPRO executable is generated from the program version number, the library type and the machine architecture. It is then possible to install different versions simultaneously in the same MOLPRO tree; see section A.3.4.

    When building Global Arrays on Linux the default is tcgmsg. You should build with something similar to:

    make TARGET=...
    
    where TARGET is LINUX on a 32 bit Linux system, LINUX64 on a 64 bit system. On other platforms consult the README for a list of valid targets. The parallel job launcher needed to start molpro can be found at tcgmsg/ipcv4.0/parallel and should be copied into your PATH or it's location specified in configure

    In some cases you will need to specify the compiler you use when building molpro

    make TARGET=... FC=...
    
    where for example FC=ifort for the Intel compiler, FC=pgf90 for Portland or FC=pathf90 for Pathscale.

    When building with MPICH you should use something similar to:

    export MPI=/opt/mpich # or equivalent
    export PATH=$PATH:$MPI/bin
    export MPI_LIB=$MPI/lib
    export MPI_INCLUDE=$MPI/include
    export LIBMPI=-lmpich
    make TARGET=... FC=... USE_MPI=yes
    
    The details will vary from system to system. When running configure -mpp -mpptype mpi you should specify the location of the GA libraries and mpirun when prompted. When asked for the location of the MPI library
    Please give both the -L and -l loader options needed to access the MPI library
    
    it is necessary to give both the directory and library name even if the library would be found automatically by the linker, for example:
    -L/opt/mpich/lib -lmpich
    
    where the directory /opt/mpich/lib will vary between platforms. If any extra libraries are needed to link in the MPI library then they should not be specified here but manually added to the LIBS entry in CONFIG. After configure you should see something similar to this in your CONFIG file:
    MPI_LIB="-L/opt/mpich/lib -lmpich"
    MPPNAME="mpi"
    MPITYPE="mpich"
    MPIBASEDIR="/opt/mpich/"
    

  3. If any system libraries are in unusual places, it may be necessary to specify them explicitly as the arguments to a -L command-line option.

  4. configure asks whether you wish to use system BLAS subroutine libraries. MOLPRO has its own optimised Fortran version of these libraries, and this can safely be used. On most machines, however, it will be advantageous to use a system-tuned version instead. In the case of BLAS, you should enter a number between 1, 2 and 3; if, for example, you specify 2, the system libraries will be used for level 2 and level 1 BLAS, but MOLPRO's internal routines will be used for level 3 (i.e., matrix-matrix multiplication). Normally, however, one would choose either 0 or 3. If a system BLAS is chosen, you will be prompted to enter appropriate linker options (e.g. -L/usr/lib -lblas) to access the libraries.

    A special situation arises if 64-bit integers are in use (-i8), since on many platforms the system BLAS libraries only supports 32-bit integer arguments. In such cases (e.g., IBM, SGI, SUN) either 0 or 4 can be given for the BLAS level. BLAS=0 should always work and means that the MOLPRO Fortran BLAS routines are used. On some platforms (IBM, SGI, SUN) BLAS=4 will give better performance; in this case some 32-bit BLAS routines are used from the system library (these are then called from wrapper routines, which convert 64 to 32-bit integer arguments. Note that this might cause problems if more than 2 GB of memory is used).

    For good performance it is important to use appropriate BLAS libraries; in particular, a fast implementation of the matrix multiplication dgemm is very important for MOLPRO. Therefore you should use a system tuned BLAS library whenever available.

    Specification of BLAS libraries can be simplified by placing any relevant downloaded libraries in the directory blaslibs; configure searches this directory (and then, with lower priority, some potential system directories) for libraries relevant to the hardware, including that specified by a -p3, -p4, -athlon, -amd64, -em64t command line option.

    For Intel and AMD Linux systems we recommend the following BLAS libraries:

    mkl
    The Intel Math Kernel Library (mkl), version 8.0 or higher http://www.intel.com/cd/software/products/asmo-na/eng/perflib/mkl
    atlas
    The Atlas library http://math-atlas.sourceforge.net. You must use the atlas library specific to your processor:
    Pentium III
    Linux_PIIISSE1
    Pentium 4,Xeon
    Linux_P4SSE2
    AMD Athlon
    Linux_ATHLON
    AMD Opteron
    Linux_HAMMER64SSE2_2 (64 bit)
    When using atlas MOLPRO will automatically compile in the extra lapack subroutines which do not come by default with the package and so the liblapack.a which comes with Atlas is sufficient. The appropriate linker options are:
    -L blasdir -lcblas -lf77blas -latlas
    acml
    For Opteron systems then ACML http://developer.amd.com/acml.aspx is the preferred blas library.
    SGI Altix can use the scsl library is preferred. HP platforms can use the mlib math library. IBM Power platforms can use the essl package.

  5. configure prompts for the destination directory (INSTBIN) for final installation of the MOLPRO executable. This directory should be one normally in the PATH of all users who will access MOLPRO, and its specification will depend on whether the installation is private or public.

  6. configure prompts for the destination directory (INSTLIB) for installation of ancillary files which are required for program execution.

  7. configure will attempt to contact the molpro webserver and download an appropriate licence key if it does not a token in the file $HOME/.molpro/token. This token will be copied to INSTLIB/.token during installation.

  8. configure prompts for the destination directory for documentation. This should normally be a directory that is mounted on a worldwide web server.

  9. configure prompts for the destination directory for the CGI scripts that control the delivery of documentation. This might be the same directory as (h), but some web servers require a particular special directory to be used.

The latter two parameters are relevant only if the documentation is also going to be installed from this directory (see below).

The following command-line options are recognized by configure.

-batch
disables the prompting described above.

-i8 |-i4
forces the use of 8- or 4-byte integers respectively.

-L lib
specifies any additional directories containing system libraries to be scanned at link time.

-blas 0|1|2|3|4
specifies system BLAS level, as described above.

-mpp |-nompp
controls whether compilation is to be for MPP parallelism (see above).

-ifort |-pgf |-path
controls whether the Intel (ifort), Portland (pgf) or Pathscale (path) compiler is to be used on Linux systems.

-f ftcflag
adds a token to the specifiers for the Fortran preprocessor ftc.

-largefiles |-nolargefiles
controls whether large file ( $\gt 2\text{Gb}$) support is wanted. This option is not relevant or used on all architectures. All modern Linux distributions should support large files.

-p3 |-p4 |-athlon |-amd64 |-em64t
specifically identifies a particular hardware in order to force appropriate run-time libraries where possible. These options are supported only on Linux systems. If any of these options is given, the MOLPRO executable will be named molpro_p3.exe,
molpro_p4.exe, or molpro_athlon.exe (in the mpp case, e.g., molpro_p3_tcgmsg.exe). It is possible to install different platform variants simultaneously in the same MOLPRO tree; see section A.3.4.



Next: A..3.4 Configuration of multiple Up: A..3 Installation from source Previous: A..3.2 Prerequisites

molpro@molpro.net
Oct 10, 2007