In order to install the software you have first to satisfy the following software requirements. In particular you need:
- fortran 77 compiler
- C compiler
- python (2.5 or higher): pay attention to install also python-dev, which includes the needed header like Python.h and arrayobject.h
- numpy (available here)
Additional packages that could be useful, but not compulsory:
- pylab: included in the matplotlib package, which can be found here, or which can be installed for example by means of apt-get utilities for linux users.
- GOTOBLAS2: optimized BLAS Libraries, which can be downloaded here.
- mpich2: for code parallelization, available here. NOTE: on linux machines, when configuring, type the following: ./configure –enable-sharedlibs=gcc . In this way, you can compile with the -fPIC flag without any error, when installing NanoTCAD ViDES module. On OS X instead, type: ./configure –prefix=/usr/local –enable-mpi-f77 –enable-mpi-f90 –enable-sharedlibs=gcc . Do not forget at the end of the installation to export the path: export PATH=$PATH:/usr/local
- mpi4py: a python/MPI interface, which can be downloaded here. For the installation, please go to the following page. Tips for installation of mpi4py on a HPC: If the HPC you use has the PBS queue management system, do the following:
- look at the available modules (module avail)
- load the python module. For example, module load python/2.6.5
- load the mpich2 module. For example, module load mpich2-intel64
- In your home, untar the tarball of mpi4py.
- Go to the untarred directory
- Type: ./configure
- python setup.py build
- python setup.py install –user
- Check if the installation is correct, running a demo program: python ./demo/helloworld.py
- The mpi4py you have installed at this point, runs only with mpich2-intel64 version, so that, everytime you want to use the MPI capabilities, you have first to load such a module. If you want to use another module, you have to go through the installation procedure once again.
In order to run parallelized simulations please refer to the “Run MPI” section.