QMMM workflow using GROMACS and VOTCA-XTP

What is this tutorial about

In this tutorial, we will learn how to set and perform excited state calculation using the Votca XTP library. We will use methane as our QM region.

Requirements

  • You will need to install VOTCA using the instructions described here

  • Once the installation is completed you need to activate the VOTCA enviroment by running the VOTCARC.bash script that has been installed at the bin subfolder for the path that you have provided for the installation step above

Interacting with the XTP command line interface

The XTP package offers the following command line interface that the user can interact with: * xtp_map * xtp_parallel * xtp_run * xtp_tools

Run the following command to view the help message of xtp_tools:

[1]:
!xtp_tools -h
/bin/sh: 1: xtp_tools: not found

Note

  • In Jupyter the ! symbol means: run the following command as a standard unix command

  • In Jupyter the command %env set an environmental variable

Setting the environment

Remove previous hdf5 file

[2]:
!rm -f state.hdf5

Generate the topology from the Gromacs file

runs the mapping from MD coordinates to segments and creates an hdf5 file. You can explore the generated state.hdf5 file with e.g. hdf5itebrowser. In Python, you can use the h5py library.

[3]:
!xtp_map -t MD_FILES/topol.tpr -c MD_FILES/conf.gro -s system.xml -f state.hdf5
/bin/sh: 1: xtp_map: not found

Check the mapping

Let us first output .pdb files for the segments, qmmolecules and classical segments in order to check the mapping. Use xtp_run -d mapchecker to see all options mapchecker calculator takes. We use the -c option to change one option on the commandline.

In the mapchecker section of the manual you can find a table with the mapchecker input variables and their corresponding defaults. Finally, the following command run the check

[4]:
!xtp_run -e mapchecker -c map_file=system.xml -f state.hdf5
/bin/sh: 1: xtp_run: not found

Neighborlist Calculation

The following step is to determine the neighbouring pairs for exciton transport. See the neighborlist options for further information.

Finally, we can run the calculation using 4 threads

[5]:
!xtp_run -e neighborlist -c exciton_cutoff=0.5 constant=0.6 -f state.hdf5 -t 4
/bin/sh: 1: xtp_run: not found

Read reorganization energies

In this step we will read the in site reorganization energies and store them in the state.hdf5 file. We just need to copy the input file and execute the calculation. The side energies have to be calculated by the user beforehand and put into an xml file. We added them to system.xml

[6]:
!xtp_run -e einternal -c energies_file=system.xml -f state.hdf5
/bin/sh: 1: xtp_run: not found

Compute site energy

In this step we will perform some QMMM calculations to compute the site energies. The qmmm_mm.xml file contains some predefined settings to perform the MM calculations. Let us first copy these settings into the state file. Instead of using the -c option we now use the -o option to read in options from an xml file.

[7]:
!xtp_parallel -e qmmm -o qmmm_mm.xml -f state.hdf5 -j "write"
/bin/sh: 1: xtp_parallel: not found

The previous command generates a qmmm_mm_jobs.xml containing 4000 MM jobs to compute, if you examine that file, it should look something like

<jobs>
  <job>
    <id>0</id>
    <tag>Methane_0:n</tag>
    <input>
      <site_energies>0:n</site_energies>
      <regions>
        <region>
          <id>0</id>
          <segments>0:n</segments>
        </region>
      </regions>
    </input>
    <status>AVAILABLE</status>
  </job>
...

Let us run just the first 4 jobs by settings all jobs status to COMPLETE except for the first four. This can be easily done with sed as follows,

[8]:
!sed -i "s/AVAILABLE/COMPLETE/g" qmmm_mm_jobs.xml
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' qmmm_mm_jobs.xml
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' qmmm_mm_jobs.xml
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' qmmm_mm_jobs.xml
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' qmmm_mm_jobs.xml
sed: can't read qmmm_mm_jobs.xml: No such file or directory
sed: can't read qmmm_mm_jobs.xml: No such file or directory
sed: can't read qmmm_mm_jobs.xml: No such file or directory
sed: can't read qmmm_mm_jobs.xml: No such file or directory
sed: can't read qmmm_mm_jobs.xml: No such file or directory

Now we can run the jobs and save the results in the state file

[9]:
!xtp_parallel -e qmmm -o qmmm_mm.xml -f state.hdf5 -x 2 -j "run"
!xtp_parallel -e qmmm -o qmmm_mm.xml -f state.hdf5 -j "read"
/bin/sh: 1: xtp_parallel: not found
/bin/sh: 1: xtp_parallel: not found

Site energy and pair energy analysis

In this step we generate an histogram and compute the correlation function of site energies and pair energy differences.

[10]:
!xtp_run -e eanalyze -f state.hdf5
/bin/sh: 1: xtp_run: not found

You should now see a set of files prefixed with eanalyze containing the histrogram and correlation functions.

[11]:
!ls eanalyze*
ls: cannot access 'eanalyze*': No such file or directory

QM energy calculation

Our next task is to perform the qm calculations for each segment that we have stored in the hdf5 file. The calculations take place in 3 stages: write the jobs to a file, perform the computation and finally save the results to the state file. We provided a small options file to make the computation cheaper.

[12]:
!cat eqm.xml
cat: eqm.xml: No such file or directory

We set the GWBSE mode to G0W0, the ranges to full and the basisset and auxbasisset to 3-21G and aux-def2-svp. For more information, check the eqm calculator options. For the sake of computational time let just compute the gw approximation and the singlet. You can also request the triplet or all

First we will write the job in a file and enable only the first 2,

[13]:
!xtp_parallel -e eqm -o eqm.xml -f state.hdf5 -s 0 -j "write"
!sed -i "s/AVAILABLE/COMPLETE/g" eqm.jobs
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' eqm.jobs
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' eqm.jobs
/bin/sh: 1: xtp_parallel: not found
sed: can't read eqm.jobs: No such file or directory
sed: can't read eqm.jobs: No such file or directory
sed: can't read eqm.jobs: No such file or directory

Now, let run these 2 jobs

[14]:
!xtp_parallel -e eqm -o eqm.xml -f state.hdf5 -x 2 -s 0 -j run -q 1
/bin/sh: 1: xtp_parallel: not found

QM calculation for pairs

In the following step we will run QM calculations for each pair in the hdf5 file. As the calculations on the previous step, we will first write the jobs in a file, then run them and finally store the results in the state file. First, we need to copy the input to our local folder

As in the previous section, we set the GWBSE mode to G0W0, the ranges to full and the basisset and auxbasisset to 3-21G and aux-def2-svp. But we compute only the gw approximation, as the BSE is formed in the coupling step only once and we do not have to diagonalize it. For more information, check the iqm calculator options. We only compute the singlet couplings.

Before running the calculations, we need to specify in the iqm input which states to read into the jobfile for each segment type.

[15]:
!cat iqm.xml
cat: iqm.xml: No such file or directory

Now, let’s write the jobs to the file

[16]:
!xtp_parallel -e iqm -o iqm.xml -f state.hdf5 -s 0 -j "write"
/bin/sh: 1: xtp_parallel: not found

From the jobs that we just write down, let’s make available only the first job

[17]:
!sed -i "s/AVAILABLE/COMPLETE/g" iqm.jobs
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' iqm.jobs
sed: can't read iqm.jobs: No such file or directory
sed: can't read iqm.jobs: No such file or directory

Now we can run and store the jobs results

[18]:
!xtp_parallel -e iqm -o iqm.xml -f state.hdf5 -x 2 -s 0 -j run -q 1
/bin/sh: 1: xtp_parallel: not found

Finally, we read the results into the state

[19]:
!xtp_parallel -e iqm -o iqm.xml -f state.hdf5 -j "read"
/bin/sh: 1: xtp_parallel: not found

Coupling

We can now compute the classical coupling of transition in the aformentioned three stages,

We need to change in the iexcitoncl input the name map_file option and add the state. check all the available of the iexcitoncl calculator. We do this via the commandline using the -c option.

[20]:
!xtp_parallel -e iexcitoncl -c map_file=system.xml states=Methane:n2s1 -f state.hdf5 -j "write"
/bin/sh: 1: xtp_parallel: not found
[21]:
!head -n 15 exciton.jobs
head: cannot open 'exciton.jobs' for reading: No such file or directory

Now we can run and save the jobs. For demo purposes we will run only the first job

[22]:
!sed -i "s/AVAILABLE/COMPLETE/g" exciton.jobs
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' exciton.jobs
!xtp_parallel -e iexcitoncl -c map_file=system.xml states=Methane:n2s1 -f state.hdf5
sed: can't read exciton.jobs: No such file or directory
sed: can't read exciton.jobs: No such file or directory
/bin/sh: 1: xtp_parallel: not found
[23]:
!xtp_parallel -e iexcitoncl -c map_file=system.xml -f state.hdf5 -j "read"
/bin/sh: 1: xtp_parallel: not found

Coupling analysis

Using the coupling computed in the previous steps, we will generate an histogram for the squared couplings in logarithmic scale,

[24]:
!xtp_run -e ianalyze -c states=e,h,s -f state.hdf5
/bin/sh: 1: xtp_run: not found

QMMM calculations

Finally let us run a proper qmmm calculation using the qmmm calculator

[25]:
!cat qmmm.xml
cat: qmmm.xml: No such file or directory
[26]:
!xtp_parallel -e qmmm -o qmmm.xml -f state.hdf5 -j "write"
/bin/sh: 1: xtp_parallel: not found

Lets run just the first job

[27]:
!sed -i "s/AVAILABLE/COMPLETE/g" qmmm_jobs.xml
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' qmmm_jobs.xml
!sed -i '0,/COMPLETE/s/COMPLETE/AVAILABLE/' qmmm_jobs.xml
!xtp_parallel -e qmmm -o qmmm.xml -x 2 -f state.hdf5 -j run
sed: can't read qmmm_jobs.xml: No such file or directory
sed: can't read qmmm_jobs.xml: No such file or directory
sed: can't read qmmm_jobs.xml: No such file or directory
/bin/sh: 1: xtp_parallel: not found

Finally, save the results. We could read them in but that is a bit pointless. Maybe check out how to turn a checkpoint file into an or orbfile (look at the scripts) and visualise it with the gencube tool.

[28]:
#!xtp_parallel -e qmmm -o OPTIONFILES/qmmm.xml -f state.hdf5 -j "read"
[ ]: