Running Vorpal on a Windows HPC Cluster

Note

Prior to running GSim with Windows HPC Cluster tools, please ensure that GSim is properly installed on your Windows Cluster. (See the “Installation” Manual.)

Setting up the Simulation Directory

The following is an example of how to run an example GSim simulation on a Windows Cluster. In this example, the UNC Share Path that is set up on all nodes in the Cluster including the headnode is:

\\hpcheadnode\scratch

This path should be replaced by the path to your shared drive, whatever it might be. Paths in UNC (Universal Naming Convention) should be used, so \\machine_name\name_of_share\directory instead of S:\directory.

To start, we create a new simulation in a directory on the shared drive. To do this, run GSimComposer on the headnode, which should be installed on the shared drive as GSim must be installed in the shared drive to run Vorpal on a Windows Cluster. shows.

|PRODUCT| installed on the shared drive

Fig. 82 GSim must be installed in the shared drive to run Vorpal on a Windows Cluster.

In GSimComposer, follow these actions:

saving a new simulation on the shared drive

Fig. 83 Choose the shared drive in the Directory field so that the subsequent New Folder is on that drive.

Continue with:

choosing a directory for the new simulation

Fig. 84 Choose the name of the containing directory of your new simulation.

Now:

  • Click the Create button in the New Folder dialog.

  • Click the Save button in the Choose Simulation Name dialog.

  • At this point, you would normally change the setup to suit your simulation needs and possibly run it locally to check if you are on the right track, but for now, just exit GSimComposer.

You should now have a simulation directory in your share drive and be able to see it in Windows Explorer. See The simulation directory needs to be on the shared drive to run a simulation on a Windows Cluster.

Simulation directory on the shared drive

Fig. 85 The simulation directory needs to be on the shared drive to run a simulation on a Windows Cluster.

Create a New Cluster Job

Jobs can be started from the command line and from the Cluster Manager tool. We will show how to create and submit a job from the Cluster Manager interface and leave it to the user to follow the Microsoft documentation on how to save the Job XML and run subsequent jobs from the command line.

Start the Cluster Manager tool usually located in

C:Program FilesMicrosoft HPC Pack 2012BinHpcClusterManager.exe

This should present with a window as shown in The Microsoft HPC Pack Cluster Manager tool used to create and submit jobs to the cluster.

Cluster Manager GUI

Fig. 86 The Microsoft HPC Pack Cluster Manager tool used to create and submit jobs to the cluster.

In the Cluster Manager, complete the following steps:

Cluster Manager New Job dialog

Fig. 87 The Cluster Manager New Job dialog is a wizard to help create and submit a job.

Continue with:

Cluster Manager New Jobs tasks

Fig. 88 Adding a task in the Cluster Manager New Job dialog.

Edit the task details by:

  • Edit the Task name field if desired.

  • Enter the command for running vorpal via MPI into the Command line field. Here you need the full path. to the vorpal executable and input file for your simulation. All the normal vorpal command-line arguments apply.

  • The Working directory field is required to be the simulation directory that we set up above.

  • The Standard output and Standard error fields are optional, but make it handy to organize the output.

  • Finally, select the number of nodes you would like to run on. The “-np 4” argument along with the Minimum 2 value says that we would like to run 4 MPI processes across 2 nodes.

The job Task Details dialog is shown in Creating parameters for the task in the Cluster Manager New Job dialog.

Cluster Manager New Jobs task parameters

Fig. 89 Creating parameters for the task in the Cluster Manager New Job dialog.

Once the job task details are set, we need to do one last set of steps:

Cluster Manager New Jobs environment

Fig. 90 Editing Job Properies for the task in the Cluster Manager New Job dialog.

In the Environment Variables dialog (Environment Variables dialog for the task allows user to see and add variables.),

Cluster Manager New Jobs Environment Variables dialog

Fig. 91 Environment Variables dialog for the task allows user to see and add variables.

click the Add button. This will bring up the Add Environment Variable dialog (see Add Environment Variables dialog for the task to add PYTHONPATH.).

Cluster Manager New Jobs environment variable dialog

Fig. 92 Add Environment Variables dialog for the task to add PYTHONPATH.

In this dialog, enter PYTHONPATH for the Name and the following for Value:

\\hpcheadnode\scratch\GSim\Contents\engine\share\scripts;\hpcheadnode\scratch\GSim\Contents\engine\lib\site-packages;\\hpcheadnode\scratch\GSim\Contents\engine\lib;\\hpcheadnode\scratch\GSim\Contents\bin\lib\site-packages;\\hpcheadnode\scratch\GSim\Contents\bin\lib\python\lib;\\hpcheadnode\scratch\GSim\Contents\bin\lib\python\DLLs;\\hpcheadnode\scratch\GSim\Contents\bin;.

You will want to replace the value of the UNC Share Path (\\hpcheadnode\scratch) with your own UNC Share Path in the PYTHONPATH line. The PYTHONPATH variable is required for all vorpal simulations to run, but other variables such as SIM_DATA_PATH may be needed for a few select simulations.