Help with ANSYS HPC Workflows in the Nimbix Cloud
In this article we will show you how to get started with HPC Workflows using ANSYS in the Nimbix Cloud. ANSYS and Nimbix have deployed the ANSYS software portfolio on the Nimbix cloud platform, offering the highest performance lowest cost cloud solution for ANSYS applications. This article reviews where your user data is stored in the Nimbix Cloud, and how to configure your HPC Workflows for various ANSYS applications.
ELECTRONICS DESKTOP (using RSM)
For advanced HPC configuration for Electronics Desktop, refer to the manuals on the Ansys Customer Portal:
You will need to follow these steps for each new job session that you run. To configure HPC with the RSM Manager in Electronics Desktop, follow the steps below. Buttons are highlighted in the image and labeled with the corresponding step number.
- Open the HPC configuration menu.
- Select your Design Type from the drop down, such as HFSS.
- Click “Add” to open the panel to create a new configuration.
- Name this configuration “JARVICE”. Note: If JARVICE exists from a previous session, you should delete it and re-create it as the MACHINES.txt file changes per job.
- Delete any existing hostnames from the list, then click “Import Machines from File.”
- Use the file /home/nimbix/MACHINES.txt, which contains the hostnames and number of cores for your session. Then click “OK” to close out of these menus.
- Finally, from the dropdown, select “JARVICE” as your active configuration.
You may now right click on your project in the left project navigation panel, click “Analyze All”, or if you prefer “Submit Job,” and import the JARVICE configuration to submit your job to RSM.
CFX is available on the Nimbix Cloud in both graphical and batch modes. It is recommended to use pre- and post-processing in graphical mode, and long-running simulations in batch mode. For running simulations in batch mode, no further configuration is needed for HPC. It is handled automatically by JARVICE, the software powering the Nimbix Cloud.
If your workflow does not permit batch operations, you can configure HPC for the CFX Solver from the graphical interface. We have already configured your user’s environment to have a list of host, but you will have to manually add those hosts to your HPC configuration.
The steps to manually configure HPC for your run definition:
- Open File->Define Run.
- Select Run Mode: Intel MPI Distributed Parallel.
- Clear the existing host(s) from the host list.
- Press the Insert Host button, which will display a list of available hosts for your HPC environment.
- Select all of the hosts listed.
- Update the partitions to 16 to use one core per partition.
- Configure the working directory to be in a project directory stored in your /data or /home/nimbix/data directory. This storage space is shared across all nodes.
- Start Run to begin your HPC analysis.
For more information on tuning your applications for HPC use (memory consumption, cores, tasks, etc…), refer to the ANSYS Customer Portal at https://support.ansys.com/portal/site/AnsysCustomerPortal.
If you have any questions please contact sales at email@example.com or for technical assistance, email firstname.lastname@example.org. Our team will assist with standard and customize workflows to better serve your simulation needs.