- Member Resources
- Education & Outreach
- Talks & Symposia
- News & Announcements
The Deans and faculties of KSAS and WSE have partnered to create a Homewood High Performance Cluster (HHPC). The HHPC integrates the resources of many PIs to create a powerful and adaptive shared facility designed to support large scale computations on the Homewood Campus.
The HHPC is managed under the aegis of IDIES and operates as a Co-Op. Hardware is provided by users, who then get a proportional share of the pooled resources. The networking infrastructure and a systems administrator, Jason Williams, are provided by the Deans of the Whiting School of Engineering and the Krieger School of Arts and Sciences.
The first cluster, HHPCv1, came online in December 2008. It currently contains 130 compute nodes with over 1000 cores connected by a DDR Infiniband switch. A new cluster, HHPCv2, came online in the newly renovated Bloomberg 156 Data Center in November 2011. It currently has over 200 nodes with 12 cores and 48GB of RAM. The QDR Infiniband switch can accommodate up to 400 additional nodes. Both clusters are connected by high speed links to other clusers in Bloomberg 156, including the Datascope and 100TFlop Graphics Processor Laboratory. Bloomberg 156 has 10GB connections to the rest of JHU and Internet2 and these are being upgraded to 100GB connections with funding from the National Science Foundation.
Researchers on the Homewood campus are encouraged to consider contributing hardware to the HHPC. Click here, or contact Mark Robbins or Jason Williams if you are interested in learning more. The minimum buy-in is 8 nodes or about $33,000 at the present time.
Under the management plan for the HHPC, 10 percent of the compute time can be allocated by the Deans to faculty on the Homewood campus. The priorities for use of this time are to meet temporary surges in compute needs for research projects at Homewood, to provide access to new hires and new contributors before nodes they have ordered arrive, and to allow potential members of the HHPC to "kick the tires". For more information, and to apply for time on the HHPC, see the HHPC application below.
For more detailed information about the cluster, see the HHPC wiki .
All users are encouraged to be affiliates of IDIES.
Please attach a .pdf file addressing the following questions: