Cray Logo
spacer

blog  facebook  twitter  linkedin  google plus  youtube
graphicHomeSupportCustom EngineeringIndustry SolutionsProgramsAbout Cray
graphic
graphic
graphic
spacer

[Infographic] Specializing in Workflow-Driven Storage

Cray_GraphicRev3

Modeling efforts of staggering scale, simulations performed in high-definition resolutions and big data initiatives are all driving data-intensive research projects in contemporary high-performance computing environments. At Cray, we are responding to rising storage requirements across a variety of storage workflows, from ingest to archive. Bringing our experience We currently have such a large user base of customers running Lustre systems that more than 120 petabytes of data capacity has been deployed on Cray® supercomputers and big data systems. Furthermore, we co-founded OpenSFS and are among the partners in the open-source Lustre movement. We have the experience to give you the storage functionality you need for your HPC … [Read more...]

iVEC and Pawsey Centre Taking Full Advantage of Cray Systems

iVEC

The Pawsey Centre in Perth is one of the preeminent supercomputing facilities in Australia. The centre was built with the specific purpose to host supercomputing systems and support the advanced scientific research that these technological powerhouses can accomplish. Managed under the auspices of iVEC  – the Pawsey Centre plays host to several supercomputers, including two Cray® XC30™ systems nicknamed “Magnus” and “Galaxy”. iVEC, which is an unincorporated joint venture between CSIRO, Curtin University, Edith Cowan University, Murdoch University and the University of Western Australia and is supported by the Western Australian Government, has been taking advantage of the resources at the Pawsey Centre to complete … [Read more...]

NERSC’s “Edison” Unleashed

resizedimage700346-Edison-head-on

Now that our new flagship supercomputer Edison, a 2.5-petaflop Cray® XC30™, has been in full production mode for a couple of months, it seems like a good time to check in and see how scientists are using it. At the top of the list of hours used are teams of scientists studying the fundamentals of the standard model of particle physics, the structure of the Earth's subsurface, clustering of matter in the early universe, fusion energy, clean combustion, how salts bind to water, nano-characteristics of catalysts, table-top accelerators, carbon sequestration, and extreme climate events. If that seems like a diverse and intriguing array of topics, that's because it is.  NERSC, the National Energy Research Scientific Computing Center, … [Read more...]

Discussions at the Rice Conference Highlight Role of HPC Systems in Oil & Gas

Oil and gas event

I just came back from the annual Oil and Gas High Performance Computing (HPC) Workshop hosted by Rice University and can say the event has grown rapidly to where it has become a broad investigation of the use and role of HPC in the Oil & Gas segment. In terms of attendance, the conference impressed with more than 300 registrants. For the first time, it also featured “Lightning Talks”; those rapid-fire, few slides, special effects, big font size presentations so beloved by the West Coast/Silicon Valley computer industry. (Perhaps next year, we’ll see some VCs showing up? The HPC industry in Oil & Gas is certainly big enough and innovating dramatically enough to warrant their presence). At such events, it’s customary … [Read more...]

MSU Turns to Liquid-Cooled Cluster Supercomputer

shadow1_medium

Mississippi State University had a need for a powerful and efficient new primary supercomputing system for their High Performance Computing Collaboratory (HPC2) –  a coalition focused on advancing the state of the art in computational science and engineering using high performance computing. They chose the Cray® CS300™ liquid-cooled cluster supercomputer. Nicknamed "Shadow," MSU's new Cray CS300-LC cluster generates 316.1 teraflops of peak performance while using minimal energy. This efficiency is accomplished in part through a hybrid architecture featuring Intel® Xeon® processors and Intel® Xeon Phi™ coprocessors, and because the system uses warm water for cooling. Almost four years ago, the University installed its first … [Read more...]