Geeks With Blogs

News Clicky Web Analytics

web stats View David Caddick ('s profile on LinkedIn

Search this Site!

Locations of visitors to this page
View My Stats eXTReMe Tracker
This posting is provided "AS IS" with no warranties, and confers no rights. The opinions expressed within are my own and should not be attributed to any other Individual, Company or the one I work for. I just happen to be a classic techie who is passionate about getting things to work as they should do (and are sometimes advertised and marketed as being able to?) and when I can I drop notes here to help others falling in to the same traps that I have fallen in to. If this has helped then please pass it on - if you feel that I have commented in error or disagree then please feel free to discuss with me either publically or privately? Cheers, Dave
Thin Clients, VDI and Linux integration from the front lines.... Raw and sometimes unedited notes based on my experiences with VMware, Thin Clients, Linux etc.

With over 60,000 CPU's in this beast it certainly packs some punch? And if you have a task that needs some serious computing like this then you can actually submit a request to get your jobs processed - submit proposals here - how cool is that?

At 500 Teraflops though this is quite a significant increase in global computing power and it makes you wonder just how much further this can increase? What if someone was able to create and prove a quantum computer? how much would that influence these sort of projects?

I does make me wonder whether or not there might be some breakthrough with how we manage the input side of things? Here we are with this enormous Computing power and yet we are still effectively stuck with a keyboard? :-)

The World's Largest Supercomputing Cloud

I had no idea the Hubble telescope could see only 12 billion years into the past.

Frankly, I'd never really thought about telescopes looking into the past until Dr. Michael Norman, a researcher from UCSD gave me a basic education in astronomy - and explained the Hubble looks at celestial bodies whose light is just now reaching us. But it can "only" see 12 billion years into the past - and that was a veil he'd like to pierce. (I asked him what he did for a living, he said, "I simulate the universe." Trump that job description.)


I was asked to give a keynote to celebrate Ranger's opening, and this was only one example of the flood of basic research and science that will now be performed on the world's largest open computing platform. Open? The facility was funded by the National Science Foundation, and is committed to providing large scale supercomputing as a service to any researcher or scientist within the US (submit proposals here). Ranger is built entirely on Sun - to dip into geekspeak for a moment, here are the stats:

  • In around 6,000 square feet datacenter space, consuming less than 3 Megawatts...
  • More than 4000 quad core Sun/Opteron blades, 120+ Tb of DRAM, running CentOS
  • Delivering more than 500 teraflops computing capacity
  • Jobs scheduled by Sun's Grid Engine
  • Interconnected by two, 100 terabit non-blocking Magnum switches (horns optional)
  • Data managed by the Lustre file system, on Thumpers
  • More than 2 petabytes of storage
  • Managed by our hierarchical data management SAM-FS product, archived to Sun tape platforms
  • With overall systems managed and monitored by xVM OpsCenter (the world's largest installation).

Jonathan Schwartz's Blog

Posted on Saturday, March 8, 2008 12:51 PM | Back to top

Comments on this post: Supercomputing made easy with 123Terabytes of RAM - typo? No that's 123Tb of RAM, the storage is around 2Petabytes

No comments posted yet.
Your comment:
 (will show your gravatar)

Copyright © Dave Caddick | Powered by: