Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
NASA Space Supercomputing

NASA Upgrades Weather Research Supercomputer 71

Cowards Anonymous writes "NASA's Center for Computational Sciences is nearly tripling the performance of a supercomputer it uses to simulate Earth's climate and weather, and our planet's relationship with the Sun. NASA is deploying a 67-teraflop machine that takes advantage of IBM's iDataPlex servers, new rack-mount products originally developed to serve heavily trafficked social networking sites."
This discussion has been archived. No new comments can be posted.

NASA Upgrades Weather Research Supercomputer

Comments Filter:
  • Big Question: (Score:4, Interesting)

    by Penguinisto ( 415985 ) on Wednesday September 24, 2008 @07:57PM (#25145193) Journal

    ...what are they doing to improve the algorithms used to calculate the results? And if they're transparent (e.g. open for public inspection) - bonus!

    (yes, I know that there are only a few folks in the Human race that would even know how to read the things. That said, it would be nice to have something educational, and at the same time open for public scrutiny so as to avoid political accusation, you know?)

    /P

    • I know that there are only a few folks in the Human race that would even know how to read the things

      That sounds like just the kind of challenge that can motivate any slashdotter into becoming an armchair mathemagician.

      • In the Greek mathematical Forum
        Young Euclid was present to bore'em.
        He spent most of his time
        Drawing circles sublime
        And crossing the pons asinorum.*

        wait, you said armchair mathemetrician, right?

        *not my poem [virginia.gov].

    • I would like to think that in addition to throwing more hardware at the problem, that the folks are smart enough to update the algorithms as much as they can. I would be curious, as well, to the general concepts behind their implementation.

    • by EmbeddedJanitor ( 597831 ) on Wednesday September 24, 2008 @08:39PM (#25145591)
      These are the models predicting Global Warming etc. These need to be open to peer review due to the significant impact of getting these models wrong.

      Faster does not mean better. I'd rather have less iterations per day on a good model than many of a crap model.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        What makes you think they are not? Anyone can download the both the source and the detailed documentation for any of the current or previous generation models. We use a coarse resolution, but full physics model when we teach climatology. You can go to www.ccsm.ucar.edu/models to download, compile and test on your own the current generation climate model. You may choose to reduce the resolution to shorten up the run times, but that's up to you. This openness is contrast to The Viscount Monckton of Benchley's

      • Re: (Score:3, Insightful)

        I'd rather have less iterations per day on a good model than many of a crap model.

        Well, as long as the simulation doesn't go slower than the weather itself. Sounds silly, but it's a relevant point.

      • Who needs models? Just call the Russians and ask them.

    • Is this really NASA's job? Isn't there some other organisation in USA that does weather prediction etc?

      No wonder they're not getting anywhere replacing aging shuttle fleets if they are playing with rubber ducks and earth climate modelling.

      • by khallow ( 566160 ) on Wednesday September 24, 2008 @09:08PM (#25145841)

        From the National Aeronautics and Space Act (which authorizes NASA and its activities):

        (d) The aeronautical and space activities of the United States shall be conducted so as to contribute materially to one or more of the following objectives:

        (1)The expansion of human knowledge of the Earth and of phenomena in the atmosphere and space;

        (4)The establishment of long-range studies of the potential benefits to be gained from, the opportunities for, and the problems involved in the utilization of aeronautical and space activities for peaceful and scientific purposes;

        (5) The preservation of the role of the United States as a leader in aeronautical and space science and technology and in the application thereof to the conduct of peaceful activities within and outside the atmosphere;

        • by vought ( 160908 )

          Thank you for looking it up and posting it. I know most people think NASA just does airplanes and spaceships, but that'd be a sad and narrow scope for an agency that is charged with not only doing missions one we're in space, but with getting through the atmosphere in the first place.

          Problem is, those airplanes and spaceships gotta get off the earth. And to do that, they often fly through weather - so yes, it's part of NASA's purview. In fact, the Marshall Space Flight Center often provides imagery of the m

        • (1),(4),(5)? What happened to (2) and (3), are they still on Mars exploring?

          They must have some pretty advanced counting methods over there at NASA. Either that or they royally screwed up the 1, 2, 5 -- no 3! gag from Holy Grail.
        • (1)The expansion of human knowledge of the Earth and of phenomena in the atmosphere and space;

          Still, NASA should be launching the satellites that collect the data. NOAA should be crunching the numbers.

          This only reaffirms my belief that there needs to be a massive restructuring of our government. There are 16 intelligence agencies. [wikipedia.org] 16! How does the Director of National Intelligence [wikipedia.org] keep track of it all?

          How does one coordinate a government this big? I guarantee there are multiple government agencies working on the same projects, while some projects get left behind because no single agency has

          • There are 16 intelligence agencies. [wikipedia.org] 16! How does the Director of National Intelligence [wikipedia.org] keep track of it all?

            Probably another supercomputer.
      • Re: (Score:3, Interesting)

        perhaps NASA conducts so much peripheral research because there's no dedicated government agency for general scientific research.

        i know that we have the NOAA for atmospheric research, but perhaps there needs to be an overarching government agency for scientific research in general. NASA, NOAA, and probably NIST would be branches or departments under such an agency. and all research that is pertinent to our societal advancement, but does not have a dedicated agency such as NASA or NOAA, would be conducted un

        • NASA= National Aeronautics and Space Administration
          NOAA=National Oceanographic and Atmospheric Administration
          Administrations are just that: administrations, run by administrators - primarily to the advantage of amdinistrators. Adding another level of administration won't fix that.
      • Is this really NASA's job? Isn't there some other organisation in USA that does weather prediction etc?

        No wonder they're not getting anywhere replacing aging shuttle fleets if they are playing with rubber ducks and earth climate modelling.

        The parent post was not insightful.

        Anything that can effect aviation and spaceflight is critical to understand. Furthermore their charter charges them with not only understanding these phenomena, but also to advance scientific knowledge for applications in these arenas.

    • Re:Big Question: (Score:4, Informative)

      by toby34a ( 944439 ) on Wednesday September 24, 2008 @09:26PM (#25145935)
      You know, a lot of the climate and weather prediction models are open source. You can download the source code if you want, and run it on your own PC if you have certain compilers. Some links for you for your own perusal: Community Climate Model [ucar.edu] NASA GISS Model [nasa.gov] Weather Research and Forecasting Model [wrf-model.org] Regional Atmospheric Modeling System [atmet.org] As long as you have access to a Linux/Unix machine, you can get these models yourself. If you want to contribute, you can do so. Just know that you probably need to have taken graduate level courses in numerical methods and actually get the physical terms in the model to make changes that mean something. Science in this case is rather open. People can easily download these models and make changes to improve it if they needed to (or to test sensitivity, etc).
    • Re:Big Question: (Score:4, Informative)

      by Leebert ( 1694 ) * on Wednesday September 24, 2008 @10:31PM (#25146457)

      Lots of the models are open. There's a nice site at: http://modelingguru.nasa.gov/ [nasa.gov]

    • Re: (Score:3, Funny)

      by SL Baur ( 19540 )

      No, no, no!

      This is slashdot and the big questions are:
      does it run Linux?
      FTFA:

      IBM said its new server, which runs Linux and is based on Intel's quad-core Xeon processors

      W00t!

      Can you imagine a Beowolf cluster of these things and can you give me a car analogy of how fast these things run?

      I, for one, welcome our new IBM iDataPlex overlords.

      • Re: (Score:2, Funny)

        by Kardos ( 1348077 )
        You must be new here.

        This is slashdot and the big question is: does it run vista?

        In Soviet Russia, supercomputers simulate you !

        There, fixed it for you.

    • I think Al Gore was involved...
    • by StevisF ( 218566 )
      Much of the simulation is CFD (computational fluid dynamics). The formulas governing fluid movement [wikipedia.org] have been well established for a long time. They make a 3D grid of cells, then when something (air, water) moves over the boundary of a cell, the equations have to be calculated. The equations are computationally expensive though. You can always use a smaller cell size, which will increase the accuracy of the model. A good example of this is turbulence [wikipedia.org]. It's not directly numerically computed even though
      • by S3D ( 745318 )
        Well, I worked with numerical simulation of liquids (not water, melted semiconductors), so I had some exposition to different numerical methods of FD. There is a lot of different numerical methods, so while equations well established, numerical methods for their solutions no way are. There are multiresolution methods, methods numerically conserving different quantities, like different combination of energy, momentum, entropy etc., adaptive meshes, methods with different tricks to fight numerical instability
  • by the eric conspiracy ( 20178 ) on Wednesday September 24, 2008 @07:59PM (#25145205)

    Does this mean that the forecasting simulation for tomorrow's weather will run in less than 24 hours?

  • NASA Engineer: "You know, chief, I've been thinking. I bet we could just about triple the performance of this thing if we supercooled it."
    Manager: "Super what?"
    Engineer: "Chilled it to absolute zero, like in the large hadron supercollider. Speeds up the electrons."
    Manager: "What would you need to do that?"
    Engineer: "Oh, I don't know, maybe... a ton of liquid helium?"

  • GPGPU (Score:3, Interesting)

    by sdemjanenko ( 1296903 ) on Wednesday September 24, 2008 @08:29PM (#25145497) Homepage
    Seeing as 67 teraflops is going to be the new processing power for this machine, I wonder if a NVIDIA CUDA implementation has been considered. Their Tesla systems are designed for this High Performance Computing, offer a significant amount of processing power and are relatively easy to parallelize code for. I know that oil companies use these high powered systems to find locations of oil, but I guess that its less likely for weather forcasting since there is less money in it. However, it would be interesting to see these cards used for modelling hurricanes and determine their expected strength and path of travel more accurately.
    • by Junta ( 36770 ) on Wednesday September 24, 2008 @11:15PM (#25146795)

      When an xhpl score says '67 teraflops' and nVidia/AMD gpus spout off about the ludicrous number of gigaflops they have, it simply isn't the same.

      For example, the PS3 variant of the Cell processor claims 410 gigaflops. It's hpl score, however, would be about 6-9 gigaflops. Even the new cell processors 'only' get 200 gigaflops by xhpl count.

      32-bit precision scores aren't comparable directoly to 64-bit operations.

  • Use rubber duckies [slashdot.org] in the cooling pool?
  • Is tripling the performance really that news-worthy? Either they update their hardware infrequently (every few years or more) and are falling behind everything else in the computer industry, or they are doubling or tripling every year or two, which makes this not an uncommon event. It seems like orders of magnitude would be a scale of real accomplishment - or even better - if they tripled the accuracy of the weather predictions.
  • I wonder if they take into account the heating that toy generates when making those calculations! ^_^
  • Comment removed based on user account deletion
  • Isn't this small potatoes to the power of the distributed project climateprediction.net?

    duke out

  • Not to knock NASA (I'm rather fond of them), but what are they doing in the weather prediction biz, anyway? Last I checked, weather and climate studies were in NOAA's domain.

    Cross-agency collaborations are great and appropriate, but in general I'd just as soon see NASA's budget dollars stay invested in space research.

  • Eeek (Score:4, Informative)

    by Zashi ( 992673 ) on Thursday September 25, 2008 @09:21AM (#25150141) Homepage Journal

    iDataPlex? Really? I am a tester at IBM. We've just started to qualify various hard drives and IO cards for the iDataPlex systems. They're very oddly designed and in general suck. The firmware (BIOS/uEFI) is really crappy but it usually is at this stage of testing. I'm sure it will get better over time. The thing that most likely will not get better is the horrible, horrible physical design (which was specially request by Facebook). I would say the reason is unknown, but from what I've heard it's because Facebook didn't want to upgrade their racks/rails so they had IBM design servers to fit them.

    There's lots of curious and pointless design features. They're almost like big-ass blades, designed to slide out of a larger outer-housing that contains the PSU and fans, but several cables and wires connect the machine to the outer-housing making it impossible to remove without also removing the outer-housing from the rack. In one variant, the pci-slot is literally in the middle of the system (imagine a card slot in the middle of your motherboard, that, when a card is inserted into it, acts as a locking bar).

    All the ports are in the front of the system: vga, usb, ethernet. Except for power. Power is in the back, attached to the external shell. There are also ps/2 ports (a rarity among newer servers) but they are completely blocked by the faceplate.

    My overall reaction: meh.

  • As much as I hate the thought, your funding is about to go bye bye.

  • plz provide a fps spec for Crysis. ~:-)

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...