Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Einstein@Home Set To Break Petaflops Barrier

Unknown Lamer posted about 2 years ago | from the onward-upward dept.

Supercomputing 96

hazeii writes "Einstein@home, the distributed computing project searching for the gravitational waves predicted to exist by Albert Einstein, looks set to breach the 1 Petaflops barrier around midnight UTC tonight. Put into context, if it was in the Top500 Supercomputers list, it would be in at number 24. I'm sure there are plenty of Slashdot readers who can contribute enough CPU and GPU cycles to push them well over 1,000 teraflops — and maybe even discover a pulsar in the process." From their forums: "At 14:45 we had 989.2 TFLOPS with an increase of 1.3 TFLOPS/h. In principle that's enough to reach 1001.1 TFLOPS at midnight (UTC) but very often, like yesterday, between 22:45 and 22:50 there occurs a drop of about 5 TFLOPS. So we will have very likely hit 1 PFLOPS in the early morning tomorrow. "

Sorry! There are no comments related to the filter you selected.

10001.1 TFLOPS, eh? (0)

Anonymous Coward | about 2 years ago | (#42451889)

/.: We Have Editors, But Damned If We Know Why!

Re:10001.1 TFLOPS, eh? (0)

Anonymous Coward | about 2 years ago | (#42451911)

It was a cut and paste from the original article, I doubt slashdot editors can edit a page on the http://einstein.phys.uwm.edu/ [uwm.edu] forum it came from.

Re:10001.1 TFLOPS, eh? (0)

Anonymous Coward | about 2 years ago | (#42451969)

It was a cut and paste from the original article, I doubt slashdot editors can edit a page on the http://einstein.phys.uwm.edu/ [uwm.edu] forum it came from.

But they damn well can -- and should -- point out that it's an error in the original source, and what the obvious meaning is.

Re:10001.1 TFLOPS, eh? (3, Informative)

Unknown Lamer (78415) | about 2 years ago | (#42451999)

I don't think the poster is a native speaker and I fixed a bunch of other obvious typos... but missed that extra zero there.

Re:10001.1 TFLOPS, eh? (5, Funny)

JustOK (667959) | about 2 years ago | (#42452411)

A zero is nothing, therefore you missed nothing.

Re:10001.1 TFLOPS, eh? (2)

Abstrackt (609015) | about 2 years ago | (#42453831)

But he could have missed it to a higher degree of precision!

Re:10001.1 TFLOPS, eh? (0)

Anonymous Coward | about 2 years ago | (#42454407)

Amazing, Unknown Lamer gets a +5 informative while timothy ends up at -1 overrated.
I wonder why...

Re:10001.1 TFLOPS, eh? (0)

Anonymous Coward | about 2 years ago | (#42456755)

Like you said, it was a quote and a very obvious error. Why make such a BFD about it then?

Re:10001.1 TFLOPS, eh? (0)

Anonymous Coward | about 2 years ago | (#42453203)

/.: We Have Editors, But Damned If We Know Why!

To provide flops.

E@H success (1, Informative)

Anonymous Coward | about 2 years ago | (#42451919)

Although Seti@Home is probably the most known project (or used to be), E@H is probably the most successful one from the pure science perspectives. They have actually managed to discover new pulsars that nobody has seen before, and unlike some slightly shady DC projects (some of them being actually for-profit), their data is accessible. Good job E@H team!

Re:E@H success (0)

Anonymous Coward | about 2 years ago | (#42454595)

Maybe because SETI@home is fucking retarded and pointless.

Re:E@H success (0)

Anonymous Coward | about 2 years ago | (#42454699)

Maybe because SETI@home is fucking retarded and pointless.

I don't know, it seems to have been a decent proving ground for the "run this distributed computing process in your spare time" model of super-computing.

Re:E@H success (1)

Macrat (638047) | about 2 years ago | (#42454787)

Although Seti@Home is probably the most known project (or used to be), E@H is probably the most successful one from the pure science perspectives.

Unlike the pure science of Folding@home?

Re:E@H success (0)

Anonymous Coward | about 2 years ago | (#42458271)

True, and that only heightens the irony that the supposed 'lead' program at E@H is the LIGO project, has made precisely ZERO gravity wave observations.
IMO, LIGO has become a billion dollar black hole; the never ending 'upgrade$' and promises of 'Big Things to Come!' are a perfect example of Big Science run amok.

I run E@H ONLY because of the the Fermi and Arecibo data and the real science results they produce.

Top500 doesn't work that way (1)

Anonymous Coward | about 2 years ago | (#42451923)

You have to run the Linpack benchmark and report that.

Re:Top500 doesn't work that way (3, Insightful)

kasperd (592156) | about 2 years ago | (#42452655)

You have to run the Linpack benchmark and report that.

And I guess no distributed computing platform is ever going to score in top 500 according to that benchmark. The communication performance between nodes is very important to most parallel algorithms. Any decent benchmark would take that into account. A real super computer has much faster communication between the nodes, than what you can achieve across the Internet. Both throughput and latency matters. There are some specific problems which can be split into parts that can be computed independently by nodes without communication between them, but most super computers are used for tasks, that do not fall into that class.

At some point I heard the rule of thumb, that when you are building a super computer, you divide your funds in three equal parts. One of those was to be spent on the interconnect. I don't recall what the other two were supposed to be spend on.

Re:Top500 doesn't work that way (3, Funny)

Anonymous Coward | about 2 years ago | (#42452883)

At some point I heard the rule of thumb, that when you are building a super computer, you divide your funds in three equal parts. One of those was to be spent on the interconnect. I don't recall what the other two were supposed to be spend on.

Hookers and blow.

Re:Top500 doesn't work that way (1)

BarfooTheSecond (409743) | about 2 years ago | (#42457685)

I agree. That's what makes all the difference between such distributed systems or simple clusters, and real massively parallel supercomputers, with a single kernel instance running on thousands of CPU cores accessing the same distributed shared memory. Nothing to do with opportunistic distributed batch jobs, both don't compare.

folding@home (2, Insightful)

etash (1907284) | about 2 years ago | (#42451929)

genuine question:

wouldn't it be wise for practical* reasons for people to offer more power to folding@home instead of einstein@home?



* = has more chances to help humanity ( for curing diseases etc. )

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42451971)

No, it wouldn't.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42452021)

* = has more chances to help humanity ( for curing diseases etc. )

*shrug* I've never been as big fan of humanity so I don't really care about that. Gravity waves though... Flying cars, gravity gun... How can anyone resist?

Re:folding@home (1)

I_am_Jack (1116205) | about 2 years ago | (#42452443)

*shrug* I've never been as big fan of humanity so I don't really care about that. Gravity waves though... Flying cars, gravity gun... How can anyone resist?

But you kind of need humanity to a) develop new gravity wave technologies and b) then actually enjoy the benefits of said new technologies. Unless of course you see these new gravity wave devices as a lagniappe for your robot and AI friends.

Re:folding@home (-1)

Anonymous Coward | about 2 years ago | (#42452677)

lagniappe
haha last i heard that word i was in nawlins rebuilding after katrina
that word wasnt ever used in a good way--i dont speak french /creole wtf it is but that word brought back a lotta memories---some of them actualy good
thank you
captcha =relative(einstien haha)

Re:folding@home (1)

Jafafa Hots (580169) | about 2 years ago | (#42454129)

There are just under 7 billion people on earth.
We are projected to hit 8 billion in around ten years or so.
9 billion an even shorter period of time after that.

We are not facing a shortage.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42452027)

Shouldn't we devote our resources to curing poverty in Africa instead of spending time trying to go to Mars?

Re:folding@home (0)

etash (1907284) | about 2 years ago | (#42452239)

finding a permanent cure for a genetic disease is not like providing food to africa.
giving someone an aspirine for his pain = giving a person a fish
finding a permanent cure for a genetic disease = teaching someone how to fish ( but for a totally different problem than eating )

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42452567)

You mistake my meaning of the word "cure," and are therefore completely missing the point.

Re:folding@home (2, Interesting)

Anonymous Coward | about 2 years ago | (#42452651)

No. But if we wanted, we could do both.

But play a little mind game. Imagine that you are a super genius, who could create a magical box within an hour. This box could create anything from nothing, even another similar box or cure for everything or food.

Would you rather spend your whole life helping Africa than inventing this box? Considering that with the box, you could help Africa also.

If yes, how about if it would take 2 hours? 4? A year?

But you are not a genius and the box is not a box. The box might be a robot that is based on technology that was invented when we tried to get to Mars. That is why we need to go to Mars, rather than help Africa with all we got.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42452939)

Who gives a shit about africa?

Re:folding@home (1)

synapse7 (1075571) | about 2 years ago | (#42453515)

How about paving roads in Michigan?

Re:folding@home (1)

Farmer Pete (1350093) | about 2 years ago | (#42454387)

Roads? Where we're going, we don't need roads! Sorry, with flying cars, roads would be a waste.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42452041)

Gravity afflicts 100% of the world's human population, so I think it's a better way to help humanity.

Re:folding@home (1, Interesting)

hawguy (1600213) | about 2 years ago | (#42452053)

genuine question:

wouldn't it be wise for practical* reasons for people to offer more power to folding@home instead of einstein@home?

* = has more chances to help humanity ( for curing diseases etc. )

Or, to put it another way - why waste resources studying astronomy when there are so many sick people in the world so it would be better for humanity to put our resources into curing disease?

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42452147)

Or maybe you are too short sighted to see that this could potentially help everyone on the planet instead of a microcosm of it.
It could lead to a cure for limited resources, which, if I'm not mistaken, is part of the cause of the poverty in Africa.

Re:folding@home (1)

hawguy (1600213) | about 2 years ago | (#42452395)

Or maybe you are too short sighted to see that this could potentially help everyone on the planet instead of a microcosm of it.
It could lead to a cure for limited resources, which, if I'm not mistaken, is part of the cause of the poverty in Africa.

Well no, not yet- humanity could feed everyone on the planet today if they could find a way past artificial political barriers that are preventing it. And if there were no political and/or religious barriers against promoting education and contraception, we could continue to house everyone on this planet indefinitely.

It would be nice for humans to be able to escape the planet some day to ensure survival even if there's a catastrophic event on this planet, but saying that we need to leave the planet because of limited resources ignores the obvious (and arguably cheaper and easier, though less "sexy" solution) - use policy and technology to reduce the resource drain on this planet.

This will be necessary even if we do colonize other planets since I don't see space travel ever being practical enough to move a few billion people off the planet to make more room for the rest of us.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42454633)

You don't see space travel as being practical enough because you are short-sighted.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42457319)

I don't see packaging meat up for shipping around the stars. Better to pack the brains up, and clone up some new bodies when you get there. Much easier to ship 10 thousand colonists packed into a shipping container, instead of a queen mary class liner.

problems become easier when you change the question.

Re:folding@home (1)

Anonymous Coward | about 2 years ago | (#42454757)

You are right, how dare DARPA and PARC create all this silly technology. There is no way that the internet or computers could improve the lives of everyone in the world.... [end sarcasm]

That is essentially your argument because you fail to see how this new technology could be used to help people. Just because you don't see it doesn't mean is isn't useful.

I never meant to imply that this would only be used for getting us off this rock in case of catastrophe. That is one possible use. Maybe we could bring extra resources from elsewhere instead of migrating to those resources. They think that most of the water here is from elsewhere anyway (comets, etc). What if we create some type of gravity ray from this and can pull in passing comets for when water becomes as valuable as gold?

So once again, without trying to offend, I say that you are short-sighted.
(hey Galileo, quit looking at the stars, there are people starving. does it matter if we aren't the center of the universe?)

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42454767)

Why should we feed everyone? Personally, I don't care that Africans are dying. That's life. People die.

Re:folding@home (1)

Anonymous Coward | about 2 years ago | (#42452199)

Here, I'll tee it up for you.

        We're all going to die. Why bother doing anything at all?

Pretty easy statement to refute; same refutation works for all the other flavors as well.

But to address the OP, they are my flops, and I can do with them as I please; just as you can do whatever you like with yours.

Re:folding@home (1)

etash (1907284) | about 2 years ago | (#42452327)

IMHO you are comparing apples to oranges. giving food to poor people in africa does not solve any problem. the problem would be solved if they had a strong economy, education etc. the equivalent of this in the field of medicine, with the added benefit that it SOLVES ( instead of postponing ) a particular set of problem for all humanity ( instead of just africa ), is FINDING a permanent cure for diseases.

Re:folding@home (1)

WaffleMonster (969671) | about 2 years ago | (#42452429)

Or, to put it another way - why waste resources studying astronomy when there are so many sick people in the world so it would be better for humanity to put our resources into curing disease?

Protien folding simulation is such a large and basic need globally there ought to be enough large scale interest to make development of specialized ASICs to deal with these problems cost effective and exceedingly useful for all who need to do these simulations. A quick check of google shows such chips do in fact exist with unbelivable performance figures which kick the snot out countless tens of thousands of CPU/GPUs. There is no shortage of funding for medical research so it begs the question why waste CPU/GPU resources on folding simulations?

I still do seti and milkyway at home because there are no resources allocated for seti and milkyway at home is interesting to me personally.

Re:folding@home (2)

camperdave (969942) | about 2 years ago | (#42453541)

Why not use a bacterial or yeast culture and let them fold actual proteins?

Re:folding@home (2)

pszilard (1681120) | about 2 years ago | (#42455857)

Protien folding simulation is such a large and basic need globally there ought to be enough large scale interest to make development of specialized ASICs to deal with these problems cost effective and exceedingly useful for all who need to do these simulations. A quick check of google shows such chips do in fact exist with unbelivable performance figures which kick the snot out countless tens of thousands of CPU/GPUs. There is no shortage of funding for medical research so it begs the question why waste CPU/GPU resources on folding simulations?

I still do seti and milkyway at home because there are no resources allocated for seti and milkyway at home is interesting to me personally.

First of all, protein folding is not the only thing they do, the Folding@HOME infrastructure is used by many for a variety of bio-molecular studies [stanford.edu] .

Secondly, custom ASIC-based machines like Anton [wikipedia.org] and MDGRAPE [wikipedia.org] (which are AFAIK the only such machines around these days) consist of much more than a custom-chip, they use specialized interconnects, memory, software, etc. and cost a lot. The MDGRAPRE-4, the coming version of the Riken-developed custom molecular simulation machine costs $10M + $4M (development + manufacturing) [www.cscs.ch] which poses serious financial limitations to it. Moreover, these specialized machines are only able to run a handful of molecular dynamics algorithms and while fast, they are nowhere near as versatile as general-purpose codes like AMBER, GROMACS, NAMD, etc. Although it is true that these specialized machines are a few orders of magnitude faster in terms of absolute performance (i.e time to solution and not Flops), due to their limitations and the way they are used, some researchers argue [acm.org] that they employ a "brute force" approach [nvidia.com] to molecular simulations which is not cost-effective from the point of view of science/$ delivered. I personally wouldn't call machines like Anton and MDGRAPE a complete waste, they achieve impressive advances in hardware, software, and science results in a specific direction: pushing the limits of how fast can one run a single simulation. There are certainly other (some would say better) ways to get amazing results with general-purpose (super)-computers be it using massive clusters or cycles donated to folding Foldging@HOME.

Finally, let me explain why is there compute-resource shortage in the (bio-)molecular simulation filed which will remain for the foreseeable future no matter how much money do various governamental and non-governamental agencies pour into it. Molecular dynamics is extremely compute-intensive, a single iteration of the MD algorithm requires 10^8-10^10 Flops (not LINPACK Flops!), repeated for millions of times during a single simulation of a bio-molecular system (and such a simulation can take weeks even on a big machine). And that's still a few orders of magnitude short of what would be needed to simulate timescales at which biological processes take place. Therefore, any compute-resource available can be harnessed for molecular simulation research and Folding@HOME does a decent job at utilizing donated cycles. Admittedly, there are some in the community who think that Folding@HOME is wasteful, but that's a topic for another discussion.

Disclaimer: I am involved in the development of the GROMACS open-source molecular simulation package which is in fact on of the computational engines used by Folding@HOME. Still, I believe I have not been biased in the way I presented Folding@HOME and molecular dynamics in general.

Re:folding@home (1)

Anonymous Coward | about 2 years ago | (#42452081)

Sure. But wouldn't it be wise for practical reasons for you to go out and help out at a local soup kitchen instead of posting on Slashdot?

People are going to do what they want to do. Of the people that share CPU/GPU, more people are interested in Einstein.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42452093)

Possibly, but the same goes for a lot of charity work. You can't control what people want to give to so you just have to be happy with what you can get.

Re:folding@home (4, Interesting)

gQuigs (913879) | about 2 years ago | (#42452109)

Discovery is not usually a straight line.

I donate to SETI@Home, Einstein@Home, LHC@Home, and a bunch of projects at WorldCommunityGrid. BOINC and GridRepublic makes this easy. I believe Folding@Home is a seperate standalone project, so it's all or nothing. In addition, there are a LOT of protein folding projects. I'd really like to see them work together - or explain why they are different.

Re:folding@home (3, Interesting)

Enderandrew (866215) | about 2 years ago | (#42452365)

Someone who only knows physics might not be able to help medical research, so scientific resources aren't entirely fungible. But CPU cycles are. So contributing to one particular distributed computing project does carry an opportunistic cost of not supporting another.

Going off on a tangent here, while I echo your sentiment that people should be free to support whatever distributed computing project they want, I'm not sure people realize that SETI has basically already failed. They've covered their entire spectrum numerous times, and have been listening for decades without finding anything. The entire project operates off the assumption that interstellar communication of another intelligent life form would occur over radio waves.

Requisite XKCD:

http://xkcd.com/638/ [xkcd.com]

If someone is contributing cycles to it, and not protein folding, then valuable medical research (that has been proven worthwhile) might be suffering literally out of ignorance. That is worth pointing out.

Re:folding@home (1)

e_hu_man (1277028) | about 2 years ago | (#42452649)

Going off on a tangent here, while I echo your sentiment that people should be free to support whatever distributed computing project they want, I'm not sure people realize that SETI has basically already failed. They've covered their entire spectrum numerous times, and have been listening for decades without finding anything. The entire project operates off the assumption that interstellar communication of another intelligent life form would occur over radio waves.

well, the seti@home project may be in disarray, but it's a bit early to say that seti (search for extra-terrestrial intelligence) in general has failed, isn't it? a few decades of silence from potential civilizations that may potentially be thousands, millions or even billions of light years away can hardly be construed as strong evidence.

Re:folding@home (2)

Enderandrew (866215) | about 2 years ago | (#42452765)

The concept of searching for extra-terrestrial life hasn't failed, but their project of just scanning radio waves basically has. If another civilization used radio for interstellar broadcasts, we'd see steady, regular broadcasts. When we blanket a spectrum from a physical direction and don't see anything, it suggests no one is broadcasting radio waves.

There may be technologically advanced life forms out there broadcasting by other means, but repeatedly checking radio waves probably won't offer any real benefit.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42453471)

The concept of searching for extra-terrestrial life hasn't failed, but their project of just scanning radio waves basically has. If another civilization used radio for interstellar broadcasts, we'd see steady, regular broadcasts

I don't think this is necessarily true. Signals would likely be sent from highly directional antennas which vary their direction with time if intentionally operating as broadcast beacons. This maximizes effect of limited broadcast energy but means there is no constant signal. For highly directional broadcasts orientation of planet and detection antennas translate to quite spotty reception.

Re:folding@home (2)

e_hu_man (1277028) | about 2 years ago | (#42454503)

i don't think this is true at all. scanning radio waves seems just as viable a means as any other to me. my point is that we need to wait for far more than a few decades of silence before the statement "seti is a failure" should even enter our thinking. there may be a civilization making identical radios to our's right now and maybe they have been for as long as we have. but if they're 1,000 light years away (not very far in interstellar terms), decades of silence is the expected result.

Re:folding@home (3, Informative)

Aqualung812 (959532) | about 2 years ago | (#42454357)

I'm not sure people realize that SETI has basically already failed. They've covered their entire spectrum numerous times

The entire spectrum? We've only looked at one frequency range on 20% of the sky:

SETI@home is basically a 21-cm survey. If we haven't guessed right about the alien broadcasters' choice of hailing frequency, the project is barking up the wrong tree in a forest of thousands of trees. Secondly, there has been little real-time followup of interesting signals. Lack of immediate, dedicated followup means that many scans are needed of each sky position in order to deal with the problem of interstellar scintillation if nothing else.

With its first, single-feed receiver, SETI@home logged at least three scans of more than 67 percent of the sky observable from Arecibo, amounting to about 20 percent of the entire celestial sphere. Of this area, a large portion was swept six or more times. Werthimer says that a reasonable goal, given issues such as interstellar scintillation, is nine sweeps of most points on Arecibo's visible sky.

Quoted from http://www.skyandtelescope.com/resources/seti/3304561.html?page=5&c=y [skyandtelescope.com]

Also, when there is no work to be done, your computer can look at other things.

I donate my time to several medical studies that will likely find some results that will help all people. I also donate some time to climate research that has less of a chance of helping EVERYONE. I also donate some time to SETI which has a very, very small chance of changing the world.

It is called hedging your bets. I spend some CPU on things with low risk and low reward, and others on things with high risk and high reward.

Re:folding@home (1)

hazeii (5702) | about 2 years ago | (#42454479)

The XKCD reminds me of a story that back in the 18th century, people wanting to know if the moon was inhabited used their telescopes to look for signal fires.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42457473)

...so scientific resources aren't entirely fungible.

What's wrong with using the pre-existing (and real) word "interchangeable"? It's recognised by just about everyone who speaks English, where this fungible bullshit is recognised by what amounts to a small, probably elitist and ego-driven community.

Start using real words and we won't have to work to figure out what you mean. You also won't look like a dick.

Re:folding@home (1)

the gnat (153162) | about 2 years ago | (#42452407)

I believe Folding@Home is a seperate standalone project, so it's all or nothing. In addition, there are a LOT of protein folding projects. I'd really like to see them work together - or explain why they are different.

Not only are there a lot of projects like this, most of them - whatever their intrinsic scientific merit - have very little direct application to fighting disease. Sure, the people directing the projects like to claim that they're medically relevant, but this is largely because the NIH is the major source of funding. It's also really difficult to explain the motivations for such projects to a general audience without resorting to gross oversimplifications. (This isn't a criticism of protein folding specifically, it's all biomedical basic research that has these problems.) My guess is that it will take decades for most of the insights gleaned from these studies to filter down to a clinical setting.

The project that is arguably more relevant to disease is Rosetta@Home, but that's because of the protein design aspect, not the structure prediction. (In fact, Rosetta doesn't even do "protein folding" in the sense that Folding@Home does - it is a predictive tool, not a simulation engine like Folding@Home.)

Re:folding@home (1)

robotkid (681905) | about 2 years ago | (#42468231)

I believe Folding@Home is a seperate standalone project, so it's all or nothing. In addition, there are a LOT of protein folding projects. I'd really like to see them work together - or explain why they are different.

Not only are there a lot of projects like this, most of them - whatever their intrinsic scientific merit - have very little direct application to fighting disease. Sure, the people directing the projects like to claim that they're medically relevant, but this is largely because the NIH is the major source of funding. It's also really difficult to explain the motivations for such projects to a general audience without resorting to gross oversimplifications. (This isn't a criticism of protein folding specifically, it's all biomedical basic research that has these problems.) My guess is that it will take decades for most of the insights gleaned from these studies to filter down to a clinical setting.

The project that is arguably more relevant to disease is Rosetta@Home, but that's because of the protein design aspect, not the structure prediction. (In fact, Rosetta doesn't even do "protein folding" in the sense that Folding@Home does - it is a predictive tool, not a simulation engine like Folding@Home.)

Someone please mod this up, as a researcher in the same field as F@H I can attest this is all quite correct.

First, I should preface this by saying I've interacted with several of the F@H folks professionally and they do excellent work. And that the NIH is under no pretenses when it funds this work that cures will magically pop out tomorrow - they think of it as a seed investment for a decade or two in the future. In terms of tax dollars spent, it's a good investment considering many biomedical labs spend more just keeping their mice alive every year than all the F@H lab's salaries combined (especially since the computing time is donated by volunteers).

That said, I've always been disappointed that they do not use their unique standing with the public enthusiast computing community to educate and provide the context of what is it they are actually doing and how it is unique among the literally hundreds of other similar protein folding research groups out there. I don't think it's hypocritical to claim basic research can have real world impact on real-world problems, but providing the proper context for an individual researcher's findings is sadly often at odds with their PR goals (in this case, convincing people to donate cycles to F@H and not to other similar projects). But so goes all of biomedical research, as the poster shrewdly notes, despite this being taxpayer funded research performed largely at non-profit, educational institutions.

FYI, federal grants are public record and you can search them to see brief descriptions of current funded research to get at least an idea of how much larger the field is than any one research group. Try one of the links below with the search term "protein folding" if you want a sense of how big this field truly is (and note that it does actually include projects run by actual doctors seeing actual patients). Considering the overall research budget is comprised of less than 2 cents from every tax dollar collected, it's not a bad ROI (obviously I'm biased as a federally funded researcher myself).

http://projectreporter.nih.gov/reporter.cfm [nih.gov]
http://www.nsf.gov/funding/ [nsf.gov]

Re:folding@home (3, Interesting)

hAckz0r (989977) | about 2 years ago | (#42453161)

Different? Ok, "Go Fight Against Malaria" and "Say No To Schistosoma" are both trying to cure the #1 and #2 parasitic diseases worldwide.

Malaria is known to be in the US and has several medications to treat it. The CDC will tell you that Schistosoma does not even exist in the US, but I acquired it at the age of 10, and it wasn't until I purchased my own lab equipment around the age of 50 that I finally got an answer to all my bizarre health problems. Statistically I should be dead, several times over. Over 200,000 people die from it every year, and I am clearly one of the lucky ones.

There is currently only one drug (praziquantel) to "cure' (with 60% efficacy) Schistosoma, and it is quickly loosing its effectiveness. There is no other substitute. None. After visiting many pharmacies in my area, it took me three days for me to locate the drug in the USA and tell the Pharmacy where they could get it for me. . Yes Its that bad. Funny thing is I can buy it off the shelf for my dog, with a prescription, but I couldn't buy it anywhere for human consumption? Clearly we need more options and SNTS protein folding analysis will help with that goal.

If you have a few extra CPU cycles to spare, please sign up for one of these two worthy causes!

More info on Schistosomiasis
https://en.wikipedia.org/wiki/Schistosomiasis [wikipedia.org]
https://en.wikipedia.org/wiki/Praziquantel [wikipedia.org]

Re:folding@home (2)

mspohr (589790) | about 2 years ago | (#42453549)

Drugs for dogs are just as "pure" as those for humans.
On the Internet, no one knows you're a dog.

Re:folding@home (2)

RicktheBrick (588466) | about 2 years ago | (#42453101)

I have read that the latest supercomputer can do a Giga Flop per watt of computing. I do not think that any home computer will come close to that figure. Now if someone would calculate the total cost of doing this 1000 TFlops, I would think that it would be very close to the amount of purchasing a supercomputer and running it. 1,000 TFlops is around 5% of the fastest supercomputer today so one would assume that the supercomputer could do a lot more than just this program. Before 2020 1,000 TFops will be one tenth of one per cent of the computing power of the fastest supercomputer. I am familiar with other distributed computing programs and they usually require more than one volunteer to do any work, They compare the results from each volunteer to make sure that they have the correct results. Therefore 1,000 TFlops is probably at best equivalent to 500 TFlops. When one looks at all the factors, I would think that it would be better to ask for donations of money and purchase some time on a supercomputer. Even if the volunteer would only donate what they would save in electricity, I believe they would have enough.

Re:folding@home (0)

Anonymous Coward | about 2 years ago | (#42453163)

Genuine question:

Do you offer cycles to either?

Re:folding@home (1)

dimeglio (456244) | about 2 years ago | (#42453959)

I was thinking, why not Bitcoins? At least there's a probability of getting money for those spare cycles.

Not just that (-1)

Anonymous Coward | about 2 years ago | (#42452099)

My arsehole is just about to break the 1 petafraps barrier!

At that rate, the entire population of New York City could be driven away in under 13 seconds!

That's nothing.. (1)

Anonymous Coward | about 2 years ago | (#42452143)

The Bitcoin network combined processing power is 287.78 Petaflops. roughly 300 times larger.

Re:That's nothing.. (0)

Anonymous Coward | about 2 years ago | (#42452373)

The Bitcoin network combined processing power is 287.78 Petaflops. roughly 300 times larger.

So? I bet I could twiddle my thumbs faster than you can code.

Re:That's nothing.. (1)

Anonymous Coward | about 2 years ago | (#42452731)

You better start now, because I got a hell of a lead.

Re:That's nothing.. (2)

euyis (1521257) | about 2 years ago | (#42452485)

Isn't Bitcoin's FLOPS number just an estimate, and a grossly inaccurate one based on the wrong assumptions that there's a single formula for estimating FLOPS with just integer performance, and the formula's applicable to all platforms?

Re:That's nothing.. (1)

Anonymous Coward | about 2 years ago | (#42454417)

"Isn't Bitcoin's FLOPS number just an estimate, and a grossly inaccurate one based on the wrong assumptions that there's a single formula for estimating FLOPS with just integer performance, and the formula's applicable to all platforms?"

All true.

But even if 287 Petaflops is grossly inaccurate, it's *still* gonna be more than 1 Petaflop. I'm almost sure about that...

Yeah no thanks (0)

oic0 (1864384) | about 2 years ago | (#42452181)

Donate my electrical expense and hardware wear / tear to someone else's research, or put it towards production of a non government controlled currency standard and make a little spare change in the process. Not a hard choice. Bitcoin wins ;)

Re:Yeah no thanks (2, Informative)

Anonymous Coward | about 2 years ago | (#42452387)

You're not actually producing bitcoin, you're just competing to win them. That is: nothing new is created by your participation in the bitcoin network, the best you can hope for is that you'll receive something which otherwise would've gone to someone else. It helps you personally, but is 0 sum for the world.

With einstein@home and folding@home, you are helping to solve big science problems. These problems will be solved faster with your help than without it. It is a net gain for science and humanity.

Re:Yeah no thanks (1)

Anonymous Coward | about 2 years ago | (#42452737)

You mean put it towards a scam

Found /. through distributed.net back in the day (2)

gatzke (2977) | about 2 years ago | (#42452481)

The reason I found slashdot back in the 90s was due to the team performance on the distributed.net tasks. So they do turn those cycles into something useful!

One Petaflop, uh? (1)

Psicopatico (1005433) | about 2 years ago | (#42452687)

I'm sorry, but that's 1024 Teraflops to me.
You're still off by ~20 hours.

Re:One Petaflop, uh? (0)

Anonymous Coward | about 2 years ago | (#42453153)

That would be TiFLOPS not TFLOPS duh

Also... does anyone else find the 1.3 TFLOPS/h an odd messure? IE 1.3 TFLOP per second per hour or 78 TFLOP/h^2 ... computer acceleration?

Re:One Petaflop, uh? (1)

hazeii (5702) | about 2 years ago | (#42454623)

Good call on the meaning of 'tera' :)

As to the rate, it's increasing at 1.3Tflop per sec per hour, so yes, it's an acceleration. Even if it has slowed a bit...now about 0.8Tf/s/h (i.e. the rate of the acceleration is decreasing).

Re:One Petaflop, uh? (0)

Anonymous Coward | about 2 years ago | (#42453261)

Sounds like 1 pebiFLOPS is what you want. Unless you insist on clinging to an incorrect use of SI prefixes...

Proprietary license (0)

Anonymous Coward | about 2 years ago | (#42452763)

Guarantees my non-participation. (The middleware BOINC is free software but the vast majority of the actual programs it download, e.g. einstein@home are proprietary...)

Re:Proprietary license (1)

hazeii (5702) | about 2 years ago | (#42454539)

E@h is GPL V2 [uwm.edu] actually. Thinking of something else?

Any suggestions for a distributed client? (1)

djh101010 (656795) | about 2 years ago | (#42452915)

So, I just bought a new 4/8 core I7 Mac. Told Folding@home to use 50% of my cores. It persisted in using 100% of my cores, despite what I told it to do, until I uninstalled it. Is there a distributed project whose client will honor my request to only donate half of my resources? Bonus points for one which lets me say which hours of which days it can run. If none of them can, I'll let ElectricSheep provide the eye candy, I really don't care. But I'd rather help out a cause that behaves as I specify on my hardware. Anyone?

Re:Any suggestions for a distributed client? (0)

Anonymous Coward | about 2 years ago | (#42452999)

Is there a distributed project whose client will honor my request to only donate half of my resources? Bonus points for one which lets me say which hours of which days it can run.

Resource management used to be the OS's job. What happened?

Re:Any suggestions for a distributed client? (1)

djh101010 (656795) | about 2 years ago | (#42456069)

Wow, THAT's helpful. Oh wait, what's that other word.

Re:Any suggestions for a distributed client? (1)

djh101010 (656795) | about 2 years ago | (#42456111)

See. when an app gives a control panel mechanism that claims to care what percent of a given resource pool it will use, yet, it uses 100% of the resources, it leads to questions such as mine. But thanks never so much for being useless. Let me guess, you want to pretend MacOS Wotever Cat is the problem, when in reality it's some misbehaving app that I tried to help with but which won't play nice. Huh.

Re:Any suggestions for a distributed client? (1)

Kyusaku Natsume (1098) | about 2 years ago | (#42457247)

I have a iMac late 2009 with a Core i7 processor, running Mountain Lion. It runs BOINC manager 7.0.31 and the projects Collatz, Einstein, SETI and climateprediction.net. Of all, only Collatz uses my ATI GPU for OpenCL computations, but BOINC respects the resource share of CPU and GPU time. For NVidia you can run Einstein, Collatz and PrimeGrid.

Also, the latest BOINC clients can be configured to not use your CPU at all if you are running processor intensive apps like Final Cut, iPhoto or iMovie. Useful when you leave the machine alone rendering.

Best regards.

Re:Any suggestions for a distributed client? (2, Informative)

Anonymous Coward | about 2 years ago | (#42453191)

I think Folding@home uses its own specialized client. I've never used it, so I can't help you there. Most of the other distributed (grid) projects out there use the BOINC [berkeley.edu] client. BOINC allows you to schedule processor time to when you want to run, allows the stoppage of distributed processes once CPU usage reaches a certain (user-definable) level, and all sorts of other things. I don't think Folding@home allows the BOINC client to connect, however.

I think what is happening (in your case) is the folding client is taking what you said quite literally, and treating the hyper-threaded cores as real cores. It filled up your 4 physical cores, causing your system to show 100% CPU usage while not utilizing your hyper-threaded cores. I think your OS and folding client are performing exactly as intended. If you truly want only two cores (plus their hyper-threaded cores) to fire, you'll either have to manually set your affinity on the folding tasks or simply tell the folding client to only use two cores.

You can try raising the issue in the help forums on Folding@home and see if they have a better solution.

Re:Any suggestions for a distributed client? (1)

tlhIngan (30335) | about 2 years ago | (#42453485)

So, I just bought a new 4/8 core I7 Mac. Told Folding@home to use 50% of my cores. It persisted in using 100% of my cores, despite what I told it to do, until I uninstalled it. Is there a distributed project whose client will honor my request to only donate half of my resources? Bonus points for one which lets me say which hours of which days it can run. If none of them can, I'll let ElectricSheep provide the eye candy, I really don't care. But I'd rather help out a cause that behaves as I specify on my hardware. Anyone?

Last I checked the BOINC based ones do. You can join multiple projects and even allocate 1 core to one, 2 to another, etc. And yes, I believe it supports even hourly usage, and even has a "back off" protocol where if it sees the CPU busier than some amount (you specify) it won't even bother starting a task - it will sleep for an hour and try again later.

http://boinc.berkeley.edu/wiki/Preferences [berkeley.edu]

Many projects use BOINC... including Einstein@home.

Re:Any suggestions for a distributed client? (1)

Anonymous Coward | about 2 years ago | (#42453795)

Whilst perhaps not as efficient, I find that running resource-hungry applications like these within a virtual machine with number of cores you want to limit it to is a good way to throttle the amount of resource they can consume

Re:Any suggestions for a distributed client? (0)

Anonymous Coward | about 2 years ago | (#42457563)

See this thread: http://boinc.berkeley.edu/dev/forum_thread.php?id=5459

added to my Dotsch/UX farm, want something better (0)

Anonymous Coward | about 2 years ago | (#42453081)

Just added the project to my 16 computer BOINC farm running Dotsch/UX. It used to be dedicated solely to seti@home. Now 50/50.

Its not all that great, I wish there was a better diskless folding farm software out there. Anyone know of any?

Contributing spare cycles at the lowest CPU clock (2, Interesting)

Anonymous Coward | about 2 years ago | (#42453565)

I would like to contribute my spare CPU clock cycles, but without causing my CPU to speed up (in this case, with Intels SpeedStep) from the lowest setting at 800 MHz. Otherwise, my laptop gets hot and loud. How can I do that?

Re:Contributing spare cycles at the lowest CPU clo (1)

Anonymous Coward | about 2 years ago | (#42454219)

for i in /sys/devices/system/cpu/cpu[0-9]*; do echo 1 > $i/cpufreq/ondemand/ignore_nice_load; done

Not a barrier (2)

Lost Race (681080) | about 2 years ago | (#42453931)

1 PFLOPS is an arbitrary threshold or milestone. It's not a barrier because nothing special happens at that point. The speed of light is a barrier. Even the speed of sound is a barrier. 10^n somethings per whatever is rarely if ever a barrier for any positive integer n.

Re:Not a barrier (0)

Anonymous Coward | about 2 years ago | (#42454465)

Factors of 1000 are interesting, because we use base 10 and people have an understanding of the jump by 10^3. Go and get laid, you won't be so miserable.

Glad I left my computer burning last night (0)

Anonymous Coward | about 2 years ago | (#42454099)

This morning I woke up to discover I'd left my computer on overnight usually this bothers me but since I don't like wasting energy and computing cycles I run Einstein@home last night. Besides other BOINC projects what other voluntary computing projects are there for us nerds? I Arity run tor... What else is there?

Power-efficiency (1)

enriquevagu (1026480) | about 2 years ago | (#42459529)

If it was at position 24 in the Top500, it would likely be 3x as power-efficient than having all these individual computers. These sort of initiatives are impressively inefficient (but very effective), this is why the 'cloud' model won the battle over the 'grid' model. It only works because computing power is donated, not paid for. On the other hand, the equivalent supercomputer would likely cost 3-8x the aggregate (wrt the sum of costs of all these computers), because of it being custom-made.

1000.2 TFLOPS reached! (1)

flowerp (512865) | about 2 years ago | (#42459703)

I added two nVidia GTX 260 and one nVidia GT 240 card to Einstein @ Home , and voila this morning's stats show:

Page last updated 3 Jan 2013 8:50:02 UTC
Floating point speed (from recent average credit of all users) 1000.2 TFLOPS

For a BOINC novice it can be quite daunting to figure out how to make it use all GPUs and not accept any CPU-only work units. Editing some XML files in some data directory isn't exactly user friendly.

Re:1000.2 TFLOPS reached! (2)

hazeii (5702) | about 2 years ago | (#42459853)

2xGTX260 are in theory about 1.5TFLOPS so that's welcome fuel on the fire :)

You can also configure common settings using the BOINC preferences [uwm.edu] and Einstein@home preferences [uwm.edu] pages. It seems common to use "location" to set up different preferences for different hosts, e.g. I use "home" setting for machines which are only good for GPU work, "work" for CPU-only systems and the "default" setting for both CPU/GPU (plus "school" settings for experimentation).

Also AIUI the latest client will use all your GPUs as long as they have identical capabilities - so it should use both your GTX260's. You do have to twiddle the XML for stuff like mixed GPU usage, but I've never found the drivers stable enough for that to work well (at least on my ragtag fleet of PCs). I'd hazard a guess it would get tricky if you throw "GPU utilization" into the mix (i.e. running multiple work units on the same GPU, which can speed things up - see the benchmarks thread [uwm.edu] ). Anyway, sounds like you're doing more advanced stuff after one night than anything I've attempted to date.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?