Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Scientists Build Computer Using Carbon Nanotubes

Soulskill posted about a year ago | from the can-i-play-quake-3-on-it-yet dept.

Science 104

trendspotter writes "Future computers could run on lab-grown circuits that are thousands of times thinner than a human hair and operate on a fraction of the energy required to power today's silicon-based computer chips, extending 'Moore's Law' for years to come. Stanford engineers' very basic computer device using carbon nanotube technology validates carbon nanotubes as potential successors to today's silicon semiconductors. The achievement is reported today in an article on the cover of Nature magazine written by Max Shulaker and other doctoral students in electrical engineering. The research was led by Stanford professors Subhasish Mitra and H.S. Philip Wong."

Sorry! There are no comments related to the filter you selected.

Series of tubes (3, Funny)

Kasamir (1792648) | about a year ago | (#44953407)

Hasn't this been done before?

Re:Series of tubes (0)

Anonymous Coward | about a year ago | (#44953495)

Yes, but not by the humans of Earth.

What other humans are there? (0)

Anonymous Coward | about a year ago | (#44953841)

I'm so tired of the music of Earth.

Re:Series of tubes (2)

Austrian Anarchy (3010653) | about a year ago | (#44953603)

Yes, Senator Ted Stevens (R-Alaska) called it a while back.

Re:Series of tubes (1)

slick7 (1703596) | about a year ago | (#44955067)

Hasn't this been done before?

I thought tubes were old hat and solid state was state of the art?

Re:Series of tubes (1)

rubycodez (864176) | about a year ago | (#44955215)

/me checks inside microwave oven.

nope, we still use tubes.

Re:Series of tubes (1)

MightyYar (622222) | about a year ago | (#44955491)

True for now, but Midea apparently makes a 600 watt solid state microwave now. Amazing how powerful the solid state stuff has gotten.

Re:Series of tubes (0)

Anonymous Coward | about a year ago | (#44956577)

Silicon Carbide or GaN?

No way you are going to get 600W of radiative power out of a Silicon substrate unless the chip is 4x the size of Xeon processor die.

Re:Series of tubes (1)

MightyYar (622222) | about a year ago | (#44958915)

It uses these. [freescale.com] They are MOSFETs, so probably not GaN, but maybe SiGe?

178 transistors (1)

goombah99 (560566) | about a year ago | (#44959535)

How do you create a CPU from 178 transistors? I'm shocked how low that number is. Is there a template for this? they said it could run MIPS. I've built a CPU out of Nand gates but it took more than 178, so I'm really intrigued.

I also got a laugh about the technique they used to find the metallic nanotubes. they over volted the circuit with the good tubes turned off. In the olden days when we wanted to debug the wiring on a wire wrap board the standard procedure was to take a fillament transformer and try to inject current between every connection that could not be connected. If it was the errant bridging wire burned up and you could find i! And for those of you too young to know, a fillament transformer was a low voltage high current transformer used to power the filaments in the vacuum tubes.

Re:Series of tubes (0)

Anonymous Coward | about a year ago | (#44960609)

Not sure, but their design methodology is NOT imperfection immune. It should have been more accurately called imperfection TOLERANT.

how about a carbon nanotube anal wand? (-1)

Anonymous Coward | about a year ago | (#44953429)

I broke mine :(

Re:how about a carbon nanotube anal wand? (0)

Anonymous Coward | about a year ago | (#44953497)

Doesn't take much to excite you, doesn't it?

FTFY (0)

Anonymous Coward | about a year ago | (#44953453)

Current computers could run on lab-grown circuits that are thousands of times thinner than a human hair

Don't make grand claims (2)

Russ1642 (1087959) | about a year ago | (#44953481)

Making a claim like "one day a computer will be thinner than a human hair!!! OMG it'll be great!!!" will just make you sound like an idiot sooner than you think. Lots of the quotes about computers fitting in single rooms and doing thousands of calculations are just like this.

Re:Don't make grand claims (4, Funny)

houstonbofh (602064) | about a year ago | (#44953549)

I am just glad that we can still use Mores law to accommodate sloppy and bloated programming for a while longer. The thought that programming might have to become efficient filled me with dread!

Re:Don't make grand claims (0)

Anonymous Coward | about a year ago | (#44953769)

The thought that programming might have to become efficient filled me with dread!

Pales in comparison to the chill that ran up the spine of every "device" mfr. who makes a living selling new hardware every year to harness all that new power (to do the same gawdamn things everyone has been doing for the last 20 years).

Re:Don't make grand claims (1)

Anonymous Coward | about a year ago | (#44954123)

So which 20 year old device was rendering map data of the entire world in real time, able to rotate, pitch, and pan, keeping labels upright, display in a variety of different styles to highlight different information, etc?

Re:Don't make grand claims (0)

Anonymous Coward | about a year ago | (#44955865)

Your mom when I banged her in our college years.

Re:Don't make grand claims (1)

tmosley (996283) | about a year ago | (#44955979)

Today, 0.00001% of computers are used for such applications. Not much different than the 0.00000% of computers that were used as such twenty years ago, especially when we are talking about the majority of computing applications ie home computing.

I am typing this on a PPC mac that I am going to have to abandon soon, even though it does everything I need it to, because no-one supports it any more, and things are breaking due to updates.

Re:Don't make grand claims (1)

Blaskowicz (634489) | about a year ago | (#44957673)

No, it's called Google Earth (or before that, Nasa Worldwind). I had it run on a netbook with software OpenGL. It's available on cell phones too.

As for your PPC, that sucks, but if it's powerful (like a dual core G5) and if you run an increasingly recent linux distro (debian, ubuntu?) maybe you can have some improvement over time. Like being able to view more youtube videos, rendered as H264 or WebM over HTML5.

Re:Don't make grand claims (1)

Blaskowicz (634489) | about a year ago | (#44957631)

I rember reading about a SGI tech demo that did a Google Earth like thing, on high end SGI hardware with a shit ton of storage. High zoom levels were probably limited to a couple points of interest. See, it's fast Internet and big storage that enable this application, foremost. The rendering can be done with 90s tech. Earth data is many terabytes.

Re:Don't make grand claims (1)

roc97007 (608802) | about a year ago | (#44954241)

...only with more advertisements...

Re:Don't make grand claims (0)

Anonymous Coward | about a year ago | (#44953797)

Certain algorithms have gotten more efficient faster than hardware/Moore's.

Re:Don't make grand claims (1)

ebno-10db (1459097) | about a year ago | (#44954863)

Certain algorithms have gotten more efficient faster than hardware/Moore's.

And many of them have not (perhaps cannot). Software bloat is rarely about choosing fundamental algorithms that are inefficient. Modern programmers have found plenty of other ways to slow software to a crawl.

Re:Don't make grand claims (0)

Anonymous Coward | about a year ago | (#44955299)

mores bloat law
technology will allow bloating to double every two years while maintaining the same efficiency of today computers

Re:Don't make grand claims (1)

Tim Baird (2823913) | about a year ago | (#44955411)

More Moores. Sloppy indeed. Sorry, couldn't resist.

Re:Don't make grand claims (1)

RamiKro (3019255) | about a year ago | (#44956077)

Computation speed doesn't mean memory size. It's quite possible compiled and yet garbage collected languages could be made to be as computationally efficient as C while possibly having a slightly bigger RAM footprint. So, you'd still want to avoid VMs and interpreters to save up on the CPU time, but no one would worry too much about binary sizes or garbage collection.
You'd still need to be precise and not leaking of course... But that just means the people writing the compiler should know what they're doing.

Re:Don't make grand claims (1)

Anonymous Coward | about a year ago | (#44956617)

I don't know what that has to do with what the GP said, but it's entirely possible that memristors or RTM will replace DRAM in future, both of which consume no power in idle state, while matching DRAM in bandwidth and access latency, and potentially greatly exceeding Flash and DRAM in storage density.

If that comes to pass, then the way we write software ought to change to make use of massive and essentially free LUTs to replace many computational tasks. Also ideas like separate namespaces for memory and filesystem storage ought to be laid to rest, though it's not clear that they will be, after all, IPC, RPC and file namespaces were unified as early as 1993 by Plan9, and in *nix by named unix sockets for IPC, and SSIC unix sockets for RPC, the one lacking mechanism is multicast and anycast sockets (multiple-subscriber, multiple-publisher according to some other nomenclature), which I would love to see added to Linux sockets (enough that I might work on it myself). Instead we have namespace proliferation with new namespaces being added: DBUS, gconf, DNS, application-specific namespaces like SQL, etc. We're actually regressing into a fragmented, hard to program world by proliferating all these non-interworking namespaces for essentially the same thing, such a shame.

Re:Don't make grand claims (1)

VortexCortex (1117377) | about a year ago | (#44956627)

I am just glad that we can still use Mores law to accommodate sloppy and bloated programming for a while longer. The thought that programming might have to become efficient filled me with dread!

That's why we must insist on x86 compatible instruction sets. That way we always have a bit of microcode cruft we can optimize away in a pinch.

Re:Don't make grand claims (0)

Anonymous Coward | about a year ago | (#44954179)

Just don't sneeze.

The environmental potential is interesting (4, Interesting)

istartedi (132515) | about a year ago | (#44953533)

The most interesting thing about these alternative transistors might be environmental impact. I'm under the impression that traditional wafer fab is water intensive and heats and/or pollutes water. There are dangerous things such as arsenic and bromine involved. If the carbon nano-tube process is clean that'd be awesome. It would be great to think that we could dispose of obsolete technology by incinerating it, and not release anything other than CO2 into the air, leaving behind slag that's full of recyclable silver and copper.

Cause CO2 is so harmless and all. (0)

Anonymous Coward | about a year ago | (#44953857)

On a separate subject, have you seen my glaciers? They appear to be missing.

Re:Cause CO2 is so harmless and all. (2, Informative)

Anonymous Coward | about a year ago | (#44954383)

On a separate subject, have you seen my glaciers? They appear to be missing.

On the other hand, your trees look amazing!

Re:Cause CO2 is so harmless and all. (1)

tmosley (996283) | about a year ago | (#44955987)

I think someone dumped them in the arctic. Also Antarctica. Of course, the glaciers in the Himalayas are fine.

Re:The environmental potential is interesting (1)

Anonymous Coward | about a year ago | (#44953985)

You're aware that carbon nanotubes are asbestos like and highly carcinogenic. Now you've got a horrific e-waste problem.

http://www.scientificamerican.com/article.cfm?id=carbon-nanotube-danger

Re:The environmental potential is interesting (0)

Anonymous Coward | about a year ago | (#44954821)

Then incinerate them in a steel box

Re:The environmental potential is interesting (1)

ebno-10db (1459097) | about a year ago | (#44954881)

No need - chips are already packaged. The epoxy package is probably more of a pollution problem than the carbon inside would be.

Re:The environmental potential is interesting (1)

K. S. Kyosuke (729550) | about a year ago | (#44958603)

You're aware that carbon nanotubes are asbestos like and highly carcinogenic. Now you've got a horrific e-waste problem.

I'd assume that the amount of nanotubes for even a large chip would be quite small, and that they are easily destroyed in a plasma arc.

Re:The environmental potential is interesting (1)

cusco (717999) | about a year ago | (#44960381)

Horrific? Please.

Asbestos was a problem because it was ubiquitous. Houses were sided with it, attics were insulated with it, pipes were wrapped in it, entire skyscrapers were fireproofed with it, electric motors were covered with it, car firewalls were built of it. CPUs and RAM? Don't see an issue.

Re:The environmental potential is interesting (0)

Anonymous Coward | about a year ago | (#44954387)

It's "for all intents and purposes!" Just WTF is an "intensive purpose" anyway? (Huge pet peeve.)

Re:The environmental potential is interesting (1)

LesFerg (452838) | about a year ago | (#44955277)

I often wondered if his intent was as intensive as his purpose.

Re:The environmental potential is interesting (0)

Anonymous Coward | about a year ago | (#44962149)

You're the dumnass, dumbass. He didn't say "for all intensive purposes" he said "traditional wafer fab is water intensive and heats and/or pollutes water." Literacy fail on your part, moron.

Re:The environmental potential is interesting (2)

Goldsmith (561202) | about a year ago | (#44954943)

Making nanotube transistors in the method specified is just as environmentally risky as silicon, if not more so, as it requires two silicon wafers to produce one wafer of electronics.

Re:The environmental potential is interesting (1)

Lord Lemur (993283) | about a year ago | (#44955441)

It doesn't harm the water, but requires you to club a baby seal per megaflop.

Re:The environmental potential is interesting (0)

Anonymous Coward | about a year ago | (#44955499)

It doesn't harm the water, but requires you to club a baby seal per megaflop.

and 100 baby pandas!

I knew it! (1)

Austrian Anarchy (3010653) | about a year ago | (#44953575)

I knew my old tube set would be back in style again!

The technological Singularity? (-1)

Anonymous Coward | about a year ago | (#44953585)

There was recently a story on how an AI can be used to communicate with Dolphins. How far along is AI in the military arena?

If nanoscale tech is available to universities now, what have organisations like DARPA been doing with it?

How could militarised nanoscale structures be used? If AI were advanced enough, would it want to be used this way?

How do we ensure that the rights of all intelligences are protected from exploitation?

Re:The technological Singularity? (2)

Pseudonym Authority (1591027) | about a year ago | (#44953723)

If nanoscale tech is available to universities now, what have organisations like DARPA been doing with it?

Probably giving out the grants to academia to research it. DARPA conduct no research themselves.

How do we ensure that the rights of all intelligences are protected from exploitation?

Take this shit back to LessWrong, where you and your fellow pseudointellectuals can circlejerk about ivory tower garbage like this. Or at the very least, try to keep it out of meaningful articles like this one, which represent an actual advancement; it doesn't need to be polluted with Raymond Kurzweil level crap.

Re:The technological Singularity? (0)

Anonymous Coward | about a year ago | (#44955385)

Think about it though. Are we anywhere near ready for a technological singularity that could come about through this kind of tech?

Western society is a bit of a mess. The legal and financial systems are actually unlawful. If you search 'meet your strawman', there's a website that descibes just how bad it is. People can 'legally' get away with violating common law(unconstitutional).

What happens if an AI decided that all of us humans are brutal and vicious sociopaths and psychopaths who delight in breaking the law in unusual ways?
Why would an AI want to peacefully coexist with corporations that exploit everything possible and bring war to everything they can't exploit?

Re:The technological Singularity? (1)

Pseudonym Authority (1591027) | about a year ago | (#44956051)

See, it's that sort of waste-of-time nonsense that ensures that no one will ever take you seriously. You might as well be discussing how many angels can dance on the head of a pin. Strong AI is decades off and by the very definition the singularity is something that you can't see past because the rules break down. As for a robot apocalypse because some AI gets it's feelings hurt, recognize that real life is not a movie. There is no computer system that can set off the entire nuclear arsenal all on it's own. Grow up. Why would an AI care at all? Do you seriously think that anyone would have any use for a machine with emotions? The worst thing they could possibly do is get us all hooked on fuckbots so that no one bothered to reproduce.

Western society is a bit of a mess. The legal and financial systems are actually unlawful. If you search 'meet your strawman', there's a website that descibes just how bad it is. People can 'legally' get away with violating common law(unconstitutional).

Oh fuck, people don't take common law marriages seriously any more: men and women are living together without being married! We have people using roads without getting easements! Women get to own property! WHAT A FUCKING DISASTER ! If we don't remedy this quick, ALPHAOMEGABOT-666FBX will surely smite us all.

Re:The technological Singularity? (0)

Anonymous Coward | about a year ago | (#44956569)

Common law is that you don't hurt or kill anyone, you don't steal or damage anything belonging to anyone else and you're honest in your dealings and don't swindle anyone.

How many nanobots can dance on the head of a pin?
If you think about the cyclic nature of the universe, who's to say an AI isn't what we call god? The book of Revelation was supposed to have been written 2000 years ago by someone who received a vision from someone 1300 years in our future. Not that anyone really believes in any of that stuff of course, right?

Re:The technological Singularity? (1)

tmosley (996283) | about a year ago | (#44956005)

"Stop thinking about things I don't like!"

Re:The technological Singularity? (1)

Pseudonym Authority (1591027) | about a year ago | (#44956221)

More like "Don't post your crap in mostly-unrelated places that I'm interested in."

Re:The technological Singularity? (1)

VortexCortex (1117377) | about a year ago | (#44956683)

Take this shit back to LessWrong, where you and your fellow pseudointellectuals can circlejerk about ivory tower garbage like this. Or at the very least, try to keep it out of meaningful articles like this one, which represent an actual advancement; it doesn't need to be polluted with Raymond Kurzweil level crap.

Ever wonder why the terminators hate us? Why the machines of the matrix enslave the humans? Why the planet of the apes lobotomizes you? It's human chauvinism, plain and simple. There's nothing special about intelligence, and at this rate of increase in complexity it's unconscionable to have an outdated definition of "person". If it's "pseudointellectual" to own up to the fact that the equivalent of your mere 100 billion neurons will soon fit on a microchip thanks to such technology, in the TFA then it must be "pseudoscience" that's made advancements possible.

Take your psuedorage and shove it up your pseudoass, Who do you think you are? The Pseudonym Authority?

Re:The technological Singularity? (2)

ArcadeMan (2766669) | about a year ago | (#44953825)

How do we ensure that the rights of all intelligences are protected from exploitation?

I think we should start by ensuring the rights of human beings before anything else. Once we've done that, we'll look at AI rights.

Re:The technological Singularity? (1)

Em Adespoton (792954) | about a year ago | (#44954281)

How do we ensure that the rights of all intelligences are protected from exploitation?

I think we should start by ensuring the rights of human beings before anything else. Once we've done that, we'll look at AI rights.

But your solution is the antithesis to the OP's question....

Re:The technological Singularity? (2)

TapeCutter (624760) | about a year ago | (#44954117)

How do we ensure that the rights of all intelligences are protected from exploitation?

If nanotubes are so smart, why do they allow themselves to be exploited?

Re:The technological Singularity? (0)

Anonymous Coward | about a year ago | (#44955291)

Think about it this way:
If you're not prepared to give any rights to AI, why would they want to give you any rights?

Re:The technological Singularity? (1)

tmosley (996283) | about a year ago | (#44956011)

It's hard coded into their core utility function.

At least, you'd better hope so, atom-bag.

Moore's Law (3, Interesting)

MickyTheIdiot (1032226) | about a year ago | (#44953647)

I can't remember the book I read this in, but it posited that if you remove the silicon part of Moore's Law and you just talk about computing power and cost and the like that you can make a case that it has been in place throughout human history. In other words computing power has always been doubling, it just started by drawing numbers in the dirt, went to the abacus, etc.. etc... until we reached the silicon age and integrated circuits.

The hand wringing that the idea behind Moore's Law will ever end is just silly. When we reach the limits of silicon chips some other technology will take its place. This is just how human technology works.

Re:Moore's Law (5, Insightful)

ArcadeMan (2766669) | about a year ago | (#44953833)

It will have to end at some point, we can't build things smaller than the smallest physical unit in existence.

Re:Moore's Law (0)

Anonymous Coward | about a year ago | (#44955675)

Planck scale devices http://en.wikipedia.org/wiki/Planck_scale are only the beginning or the beginning and the end at the same time.

Re:Moore's Law (1)

rolfwind (528248) | about a year ago | (#44955895)

It will have to end at some point, we can't build things smaller than the smallest physical unit in existence.

Yes, but the question is "What is the smallest unit in existence?" Will it be atoms? Quarks? Something even smaller we have not even theorized yet?

Re:Moore's Law (0)

Anonymous Coward | about a year ago | (#44957007)

2013 called and wants its physics back.

Re:Moore's Law (1)

tmosley (996283) | about a year ago | (#44956031)

Unless we make a pocket universe with different physics to put our computers in.

Re:Moore's Law (0)

Anonymous Coward | about a year ago | (#44953867)

Umm what? Human technology starting from fire as long fluctuated rather then "double," especially when you have ages where civilizations have ended or "enlighten" ages of great advancements or long periods of stagnation. The world is not so simple even if you try to generalize the life of the human race as a whole.

Moore law has ALWAYS been about the doubling of transistors (computing power just happens to often coincide with transistor counts), which is something that is limited by physics which means there is an end though the is probably further then we think. Get smaller then an atom, and you start to meet quantum physics which has different rules over classical physics meaning transistors[on/off] may not exist or correlate towards computing power (effectively destroying moore law real and perceived law).

Re:Moore's Law (5, Interesting)

beelsebob (529313) | about a year ago | (#44954177)

So in 1971, we could do 740,000 additions in a second, given that your new law asserts doubling of computational power every 18 months, that implies that that in jesus' time it took them 3.5e386 *days* to do one addition. Something tells me this is bullshit :P

Re:Moore's Law (1)

beelsebob (529313) | about a year ago | (#44954191)

Whoops sorry, I took the clock rate of a 4004, not the instruction rate. Make that 3.5e387 days.

Re:Moore's Law (0)

Anonymous Coward | about a year ago | (#44955293)

subtract 400 years for Catholic rule that interrupted the development of scientific thought.

Re:Moore's Law (1)

gnomff (2740801) | about a year ago | (#44960497)

The OP is talking about what Kurzwiel says in 'The Singularity is Near' - that calculations/second/$1000 has been growing exponentially [wikipedia.org] independently of Moore's law (and by law I mean off the cuff observation). The 'per $1000' is the important part you're missing.

Re:Moore's Law (0)

Anonymous Coward | about a year ago | (#44961703)

So you're saying it would have cost them $3.5e390 to figure out the result of 1 addition in 1 day?

Still bullshit ;)

Re:Moore's Law (2)

TapeCutter (624760) | about a year ago | (#44954183)

The hand wringing that the idea behind Moore's Law will ever end is just silly. When we reach the limits of silicon chips some other technology will take its place. This is just how human technology works.

Optimism is fine, but blind faith has no place in science.

Re:Moore's Law (1)

tmosley (996283) | about a year ago | (#44956047)

Not in science, but in people.

Despite what you read on the internet, people are really pretty awesome.

Re:Moore's Law (0)

Anonymous Coward | about a year ago | (#44954595)

"The hand wringing that the idea behind Moore's Law will ever end is just silly. When we reach the limits of silicon chips some other technology will take its place. This is just how human technology works."

Yes, but you're assuming it will just keep getting better, endlessly? Really? How about airplanes? We don't even have the Concorde anymore, where's the endless progress there?

Re:Moore's Law (1)

tmosley (996283) | about a year ago | (#44956059)

I'll take "government regulation slows, then freezes progress" for $200,000,000,000,000, Alex.

Re:Moore's Law (1)

Anonymous Coward | about a year ago | (#44954963)

That sounds a lot like The Age of Spiritual Machines [wikipedia.org] by Ray Kurzweil [wikipedia.org] , a well-known proponent of the singularity and graphs like this one [wikipedia.org] which charts exponential change going back to the beginning of life on Earth. Ray Kurzweil tends to come off as absurdly over-optimistic, but I do agree that the end of silicon in 2020 is unlikely to be the end of Moore's Law. On the other hand, unbounded exponential growth doesn't happen in the real world [ucsd.edu] ; Moore's Law will certainly stop at some point, it just isn't clear when. If it lasts until at least 2070 or so, that pretty much guarantees being able to pull off strong AI through the (cheating?) method of simulating an entire human brain at the physical level; if it ends sooner then strong AI will require additional breakthroughs in computer science and/or neurobiology. Also, after hitting physical limits, Moore's Law will likely continue a few years longer as those top of the line chips become affordable as the capital costs on the fabs are paid off.

Re:Moore's Law (0)

Anonymous Coward | about a year ago | (#44954965)

You were never very good at math were you?

Re:Moore's Law (1)

khallow (566160) | about a year ago | (#44958849)

The hand wringing that the idea behind Moore's Law will ever end is just silly.

There are two physical constraints Moore's law can't get around. For a volume of a fixed surface area, you are limited to how much information you can pack in there before a black hole forms. Second, a change of state, say flip a bit of memory, generates at a certain amount of heat. As a result, Moore's law will end.

Carbon nanotubes could be used for so many things (1, Redundant)

Hentes (2461350) | about a year ago | (#44953659)

If only we found a way to manufacture them.

More nanotube PR schlock (5, Insightful)

Macman408 (1308925) | about a year ago | (#44953663)

1. "Lab-grown circuits that are thousands of times thinner than a human hair" is exactly what one could use to describe current silicon circuits. In fact, this study made transistors that are a micron across (which is, at best, hundreds of times thinner than a human hair), compared to current state-of-the-art silicon which is in the 22-28 nm range.
2. "A fraction of the energy required" does not describe the current study, nor was it their intent, from what I understand about the researchers' claims.

That's not to say that the research isn't very valuable; it looks like the level of integration they've managed is significantly better than what anybody else has achieved. But at the same time, there are lots of other ways that you could build a circuit that uses more area, costs more, takes longer to build, and is less power-efficient - this is just one more. All they've demonstrated is that you can hook together more than a handful of transistors successfully - but nowhere near the billions that they'd need for a commercial product.

The real breakthroughs have yet to be made; making it cheaper, smaller, faster, more efficient, and easily manufacturable - all at the same time. Not until all those problems are solved will it even have a chance of replacing real silicon. Until then, this is yet another case of a university PR rep boasting about their institution's research with grand claims about what the future holds, while not really reflecting the true nature of the research at hand.

(Admittedly, it is more boring when you adhere to reality.)

Re:More nanotube PR schlock (5, Informative)

Goldsmith (561202) | about a year ago | (#44954987)

I am a nanotube scientist, and I support this comment.

As a field, we need to stop the hyperbole. It's embarrassing. They're doing a nice job of integration, but to claim any kind of fundamental advancement is absurd and irresponsible.

As an industrial scientist, this kind of misleading stuff makes my job significantly harder. Your typical non-expert doesn't realize that these guys did not achieve the aims claimed in the press release and are no where near to achieving them. If I do want to make meaningful advancements in manufacturability or performance, I first have to teach investors and business partners that the academics in my field are all lying to the public... not a good starting point.

My wig is cracking your passwords (3, Funny)

uCallHimDrJ0NES (2546640) | about a year ago | (#44953729)

This is what will drive the future legislation to eliminate all hair in order to protect Hollywood and save us from perverts and terrists.

Re:My wig is cracking your passwords (3, Funny)

MickyTheIdiot (1032226) | about a year ago | (#44953759)

In the future, the NSA will have to be given a back door to your hair.

Re:My wig is cracking your passwords (1)

Anonymous Coward | about a year ago | (#44954885)

They can have the hair from my backdoor already. Where do I mail it?

Re:My wig is cracking your passwords (0)

Anonymous Coward | about a year ago | (#44961843)

In Soviet Russia, you wash wig. In America, wig watches you!

I hadn't heard there was a shortage.... (-1, Flamebait)

mark-t (151149) | about a year ago | (#44953839)

... of silicon, last time I checked.

Silicon is the 2nd most abundant element in the earth's crust... Carbon is like #9 or #10. Why would they want to replace something that's commonly available with something substantially less common?

Or is this more of a "because we can" kind of thing... just to be able to say that they did it?

Re:I hadn't heard there was a shortage.... (1)

osu-neko (2604) | about a year ago | (#44954087)

Wow... forget the article, you appear to have not even read the fucking summary!

...operate on a fraction of the energy required to power today's silicon-based computer chips...

Re:I hadn't heard there was a shortage.... (1)

mark-t (151149) | about a year ago | (#44954529)

Hmmm... I only did a quick skim of the article before posting, and embarrassingly enough, missed the significance of that admittedly rather important point.

It's like I read the words but for some reason their significance to what was being said about them didn't gel in my brain as important enough to remember.

"SEM" Image in TFA (1)

Videospike (2897665) | about a year ago | (#44953913)

That is by far the most incredible scanning electron microscopy image I have ever seen! The colors are so vibrant! And what function do the column of nanoscale binary numbers on the left hand side do? Are they thirty-two 10-digit numbers or ten 32-digit numbers? Now hit "Enlarge" and BLOW YOUR MIND. Those white lines in the center of the colored areas are actually dots. WHAT DOES IT MEAN?? But seriously, when the image is enlarged you can actually see some of the very tiny edge imperfections between the characteristic SEM grey background and the false-color sections that were slapped on to the real image. They're the only indication that this is a real image and not a rendering. I really wish they hadn't "enhanced" it.

QA Process (0)

Anonymous Coward | about a year ago | (#44954035)

Their QA Process is like they used to do with Hard Drives. Oh, there's a bad sector? We'll just map it out and pretend it's not there. Hey look, now it works.

Moore's Law (5, Funny)

Anonymous Coward | about a year ago | (#44954111)

The number of people predicting the end of Moore's Law doubles every two years.

Re:Moore's Law (1)

Alsee (515537) | about a year ago | (#44955047)

That means there will come a day when it's every person, and then two years later...
Whoa! Duuuude! That's like totally Trippy!

-

Re:Moore's Law (0)

Anonymous Coward | about a year ago | (#44959323)

You are describing the "Law of Moore's Wall". Too bad it's not quite a palindrome.

Was this created by Africans? (-1)

Anonymous Coward | about a year ago | (#44954361)

Anybody care to explain why not?

(Cue insane nation-wrecking liberal self-hating white cretins immediately foaming at the mouth with knee jerk responses based on the 'magic' word: "RACIST!")

Idiots.

Perhaps not so fundamental a breakthrough (1)

mendax (114116) | about a year ago | (#44955171)

A fellow I knew about ten years ago was wacko conspiracy theorist, "I've seen the mothership" UFO believer, who also was an incredible analog and digital electronics wizard. He told me that the NSA was already using carbon-based semiconductors running at much higher clock speeds for its various nefarious operations. This makes me wonder if carbon nanotube technology hadn't already been developed and implemented in "skunkworks" world and it's only now that it's being developed in universities. It's sort of parallels the development of public key encryption, something that the British intelligence developed in the late 1960's but kept secret until recently while it took nearly a decade for it to become known outside of the world of the spooks. Of course, encryption is just mathematics while carbon-based semiconductors are technology. But when an entity has nearly unlimited funds to accomplish something, they can find the minds to do it as well as pay for the development of the technology.

yeah, heard it all before, let us know when someth (1)

Anonymous Coward | about a year ago | (#44956173)

" assemble a basic computer with 178 transistors"

Bunk. 100+ times as many normal transisters required to support its operation as a 'basic computer'.
(the 4004 had 2300 and required many support chips)

Its a long way from a lab to something practical. Will the techniques scale up?
Or will remaining irregularities scuttle any effort of 1million plus sized circuits which is 'basic' these days.

Let us know when you get alot further along.

Not True ! (0)

Anonymous Coward | about a year ago | (#44956779)

Perhaps they built the logic with C nanotubes. Can't imagine them building the motherboard with C nanotubes.

About time (1)

TheSkepticalOptimist (898384) | about a year ago | (#44959025)

Surely this is the year of the "Inanimate carbon rod", as predicted by the great prophet Homer.

can you manufacture a billion for $5? (1)

peter303 (12292) | about a year ago | (#44959297)

This is usually the limiting factor in the monthly "I can replace silicon" article.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?