Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Science Technology

New Material Sets Stage For All-Optical Computing 53

An anonymous reader writes with this excerpt from the International Business Times: "Researchers have made a new material that can be used to guide waves of light, a breakthrough that could lead to ultra-fast computing. Georgia Tech scientists are using specially designed organic dyes that can process and redirect light without the need to be converted to electricity first. ... 'For this class of molecules, we can with a high degree of reliability predict where the molecules will have both large optical nonlinearities and low two-photon absorption,' said [Georgia Tech School of Chemistry professor Seth] Marder." According to the article, using an optical router could lead to transmission speeds as high as 2,000 gigabits per second, five times faster than current technology.
This discussion has been archived. No new comments can be posted.

New Material Sets Stage For All-Optical Computing

Comments Filter:
  • "2000 gigabits per second"

    GigaBITs? Wow!

    • Re: (Score:2, Insightful)

      by Serilleous ( 1400333 )
      Yes, its true, transmission speeds and routing capacity are usually measured in bits rather than megs or kb. (this probably has something to do with the whole kb/mb doesn't follow powers of ten thing)
      • so you're saying not only are they bits, but they're power-of-ten giga-?

      • Re: (Score:2, Insightful)

        Probably has more to do with the fact that historically some hardware had byte and word sizes that weren't multiples of 8.

        E.g. see http://en.wikipedia.org/wiki/36-bit [wikipedia.org]

        • No, it has more to do with rates of serial transmission only making sense in bits/sec. The size of the shift register on the other end (bytes, words, whatever) doesn't impact the speed of the transmission system, just how often you have to load/unload it. The reason bytes are used when referring to memory is that that is usually the smallest addressable space (there are exceptions).
    • Sure -- its for routers, a basic component of data transmission, and a bit is the most fundamental piece of data. We typically think in "BYTES", but that's really an OS abstraction right? At the lowest level of the network stack it makes sense to talk in bits -- anything above that can interpret it as it will (like a Byte, Word, Long.. etc)
      • by Anonymous Coward

        Where did you learn it was an OS abstraction? That's just ... sigh.

        Bytes are the smallest addressable unit of memory the CPU can handle. It doesn't matter if the memory controller only does cache line fills or whatever, memory addresses are in units of bytes.

    • by Mashdar ( 876825 )
      Oddly, in the future (when this technology is available):
      2000 / 1000 = 5

      Not to mention that if 1000Gb/s connections are widely commercially available today, I have to assume that faster connections are available for specialized purposes.
      • Commercially available connections are measured in the tens of Gb/s. It's called gigabit ethernet, not terabit ethernet.
    • Re: (Score:3, Funny)

      by belthize ( 990217 )

      Gah I hate these lame random units .. gigabits/second.

      Could somebody translate that to a more standard Libararies of Congress/fortnight please.

    • I am guessing that you did not see it coming, or going.
  • by Anonymous Coward on Friday March 05, 2010 @07:40AM (#31369418)
    "five times faster than current technology." Reminds me of being a teenager and discovering lotion...
  • by Anonymous Coward

    I could not get my eyes from that advertisy pic [ibtimes.com].

    Must...read...article.

  • by distantbody ( 852269 ) on Friday March 05, 2010 @07:55AM (#31369496) Journal
    EETimes has "IBM Research claimed a keystone achievement in on-chip optical communications Wednesday (March 3), saying its 40-gigabit-per-second (Gbps) germanium avalanche photodetector completes what it calls the nanophotonic toolkit." (link) [eetimes.com] (A few days before announcing 2,500 layoffs, hmmm...)

    ...And the same news from Semiconductor Intl [semiconductor.net].
    • (A few days before announcing 2,500 layoffs, hmmm...)

      Simple. IBM no longer needed help because it invented awesome.

  • Optocouplers (Score:4, Interesting)

    by derGoldstein ( 1494129 ) on Friday March 05, 2010 @08:03AM (#31369538) Homepage
    I'd like to benchmark this against graphene [technologyreview.com]. Since optical signals don't have to be converted to electrical first, then (I think) the bottleneck would be the optoelectronics.
    • Re:Optocouplers (Score:4, Informative)

      by Hurricane78 ( 562437 ) <deleted&slashdot,org> on Friday March 05, 2010 @08:38AM (#31369730)

      optoelectronics

      If they don’t have to be converted to electricity first, then where are the electronics in this?

      A better name is “photonics”. :)

      • More verbose description --
        Given that the signals arrive in optical form, you (will) have two choices:
        1) Convert them into electrical signals, using optoelectronics, process the data, and then (sometimes) convert the signal back to optical.
        2) Keep the signals in optical form and process them using these new materials.

        Just because the first option has more steps, doesn't mean it's slower. If you have very fast converters, and then very fast transistors (like the graphene-based ones linked above), then y
      • by sam0737 ( 648914 )

        optoelectronics

        If they don’t have to be converted to electricity first, then where are the electronics in this?

        A better name is “photonics”. :)

        Well that until your computer is totally processed in light, including CPU, motherboard, display card and network interface. I would agree the future AC transformer be an giant LED light blub, then all component inside are light path...to the extend that the light is processed and redirected to the monitor panel directly and a lighting pixel. That's would be very awesome, yet I don't see it coming very soon though :P. Hence we will still stick with electronics somewhere for a while.

    • Even better, Make the optoelectronics in graphene too.

      Would love a quick turn around on technologies such as this, even if for a quick novelty factor at first.

      Even something as powerful as a 486 would surfice, although the mhz would probably not be an issue.

  • +1 for Georgia Tech, twice in one week (Spanish botnet taken down)
  • I've been reading headlines for the past 20 years that claim "breakthroughs" in all-photonic computing. Where are the all-photonic routers?

  • Power & Heat (Score:3, Interesting)

    by CraigoFL ( 201165 ) <slashdot&kanook,net> on Friday March 05, 2010 @10:54AM (#31370924)
    It's funny... when the tech industry first started talking about switching to light instead of electricity for the chip insides, the biggest motivating factor was speed. How much faster (usually determined in "clock" speed even) can we make a chip if we can use photons instead of electrons? These days, I'm more interested in other factors:
    • How much electricity (per unit of performance) does it use?
    • How much heat does it put out?
    • How much smaller can we make the chip and its supporting components?

    This is a result of the highly-clustered, highly-mobile computing age we live in today. A single fast chip isn't as applicable any more. Give us tiny and low-power.

  • by Curmudgeon420 ( 1092149 ) on Friday March 05, 2010 @11:04AM (#31371078)
    The big issues in designing optical switches is their switching time and minimum switch pulse width. I and my group built what is probably the first all-optical computer in the early '90s. We used Lithium Niobate switches, which limited the machine's clock frequency of 100 MHz. It's hard to find the original article, which is in the Feb. 18 issue of Science Express. Subscription required, unfortunately. In that article the authors say nothing about switching time, or minimum switch pulse. It looks like a good piece of research, but eons away from anything practical.
  • Obviously the ramifications as far as emissions security (TEMPEST though that's a simplification of TEMPEST) are huge, but what is this likely to do for heating and component size. I can see this being a great opportunity for a lot of military applications even if the speed limit is only a few times better than what we have now.

  • It's my understanding that fiber optic cable has speeds that are limited by the electronic conversions on either side. Is that what is the issue here as well? How well could this (for lack of a better term) internal light mesh with an external (fiber optic) light?
  • So when I first read the headline and article, my initial question was, "Could similar dyes be used to route light around an object, hence creating a cloaking device?"

    Unfortunately, the article didn't hint at this possibility at all. However, I did pick this up:

    The research was funded by the National Science Foundation (NSF), the Defense Advanced Research Projects Agency (DARPA) and the Office of Naval Research (ONR).

    So DARPA's helping fund it eh? In answer to my own question then, "Yes!"

    Leave it to DARPA to fund the development of a cloaking device and play it off as a computer breakthrough. I, for one, am stoked.

  • Smoke and mirrors: A new material that can be used to guide waves of light

One man's constant is another man's variable. -- A.J. Perlis

Working...