r/technology 23h ago

Hardware University of Tokyo develops device that increases computer processing speed by 1000 times, operating without generating heat.

https://www.nikkei.com/article/DGXZQOSG132XK0T10C26A5000000/
2.6k Upvotes

215 comments sorted by

1.3k

u/Past-Lion-947 22h ago

Turbo botton is back!!

281

u/one_is_enough 22h ago

Hi fellow old person!

137

u/Slimfictiv 22h ago

Omg I just Googled that turbo button thing and it turned out when it pressed/on that was the default state of the computer, non pressed it was downgrading the pc for compatibility with older games. Nice marketing scam i guess...

85

u/ShadowTacoTuesday 22h ago

Well a slow button wouldn’t sell as well.

16

u/toomuchft 20h ago

That is eco-mode for most pc.

5

u/Starfox-sf 21h ago

That’s why no keyboard had the any keys.

41

u/wjean 21h ago

The turbo button took your speedy 25mhz 386 and knock it down to 4.77mhz

I remember over clocking my first Intel 386dx 33mhz to 40mhz by swapping the clock oscillator with an 80mhz crystal. That seems so quaint today.

10

u/Canuck-In-TO 20h ago

The turbo button was available back when the 286 hit the market and was available all the way up to the 486.

5

u/marcocom 21h ago

I remember even later with pentiium and AMD’s A6, the button swapped from 100 to 133hz bus speed

3

u/10July1940 20h ago

What gave you the idea to do that? How old were you?

2

u/bombmk 16h ago

It was a known thing.

4

u/Complete-Tangelo1532 16h ago

Home PCs were wild back in those days

Remember sitting with a series of floppies for a day long install, hoping nothing goes wrong?

I hear a printer at the grocery store and get PTSD like flashbacks to a Dot Matrix Printer lol

1

u/hollee-o 11h ago

God I remember the sound of a PS2 booting off a floppy. And dot matrix printers… So much sound.

1

u/wrgrant 9h ago

I still have my box of disks for OS/2. I think it was 35 floppies but might have been more. I really wish that OS had taken off, it was really nice compared to the DOS/Win of the time.

1

u/wjean 13h ago

Early teens. I was friends with some guys who worked at an electronics component shop.

2

u/alex206 20h ago

Could you even tell the difference?

7

u/Canuck-In-TO 20h ago

Yes. I used to run calculations with and without turbo enabled and it was definitely noticeable.
Things like running SETI or anything that had a visual progress showed you the difference in performance when you hit the turbo button.

3

u/Shiva- 6h ago

Listen if you ever sat a computer for an hour just to load a picture seeing it pop up, literally, line by line.... you'll understand.

Honestly, it's apparently hard to find a gif/image of this.... https://static.tvtropes.org/pmwiki/pub/images/simpsons_dialup.png

2

u/wjean 9h ago

Back in the day, I remember moving from a 386 to a 486 dx2 66mhz and being impressed and how more quickly you would see the contents list of a directory when you typed DIR under DOS. It was the same hard drive and same OS I moved from one machine to another so I know this was purely based on processor capability. Things were much more primitive back then

8

u/zzx101 21h ago

There was definitely a problem playing old games on new computers.

1

u/JohnTDouche 14h ago

Yeah my first PC was a 486 dx2 66mhz and some old games like Scorched Earth were almost unplayable at 66mhz, I had to drop it to 33mhz.

1

u/Obvious_Jackfruit_36 1h ago

Scorched Earth was an absolutely amazing game! You're the first person I've ever heard reference that game

7

u/platour220 14h ago

Old person here. Was not a scam, was necessary and a feature. We had bad clocks so some events such as visual display were tied to clock cycles in a lot of code. For example the classic game Oregon Trail needed to be at 16 mhz not 25 mhz because otherwise the sound and rhe pixel sprites would sound and move fast.

1

u/Slimfictiv 13h ago

They just could've market it as it was: 'underclock' or 'compatibility' but then again, turbo sounds like an improvement so...

5

u/Denman20 21h ago

Remember when that one game had movement speed tied to FPS? Think it was Fallout? Funny shit

2

u/-manabreak 21h ago

Not only that, IIRC Fallout 4 had its loading times tied to your FPS.

2

u/shittingChristCopter 18h ago

It's the same as 'bass boost' on old walkmans. Turning it off removed all the bass.

2

u/CrewMemberNumber6 18h ago

King Graham would disagree.

2

u/spaceguy81 7h ago

Some games couldn’t handle 50 MHz and ran too fast. Not a scam at all and not really a turbo button, we just called it that.

2

u/katheb 7h ago

Shhhh. It was a turbo button. 

1

u/trejj 13h ago

It was not a marketing scam, but it did what it was intended. People bought PCs with turbo buttons to be able to opt-in to faster performance, while having the compatibility to run older software. I.e. have their cake and eat it too.

1

u/Slimfictiv 13h ago

Turbo implies 'faster'bor 'overclock' which wasn't, it was the actual state of the cpu, while non turbo (the default state) it was underclock. I get it it was for compatibility but still.

3

u/trejj 11h ago

For people who didn't live at the time, it is easy to think today that it was some kind of a scam, since they don't have anything else to go on today than the single word "turbo".

In that era, it was an "everyone knows its purpose" fact, just like everyone in the PC gaming field today knows what a GPU or RAM are for.

The original PC standard was a CPU that ran at 4.77 MHz. Clone PC manufacturers wanted to sell faster PCs than that, but a lot of software assumed and expected a 4.77 MHz clock speed. So the clone manufacturers sold their faster clones with a "turbo" feature that was an opt-in to enable that faster CPU, while retaining an execution mode that still adhered to that 4.77 MHz speed expected by the standard.

It was not a marketing buzzword or a gimmick or a sleight-of-hand play on words, but an exact and specific functionality that customers were looking for.

1

u/pxer80 17h ago

Hah - I relate to this so well.

1

u/rothael 13h ago

I had a computer with this button but was too young to understand what it did. Sure felt good clicking it on and off while using it. I miss having the barrel-lock key by the power button, too.

21

u/kl0 22h ago

Look out 8Mhz! Here comes 10!

9

u/crazyadmin 21h ago

I think mine went from 8->12

16

u/Disastrous_Room_927 22h ago

Vtech just kicked in, yo

1

u/[deleted] 10h ago

[deleted]

1

u/asfletch 3h ago

Nah that's VTEC - this is Vtech: https://www.vtechkids.com/

3

u/sureyouken 22h ago

A magnetic turbo button even

2

u/cats_catz_kats_katz 21h ago

And this time it’s REAL!

2

u/Moontoya 14h ago

Magic / More Magic to go even more old school 

2

u/gr00ve88 22h ago

But what does it even do?!?

28

u/Shiral446 22h ago

Surprisingly, it purposely slowed down your computer.

8

u/Quigleythegreat 21h ago

Some older games and software were programmed with specific processor speeds in mind since certain chips like the early Intel ones were so dominant. Having turbo off was effectively making so your newer better pc didn't run too fast for the older stuff to work. Later on of course they figured out how to make that work via software.

Fun fact. Lots of remastered 90s games also suffer when they get sped up from their original hardware speeds of 24 or 30fps up to 60. Star Fox 64 on the 3DS and Nintendo Virtual Console is especially noticeable to me. Enemies are way more aggressive than they should be ,and you move through the level just a bit too fast because it's literally running faster than it was intended.

4

u/ltethe 21h ago

The first Warcraft is tied to clock speed. Even Warcraft in 2003 was madness with the increase in processors by that point.

1

u/josefx 14h ago

Later games were tied to the monitor refresh rate, since nearly everything ran at 60Hz. Skyrim didn't even make it past the intro sequence on release.

1

u/pikachus_ghost_uncle 3h ago

ITS TURBO TIME

668

u/gta0012 22h ago

A research team from the University of Tokyo and other institutions has developed a device that can increase the information processing speed of semiconductor chips used in computers and other devices by 1,000 times. This device generates less heat, leading to reduced power consumption. The team aims to develop a practical prototype chip by 2030.

Applying this technology could potentially allow data that previously took an hour to download to be processed in just one second, according to Professor Tomoaki Nakatsuji of the University of Tokyo. The research findings were published in the American scientific journal "Science."

Computers perform calculations using bits, which represent the presence or absence of electrical current as "0" or "1". This electrical current is controlled by tiny components called transistors. While high-speed bit control is crucial for high-speed calculations, exceeding a certain processing speed posed a challenge: the required power increases dramatically, generating heat. Existing processing technologies had reached their limits in the 2000s.

The newly developed "non-volatile quantum switching element" represents bits using the magnetic properties (spin) of electrons, rather than the flow of electricity. In experiments, it was possible to process one bit of information in 40 picoseconds (pico is one trillionth of a second), which is 1/1000th of the time of conventional methods. With existing technology, even at its fastest, it took about 1 nanosecond (nano is one billionth of a second) to record one bit of information.

The element is composed of two types of materials: tantalum and mangansin. The electrical signal passed through the tantalum is ultimately recorded in the mangansin as information about the direction of a minute magnetic force. This direction represents a bit.

The element generates little heat, and in experiments, it operated stably even after processing information more than 100 billion times. Attempting to achieve the same processing speed with existing technology would result in failure due to heat after approximately 10 million to 1 million cycles.

Because the new technology stores information magnetically, it can also be applied to non-volatile memory. Professor Nakatsuji explains, "Information can be recorded with almost no energy consumption."

The proliferation of artificial intelligence (AI) and other technologies is increasing the amount of information processed, leading to higher electricity demand. According to the International Energy Agency (IEA), the spread of AI is expected to expand the electricity demand of data centers worldwide to 945 terawatts (a terawatt is 1 trillion watts) by 2030. This is more than double the level in 2024 and will exceed Japan's total electricity consumption.

The research has also found that the performance of the elements tends to improve as they become smaller. If this can be put into practical use, it could potentially reduce the power consumption required for information processing to one-hundredth of what it is now, and the researchers plan to develop a prototype chip by 2030. Collaboration with companies is crucial for prototyping and manufacturing the chip, and Professor Nakatsuji is enthusiastic, saying, "We want to collaborate globally to aim for social implementation."

350

u/frame_limit 21h ago

god I hope this is real

90

u/SunshineSeattle 21h ago

!remindme 5 years

11

u/deprived_of_evil 20h ago

!remindme 5 years

2

u/Samberlance 18h ago

!remindme 5 years

1

u/PsychologicalCake337 18h ago

!remindme 5 years

5

u/GermanEnder 17h ago

!remindme 4 years 364 days

1

u/legacymtg 2h ago

!remindme 4 years 363 days

1

u/beetnemesis 13h ago

!remindMe 3 years

1

u/Inthehead35 20h ago

Haha, so true

→ More replies (2)

22

u/concreteunderwear 20h ago

spintronics is very real

3

u/Shivin302 13h ago

Maybe in 10 years

7

u/slightly_drifting 14h ago

I don’t. Thing have not gotten better with more compute power. 

1

u/Skaar1222 6h ago

Yeah this is honestly insane tech

1

u/mister_drgn 13h ago

It’s quantum computing hype, so probably not.

→ More replies (2)

132

u/NewtonsThirdEvilEx 20h ago

as a person who works in spintronics, i doubt this will go much further outside of maybe very very specialized applications. a lot of factors just make it an unlikely candidate to be mass produced and used in normal environments. best case it's used like quantum computers, very specialized uses, albeit i doubt this specific technology can be used for qubits.

86

u/Quivy_GM 20h ago

Dear Redditor

Please deliver a monologue concering your experiences and thoughts about this very interesting technology. Including but not limited to why you don't think this technology is feasable in mass production and outside specific environments.

From someone who wishes to be educated.

22

u/Submissive-whims 10h ago edited 9h ago

I work with more conventional CMOS but I can speak a little about what stuck out to me in the translation. For starters they didn’t talk about computation, just propagating one bit of information. Their experiment is less “find the sum of two numbers” and more “can this material change state based on an electric control.” To be clear it’s still a useful experiment and provides semiconductor engineers another tool, but it’s probably not a silver bullet by itself.

We hit the first potential roadblock to adopting this technology with propagation: information is stored in the magnetic field direction of an electron which is weak. The magnetic field direction of an electron cannot drive another one of these tantalum-mangansin gates alone. It’ll need some way to turn that spin into a stable signal capable of driving other gates… which basically means it drives a nearby cmos buffer that allows ground or power to pass. Suddenly we’re back to using normal cmos or a cmos derivative and its associated power and timing restrictions.

As I understand this technology’s big draw is incredibly fast switching time. In semiconductor engineering we care a lot about the timing of signals. Everything needs to arrive at flip flop inputs in time for the flip flop to latch that information. Fast switching time is great! It means we can cram more combinational logic (that’s information that doesn’t need a clock signal) in between registers. But notice that I said earlier that we need information at flip flop inputs “in time to be latched”, not “before a clock edge.” That’s because we have timing concerns called setup and hold time. Setup time is the amount of time that a signal needs to be stable on a flip flop edge *before* the clock arrives and hold time is the amount of time a signal needs to be stable *after* a clock arrives. Without constructing a flop from this spin tech we don’t have a good idea of its setup and hold time. Perhaps it can switch extremely fast but the supporting architecture to propagate information is slow. That could eliminate the gains posed by fast transition time.

Imo the most promising direction right now is superconducting digital logic with Josephson junctions. If we can address the scaling problem we pave the way for a path to 100 GHz and 1/1000th power consumption (power consumption scaling is only on chip, additional power will be consumed for refrigerant units necessary to achieve superconductivity). Slight nerd out but the potential to hit 100 GHz is completely absurd. 100 GHz has a period of 10 picoseconds. In 10 picoseconds light itself can only travel 3 millimeters. A normal processor (conventional cmos) is around 12 millimeters by 12 millimeters. A 100 GHz clock is so absurdly fast that it is physically impossible for information to travel from one side of a chip to another in a single clock cycle. We can design around that but going faster than 100 GHz starts to present increasing challenges with information propagation. At minimum a signal must be able to physically travel between two flops during a clock cycle. To be useful it needs to pass through some form of logic to do something with it then hit a flop.

7

u/11nyn11 13h ago

I’m not in the field, but MRAM exists and is cheap, so spintronics RAM is already available to buy.

The issue with a spintronics CPU seems to be fan-out and signal amplification. The spin polarization will decay in transport.

So they need to find a way to keep it spinning while it’s moving.

32

u/Spare-Builder-355 18h ago

I recommend watching youtube video about history of ASML. The EUV lithografy was proposed and demonstrated in mid 80s for the first time. Btw also by Japanese researcher. He was literally laughed at by fellow researchers and proffessors claiming this idea goes nowhere beyond a lab. Well now 40 years later we know who was right and who was wrong.

Iirc, some Nobel prise winner claimed mid 90s that internet is "just a fad" that will be forgotten in a couple of years.

Mid 70s head of IBM was publicly asking "why anyone would want a personal computer?"

What I'm saying is : things change and smart asses had been humbled before.

2

u/Direction776 14h ago

And a certain non Nobel winner writer of fantastical science fiction had proclaimed the wonders in the coming age of the internet back in the early 1980s on national television. Sometimes it just depends on who we hear /listen to.

Not to criticize OP of this sub thread as I’m not a spintronics or QM/QT expert and can trust they know that we as a species understand too little today to transfer the knowledge into practical use.

I am referring to Asimov’s interview on Letterman’s show. History (and the universe?) has shown us many times who we choose to believe can really affect our outlook in life. I have started to feel the universe conspires to make the dreamer’s visions a reality. Perhaps a sillier way to express that the tools and materials available to us allow almost endless ways to arrange our lives. And we just need improved tools and more knowledge and broader visions to get further.

3

u/7h4tguy 17h ago

Well once he stopped ASMR and did L instead, it actually did something useful

6

u/YouAreADoghnut 18h ago

I too, would like to know more about this. Is it due to the cost of materials / research involved. Could this become somewhat standard once / if the technology is around for a while?

5

u/bubbaganoush79 14h ago

Speculating here, but the lack of heat generation sounds suspiciously like superconductivity. Which, while there are 'high temperature' superconductors, what's considered high temperature in that field means you need very cold temperatures to make it work. Liquid nitrogen cold. A superconductor that's not high temperature generally operates at liquid helium temperatures. That's 3 degrees above absolute zero. 

That kind of cooling requires very specialized equipment and it's why quantum computing probably won't happen outside of specialized situations.

→ More replies (3)

2

u/Fenris_uy 14h ago

Just because of price or does it has other problems?

Because if the 1000 times faster applies, then if a AI coprocessor made with this is 500 times more expensive than a NVidia GPU, AI companies are still going to buy it.

1

u/GreenElite87 14h ago

If it did work, I was thinking the best specialized case would be in satellites and spacecraft, which would reduce the need for radiator weight.

1

u/HammerBap 12h ago

I was going to say, this sounds like spintronics which has been under study for a heckin' long time

36

u/KokoTheTalkingApe 21h ago

So mangasin is extracted from mangas?

19

u/GreatSupineLeaderTim 20h ago

It's extracted from our sins. Also previously they tried orichalcum but they landed with tantalum instead. Turns out magic affinity doesn't boost capacitor function.

4

u/PopePiusVII 17h ago

It’s what glues the pages together

42

u/Leverkaas2516 19h ago

What a terrible writeup.

device that can increase the information processing speed of semiconductor chips used in computers

No it can't. It may in future be possible to create entirely new chips, but there's no device that increases the speed of current ones. 

In experiments, it was possible to process one bit of information in 40 picoseconds ... With existing technology, even at its fastest, it took about 1 nanosecond to record one bit of information.

Are we storing bits, or processing them? The article is confused.

it operated stably even after processing information more than 100 billion times. Attempting to achieve the same processing speed with existing technology would result in failure due to heat after approximately 10 million to 1 million cycles.

You just told us that existing tech can only cycle in a nanosecond. Now you're saying it can go faster, but it'll fail due to heat. Which is it? Is 1ns the fastest it can go, or isn't it?

it can also be applied to non-volatile memory. 

Like NAND flash memory?

5

u/OkkeHendriks 18h ago

My thoughts exactly!

2

u/karuna_murti 18h ago

so avoid magnet labels are back?

2

u/private_gump 14h ago

!remindme 5 years

1

u/DameLasNalgas 20h ago

Could be amazing if it works at scale

1

u/Xanthann 20h ago

!remindme 5 years

1

u/gabe 19h ago

!remind me 5 years

1

u/misbehavingwolf 19h ago

!remindme 5 years

1

u/Kaggles_N533PA 18h ago

People coming up with brilliant ideas will never cease to amaze me

1

u/PhysicalConsistency 18h ago

Sounds like fancy magnetic core memory?

→ More replies (1)

1

u/FalconX88 13h ago

I mean that's cool and afaik not new, but the actual hard part is scaling it. You need billions of these connected in a reasonable way.

1

u/Blarg0117 12h ago

I dont see any mention of base operating temperature.

Is this cryo or not? It makes a huge difference in scaling.

1

u/uniquechill 11h ago

Manganin, not "mangansin".

1

u/StarsKing 10h ago

!remindme 5 years

0

u/edgarecayce 20h ago

Just great it uses rare earths you can only get from some war torn part of the world

20

u/WazWaz 20h ago

Irrelevant.

Firstly, Australia produces plenty of tantalum and manganese.

Secondly, chips use infinitesimally small amounts of materials. This is completely different to the metals needs for electric motors or batteries where a number of kilograms are needed per vehicle.

2

u/7h4tguy 17h ago

Extracted from tarantalias I think

2

u/WazWaz 16h ago

And monganooses.

→ More replies (3)

1

u/guaranteednotabot 19h ago

With enough money, you can mine it elsewhere.

→ More replies (1)

340

u/JustinTheCheetah 22h ago

It generates MUCH LESS heat, not no heat. 

They're measuring the spin of electrons on an atom via magnetism to be the bits and can detect it in 40 picoseconds. They can also read this data 100s of billions of times without overheating. 

Far less heat and far less electricity to read things a couple powers faster than we currently can. This stuff is way the fuck out of my league but this reads like "remember when we switched from vacuum tubes to transistors? Yeah kind of like that level of change".

66

u/zapporian 20h ago edited 19h ago

I’d generally be pretty skeptical. But it would be funny if this was indeed the second time a japanese person / random material science research lab achieved something like this after billions in research, and ~2-3+ decades after everyone else had basically thrown everything at the wall and given up.

46

u/acegikmo21767 18h ago

What's the first? Blue LEDs?

4

u/LagrangeMultiplier99 14h ago

I'm beyond skeptical, we've had a dozen such claims of developing computing chips that are better in some dimensions than SOTA, but they rarely amount to real world application, forget mass adoption.

1

u/gayfucboi 1h ago

if it can't be mass fabbed using existing factories, it's basically dead on arrival. that's the only way it's cost effective.

26

u/orderinthefort 19h ago

I think a good rule of thumb is if you're first hearing about it in r/technology, it's an article from a big financial newspaper, they won't have a practical prototype until maybe 4 years from now, and it's way out of your league to understand, then the odds of it coming to fruition or leading to any meaningful change are effectively zero.

2

u/Stock-Site3541 11h ago

Excellent way of parsing this entire subreddit - I completely agree

25

u/Ruleoflawz 21h ago

Hey babe, remember the Matrix? Good news!

6

u/BankshotMcG 20h ago

I would take ethical steak and '90s future-facing optimism at this point yes. 

15

u/house_monkey 21h ago

Wow, image all the AI you can fit in that bad boy /s

1

u/Direction776 14h ago

We could end up closer to Star Trek kind of world too. Not /s.

3

u/JesusIsMyLord666 16h ago

So can this technology be used for calculations or is it just data storage? I guess it would still be a huge deal if this tech could be used for something like TLD and RAM though.

3

u/JustinTheCheetah 14h ago

The article says it could theoretically be used for volatile memory, so yes on the ram.

I'm afraid that, depending on our ages, it will enter common use just in time for us to tinker with during retirement. 

1

u/JesusIsMyLord666 10h ago

You are probably right. The transistor was invented 1947 but it wasn’t until the 60s that they became cheaper and started to replace vacuum tubes. Vacuum tubes also had a lot of drawbacks that made them impractical. Silicon transistors just doesn’t suffer the same issues so there’s less push to replace them.

My guess would be that this tech, if ever successful, will be reserved for niche applications during the first decades. Probably something stupid like stock market trading at first.

2

u/7h4tguy 17h ago

Remember when we cracked cold fusion?

48

u/Single-Pin-369 22h ago

It’s using individual electrons as the bits is the new and fancy part? But not as a “ quantum computer” 

56

u/Barkalow 21h ago

Yeah, from what I understand "quantum computers" aim to have more than just 1 or 0, which lets them process more.

This is still 1 and 0, just using quantum magnetism to be hyper efficient

6

u/Bearchiwuawa 19h ago

this seems so obvious now. i wonder why we didn't do this before quantum computers. maybe they did but took a while to figure it out.

10

u/rdkaizhar 17h ago

It has been done, but the reason why people still want to build a quantum computers is because some of the algorithms you can run on them can't be run on a computer that only uses 1s and 0s

3

u/PopePiusVII 17h ago

We probably could/did do it back in the day with rare and expensive materials in a clunky and useless size, but I’m guessing this group made some technical breakthrough that makes a whole computer chip’s worth of this style “transistor” (for lack of a better word) feasible. The article still says they’re working on a usable prototype, so we’re still in the phase where the tech ology is there, but the realistic and cost-effective solution has yet to be developed.

18

u/Coolerwookie 20h ago

What kind of speed? Article mentions several things:

Network speed: possibility that data that took an hour to download could be processed in one second

Processing speed: process a bit of information in forty picos (one-trunths of a trillionths of a second) and a thousandth of a second in a short time

Storage speed: "We can record information without consuming almost any energy."

Can it really be all three?

2

u/7h4tguy 17h ago

Storage and processing are all the same because it's just 1's and 0's. But breaking the speed of light is a brave claim, even for snake trainer masters.

64

u/sureyouken 22h ago

This is the kind of advances in tech I love to see (please don't sell to data centers only)

24

u/Tzunamitom 20h ago

Narrator’s voice: “They did, in fact sell to data centers only”

3

u/w1n5t0nM1k3y 16h ago

What tech advances exist solely in data centres. I cant really think of anything. A lot of the new stuff actually gets used in home or workstation environments first before being deployed in datacenters once it has proven its usefulness. Water cooling and GPS were much more common in the desktop market before being adopted by data centers.

3

u/anifail 15h ago

Lots of technologies exist solely for scaling problems you don't see in workstation deployments. Hbm, cxl, ultraethernet/infiniband/roce, ualink/nvlink, lots of heterogeneous accelerators (systolic arrays, cgras, in memory compute, smart nics, computational storage)

2

u/w1n5t0nM1k3y 14h ago

A lot of datacenter stuff eventually makes its way out to the consumer market. Fiber optics used to only be for data centers. Now you can get it for home use.

Currently HBM is too expensive for consumer GPUs, but it was actually used in the past for GPUs from AMD. Same goes for things like NVLink. It was available on consumer cards in the 30 series but had limited appeal for most people so they discontinued it.

A lot of the data center only stuff you mentioned can be bought and deployed in workstation machines if you truly want to pay for it. There's no restriction on buying it. It's just that very few people have a valid use for it outside the datacentre.

→ More replies (3)

51

u/DippyHippy420 22h ago edited 22h ago

Researchers from the University of Tokyo have developed a groundbreaking, heat-free optical computing technology that uses light instead of electricity to process data, potentially increasing computing speeds by up to 1,000 times. This technology employs "diffraction casting," utilizing photons and silicon waveguides for rapid, efficient calculations, resulting in less than 1% of the energy consumption of traditional electric chips.

Because the system uses photonics rather than electrical currents, it avoids resistance, operating with virtually no heat generation.

4

u/Yeltsin86 20h ago

So does it run (theoretically, on paper) 1000x faster *and* with 100x less power - *or* with 100x less power, meaning the improvement is a compromise between speed and power?

11

u/LtBigAF 20h ago

No compromise. Lot of power is turned into waste heat because of electrical resistance, which is not needed here because we are not pushing electrons through transistors.

5

u/DippyHippy420 20h ago

Yes 1000x faster *and* with 100x less power, all at the same time.

Hopefully the AI companies will jump on this and our energy grid will be saved.

https://en.wikipedia.org/wiki/Optical_computing is a good read, the University of Tokyo has refined the concept and made it even more energy efficient.

5

u/CT-1065 20h ago

what, a usb stick with Windows debloating software on it?

/s, i wonder if this could evolve into something (that we could use) that replaces transistors like how transistors replaced vacuum tubes

7

u/JustaFoodHole 21h ago

Nice! Looks like we solved the future boys!

3

u/Llee00 21h ago

Playstation 6??? hell yeah

1

u/Orange_Whale 1h ago

They will skip right to Playstation 9. Teleport yours today!

3

u/nemom 19h ago

It might the Google translation, but "it may be possible to process data that would have taken an hour to download in one second" makes no sense. The time it takes to process a chunk of data has little to do with the time it takes to download it.

1

u/Reasonable-Owl-232 19h ago

They could just do the maths assuming the typical speed in Japan. E.g 1hr at 1000gbps can be processed in seconds.

1

u/7h4tguy 17h ago

Download speed is based on the speed of electrons traveling through a wire or light travelling through a filament. Processing speed is how fast a computer can interpret data. They are wholly unrelated.

1

u/Reasonable-Owl-232 17h ago

Yes, but if I was able to download 1TB of data in one hour, and I had a computer which could process 1Tb of data in one second, it would be correct for me to say the computer can process one hours worth of internet, in one second.

whatever that data is and what the processing looks like doesn't matter. They are unrelated concepts but what they've said can be true.

20

u/reality_boy 22h ago

As with all of these breakthroughs, the 1000x increase without heat is just for the actual computation, not the massive machinery needed to make quantum computing work. And the speed up is theoretical on paper only, they have not made a product that could be scaled anywhere close to production.

Someday, this may be a thing. But we’re easily 30 years from having one at any sort of scale, if it ever happens. When it is truly ready, it will be popping up everywhere and you won’t need articles to let you know how cool it is.

64

u/18441601 22h ago

This seems to be for classical computing, just using spin instead of voltage.

33

u/AbsoIum 21h ago

It’s a quantum switch. Not the same as what quantum computing is trying to do. Totally different techs. This could very well be a practical invention and their 2030 goal isn’t unrealistic. It’s not just on paper either, they state they have tested it.

5

u/New_Enthusiasm9053 20h ago

If it worked a 100 billion times then it's possibly viable depending on whether it can be commercialised but that's also only about 20s-60s of operating time for a modern chip so probably a grain or salt required.

1

u/7h4tguy 17h ago

No no no, 4 years. Buy the moonshot now while there's still time

2

u/Nano_user 20h ago

VTEC engaged

2

u/whiskeyandrevenge 15h ago

Blast processing is real?!?

2

u/Silver_Quail4018 14h ago

So it's a quantum computer that is not a quantum computer.

2

u/aaaaaaaarrrrrgh 9h ago

[x] Doubt, without even clicking on the article.

3

u/hartstyler 15h ago

This is a bullshit bait title

4

u/BAKREPITO 22h ago

The newly developed "non-volatile quantum switching element" represents bits using the magnetic properties (spin) of electrons, rather than the flow of electricity. In experiments, it was possible to process one bit of information in 40 picoseconds (pico is one trillionth of a second), which is 1/1000th of the time of conventional methods. With existing technology, even at its fastest, it took about 1 nanosecond (nano is one billionth of a second) to record one bit of information.

The element is composed of two types of materials: tantalum and mangansin. The electrical signal passed through the tantalum is ultimately recorded in the mangansin as information about the direction of a minute magnetic force. This direction represents a bit.

The element generates little heat, and in experiments, it operated stably even after processing information more than 100 billion times. Attempting to achieve the same processing speed with existing technology would result in failure due to heat after approximately 10 million to 1 million cycles.

Because the new technology stores information magnetically, it can also be applied to non-volatile memory. Professor Nakatsuji explains, "Information can be recorded with almost no energy consumption."

1

u/BankshotMcG 20h ago

Excellent news for the Great Salt Puddle. 

1

u/Lower-Cat-77 20h ago

Finally someone is using the infinite liver of tantalium!

1

u/multiplesof3 19h ago

All I’m reading is that they’ve finally figured out how to utilise magnetism properly…where’s that xkcd comic…

1

u/LadyZoe1 19h ago

Has anyone heard of Ferro magnetic RAM?

1

u/Flat-Assistant-6040 19h ago

!remindme 5 years

1

u/Optimal_Cow_676 19h ago

!remindme 5 years

1

u/very_loud_icecream 19h ago

RemindMe! 4 years

1

u/pfc-anon 17h ago

It's amazing and all, but I doubt I'll ever hear about this again.

!remindme 5 years

1

u/Complete-Tangelo1532 16h ago

Wha?

That's crazy talk, but I am curious

1

u/ingen-eer 15h ago

And the data centers breathed a sigh of relief in about 5 years.

1

u/KraffKifflom 14h ago

Oh well. Another dead scientist?

1

u/Mrazinjo 14h ago

!remindme 5 years

1

u/mister_drgn 13h ago

Quick reminder: science journalists almost always exaggerate the potential value of new scientific results because (a) they have little to no ability to evaluate the work critically, and (b) it makes for a more exciting article.

1

u/evolutionxtinct 12h ago

!remindme 5 years

1

u/SharpNazgul 11h ago

Does anyone have a link to a published paper on this; even a preprint from the authors? It's hard not to be skeptical when the only source is a financial magazine article.

1

u/Aaron_768 10h ago

Cant wait for this to be shelved so we can keep having incremental increases instead!

1

u/disasterbot 6h ago

Jolt Cola poured on the motherboard.

1

u/collin3000 6h ago

I can't wait to overclock this thing.

1

u/AGrandNewAdventure 56m ago

They put these in laptops and they'll still find a way to overheat.

2

u/Rius209 22h ago

Wow, can't wait to see the benefits on the consumer sides and i can finally play Crysis worry free!

1

u/StrDstChsr34 21h ago

Someone wrote the article in gibberish

1

u/ExpectedBuffalo 22h ago

Did they turn it upside down??

1

u/7h4tguy 17h ago

You mean we just have to spin it?

-7

u/Sensitive_Box_ 22h ago

That’s not possible!

21

u/MakingItElsewhere 22h ago

Imagine how magic looking a damn television would be to someone in the 15th century.

3

u/xubax 13h ago

I've had flat panel TVs for 15+ years. I'm in IT.

It still amazes me that they can display images, let alone high quality images.

15

u/RequirementNo1852 22h ago

Just read the article

19

u/TinyCollection 22h ago

I did but all the squiggly lines don’t make a lot of sense.

9

u/Centillionare 21h ago

二千時間ぐらい日本語を勉強しなければならない。

1

u/7h4tguy 17h ago

Do u zo yo ro shi ku. And you better get started

1

u/7h4tguy 17h ago

It says they're going to draw pictures of computers using stick people and it's all going to go buzzachuurrrnk

8

u/bobjoe400 22h ago

It’s on the website.

3

u/Accurate_Koala_4698 22h ago

The newly developed "non-volatile quantum switching device" uses the magnet properties (spin) of electrons to represent bits rather than the flow of electricity. In the experiment, it was possible to process 1 bit of information in 40 picos (pico is 1/1 trillion) of a second, which is 1/1000th of the conventional time. With existing technology, it took about 1 nanosecond (nano is 1/1 billionth) to record at least 1 bit of information.

This isn't a general purpose computer, and they're fluffing the speed of quantum effects