This is indeed very promising, but not necessarily a breakthrough yet. There's still the issue of operating temperature. The researcher said the next step is to try to get this to work with fancier superconductors that operate at 77K, which would "only" require liquid nitrogen cooling. If that is achieved, then I see this having real applications, but niche ones. More revolutionary applications would require room temperature, normal pressure superconductors, which AFAIK haven't yet been found, and might be possible at all(it's an open question). Even if they were found, there's of course also the question of whether the cost of those materials can be brought down enough to be worth the gains in efficiency etc.
dgoldstein0 685 days ago [-]
Yup.
One thing I'm wondering: superconductors are supposed to be more efficient because they waste no power as heat. But doesn't it take power to maintain such cold temperatures, even for "high temperature" liquid nitrogen temperatures? How does this compare to the energy we save from the lack of resistance in the superconductor?
willis936 685 days ago [-]
It depends on how much current you want. I worked on a 1.2 m major radius 1 T copper stellarator that used 10 MW of power for the confinement coils. The coils are water cooled but even then the shots are only 1-second to avoid overheating (and to save on the engineering cost of the power system).
Any real fusion reactor would need 5+ T confinement fields and be 2+ times the radius of this machine. You would be using many GW in ohmic losses in copper confinement coils for hundreds of MW of fusion power: it simply could never work. A cryoplant for an LTS fusion reactor would use on the order of 50 MW, a bit less for HTS.
So for the big magnet application superconductors are an incredible win.
AussieWog93 685 days ago [-]
>But doesn't it take power to maintain such cold temperatures, even for "high temperature" liquid nitrogen temperatures? How does this compare to the energy we save from the lack of resistance in the superconductor?
Not a thermals guy, but bog-standard, consumer air conditioners tend to have efficiencies of 300% or so. Ie, for every joule of electrical energy consumed, they'll reduce the thermal energy in the room by 3 joules.
Regardless of this, having zero heat output in the wafer itself could be far more interesting than just increased power efficiency. For example, without the issue of thermals one could imagine silicon wafers with thousands of layers and 3-D interconnects.
XorNot 685 days ago [-]
It's a question of what you're spending power on. When you cool something there's the action of pumping heat away from it, and then the issue of how easily heat can migrate back in.
So you're only really fighting the inefficiency of your insulation, and insulation can be very, very good (at the extreme end you could magnetically balance something in a vacuum chamber and then chill it - it'll take a very long time to heat up by radiative emission from the walls (double wall insulated cups use this property though my take is they're just gas filled - same idea though).
Essentially the ongoing cost of keeping something cool is purely a function of heat load making it through insulation, and in all cases that's going to be much lower then generating heat in the system - the problem is that engineering and complexity wise, it's non-trivial and failure scenarios can be bad - i.e. an SC magnet quench can be very bad since you amongst other things might immediately boil your coolant. This happened at the LHC during commissioning.
justinclift 685 days ago [-]
At some point, I wonder if it'd just be cheaper and easier to put our supercomputers beneath the earth's crust. :)
How would that help? Temperature goes up as you go down.
z2h-a6n 685 days ago [-]
The superconducting transition temperature (Tc) of many superconducting materials increases as the pressure applied to the material is increased, so I guess the idea is to make the ambient pressure high enough that Tc is below the ambient temperature.
My instinctual reaction (as a physicist studying superconductors, but with no expert knowledge of geophysics) is that this won't work for most (possibly all) materials, because (a) the maximum Tc under high pressure is still generally less than ~ room temperature, (b) the ambient temperature increases with depth below the earth's surface.
vanderZwan 685 days ago [-]
There's also the question of maintenance.
Enginerrrd 685 days ago [-]
Exactly.
Now a massive computing center in Antarctica however might have some merits.
Maursault 685 days ago [-]
Not if Climate Change is any concern. What we need in Antarctica is a factory that makes it colder there, not dump heat there. What you want to do is have heatsinks that dump heat to the cold side of the vacuum of space.
Enginerrrd 685 days ago [-]
This is... kind of delusional.
Maursault 685 days ago [-]
Crazy or not, all that matters is whether it's profitable.
asah 685 days ago [-]
temp goes up with depth...
but maybe... bottom of the ocean? space/moon ?
pg_bot 685 days ago [-]
To be fair 'only' requiring liquid nitrogen is a much easier proposal than liquid helium as nitrogen can be sourced from air and is way less expensive.
elromulous 685 days ago [-]
This is exciting, don't get me wrong, but I'm always skeptical of these. "in the lab" in silicon is somewhat equivalent to "in mice" in biology. I was very excited about memristors back in '08 when they were first actually synthesized [1], but here we are, ~14 years later, and still no commercial viability. Producing something in a lab, and mass producing something in a foundry/factory are very different.
> For instance, the use of superconductors instead of regular semiconductors might save up to 10% of all western energy reserves according to NWO.
>
> According to the Dutch Research Council (NWO), using superconductors instead of conventional semiconductors might save up to 10% of all Western energy reserves.
This publication needs an editor.
mike_hock 685 days ago [-]
At least it didn't contradict itself.
ThePhysicist 685 days ago [-]
Super interesting result, but some of the things mentioned in the article are just plain wrong or grossly misrepresented. Not sure for example what they mean with "IBM mentions that without non-reciprocal superconductivity, a computer running on superconductors is impossible". RSFQ (Rapid single flux quantum) logic is based on Josephson junctions and works just fine up to many GHz, Prof. Likharev's group at SUNY / Stony Brooke developed these together with IBM, and RSFQ circuits are still being used in niche applications as well as quantum computing. The reason that they never replaced semiconductor-based computers is simply that they couldn't keep up with the rapid progress of those. In the 80s and 90s it was impressive that RSFQ devices could (theoretically) run at several 10s - 100s GHz, fast and efficient MOSFET transistors have much better characteristics and don't need costly cooling. For high-frequency applications there are HEMTs (high electron mobility transistors), which are often also easier to operate and manufacture than superconducting circuits. Also, no one managed to really scale RSFQ logic beyond a few 100.000 junctions.
melony 685 days ago [-]
Are we going to get a magnetic monopole out of this too?
Maursault 685 days ago [-]
No, but we can expect a lot of angry Apple customers at some point bitching that Apple's QMJJs stop working after two years because the coolant leaks into their trendy graphene and gold logic gates[1] throwing them out sync with the laser, catching fire and turning their advertised 350 million times faster[2] computer into a superfund site. But having massive renders sitting there for your review before you get actually submitted them to render might be worth it.[3]
[2] hundreds of times faster bump from superconductors (arbitrarily chose 350) times a million times faster with the fancy new super fast laser driven graphene and gold logic gates.
[3] A benefit of using materials that light passes through faster than the speed of light in a vacuum.
d--b 685 days ago [-]
I didnt read the article, but I think that adding the question mark to the title to mitigate tbe claim is a great idea!
One thing I'm wondering: superconductors are supposed to be more efficient because they waste no power as heat. But doesn't it take power to maintain such cold temperatures, even for "high temperature" liquid nitrogen temperatures? How does this compare to the energy we save from the lack of resistance in the superconductor?
Any real fusion reactor would need 5+ T confinement fields and be 2+ times the radius of this machine. You would be using many GW in ohmic losses in copper confinement coils for hundreds of MW of fusion power: it simply could never work. A cryoplant for an LTS fusion reactor would use on the order of 50 MW, a bit less for HTS.
So for the big magnet application superconductors are an incredible win.
Not a thermals guy, but bog-standard, consumer air conditioners tend to have efficiencies of 300% or so. Ie, for every joule of electrical energy consumed, they'll reduce the thermal energy in the room by 3 joules.
Regardless of this, having zero heat output in the wafer itself could be far more interesting than just increased power efficiency. For example, without the issue of thermals one could imagine silicon wafers with thousands of layers and 3-D interconnects.
So you're only really fighting the inefficiency of your insulation, and insulation can be very, very good (at the extreme end you could magnetically balance something in a vacuum chamber and then chill it - it'll take a very long time to heat up by radiative emission from the walls (double wall insulated cups use this property though my take is they're just gas filled - same idea though).
Essentially the ongoing cost of keeping something cool is purely a function of heat load making it through insulation, and in all cases that's going to be much lower then generating heat in the system - the problem is that engineering and complexity wise, it's non-trivial and failure scenarios can be bad - i.e. an SC magnet quench can be very bad since you amongst other things might immediately boil your coolant. This happened at the LHC during commissioning.
There's a billion $$$ startup idea. Let's get some people together to start serious superconductive computing below the Kola Superdeep Borehole: https://en.wikipedia.org/wiki/Kola_Superdeep_Borehole
Well, apart from the Russian-Ukrainian war going on presently. :(
So plan B, need to put a superdeep borehole somewhere else. Maybe at the KTB borehole in Bavaria?
https://en.wikipedia.org/wiki/German_Continental_Deep_Drilli...
My instinctual reaction (as a physicist studying superconductors, but with no expert knowledge of geophysics) is that this won't work for most (possibly all) materials, because (a) the maximum Tc under high pressure is still generally less than ~ room temperature, (b) the ambient temperature increases with depth below the earth's surface.
Now a massive computing center in Antarctica however might have some merits.
but maybe... bottom of the ocean? space/moon ?
[1]https://en.m.wikipedia.org/wiki/Memristor
https://www.nature.com/articles/s41586-022-04504-8
>
> According to the Dutch Research Council (NWO), using superconductors instead of conventional semiconductors might save up to 10% of all Western energy reserves.
This publication needs an editor.
[1] https://news.ycombinator.com/item?id=31356109
[2] hundreds of times faster bump from superconductors (arbitrarily chose 350) times a million times faster with the fancy new super fast laser driven graphene and gold logic gates.
[3] A benefit of using materials that light passes through faster than the speed of light in a vacuum.