One thing I'm wondering: superconductors are supposed to be more efficient because they waste no power as heat. But doesn't it take power to maintain such cold temperatures, even for "high temperature" liquid nitrogen temperatures? How does this compare to the energy we save from the lack of resistance in the superconductor?
Any real fusion reactor would need 5+ T confinement fields and be 2+ times the radius of this machine. You would be using many GW in ohmic losses in copper confinement coils for hundreds of MW of fusion power: it simply could never work. A cryoplant for an LTS fusion reactor would use on the order of 50 MW, a bit less for HTS.
So for the big magnet application superconductors are an incredible win.
Not a thermals guy, but bog-standard, consumer air conditioners tend to have efficiencies of 300% or so. Ie, for every joule of electrical energy consumed, they'll reduce the thermal energy in the room by 3 joules.
Regardless of this, having zero heat output in the wafer itself could be far more interesting than just increased power efficiency. For example, without the issue of thermals one could imagine silicon wafers with thousands of layers and 3-D interconnects.
So you're only really fighting the inefficiency of your insulation, and insulation can be very, very good (at the extreme end you could magnetically balance something in a vacuum chamber and then chill it - it'll take a very long time to heat up by radiative emission from the walls (double wall insulated cups use this property though my take is they're just gas filled - same idea though).
Essentially the ongoing cost of keeping something cool is purely a function of heat load making it through insulation, and in all cases that's going to be much lower then generating heat in the system - the problem is that engineering and complexity wise, it's non-trivial and failure scenarios can be bad - i.e. an SC magnet quench can be very bad since you amongst other things might immediately boil your coolant. This happened at the LHC during commissioning.
There's a billion $$$ startup idea. Let's get some people together to start serious superconductive computing below the Kola Superdeep Borehole: https://en.wikipedia.org/wiki/Kola_Superdeep_Borehole
Well, apart from the Russian-Ukrainian war going on presently. :(
So plan B, need to put a superdeep borehole somewhere else. Maybe at the KTB borehole in Bavaria?
My instinctual reaction (as a physicist studying superconductors, but with no expert knowledge of geophysics) is that this won't work for most (possibly all) materials, because (a) the maximum Tc under high pressure is still generally less than ~ room temperature, (b) the ambient temperature increases with depth below the earth's surface.
Now a massive computing center in Antarctica however might have some merits.
but maybe... bottom of the ocean? space/moon ?
> According to the Dutch Research Council (NWO), using superconductors instead of conventional semiconductors might save up to 10% of all Western energy reserves.
This publication needs an editor.
 hundreds of times faster bump from superconductors (arbitrarily chose 350) times a million times faster with the fancy new super fast laser driven graphene and gold logic gates.
 A benefit of using materials that light passes through faster than the speed of light in a vacuum.