this post was submitted on 21 Dec 2023
622 points (97.8% liked)
Technology
60084 readers
3861 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
While this is amazing and all, it's always seemed to me that this approach of using hundreds of laser beams focused on a single point would never scale to be viable for power generation. Can any experts here confirm?
I've always assumed this approach was just useful as a research platform -- to learn things applicable to other approaches, such as tokamaks, or to weapons applications.
I mean I assume you have to start somewhere to be able to improve, right? Like breakthroughs with TVs, no one would realistically use a vacuum tube when you can make an OLED display. But if we didn’t start with the vacuum tube we wouldn’t know what to improve on.
Once we know that we can do it, we can start working on how to do it easier.
IMO the current best bet on who builds an actual fusion plant first is Proxima Fusion, a spin-out of the Max Planck institute. They're planning on building a large Stellerator by 2030 based on their experiences with Wendelstein-7X, which exceeded all expectations (as in: It behaved exactly as predicted), proving that the concept scales without issue. Still some kinks to figure out but those are about economical efficiency, not achieving power output.
The NIF generally does research on nukes. I have a hard time believing them talking about civil applications is anything but marketing.
Yes, and the reason why they are good is that they are using high-temperature superconductors for their magnets, which makes it as efficient as currently possible. The tokamak models of the US are doing the opposite, they use even more energy for their magnetic field.
Tokamaks also use superconducting magnets, there's really no feasible way to get the necessary field strengths without superconductivity. What makes the two approaches different is that ions want to follow magnetic lines naturally in a spiral which Stellerators lean into and allow (hence the lovecraftian geometry) while Tokamaks try to make them fly straight by inducing a current into the plasma creating a secondary magnetic field, creating turbulence which then has to be brought under control.
The net effect on plasma stability is that with a small Tokamak you're balancing a column of three tennis balls, when you make it bigger you get additional balls to balance. With Stellerators you're balancing a tennis ball in a salad bowl.
The reason early research favoured Tokamaks is because people thought designing the coil and field geometries necessary for Stellerators wouldn't ever work out but then Supercomputers came along (Wendelstein 7-X was computed on a Cray) and, well, as said, the real thing behaves exactly as computed. A thing Tokamaks can only dream of with all their tennis balls.
That's a great point; manufacturing is much better than anticipated even 20 years ago.
I mean that the startup uses high-temperature superconductors and hence uses even less energy for their cooling. Wendelstein 7-X uses "normal" superconductors, and hence requires more energy for that. And a tokamak uses an order of magnitude or so more energy for the magnetic field, than a stellarator does.
But yeah, no idea how much more energy a higher power tokamak magnet picks up from the reaction chamber compared to a lower powered stellarator magnet. But surely the less cool high-temperature superconductors are more tolerant to this than the "normal" ones, since they have more temperature tolerance to work with. Hence, for building a reactor that generates a gigawatt or so of heat, this approach seems really the best that we have now.
Must've missed that "high temperature" before superconductor when reading. Wendelstein 7-X uses Niob-Titanium, very much not high temperature at 10K but as I understand it's the standard for applications because metals, much unlike ceramics, aren't a bugger to deal with. If there's some suitable new materials (it's been 23 years since W-7X started getting built) I doubt they'll use it unless they know it won't be an issue, just not the right thing to bet the project on. Looking at Wikipedia things >77K are still either ceramics or need >150GPa which is insane for an industrial application. Or, wait... yep there's people who use "high temperature" to mean things other that "can be cooled with nitrogen". That might be it, 20K can be done with hydrogen.
Yeah, I have listened to an interview with one of the people working at Wendelstein, and they said that the startup uses "high-temperature superconductors". They didn't go into any detail though, so no idea what exactly they meant.
The first step is always the hardest. You have to start somewhere. You don't start by having something fully scalable right away, you have to work towards it.
Yeah pretty much my understanding as well, I don't think anyone has a notion of what it would take to generate power from inertial fusion and whatever if it would be practical.
Same, I never understood the scope of this lasers-on-a-pill approach, other than being a starter. I like the Polywell concept more.