I'm curious. I think the whole thing (space-based compute) is infeasible and stupid for a bunch of reasons, but even a class-A amplifier has a theoretical limit of 50% efficiency, and I thought we used class-C amplifiers (with practical efficiencies above 50%) in FM/FSK/etc. applications in which amplitude distortion can be filtered away. What makes these systems be down at 10%?
Nowadays such microwave power amplifiers should be made with gallium nitride transistors, which should allow better efficiencies than the ancient amplifiers using LDMOS or travelling-wave tubes, and even those had efficiencies over 50%.
For beamformers, there have been research papers in recent years claiming a great reduction in losses, but presumably the Starlink satellites are still using some mature technology, with greater losses.