Smartphones aside, little Ryzen 6000 boxes would be OK.
Used DDR5 laptops with a little discrete GPU would be even better. I have one with a broken screen that may be dedicated to this very task.
You could maybe run something on the 4GB Jetson Nano?
But very slow: https://www.reddit.com/r/LocalLLaMA/comments/12c7w15/the_poi...
A 32GB+ ddr5 laptop with a dGPU and some RAM will (IIRC, just barely) do llama 70B for far less money and a similar TDP.
Smartphones aside, little Ryzen 6000 boxes would be OK.
Used DDR5 laptops with a little discrete GPU would be even better. I have one with a broken screen that may be dedicated to this very task.