Keras is great for painful libraries like Theano, which was similar to early TF. Btw, many Theano users already used a higher level library called Lasagne, which was similar to Keras.
When I switched to TF in 2016 Keras was still in its infancy so I wrote a lot of low level TF code (eg my own batchnorm layer), but many new DL researchers struggled with TF because they didn’t have Theano experience. That steep learning curve led to the rise of Keras as the friendly TF interface.
It's also a bit histrionic, in that Keras was very popular with the Theano backend before that project wound down.