Are you running maximum settings, and at what resolution? You need to keep in mind that actually hitting power limit would make your card downbin in clocks and/or voltage, which would hinder your performance, so it is actually a good thing you're NOT hitting power limit.
A perfect example of hitting power limit when you're not even seeing anywhere near the card's actual power limit is in this screenshot:
Now I'm running at 5120 x 1440 (7.44M pixels, so almost 4K) - and my GPU is set to 2160MHz @ 1068mV (technically stock voltage but it's commanded to stay at the lowest bin of stock voltage, so it's an OC/undervolt, lower power draw, lower temps, faster clocks) & the vRAM is set at +1000MHz - so pretty much full tilt on the card in terms of how high I can go without losing stability or getting artifacts. As you can see, I'm getting a whoppin' 32fps (settings are absolutely maxed, RT is fully on with the last setting on Psycho, DLSS is on Quality because it just plain looks better than native 95% of the time) - and seeing POWER as a Limiting factor, despite only drawing 423W according to the HWInfo readout, when my power limit slider was maxed, meaning I should have a 520W power limit.
Because of me hitting a power limit somewhere in the card's brain, I'm only hitting 2130MHz in that scene. Just an example of why you actually don't want to see power limit, when you can avoid it.
Doing an undervolt/overclock can actually net you some nice performance gains, while simultaneously lowering your load temperatures, and power draw.
I think the highest draw I've seen so far in Cyberpunk is around the 470W range, with these settings. Oddly, that did not have a POWER limiting pop up. Ampere are fickle creatures.