I would have to disagree with you Jan. Breaking up complex CISC instructions into simpler micro-code was both for creating superscalar processors and improve pipelining. Intel designers themselves did not see it as an attempt to be more RISC like.
But there are different perspectives on this. Depends on what lens you want to look at RISC: Is it a philosophy around how you design an instruction-set or a philosophy around the micro-architecture. It is a bit of both. You ignore the ISA perspective here.
Either way you can really just consider this as a countermove against the obvious advantages of RISC. Making x86 micro-architecture more RISC like was never going to make x86 inherently superior to a RISC processor. It was just a way to attempt to even the playing field.
Without Intel high-volume production and access to smaller node sizes for fabrication, it would not have been enough to beat the RISC workstations.
Had those RISC workstation guys had the same volume, profits and access to same manufacturing capability as Intel, then they would not have been outcompeted by Intel copying a few RISC tricks internally in their processors.
> x86 isn't going to "magically" disappear in the foreseeable future.
Well we still have 6502 chips around, but they aren't relevant. Likewise in 10 years x86 chips will no longer be very relevant in the market.
Neither AMD nor Intel has some magical fairydust which will keep x86 competitive with Arm. x86 is already getting outcompeted on price/performance by various Arm alternatives in most relevant markets.
The only thing that keeps x86 around still is momentum and legacy. But that is not enough to keep x86 around forever. Eventually that momentum will run out of steam.