styckx,
About the lack of GPU use, I'm wondering of there's some kind of silent failure going on. I remember reading that DirectX has a software fallback mode, it does have a name but I can't recall it. Microsoft introduced it in DirectX and Windows Presentation Foundation. WPF is hardware accelerated via DirectX (9 I think) and is used in Windows user interfaces for some apps.
The idea is that there's a lot of differently performing graphics cards and they don't have the same features, this causes a headache for the programmer because they have to check what the card can and can't do and then branch their code accordingly. The more cards to support, the more complex things get in your code. So to help the problem, MS introduced a fallback so that if the programmer asked the card to do something it couldn't do, it reverted to doing it in software instead. The result is that the program still runs, just a lot slower. Much better then just crashing.
So, as a
speculative hypothesis, how about RW3 is asking the card to do something it can't or supplying something in the wrong format? This results in the card rejecting it, DirectX detects the problem and does it's fallback. Wham! Unplayableville as you say :)
Of course it is just speculation and I've never even seen a fallback, let alone created one. (sorry)
But it's clear that a scene is being rendered, there's polygons and textures and lights and stuff, so there should be some GPU activity.
Best regards,
Shadders.