This is just plain nonsense. First of all, I play at 1920x1200 with my 2GB GTX 960 with full Ultra settings in several games, including Elite, The Witcher 3, Fallout 4, and GTA 5 just to name a few. I get a solid 60FPS in all of those with the exception of GTA 5 when there's too many cars on screen; then it drops to "only" 50 or so FPS. Second, there's no reason whatsoever why these cards would "not perform well int he future" assuming tasks reasonably similar to existing ones. Sure, if you want to try and play a game in 2 years on one in 2 years you may have some limits, but that's the case with any video card over time. Even if you're talking about VR tasks, the Oculus specifically states that a 970 is sufficient.
As far as your AMD is inherently better but is limited by drivers, that's bunk as well. Sure, you may not have every single feature in all cards but what matters is whether any games are going to be requiring those anytime soon. AMD has a tendency to add extra bullet point features then limit it via drivers because that's their strategy to push their products after changing up their business model. NVidia, OTOH, didn't bother putting certain things in current cards because there would be no point whatsoever for end users. As for NVidia making "bad drivers", I think you need to find some actual evidence of that if you're going to throw it around. I haven't had any issues with any of my NVidia based cards while I've seen driver problems on the Radeon stuff as far back as when they were ATI. Even then, that hasn't always been the case and anyone can have a bug that needs fixing at some stage no matter who they are.
Note that I am not an Nvidia fanboy; some of my rigs have AMD cards and others NVidia. Both are suitable for the tasks, so long as you don't ask more than they're capable of. When you start off by spewing nonsense about NVidia cards not being capable of resolutions that I happen to actually use daily, though, you lose all credibility in my eyes.