I didn't think it mattered if the FPS matched the "interval", or perhaps it does now given that Direct Mode is cranky (doesn't work at all for me). According to Carmack the idea is that if a frame isn't finished just before the time you need to present a frame, no problem, the Rift gets the current head tracking location and performs an image warp on the previous frame.
Now I think this is a problem on Windows in extended mode - something to do with the actual interrupt interval being messed up if the refresh rates of your two displays are different (my primary display is 60Hz, Rift is 75Hz of course).
Or something...
The feature you're talking about is called "asynchronous timewarp" (ATW). As of today ATW is not yet released on the PC, but only on the GearVR (Samsung Note 4). Regular timewarp (TW) is on PC, which leads to some confusion. TW applies last-moment corrections to the currently rendered frame before it is sent to the monitor. But TW can't create "new" frames to send in between. That is when ATW comes into the picture. Essentially ATW will be used to smooth out your framerate, while TW just makes sure the current frame matches your head position as good as possible. Both important, both with similar names and similar procedures, but different purpose.
Their problem is they have to make ATW work for not only one specific handset, but across all PC hardware configurations. Here's a quote from an article: "[...] there is the whole matrix that we’ll have to work through; NVIDIA, AMD, Intel and then PC, Mac and Linux. That’s a lot of work [...]". So until they get that done and release it (which might take many more months), you absolutely need 75fps coming out of the game.
BTW, there is nothing that stops a developer from implementing a form of ATW on their own. The basic idea is as simple as having a seperate render- and output-loop. If no new rendered image is available when the time for the next frame comes, apply the timewarp method again to the previous frame and send that. Of course it is not quite as simple in realitly, but the problems Oculus is facing are on a low level system and driver level, so that it just works with all hardware etc. - a single dev could patch their gamecode more easily.
EDIT: you can try a software implementation of this in the "OculusWorldDemo" sample of the SDK. You can lower the simulated framerate with the "J" key. Even if you go as low as 10 fps you will still get smooth headtracking in the 3D scene (although the overlays will stutter).