I used to work with Access Databases, querying data tables stored on network shares, ODBC or SQL servers. I'm far from an expert on networking protocols but as I understand it Wi-fi is geared up for burst activity (loading web pages, downloading files etc) rather than continuous two-way data transfer like a long running database query or P2P gaming.
Regardless of speed, a cable has stability & latency advantages over a wireless connection. The advantage of wireless is portability & convenience, at the expense of connection stability.
Yes and no. If we're going to get into technical details, the primary reason why "wifi sucks for gaming" - or indeed any use that requires continuous two-way communication - became such an article of faith was that the 802.11 standard is half-duplex transmission. Any UTP cable hookup is full-duplex with transmit and receive on physically separate conductors.
Half-duplex isn't something the lower layers of the model are unused to handling, though. The origin of ethernet used a single shared physical connection, a coax cable segment. The various protocols for handling a half-duplex data channel are so deeply embedded in networking that they are likely subject to continental drift.
Provided the total transmission bandwidth is fast enough, half-duplex comms do not suffer any inherent disadvantages over full-duplex. However, early 802.11 implementations were not really quick enough to avoid "wait states" for the physical channel to become available - thus WiFi's reputation for latency and packet loss which in those days was fully justified and showed up most when dealing with just the kind of usage we're talking about, sustained moderate to high bandwidth bidirectional traffic. As the maximum bandwidth of a WiFi connection improved, these issues were mitigated. If the wireless portion of the communication path is imposing "duplexing delays" that are only of the same or lower order of magnitude as the latency on the rest of the communications path then those duplexing delays will not have an effect that is detectable from either end of the socket.
That improvement is why WiFi's "defenders" pointed to their faster protocols as an argument why "WiFi no longer sucks for...." but that was only one of WiFi's problems and boosting the total bandwidth did nothing to mitigate the second major one. WiFi used the 2.4GHz band. They shared that part of the spectrum with bluetooth, with cordless phones, with part of an amateur radio band and even with the output of the cavity magnetrons that are the heat source for consumer-grade microwave ovens. (large commercial grade ones use a different frequency band) Couple that with the fact that the 14 WiFi channels within the 2.4GHz band overlap considerably it is unsurprising that in such a crowded piece of spectrum interference issues were responsible for some decidedly dodgy connectivity. All the innovations to increase bandwidth in this frequency band did nothing to ameliorate that.
The "solution" to that was to use the 5GHz band instead. At this higher frequency, channels can be "packed together" more tightly without crosstalk between them. Pretty much all WiFi kit is now capable of using this higher band in addition to the legacy 2.4GHz band. Unfortunately, here we run into the problem of the "consumer-grade potato"
Even today, a lot of the kit ISPs ship out prioritizes 2.4. Even where it doesn't actively do so, while it is transmitting on both bands it is often using the same SSID on both, which leads to many devices defaulting to the lower and poorer frequency. You CAN undo this particular bone-headed default but you'll be delving into the "advanced config" of your router and I'd wager the vast majority of home users have never even thought of going there - they see the "latest" set of letters appended to 802.11 and think its all good, never realizing that their computer may be ignoring the fast lane, even though it boasts the same set of magic letters in its own spec.
So yeah, a WiFi connection using modern kit that is properly set up can be perfectly OK for gaming or any other demanding application for which WiFi traditionally "sucks" - but it has to BE properly set up and that is still relatively rare to find in an out of the box default config and changing away from those defaults is not an exercise for the ignorant.