Skipping over the discussion, is there a chance that the netcode will never work successfully? That it's too ambitious for the technology?
Well Crytek Engine(CE)was never to popular as network friendly in the past and "we"conclude that their engineers could not solve this problem for past 15 years....However this is the thing past 5 years we witness that multicore cpu's become usual standard and overall hardware improved a lot + the internet connection is probably 2x(reality could be 5x) faster and secure and this things could be crucial in the area of netcode,but there is a problems behind the curtains that are far more important like Latency which is an unavoidable fact of online games, caused by not only network latency, which is largely out of a game's control, but also latency inherent in the way game simulations are run. There are several lag compensation methods used to disguise, reduce, or cope with latency, however their feasibility varies by application....
So I will quote wiki on this below here:
"A single update of a game simulation is known as a tick. The rate at which the simulation is run on a server is referred often to as the server's tickrate; this is essentially the server equivalent of a client's frame rate, absent any rendering system. Tickrate is limited by the length of time it takes to run the simulation, and is often intentionally limited further to reduce instability introduced by a fluctuating tickrate, and to reduce CPU and data transmission costs. A lower tickrate increases latency in the synchronization of the game simulation between the server and clients. Tickrate for games like first-person shooters can vary from 60 ticks per seconds for games like Quake or Counter-Strike: Global Offensive in competitive mode to 30 ticks per seconds for games like Battlefield 4 and Titanfall.[citation needed] A lower tickrate also naturally reduces the precision of the simulation, which itself might cause problems if taken too far, or if the client and server simulations are running at significantly different rates.
Games may limit the number of times per second that updates are sent to a particular client, and/or are sent about particular objects in the game's world. Because of limitations in the amount of bandwidth available, and the CPU time that's taken by network communication, some games prioritize certain critical communication while limiting the frequency and priority of less important information. As with the tickrate, this effectively increases the synchronization latency. Game engines may also reduce the precision of some values sent over the network to help with bandwidth use this lack of precision may in some instances be noticeable.
Various simulation synchronization errors between machines can also fall under the "netcode issues" blanket. These may include bugs which cause the simulation to proceed differently on one machine than on another, or which cause some things to not be communicated when the user perceives that they ought to be. Traditionally, real-time strategy games have used lock-step peer-to-peer networking models where it is assumed the simulation will run exactly the same on all clients; if, however, one client falls out of step for any reason, the desynchronization may compound and be unrecoverable"
Not sure if this explanation will clear a bit of the bigger picture but at least can give you some of the info how really complex netcoding issue is after all.......