Patch Notes Update Horizons Beta 3.1 AMD Hotfix Incoming 6.30 GMT

Status
Thread Closed: Not open for further replies.
The hardware is a now thing, certainly not 5 years. I don't mean to be disparaging of your system which is quite low end.
Normally I don't advertise my system as I think system sigs are a bit sad but for example my i7 4790k 4-4.4 Ghz, twin GTX 970's with 16 Gb ram, OCZ ssd based system, which while capable, is far from a current "top end" system and gets 100-120 fps inside stations, 140-150 in SC with all graphics settings maxed @ 1920 x 1080 and SS (super sampling) set to 2. When I run 3 monitors @ 5890 x 1080 and I drop SS back to 1, leave all other settings maxed and achieve similar to slightly better framerates. With SS at 1.5 slightly lower framerates. but on all settings the gameplay is as smooth as glass.



There are i7's and there are i7's though mate also your graphics chip is getting a bit long in the tooth as well. Toss a 970 or a 980 in it and you will really notice the difference.

Well, if you are getting framerates of 30 or above it should be smooth. 30 frames per second is what TV shows and movies have been filmed at for years and they are smooth. You start getting jerkiness at below 30 fps. So long as I can consistently get 30fps or above, I'm happy. Sure 60 or even 90 fps is good, but it's the same as listening to an MP3 song encoded at 256 kbit/sec as opposed to 320 kbit/sec. Unless you have ears with the sensitivity of a dog, you won't hear any different, but the 256kbit/sec takes almost takes substantially more space.

And yeah, I know my machine is quite low end. I'm going to be getting a better one that will be almost recommended specs. Not quite, they recommend an AMD R9280X. The machine I'm looking at is an R9 280, but with a hex core processor.
 

Robert Maynard

Volunteer Moderator
Well, if you are getting framerates of 30 or above it should be smooth. 30 frames per second is what TV shows and movies have been filmed at for years and they are smooth.

Movies are typically filmed at 24 FPS - except "The Hobbit" which was filmed at 48 FPS. NTSC uses a frame-rate of 30 FPS (so a lot of US TV shows are 30 FPS); PAL uses 25 FPS.
 
Well, if you are getting framerates of 30 or above it should be smooth. 30 frames per second is what TV shows and movies have been filmed at for years and they are smooth.

Except that movies and TV shows rely on certain camera shutter speeds, which aren't a factor in games. Take the 24 frames per second in movies for example. Each frame is usually taken at a shutter speed of 1/48th per second. That's pretty slow. The result: quite a lot of motion blur which helps blending the individual frames together and suggests fluid motion. The individual frames in games, however, are tack sharp and as a result much easier to discern for the eyes. The result: 24 fps in game are pretty jerky and other than in movies/TV shows we need quite a bit higher frame rates for really fluid motion.

The often touted myth that our eyes can't differentiate higher framerates than 24, or 30 respectively, is based on a fallacy that is the result of the misconception of how movies create fluid motion (see above). We can easily discern the difference between 40, 60, 100 frames per second and even higher, provided the display supports framerates that high with its refresh rate.
 
Last edited:
30 frames per second is what TV shows and movies have been filmed at for years and they are smooth.

30FPS 720p:
www.youtube.com/watch?v=YE7VzlLtp-4


60FPS 4K:
www.youtube.com/watch?v=aqz-KE-bpKQ


The human eye after 16FPS more or less it can't tell if in a second there are more frames than that...but below it can. Then above that amount what we see is actual motion, we don't see in frames (I read that a lot over the internet...). You can tell if the actual motion is more fluid or not but that just all. (as in the videos above)

In gaming the whole story is different because motion is one thing (what you see) but how it is being rendered and processed in real time by your GPU/CPU so you can then play, is a totally different thing. Don't compare real time 3D rendering with a movie again please.

If we want to talk about frames we also should talk about Hertz, but we can be here for the whole night.

In games you always need "more frames" per second because of the latency and overall response to your controls, it is as simple as that. 30FPS is "playable" because motion is fluid enough for the human eye and "that limit" is being used on consoles because the hardware can work with it and do more by limiting what the GPU is going to render (frames) in real time.

More frames more processing power required to render. More graphical features, more processing power needed to run it at higher frame rates. Why higher frame rates? To play with the best input possible in real time.

Etc...
 
I get terrible frame rates at 1920 X 1080. I've dropped down to mid graphics and 1280X768. Graphics look pretty much the same. A bit less detail on planetary surfaces, but not that much and I the game is much more responsive. Also, I get almost 24 to 30fps in SC. Of course I my GPU is minimum spec. Radeon R7 240 2GB. I'll be upgrading in a few weeks to at least a Radeon R9 280 although I'd like a R9 280X. But I still don't think I'll be able to run in 1920 x 1024 high settings. This game seems to be future proof. Not quite enough power currently on the market, but there will be. In 5 years we'll probably be able to run 1920X1080 in Ultra smoothly.
I was merely pointing out that you are wrong about the current state of technology and you can play ED smoothly now, easily, and not in 5 years lol.

Well, if you are getting framerates of 30 or above it should be smooth. 30 frames per second is what TV shows and movies have been filmed at for years and they are smooth. You start getting jerkiness at below 30 fps. So long as I can consistently get 30fps or above, I'm happy. Sure 60 or even 90 fps is good, but it's the same as listening to an MP3 song encoded at 256 kbit/sec as opposed to 320 kbit/sec. Unless you have ears with the sensitivity of a dog, you won't hear any different, but the 256kbit/sec takes almost takes substantially more space.

And yeah, I know my machine is quite low end. I'm going to be getting a better one that will be almost recommended specs. Not quite, they recommend an AMD R9280X. The machine I'm looking at is an R9 280, but with a hex core processor.

In gaming a high frame rate doesn't necessarily mean smooth gameplay. Depends on the quality of your OS install, the state of your registry, optimisations, system storage, seek times, bottlenecks and drivers and a myriad of other factors. Lots of players have complained about stuttering in ED with high frame rates.
Don't think that I am an Intel fanboi, I built several AMD Based systems in the late nineties and noughties from a K2 to a K3 from an Athlon upto the Athlon XP and would still have an AMD system if they were comparable to Intel as I believe in supporting competition. Sadly, at the moment they are not comparable, although I am sure plenty of people will disagree.
You could get an i5 4690 system and a GTX 960 or even 970 for around the same cost that would run ED easily and Star Citizen quite well if you want to go that route. It seems that you are failing to future proof yourself with the system that you intend and would most likely be looking to upgrade again within 2 years.
I do admit to being a Nvidia fanboi tho and wouldn't have an AMD card in my system if you paid for it.:p
 
Hi everyone,

we're deploying a hotfix for the AMD issues seen in the Beta 3 release. This will be available at 6.30pm GMT. There will be no server downtime.

Michael

I assume this 'AMD' issue related to AMD GPUs or does it also refer to known problems with AMD CPUs?
 
30FPS 720p:
www.youtube.com/watch?v=YE7VzlLtp-4


60FPS 4K:
www.youtube.com/watch?v=aqz-KE-bpKQ


The human eye after 16FPS more or less it can't tell if in a second there are more frames than that...but below it can. Then above that amount what we see is actual motion, we don't see in frames (I read that a lot over the internet...). You can tell if the actual motion is more fluid or not but that just all. (as in the videos above)

In gaming the whole story is different because motion is one thing (what you see) but how it is being rendered and processed in real time by your GPU/CPU so you can then play, is a totally different thing. Don't compare real time 3D rendering with a movie again please.

If we want to talk about frames we also should talk about Hertz, but we can be here for the whole night.

In games you always need "more frames" per second because of the latency and overall response to your controls, it is as simple as that. 30FPS is "playable" because motion is fluid enough for the human eye and "that limit" is being used on consoles because the hardware can work with it and do more by limiting what the GPU is going to render (frames) in real time.

More frames more processing power required to render. More graphical features, more processing power needed to run it at higher frame rates. Why higher frame rates? To play with the best input possible in real time.

Etc...

Yeah, with those 2 movies, I can tell the difference in the quality, but not in the motion. Motion-wise, they both look the same, although the 4k is more visually pleasing with the color and textures.

As for playing with the best input, I do agree. It does make it more responsive with higher framerates.
 
Yeah, with those 2 movies, I can tell the difference in the quality, but not in the motion. Motion-wise, they both look the same, although the 4k is more visually pleasing with the color and textures.

As for playing with the best input, I do agree. It does make it more responsive with higher framerates.

No they don't. There is a huge difference and I feel really sorry that you can't see it :]
 
For anyone still wondering what this fix was about.
Planetary texture errors like this on MAD cards are now solved:
Screenshot_0298.jpg
 
Last edited:
No they don't. There is a huge difference and I feel really sorry that you can't see it :]

I agree with you concerning the actual textural quality. In the 2nd one, it is stunning as compared to the 1st one. Motion? I see no jerkiness in either one. Maybe there's something wrong with my eyes, but the motion looks fluid in both. Of course, you can see individual hairs move in the 2nd when you can't in the 1st. If that's what you are talking about, then sure. I was talking about big picture motion. Of course the fine detail motion will be better in the 2nd one. Cause in the 1st the resoluti9on isn't there to even see the extremely fine details.
 
I think 30 Frames are way to low if the Picture Comes to Motion. If i Play racing sims for example i prefer 100fps+ because it is necessary to see the track and the braking Points as they come and not stutter in. Even 60FPS are not enough. I think it's the same with all games which have fast motions. TBH you will see a difference between a 25Hz TV and a 100Hz TV. If you don't believe it try it and watch a Football game, or something with fast camera moves. You will clearly see that the Motion is much more fluid. Next Thing i believe is that the human eye is really not capable to different more then 25 FPS but it has a Focus and then it doesn't see much of the surroundings but the one object really clear. Thats the Problem with movies or Video games. You'll have a small Screen on which the eye can't focus on a particular Thing and the dimensions are wrong (too small). To come back to the example of Speed. If you are 100m/s fast and you only have 25fps then you see every 4m a Frame. If you Focus on an object you'll see that this object stutters because it is displayed every 4m only. Now if you turn up the FPS to 100 the object is shown every 1m which is much more fluid if you Focus on it. I think there is no FPS Limit in the real world, ok maybe the light frequency, but it's much higher then 100 Hz.
 
Last edited:
Status
Thread Closed: Not open for further replies.
Back
Top Bottom