Apex Legends: Update zu den Serverproblemen

PCGH-Redaktion

Kommentar-System
Teammitglied
Jetzt ist Ihre Meinung gefragt zu Apex Legends: Update zu den Serverproblemen

In einem ziemlich langen Blog-Eintrag haben sich die Entwickler des Free-to-Play-Shooters Apex Legends zu den anhaltenden Server- und Stabilitätsproblemen geäußert. Damit möchte man der Spielerschaft genauer vor Augen führen, wo die Probleme liegen und wie man die Situation in Zukunft verbessern möchte.

Bitte beachten Sie: Der Kommentarbereich wird gemäß der Forenregeln moderiert. Allgemeine Fragen und Kritik zu Online-Artikeln von PC Games Hardware sind im Feedback-Unterforum zu veröffentlichen und nicht im Kommentarthread zu einer News. Dort werden sie ohne Nachfragen entfernt.

lastpost-right.png
Zurück zum Artikel: Apex Legends: Update zu den Serverproblemen
 
"
The question is: What exactly is happening during each tick? We want the world state to be as accurate as possible, which is why our servers save the full world state on each tick. If we didn’t do this, it would probably save some of the CPU costs on our servers, but we would lose accuracy in our simulations, and that isn’t worth the risk.

Put simply, the higher the tick rate, the higher the bandwidth sent to all players. If we were to move from a 20Hz server to a 60Hz server, it would mean multiplying the bandwidth the game uses by three. As of today, Apex Legends roughly consumes 60kB/s at the beginning of a game. A 60Hz server would consume 180kB/s. That may not sound like a lot, but it’s quite a bit, and we are always looking for ways to reduce the required bandwidth.

But why would it matter if the bandwidth went a little higher? Keeping bandwidth costs low for games is much more critical than, say, for video streaming. For high-bandwidth applications (streaming, downloading, etc), jitter or hitches are easy to hide by buffering minutes of a stream, dropping stream quality, etc. You probably won't be shown jitter in a download, and you probably don't care that the speed is variable by a few or even hundreds of milliseconds.

Games do not have this luxury. Skipping even a couple 50ms intervals can start to feel bad. Skipping a few more can send you into a death spiral of having to send you bigger and bigger updates to catch you back up. There are no exceptions to not getting you those updates, because your client needs a perfect state of the world to be accurate.

The above example shows how comparing tickrate between games is complicated, because the information contained in each tick varies. There’s another complication as well, which is that the limits on inputs that servers can receive and send out aren’t always the same even if they have the same tickrate. To be specific: in many games, if a server runs at 60Hz, it means the client can only send 60Hz inputs. If you run at 60fps it’s fine, but if your client runs at 120fps, you would lose half of your inputs. This is not the case in Apex Legends. We process variable rate of inputs fine. (As a side note, the higher your FPS is in Apex, the higher your bandwidth usage is as a result.)

Okay, so we’ve discussed some possible downsides that come with increasing server tickrate. But what about the upside of going from, say, 20Hz to 60Hz? Come on, Respawn! Wouldn’t that make the servers three times faster and three times better? Just do it!

Based on our findings, it would not result in a meaningfully different experience, and we want to explain why.

For the sake of the argument, let’s assume that you’re averaging about 50ms ping, or latency. Remember that your ping measures the speed of a full round trip between your machine and the server. So assuming there are no other problems like fluctuating latency or hardware lag (eg. display devices introduce 20-50ms delay), the server will receive your input 25ms (half ping) after you press a button or flick your mouse.

Since our servers are 20Hz, they update the world state every 50ms (1,000ms in each second / 20 ticks per second = 50ms per tick). So in the worst-case scenario, your inputs will be processed by the server after 75ms (25ms + 50ms).

To figure out what that 75ms delay actually means in terms of your experience, you have to think about your frame rate. The math here can get tricky, but remember that in a 60fps game, each frame takes about 16.67ms (1,000ms in each second / 60 frames per second = 16.67ms per frame). If your inputs are being processed by the server after 75ms, as in our example above, and your game is running at 60fps, that means the lag between your input and its impact on the game is about five frames (75ms for each update / 16.67ms per frame = about 4.5 frames, and round it up to 5 frames since there’s no such thing as a half-frame).

If you do all the same calculations above for a 60Hz server, you get 41.67ms for maximum delay between input and the server processing it (25ms ping + [1,000ms / 60 ticks per second = 16.67ms per tick] = 41.67ms).

41.67ms is definitely better than 75ms, but what does it result in as far as frame-rate goes? Let’s again assume we’re running at 60fps. Each frame takes 16.67ms, so now the lag between your inputs and the server recognizing them is about three frames (41.67ms for each update / 16.67ms per frame = about 2.5 frames, round it up to 3 frames since there’s still no such thing as a half-frame).

Put all this math together, and you realize that 20Hz servers result in about five frames of delay, and 60Hz servers result in three frames of delay. So for triple the bandwidth and CPU costs, you can save two frames worth of latency in the best-case scenario. The upside is there, but it isn’t massive, and it wouldn’t do anything for issues that are tied to plain old lag (like getting shot while in cover), ISP-level issues, or bugs (like with hit reg and slow-mo servers).

Our example examined the upside of going from 20Hz to 60Hz. You can follow the math for other jumps, like from 20Hz to 30Hz or even 40Hz, and you’ll find that the gains in frame rate would be similarly quite small. You’d need to increase tick rate very drastically before you could really start to feel it—even the jump from 20Hz to 60Hz would feel like the difference between 58 FPS and 60 FPS. This difference isn’t nothing, but we sincerely believe that it isn’t enough to prioritize tickrate changes over other more efficient improvements we could be making."

Schön langen Artikel machen aber leider steht da nur das mehr Tickrate besser ist und mehr kostet und EA nix zahlen will. 60 fps Apex bei playstation 4 passt ja mal garnet und alle unter ein Dach zu bringen. Den PC Server sind gleich schwach. Bei 120 hz sind das schon 2 frames und ich spiel auf 190 fps/hz und halte diese Framerate auch. Einfach peinlich wie weit Apex server zurück fallen. Ich schätze mal pcgh bekommt bezahlt quasi das Reddit zu reposten.

Hier ein besseres Video zum Problem. Netcode ist seit über 1 Jahr gleich und alle damaligen Bugs auch.
Eingebundener Inhalt
An dieser Stelle findest du externe Inhalte von Youtube. Zum Schutz deiner persönlichen Daten werden externe Einbindungen erst angezeigt, wenn du dies durch Klick auf "Alle externen Inhalte laden" bestätigst: Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit werden personenbezogene Daten an Drittplattformen übermittelt.
Für mehr Informationen besuche die Datenschutz-Seite.

Doppelte Tickrate bedeutet auch halb so viele Noregs. Etwa 1/10 hits in Apex werden nicht registriert.

In Counterstrike ist mein ping unter 10 bei Apex immer über 50. Einfach schwach deren Falsche Verteidigung anzuführen.
Die besten Spieler in Apex spielen absichtlich auf 200 ping Servern um Ihre Ranglistenpunkt zu ergaunern weil das Game Highpingleute favorisiert.
 
Zurück