Jump to content

In an attempt to minimize latency Google stadia will predict your actions for you


legend

Recommended Posts

https://www.pcgamesn.com/stadia/negative-latency-prediction

 

Quote

Yes, you heard that correctly. Stadia might start predicting what action, button, or movement you’re likely to do next and render it ready for you – which sounds rather frightening.

 

This sounds absolutely awful to me.

 

The only reason to do this is if the latency is long enough that the difference is noticeable. And if the difference is noticeable, then mispredictions will be noticeable and I think far worse.

 

I say worse because with latency, I can at least account for it to some degree and adjust my control accordingly (those of us who used to play online games when latency wasn't so good probably are familiar with adjusting for a certain amount of lag). But if the game executes a command for me, and it's wrong, there's nothing I could have done to prevent that. I don't think I would have nearly as easy a time trying to predict how their system will behave versus a flat amount of latency in my control.

 

Since game play is inherently non-stationary--you learn how to control the game differently; how to more quickly respond to events--I'm even more doubtful that they can setup a prediction that *also* accounts for how you as a player are changing your behavior.

 

This sounds like a hot mess to me. Maybe cloud gaming is just a bad idea for a lot of games?

Link to comment
Share on other sites

Haven't games been doing this since Quake in the '90s? Like, I remember John Carmack giving interviews about this kind of thing back when I was in highschool. I guess the biggest difference here is that previous latency predictions were done client-side and now we're looking at server-side predictions.

Link to comment
Share on other sites

43 minutes ago, Ghost_MH said:

Haven't games been doing this since Quake in the '90s? Like, I remember John Carmack giving interviews about this kind of thing back when I was in highschool. I guess the biggest difference here is that previous latency predictions were done client-side and now we're looking at server-side predictions.

 

Not really the same. It's one thing to interpolate the coarse movements of a network player you're observing, and handle hit detection as some degree of tolerance between the client side state and the host state. It's a whole other matter to fundamentally change what your controlled character is doing.

Link to comment
Share on other sites

21 minutes ago, legend said:

 

Not really the same. It's one thing to interpolate the coarse movements of a network player you're observing, and handle hit detection as some degree of tolerance between the client side state and the host state. It's a whole other matter to fundamentally change what your controlled character is doing.

 

I'm not sure I'm really seeing the practical difference here. I shot that guy and bullet went straight through him. I tried to shoot the guy and the gun never fired. If the predictions are off, you get the same result. The part that sucks is that this can affect single player games in a way that only online multiplayer games has to suffer.

 

However, this was always going to be a problem. There are only possibly four companies in this country with servers close enough to people to make online gaming like this really possible. You have Microsoft, Amazon, Facebook, and Google. Even then,  someone like Microsoft recommends a dedicated tunnel for lag free Excell and Skype.

 

Either you go with predictions, the same way online gaming had to deal with in the dial up days or you just get random pauses, stutters, and reduced image quality.

 

I'm not entirely sure what the solution here is. Internet access in the States is shit. Speeds, price, competition, latency. I'm not sold on any of these game streaming services, so I guess we'll see how things work out.

Link to comment
Share on other sites

9 hours ago, Ghost_MH said:

I'm not sure I'm really seeing the practical difference here. I shot that guy and bullet went straight through him. I tried to shoot the guy and the gun never fired. If the predictions are off, you get the same result. The part that sucks is that this can affect single player games in a way that only online multiplayer games has to suffer.

 

It's now going to potentially affect the players action -- it thinks I am going to rotate my FOV (so it does), and I didn't move the controller.  Potential for all sorts of weird effects.

9 hours ago, Ghost_MH said:

However, this was always going to be a problem. There are only possibly four companies in this country with servers close enough to people to make online gaming like this really possible. You have Microsoft, Amazon, Facebook, and Google. Even then,  someone like Microsoft recommends a dedicated tunnel for lag free Excell and Skype.

Your list isn't consistent with what google tells me for the largest cloud providers.

That said, both Netflix and Hulu use AWS, so why can't EA/Acti-Blizzard/Sony/etc.?

9 hours ago, Ghost_MH said:

Either you go with predictions, the same way online gaming had to deal with in the dial up days or you just get random pauses, stutters, and reduced image quality.

Why?

I don't believe the reduced image quality and pauses I had with Stadia would be fixed with input predictions.  The low bandwidth of my controller inputs is much less likely to get interrupted vs. the much larger bandwidth of the video stream coming back.

Link to comment
Share on other sites

23 minutes ago, AbsolutSurgen said:

It's now going to potentially affect the players action -- it thinks I am going to rotate my FOV (so it does), and I didn't move the controller.  Potential for all sorts of weird effects.

Your list isn't consistent with what google tells me for the largest cloud providers.

That said, both Netflix and Hulu use AWS, so why can't EA/Acti-Blizzard/Sony/etc.?

Why?

I don't believe the reduced image quality and pauses I had with Stadia would be fixed with input predictions.  The low bandwidth of my controller inputs is much less likely to get interrupted vs. the much larger bandwidth of the video stream coming back.

 

It's not about the largest cloud providers, but who has data centers closest to the people in footprints large enough to support  these services. Those servers need to be near people. There's no way to get around the speed of the tubes running all of this. I guess a Comcast or Cox or any other ISP could totally leverage their control of their portions of the network across the States, but that's a whole other net neutrality issue. Also, that's already being taken advantage of by the largest cloud providers. Comcast and Verizon don't have an interconnect at major data centers across the country for nothing. That's pretty much what Netflix means when they stream from within Comcast's network to Comcast costumers, for instance.

 

As for all those other companies, I'm pretty sure they're already using someone else's servers. Isn't Sony using Microsoft's Azure for PlayStation Now?

 

My whole point is that the latency issues people see aren't caused by things the providers have much control over. How they handle those issues will be there trick. Could a company like Apple spend gobs of money dropping a ton of servers at every street corner in the nation? Sure, but then they'd go broke. At that point, it would probably be cheaper to just loan out game consoles on a rental basis and call it a day.

Link to comment
Share on other sites

1 hour ago, Ghost_MH said:

 

It's not about the largest cloud providers, but who has data centers closest to the people in footprints large enough to support  these services. Those servers need to be near people. There's no way to get around the speed of the tubes running all of this. I guess a Comcast or Cox or any other ISP could totally leverage their control of their portions of the network across the States, but that's a whole other net neutrality issue. Also, that's already being taken advantage of by the largest cloud providers. Comcast and Verizon don't have an interconnect at major data centers across the country for nothing. That's pretty much what Netflix means when they stream from within Comcast's network to Comcast costumers, for instance.

 

As for all those other companies, I'm pretty sure they're already using someone else's servers. Isn't Sony using Microsoft's Azure for PlayStation Now?

 

My whole point is that the latency issues people see aren't caused by things the providers have much control over. How they handle those issues will be there trick. Could a company like Apple spend gobs of money dropping a ton of servers at every street corner in the nation? Sure, but then they'd go broke. At that point, it would probably be cheaper to just loan out game consoles on a rental basis and call it a day.

With the nature of game streaming -- you still need to put custom servers all over the country.  You can't just put a virtual server in an existing data centre.

Link to comment
Share on other sites

16 minutes ago, AbsolutSurgen said:

With the nature of game streaming -- you still need to put custom servers all over the country.  You can't just put a virtual server in an existing data centre.

 

To some extent. However, there's a vast difference between dropping new severs in a space you own and manage and acquiring the footprint to handle all those new servers. It's not just severs and data centers. For companies that don't also have a large presence all over the place, there are employees than need to be hired to manage and support those servers. There's a lot that can't just be done remotely.

Link to comment
Share on other sites

1 minute ago, Ghost_MH said:

 

To some extent. However, there's a vast difference between dropping new severs in a space you own and manage and acquiring the footprint to handle all those new servers. It's not just severs and data centers. For companies that don't also have a large presence all over the place, there are employees than need to be hired to manage and support those servers. There's a lot that can't just be done remotely.

That's why you pay AWS to do that for you.

Link to comment
Share on other sites

11 hours ago, Ghost_MH said:

 

I'm not sure I'm really seeing the practical difference here. I shot that guy and bullet went straight through him. I tried to shoot the guy and the gun never fired. If the predictions are off, you get the same result. The part that sucks is that this can affect single player games in a way that only online multiplayer games has to suffer.

 

I think you might be confused about what typically happens in multiplayer games. In no world in multiplayer games will it register you shooting when you didn't nor vice versa.

 

What multiplayer games will often do is extrapolate slightly where other players are. Lets call this the client state. You then shoot your gun at them. If you hit them in your client state, then the message sent to the server is "I think I hit player x where I calculated them at position z". The server then has more up-to-date information of where player x actually is. Lets call this z'. If ||z - z'|| is small, then the server awards you the point because it means you were in sync enough. Only if you system was significantly lagged (resulting in a big ||z - z'||) will things start to feel out of whack.

 

At no point do your controls get changed; the system just allows for a tolerance of mismatch between each players world state when it comes to assigning significant discrete events like successfully  hitting another player, and it does its best to interpolate other players actions in your world state.

 

 

 

 

 

All this said I've seen some discussion suggesting what Google is proposing is not as bad as this article made it sound, but it's not going to solve the problem either and might introduce a new kind of micro stutter.

 

That is, there are generally three sequential bottlenecks in doing cloud gaming.

1. Controller command -> server

2. Frame update from controller command

3. Server frame -> client

 

What Google might be trying to do is eliminate the frame time delay, which at 60fps is 16ms. Here's how you can do that.

 

Suppose there are only two commands you can send: A and B. On every frame update, the google server can then run two version of game; one where in the next frame you execute A and another where you execute B. It then queues up these two results. Now when the server receives a command, if it was A, it immediately returns the queue results for A; otherwise B. Consequently, it skips the frame rendering time.

 

If you have many more than two possible commands, then the server can queue up lets say k of the most likely ones. Now when a command comes in, if it's one of these k, you again skip the frame update delay. If it's not, then it rolls back and executes the actual command you made.

 

 

So in the ideal case, Google can eliminate the frame update delay. However, we still have the following problems:
1. The round-trip internet delay is probably much worse than the frame delay

2. It requires a shit ton of compute to do this, especially if it has to buffer long possible future trajectories

3. For every miss, you now have a different amount of delay incurring a micro stutter in your controls.

 

 

If this is what Google meant, it's not as bad as I feared, but still isn't a cure and has other problems.

Link to comment
Share on other sites

This is funny. Some of the most in-the-know gamers out there, including people on this very forum, claim to not be bothered by input delay, abysmal resolution and even that they are literally unable to tell the difference between 20 and 60 FPS, yet we're supposed to believe Johnny Fortnite is gonna be intolerant of some input lag or artifacts on screen or the occasional incorrectly predicted input? With the way people vehemently defend this shit I could see the game playing itself entirely and people screeching about "how they barely notice" and everyone who brings it up is an unhappy elitist.

 

When you cultivate a culture of no standards, don't be surprised when companies expect you to lower yours.

Link to comment
Share on other sites

5 hours ago, legend said:

 

Suppose there are only two commands you can send: A and B. On every frame update, the google server can then run two version of game; one where in the next frame you execute A and another where you execute B. It then queues up these two results. Now when the server receives a command, if it was A, it immediately returns the queue results for A; otherwise B. Consequently, it skips the frame rendering time.

 

Wouldn't you need a REAL beast of a computer to do this?

Link to comment
Share on other sites

4 minutes ago, AbsolutSurgen said:

Wouldn't you need a REAL beast of a computer to do this?

 

Absofuckinglutley. I dare say a single computer might not be able to do it at all (i.e., not even two branches) if we're talking about a modern game.

 

Google can maybe do it to some extent because Google's cloud is insane. But it's a shit ton of energy cost for something that wont really solve the core challenge and potentially (depending on how noticeable control microstutter from misses is) could feel worse

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...