WeasteDevil
New Member
Launch title Forza 5 already makes extensive use of cloud based AI.
It can¡'t help with anything on-screen. Forza is running at 60fps, that's 16ms per frame.
Launch title Forza 5 already makes extensive use of cloud based AI.
It can¡'t help with anything on-screen. Forza is running at 60fps, that's 16ms per frame.
Launch title Forza 5 already makes extensive use of cloud based AI.
The cloud can't help with anything realtime that's on the screen, latency is far too high for that.
As for AI in general, it's not complicated, one of my specialities was ANNs and discrete programming languages. The problem with ANNs is (or at least used to be, I'm out of the loop) is that once you approach the number of neurons of a fly (which if my memory serves me correctly is around 80,000 for the entire nervous system, around 15,000 for the brain itself) it breaks down, well, it breaks down in the fact that adding extra neurons don't provide much benefit.
In a game like Forza, A.I is never that complex anyway. Obviously physics is the big factor there, and you certainly couldn't use cloud for that.
AI is shit in driving games (by this I mean not very realistic rather than easy to beat). This could be a thing of the past though. I just don't see how latency effects AI really; the human brain has reaction times slower than the cloud's latency so how does the latency effect realism?
It calculates how you drive around a certain circuit.. Just so your friends can drive against a AI car that would drive similar. Would really not call that extensive, and i doubt anyone would really care if they canned that function tommorow.
Well for a start you can't replicate the human brain, certainly not on these systems.
Plus the problem isn't so much making a system capable of thinking as we'd expect, much like physics it's making that system fun.
The impression I get is that it's more sophisticated than that.
The game learns your driving style allowing AI opponents to evolve over time. If AI opponents can react to the player in this way then surely they'd be able to react to one another also? The overall effect would be much more realistic, reactive racing rather than the emotionless blobs bumping each other along predetermined racelines that we're used to.
Well the A.I characters will be just be drivers modeled on how friends, or probably how over people drive. It may see how you drive and pick opponents to go against you because of that. Which is far enough, but as i said i really doubt anyone would really be that bothered if they gave you standard A.I drivers instead.
Everyone wants real AI but that's science fiction. You make it sound like it's some trivial thing which will be unleashed by the cloud. Well, I hope you're right.feck it then!
Personally I think such developments sound really innovative and potentially game changing, but if everyone else is just happy with the emotionless computer opponents of old then who am I to argue?
Everyone wants real AI but that's science fiction.
When you've got a learning neural network, more computing power is nothing but helpful.
Because what you're able to do is process a lot more information, and you don't have to do it in realtime on the box. And that frees up more of the box to be doing graphics or audio or other computational areas.
So we can now make our AI instead of just being 20 per cent, 10 per cent of the box's capability, we can make it 600 per cent of the box's capability. Put it in the cloud and free up that 10 per cent or 20 per cent to make the graphics better - on a box that's already more powerful than we worked on before.
I said you make it sound trivial and you assume for some reason that the cloud will bring you AI. Good luck with that.
the human brain has reaction times slower than the cloud's latency
Here I've found the quotes from the Forza developer: link
The article claims that it's likely an exaggeration, but the numbers the guy's talking about are quite astonishing.
For argument's sake let's assume that a console can crunch in-game data at 100Gb/s:
Without cloud technology and assuming that AI and other non-graphical components takes 20% of that processing power then this gives us:
With cloud technology offering a further 600% total processing power for non-graphical components, freeing up the console hardware for purely graphical processing you get:
- 20Gb/s AI
- 80Gb/s Graphics
A 20% increase in graphical capabilities and a 3,000% increase in non-graphical capabilities with the cloud then. Even if somewhat exaggerated I don't see how anyone can fail to be excited about this stuff.
- 600Gb/s AI
- 100Gb/s Graphics
Are you being serious?
I need to ask before I laugh at you!
It's a load of bollocks!
Neural networks sadly don't scale linearly.
Cider, ping google.com and tell me what response times you get in ms.
Yes I'm serious. I can understand why graphics need to be freshly rendered every 16ms, but I don't understand why AI needs to be rendered as quickly.
Obviously a certain degree of non-graphical processing needs to be done in realtime in order to tell the console exactly what to render every 16ms for each frame, but for more complicated algorithms such as AI decision making I don't see why that can't be achieved even with a 60ms latency.
How so?
How do I do that on a phone?
I searched and It returned results in 0.25 seconds if that's what you mean.
Your TV can introduce as much as 150ms of latency alone, and it's plugged into the console with a 1ft cable. Processing takes time, information passing back and forth to servers takes time. It's not useable in real-time.
To summarize, OnLive's overall streaming delay (i.e., the processing delay at the server plus the playout delay at the client) for the three games is between 135 and 240 ms, which is acceptable if the network delay is not significant. On the other hand, real-time encoding of 720p game frames seem to be a burden to SMG on an Intel i7-920 server because the streaming delay can be as long as 400-500 ms. Investigating whether the extended delay is due to design/implementation issues of SMG or it is an intrinsic limit of software-based cloud gaming platforms will be part of our future work.
Here are some numbers from a study on services such as OnLive:
http://www.iis.sinica.edu.tw/~swc/onlive/onlive.html
TVs have higher latencies than monitors, too. Toss in another 50ms or so (which I think is towards the better end).
Bandwidth is another consideration. You had developers complaining about the RAM on various consoles in this generation. The bandwidth of RAM is in the hundreds of megabytes per second. The average US broadband connection has a download speed of 1 MB/s.
The fluctuation in these numbers will also be large in comparison to two pieces of metal in your console. This would posit a design that is somehow "additive" in nature in the sense that the console must be able to render a frame with varying levels of information.
For example, for simplicity, if there were 4 pixels to render, a naive design would be to hand off 1 pixel to the cloud, and 3 to the console itself. But that 1 pixel might not arrive in time. So what do you do? Render nothing for that pixel? You would actually need to render all 4 pixels on the console (possibly badly), enriching it with that extra information from the cloud if it arrives. This can be a pretty difficult operation because not all operations "add" together nicely (what does it mean to add two textures together?). The "addition" of information also takes processing power (processing power taken away from real-time processing).
On a slow connection, you can see the effects on site like YouTube, where you sometimes hit the limit of the buffering, and the audio/video stutters/blurs, before it smoothes itself out again - you definitely notice it. And that's not vector processing - it's just compressed video, which is pretty easy in comparison.
Cider, just give it up! Really, the spiel from Microsoft isn't what is going to happen. The reality is that it cannot happen, it's impossible. I'm a computer scientist telling you that it cannot happen!
If Sony was putting out this bullshit, I'd also say that they are talking bollocks.
Despite all that OnLive manages to work. It's an interesting piece but not entirely relevant to the Xbox.
Remove the graphics component as this would be processed by the console and where does that leave you?
You have to also acknowledge that such technology has been developed with a full 4G rollout in mind.
You don't seem able to say why it cannot happen though.
Many infinitely more prominent and accomplished computer scientists seem to think it rather important so forgive me if I don't take your word for it, man.
The problem I see with it is what happens to a game that heavily uses cloud compute and your net connection goes down? Does the game stop working? Does it just downgrade what it does?
What happens say if you lose your job and you have to cancel your net connection?