Jump to content

cross platform


PcWizKid
 Share

Recommended Posts

No, not any time soon anyways. Architecture is totally different, and even if the main engine was ported over to compile for the PS3 it would run like ****. PS3 has a relatively weak general purpose processor, so if you don't code to take advantage of the SPUs then it performs horribly. If you switch the physics system over to use Bullet that would help a little as it's optimized to work w/ the PS3 architecture, but that would be just the first step and you'd need an LE source license to even think about attempting it as the current DLLs are compiled for x86 not PowerPC.

Windows 7 x64 - Q6700 @ 2.66GHz - 4GB RAM - 8800 GTX

ZBrush - Blender

Link to comment
Share on other sites

There's some challenges to master before you can write games on the PS3:

  • You must release a commercial game on the PC and convince Sony that you are a professional game developer
  • PS3 has it's own graphics API (PSGL 2.0), so Leadwerks Engine would need to be majorly rewritten to use that API (PSGL is similar to OpenGL, but not quite the same)
  • In addition Leadwerks Engine would need to be rewritten in C++, since BlitzMax does not run on its PowerPC CPU and the PS3 OS
  • PS3 GPU is quite low end, even CryTek had problems with it when porting Crysis 2 to it, and they said things would work better on PS4

I think before that happens, there will be some PC console out which can run PC games directly and support hardware upgrades also.

There's already the OnLive console out, which is much cheaper than PS3 and XBOX360, and it fits into your pocket, and can run games made with Leadwerks Engine directly.

Ryzen 9 RX 6800M ■ 16GB XF8 Windows 11 ■
Ultra ■ LE 2.53DWS 5.6  Reaper ■ C/C++ C# ■ Fortran 2008 ■ Story ■
■ Homepage: https://canardia.com ■

Link to comment
Share on other sites

The engine will eventually be available on the PS3, but I don't think Sony will let you use it unless you are already a successful published development house.

 

I am cautiously interested to see what OnLive can deliver, and they are friendlier to small developers. One of the initial games they have available is 2DBoy's "World of Goo".

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Yeah, I watched that video, and the guy gave a nice presentation. He seems honest, at least. I'll try to get some time with him at the next GDC and see what he says.

 

The most significant aspect of this is it removes the barriers to entry that presently exist for both game developers and middleware devs like myself. Right now there are significant technical, financial, and political barriers to entering the console market, for everyone. This decreases competition and variety. With OnLive, you only have to get your game approved for publishing, much like Steam, so it's friendlier to small game developers. And you don't have to write four different versions of your game for the various platforms, just one PC game that works.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Hopefully it'll be launched by then. One thing that is applicable directly to us now is the 80ms perceptual threshold. Going by that, you could do the network code w/o any client side prediction as long as round trip was <80ms, but would have to add some client side prediction at higher latencies or you will notice the lag between pressing a key and getting a reaction from the system.

Windows 7 x64 - Q6700 @ 2.66GHz - 4GB RAM - 8800 GTX

ZBrush - Blender

Link to comment
Share on other sites

With OnLive you don't need any networking for the game itself.

You can just code the multiplayer features into the game using direct memory access between game clients and game server.

Of course you can also use the local network on the same IP, so you don't have to rewrite your networking code, and the network speed is then as fast as SATA3 (6GB/s).

Ryzen 9 RX 6800M ■ 16GB XF8 Windows 11 ■
Ultra ■ LE 2.53DWS 5.6  Reaper ■ C/C++ C# ■ Fortran 2008 ■ Story ■
■ Homepage: https://canardia.com ■

Link to comment
Share on other sites

Is that part of the OnLive API? I don't think direct memory access would work because you can't guarantee that the clients/server will be in the same datacenter, much less the same machine.

Windows 7 x64 - Q6700 @ 2.66GHz - 4GB RAM - 8800 GTX

ZBrush - Blender

Link to comment
Share on other sites

Hopefully it'll be launched by then. One thing that is applicable directly to us now is the 80ms perceptual threshold. Going by that, you could do the network code w/o any client side prediction as long as round trip was <80ms, but would have to add some client side prediction at higher latencies or you will notice the lag between pressing a key and getting a reaction from the system.

 

 

I don't think trying to hybrid that would even be worth the hassle. As we all know, a ping value isn't like a graphics card where you can put on the box, requires this model. A person's ping can fluctuate so much in a given session. It would be video game suicide to require a ping of 80 ms or less to play an online game. The best you can say is requires broadband, but even then many people will have over an 80ms ping. If you are making a PC game that requires any sort of fast twitch, client side prediction is required. There isn't much getting around that yet.

 

Add on top of that a household network where other people are doing things at the same time or cable where you share the line and it opens up for a disaster. Every online game out today and the last 10 years, with client side prediction, can smoothly handle a ping up to 250. Requiring anything less than that, without the network infrastructure in place like we have today and well into the next 5-10 years, would be a step in the reverse direction and only shrink your user base.

Link to comment
Share on other sites

I doubt they significantly rewrote Crysis and all those other games just so they would work on OnLive. I think the way it's treated is just like you have a really long controller cord that goes to another city, and a really long monitor cable that comes back.

 

Since most games list available servers by ping, it would be obvious which servers were in the same data center, even with no changes to the game, because they would have a ping of like 3 milliseconds. Even when I connect to my own machine through the nearest hub, wherever it is, I get a latency of 5 milliseconds.

 

If this works, imagine how much development costs it will get rid of. No more dealing with the PS3's funny memory management, or some alternative API someone made for one console. DirectX or OpenGL, it wouldn't matter. It would just run every PC game. And your system specs would be known and definitely high.

 

It might not work for twitch FPS games, but I don't think driving or other games will be a problem, especially when an analog controller is used.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

You don't necessarily even have to have seperate exes for clients and server. A powerful server could render all 64 clients in the same exe. But I guess that would be more work to rewrite the game, than just use a local network, or server to server network, which is anyway super fast.

Ryzen 9 RX 6800M ■ 16GB XF8 Windows 11 ■
Ultra ■ LE 2.53DWS 5.6  Reaper ■ C/C++ C# ■ Fortran 2008 ■ Story ■
■ Homepage: https://canardia.com ■

Link to comment
Share on other sites

You don't necessarily even have to have seperate exes for clients and server. A powerful server could render all 64 clients in the same exe. But I guess that would be more work to rewrite the game, than just use a local network, or server to server network, which is anyway super fast.

 

 

This is another good question to raise. One of the most popular games on PC is the MMO. Are you required to now host your database servers and game servers on their network for this to work? Would MMO developer companies be willing to give up that freedom if it's required? Piracy is a big part of the appeal yet MMO games don't worry so much about that since there is a subscription to play the game.

 

I can see the benefits to this, but I don't think what he's claiming is possible on the scale they are planning on. I have a hard time believing the amount of hardware required for this to work and all the things that need to go right on the network side would allow them to make a profit. I'll eat my words if it works, but so far we see all hype. Why isn't it out yet? My guess is it most likely only works correctly in large scale cities and their trying to figure out what to do about the rest of the US.

Link to comment
Share on other sites

If a game is playable in the first place on OnLive, then adding a multiplayer component to this would be just the same as if you were playing it locally. The same constraints would still exist, and the same solutions like client-side prediction would still be used. The worst case scenario is latency would be about the same or slightly worse than it is now, with a conventional server located anywhere in the world. The best case scenario is a game server located in the same data center would deliver overall better results, because the multiplayer latency would be completely eliminated.

 

This all rests on the idea that OnLive has managed to get "as the crow flies" data pathways from the OnLive console to the data center, whereas conventional multiplayer networking may reroute packets all over the place. Therefore, OnLive multiplayer games could conceivably be much less laggy than conventional randomly routed games.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

From what I understood, the games should be primarily played on the same server, and although it's possible that some players play on a different server, it's not completely optimized for that yet, and would need also higher bandwidths between the servers. At the moment the limit is 1500 miles between client and server.

 

They could solve the problem by launching their own OnLive satellites into space, which allow 100GB/s connections :)

Ryzen 9 RX 6800M ■ 16GB XF8 Windows 11 ■
Ultra ■ LE 2.53DWS 5.6  Reaper ■ C/C++ C# ■ Fortran 2008 ■ Story ■
■ Homepage: https://canardia.com ■

Link to comment
Share on other sites

From what I understand, playing the game is the same as playing on your own system. So hosts don't have to be in the same center, or even be on an OnLive system, it works like like it would if bought the game. You could host a game on an OnLive server for games that don't have dedicated servers. Games that do have dedicated servers will work just the same (a dedicated server somewhere), with the clients being run on OnLive servers. Limit is about 1000 miles to keep latency < 80ms.

 

Satellites wouldn't help at all and would make things MUCH worse. Geostationary orbit is 22,236 miles, so latency is VERY high bouncing stuff off satellites. Light/radio waves travel the same speed (roughly) via fiber underground or via radio waves to the satellite, so using a satellite would make it about 22x too slow for OnLive.

Windows 7 x64 - Q6700 @ 2.66GHz - 4GB RAM - 8800 GTX

ZBrush - Blender

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...