|
Cube & Cube 2 FORUM
|
Cheating & open source, revisited |
by Aardappel_
on 04/27/2005 07:54, 218 messages, last message: 06/16/2006 17:23, 188273 views, last view: 11/01/2024 13:19 |
|
As you all know, cheating is a problem for Cube being Open Source. Noone likes the current solution of the incompatible binaries, and I am getting to the point where I see the usefulness of having other people continue to work on Cube whenever I don't have the time.. currently that is problematic and would be much easier if the source to the official game could be truely open.
Multiplayer continues to be an important aspect of Cube, so we can't ignore cheating and simply hope that people won't change 1 line of code to enable god mode or permanent octa-damage, because they will (tell me something about human nature and the people on the interweb).
The solution can't come in the form of "cheat protection", this simply isn't possible with the current cube, and even if the entire gameplay code was moved serverside, is still fragile. Don't even suggest it... make sure you understand the nature of the client/server gameplay code before commenting.
The solution for Cube I feel has to be a social one. As you may remember, I designed a solution before:
http://wouter.fov120.com/rants/trusted_communities.html
The problem with this particular design is that it is too complex to set up, and too centralized. I would like to come up with a solution that is simpler, less implementation work, and can work with any group of people, centralized or not.
This is the idea I came up with sofar:
Every player that wants to play in a cheat free environment, can use a command in Cube to generate a set of key files. A key file is simply a file of, say, 10000 random bytes. The player then hands out these files to players he trusts... or rather, players he wants to trust him. (why there are multiple files will become clear later).
A server can either be in untrusted mode (default, works as before), or trusted mode. It can be set to trusted mode by the server admin, or voted by the players until the server empties. It will show up in the server browser as trusted.
If a player A & B connect to a trusted server, A looks up B's nickname in his folder of key files. If he finds a corresponding key file, he chooses a few random file offsets and reads the bytes there. It now sends a packet to B asking it for the bytes at those offsets. If B really is B, it can simply read its own keyfile and return the values. A now compares the values, and if they match, it sends a "I trust B" packet to the server. The hud shows which clients you trust, and for each client how many clients trust him in total. You are now sure that B really is who he says he is.
On a trusted server, people that after exchange of trust packets have gained no trust, can be booted from the server automatically. This allows you to play games with trusted people in your community, and have external people unable to be join the game.
asking for random offsets guarantees that untrustworthy clients or even servers never get to sniff keys. "Trust" is evaluated locally and for you only, so can't be spoofed.
The one problem would be handing your key file to someone who later turns out to be untrustworthy. This person could now impersonate you and appear to be you to all your trusted friends. Hence the multiple key files, so you can give a different key file to different people (or groups of people). That way, if the person "goes bad", he can't impersonate you towards your friends, as he doesn't have the keyfile your friends have.
The system is not perfect of course. You can still have 2 cheaters join together and trust eachother. Luckily cheaters hardly ever come in groups, and there are more complicated ways to protect even against this.
The biggest issue is the inconvenience of having to exchange key files, and especially to require new players to find existing players on forums/irc before they can sensibly play. I think it is bearable though, as you only need to do it once, and Cube multiplayer is a fairly closed community. And if servers are by default untrusted, you can give newcomers the benefit of the doubt until they behave suspicious.
What do you all think? Please think it through thoroughly before commenting (I am talking to you, Jean Pierre! :). I am especially interested in "holes" in this system, i.e. ways that cheaters could abuse it if they really wanted to.
|
|
Board Index
|
|
#104: Re: .. |
by CC machine
on 06/12/2005 17:49, refers to #88
|
|
yes i know but it would work (if possible)
reply to this message
|
|
#105: Info about keys, key pair public/private and RSA |
by CrazyTB
on 06/13/2005 21:34
|
|
If keys are to be used, they must be in pair: public/private, like in RSA. Maybe knowing how SSH and OpenSSL works could be useful. Maybe using DSA instead of RSA.
Short intro on RSA: When you encode a message with "private key", you will need the "public key" to decode it. The same way, if you encode using "public key", you will need "private key" to decode it.
Public/private keys are equivalent in function, the only difference between them is that public can be really public available (post on forums, e-mail it, and so on), while private key must be kept secure (you shouldn't copy it).
A handshake between two clients A and B could be: A uses the B public key to encode some message. Then, A encode the already encoded message using A private key. A sends the double-encoded message to B. B uses the A public key to decode the message, then B uses the B private key to decode the message. Now, the message has been fully decoded. If the message makes sense, then we will know that A and B really have the private keys corresponding to public ones.
Remember: this only trusts that A is A and B is B. There is no need for encryption in normal game data, since we don't want to protect it from being sniffed or modified by "man-in-the-middle". The encryption would make a "secure tunnel", where only the ends (the clients/servers) would be "insecure". Our main concern is to make the "ends" secure, not the "tunnel", so, encryption won't solve our problem.
reply to this message
|
|
#106: Technical solution (?) |
by CrazyTB
on 06/13/2005 21:51
|
|
It isn't possible to verify the binaries, nor verify if one or another function has not been altered, since it is possible to the cheater intercept these checks and use the "official" binary/code for answering these checks.
So, it isn't possible to check the client code. Solution? Make all clients and/or the server check if data sent by clients is consistent. There is no other way.
It is also possible to make the client report its ammo (of all 4 weapons), armour and health from time to time, or only in "special conditions" (conditions when these values are changed, like getting items or being hurt). The server and/or clients will check if these values are equal to values the client should have. To make this work, the server and/or clients (I think server-only is easier in this solution) must have all stats for all players, and must keep these stats updated. Shouldn't be difficult to do that.
To not affect too much the network traffic, when a player get hurt, only health and armour should be sent; when firing, only ammo for used weapon must be sent; and so on...
Maybe the client should only send this data once per second.
This will solve the problem of "god mode" and "unlimitted ammo".
To solve the problem of "hyper damage", the server should also check if the weapon's damage is inside possible limits. This, however, may not solve all possible damage cheats. Solution? The server should choose the appropriate damage for each shoot.
Only hyper-speed and walk-thru-walls cheats will remain. These can be solved as people said: check if movement is allowed. (maybe this could be made client-side and clients must send an "ok" to server?)
What do you think? Good idea/solution?
reply to this message
|
|
#107: Re: .. |
by Pxtl
on 06/14/2005 07:36, refers to #107
|
|
That approach bloats the netcode, and is effectively a voting system. Voting systems are really, realy complicated and hard to get right.
reply to this message
|
|
#108: .. |
by CrazyTB
on 06/15/2005 02:03
|
|
I'm not sure, but I think some of the checks should be made in server, just because doing that in clients will need to transfer this redundant data to all clients and collect the response of them. This adds a lot of network traffic.
I think doing "redundant-data-check" (like ammo, health...) on server and "non-redundant-data-check" (like position, speed, moving/firing through walls...) on clients would be the best solution.
Checking the ammo, health, etc. is very easy. Just keep a couple of variables for each player, do one or two additions and subtractions, and compare the resulting value with value the client sent. I don't think this can slow down too much a server.
reply to this message
|
|
#109: .. |
by pushplay
on 06/15/2005 03:05
|
|
I think what hasn't been said here is that this whole verification system will beat some hacks, but it will never beat every hack. So you'll put yourself in a situation where you make the program uglier and uglier while the protocol gets thicker and thicker, in an arms race to try to keep up with the latest clever hack. You can't win that way, you can only spend tons of effort to minimize loss, and no one here is getting paid for that.
reply to this message
|
|
#110: Re: .. |
by Aardappel_
on 06/15/2005 03:58, refers to #109
|
|
amen.
reply to this message
|
|
#111: .. |
by CrazyTB
on 06/16/2005 03:18
|
|
So... What to do? Noone wanna cheaters, but there will be some.
Maybe you could take a look at "fuhquake" game. It is open source, can be recompiled, but uses a closed-source DLL (dynamic link library) just to authenticate at server. I really don't know how this works, I never used fuhquake, I'm just saying what a friend (which uses Gentoo) told me.
reply to this message
|
|
#112: Re: .. |
by remouk_
on 06/16/2005 11:42, refers to #111
|
|
This sounds really good, while it may not require so many time to build an official release of this dll.
reply to this message
|
|
#113: Re: Alternatives |
by pushplay
on 06/17/2005 02:15, refers to #112
|
|
I don't think you've quite thought this through. Now you don't even know if players disappearing and multiple people picking up ammo boxes is cheating or the effects of an ignored player, so you can't justifiably ignore someone.
reply to this message
|
|
#114: What might work... |
by enigma_0Z
on 06/17/2005 23:03
|
|
This should work...
Identify players with the masterserver by their MAC addresses ONCE. Create a unique key file for each player. Hold players accountable to these key files. You can develop a rating system for players of say 0-1000 starting them off at 500.
Servers decide how they want to run by that trust rating, servers can run "untrusted" by seting their minimum trust to 0, default at 500, or have only the best players run at 1000... You could limit it so that a player could only vote against another player once a game or every 5 minutes. Then have the server check to see if one player is bashing another (voting against him or her every chance he or she gets, for instance). Then kick the voting player, unless it's alot of players voting against one.
Basically, you need three things IMO:
1) Somewhat of an identity
2) A democracy for kick/ban/trust/whatever and a community-created rating for that
3) A way of preventing people from voting down someone just because he/she is better.
Either we can make voting so inconveinent that only people who are legitamantly voting can vote or something else...
Would it be possible to keep the net code closed source and opensource everything else like cube? We could have people work on things and keep the netcode a black box...
reply to this message
|
|
#115: Re: What might work... |
by tentus
on 06/17/2005 23:25, refers to #114
|
|
seems like a logical approach. also, something i don't think has been proposed: why not have the trust system for people who has "unofficial" binaries, but let people who have the "official" binary, complete with whatever is nescessary to verify that it's the real deal, be outside the trust system until they make their own build.
also, why don't people use the "official" version for multiplayer and use their custom versions for mapping and such (i realize that there are tons of cases where this wouldn't work). i use quad to help me out in my mapping sometimes, but i've never used it for mapping. does that make sense to anyone else?
reply to this message
|
|
#116: Re: What might work... |
by tentus
on 06/17/2005 23:26, refers to #115
|
|
mehhhh, can't type today.
>i've never used quad for multiplayer.
sorry about that.
reply to this message
|
|
#117: .. |
by hungerburg
on 06/17/2005 23:31
|
|
hello,
your simple solution fails in contact with reality/me: I want to stuff the machinegun as full as I can, from all available ammo (there may be more than one box in a map ;) I too see, there is an arms race to avoid! the cube way as of now, works quite well; on the servers I generally dont feel to be haunted by cheating lamers. there is some obscure algo in the network code and obviously nobody yet took the effort to crack it. map cheats seem to be rarer than engine bugs;)
aard opened this thread with suggesting that only people play each other, that trust each other. One problematic point in his implementation I think can easily be fixed by using public key cryptography - the one of impersonation by use of a "public secret") - altered suggestion:
At first, players create a public and private key pair using standard tools (think openssl, available for all platforms and bsd license - so can be embedded in cube/sauer!). to not make this depend on the nick, the public key is the global identifier of a specific, single player; lets call any two of them Alice and Bob (A and B, like in the original example.)
When joining a game, along with her nick, Alice also hands out her public key. peers AND server use this one to encrypt some random string and send it to her; if she then can return the original, she has to have the private key, so it must be her! vice versa, alice also tests the other peers the same way. eg. this way she might get to know that the other player really is the person she used to know as "Bob" - because Alice, like any player, keeps a list of other keys (indicated by nicks, though that is just a convenience, as they might change). there has to be an interface in game to flag some of these as un/trusted (and to prune ones, that have not been seen for a year or so, etc...). if she knows that she trusts Bob, she can tell the server so.
PROS:
- a web forum was not at all necessary, as public key exchange between peers can be handled in game
- if keys were exchanged between each game, the server would not have to keep a log of keys seen, only client would
- there are no two equal public keys
CONS:
- the server still has to be trusted
- bummer! clients behind nat cannot be queried this way! (also in aards suggestion)
peter
reply to this message
|
|
#118: Re: Alternatives |
by Gilt
on 06/18/2005 01:41, refers to #112
|
|
re: ignore-world
godamn I really like this idea! I mean... it's just so... deadpaned crazy! A true five-bagger!
the more I think about it, the more I can't help but applaud with awe. The complete lateral thinking used here is incredibly inspired. I mean, it's essentially taking what attempts to be an objective reality and totally turning into a subjective experience. it'd be like playing in a completely disassociated-ostrich version of a world. I cannot even begin to fathom the inter-connected dark matter effects that would occur at the nexus of the storm of collapsing dimensionality that is this multi-verse. imagine a game where you can be intensely playing one-on-one with an opponent who is SIMULTANEOUSLY playing in a free for all with 3 other people who are each playing a game of CTF, 3v3 and lastman-standing respectively!!!!
I want to watch the movie based on this idea, written by philip k dick.
this is honestly one of the only truely innovative ideas I've read here. this is what life is all about. sometimes you've just got to bust out with a giant mutant rainbow ostrich and see what happens. Ofcourse, aard being the ruthless mother nature that he is ( :p ), would never let it live too long... but at least you get to see it strut it's stuff for a little while...
reply to this message
|
|
#119: Re: Alternatives |
by Aardappel_
on 06/18/2005 10:56, refers to #118
|
|
I thought about it for a second, as it has many good properties. It is of course dead simple, no central administration, for one. The fact that people are playing potentially different games to me is not problematic, and would fit nicely with cube's thick client model.
One problem with it is that this costs the players more effort to combat cheaters than for the cheater to keep on annoying people. Cheater cheats, people recognise it, and slowly everyone on the server puts him on "ignore". He sees this, quits, reconnects with his isp, and rejoins the game with a different nick and ip. Players would have to give someone joining the game benefit of the doubt before they can ignore hims again. Repeat.
The problem is worse because of the kinds of cheat someone with full sourcecode can make. First of all you can make really annoying cheats, for example instantly kill everyone on a keypress. Second, you can make cheats that give you a huge advantage, but is almost indistinguishable from just a good player (such as radar, see through walls, stats showing enemy health, second countdown till next quad/armour, minor ammo/health cheat etc).
One way to make it harder to rejoin after being detected is to move to a "round" based system, where you have to spectate until the next round if you join a server. That way the penalty for being ignored maybe offset its cost.
reply to this message
|
|
#120: .. |
by remouk=
on 06/18/2005 12:13
|
|
/ignore ? What a pretty good idea ! Imagine, if you think a player is cheating, you just have to /ignore him ! Then you won't see him, and he won't see you anymore. In another case, it could be a way to cheat, so let's think about it. :)
reply to this message
|
|
#121: .. |
by hungerburg
on 06/18/2005 13:29
|
|
cheaters are just a nuisance - why are they so special? because everyone agrees on that they annoy! now how to kick all the cheaters?
if the goal was instead, to only play a select number of friends, whom I trust, a password on joining the server was just as useful as a trust scoring system! this of course depends on another channel to share the password - while the trust meter depends on a third party, to tabulate the score. there is always a third party to any game of cube - its the server. in my post 132 I tried to improve upon aards suggestion how to reliably tell who's who.
the "ignore" idea is indeed totally different: it works like in a chat - its one step before the "kick", ie. no consensus/vote needed. in a chat, there are no cheaters - though there may be nuisances. this is no longer about a closed group with external ties or a referee.
what remains: 1. also a very good player may annoy other players, 2. how to trust strangers (if not per default?)
reply to this message
|
|
#122: Re: General long-term strategy |
by jean pierre
on 06/18/2005 14:47, refers to #122
|
|
I am better skilled then mostly anyone in playing Cube mutliplayer and most say i have a hack that the aiming target automatically on the person thats not true im just Advanced in Cube/Sauer.
reply to this message
|
|
#123: Re: General long-term strategy |
by makkE
on 06/18/2005 14:57, refers to #122
|
|
Lol
reply to this message
|
|
|
Board Index
|
|
Unvalidated accounts can only reply to the 'Permanent Threads' section!
|
|
|
|
|