home home

downloads files

forum forum

docs docs

wiki wiki

faq faq

Cube & Cube 2 FORUM


Cheating & open source, revisited

by Aardappel_ on 04/27/2005 07:54, 218 messages, last message: 06/16/2006 17:23, 188238 views, last view: 11/01/2024 11:26

As you all know, cheating is a problem for Cube being Open Source. Noone likes the current solution of the incompatible binaries, and I am getting to the point where I see the usefulness of having other people continue to work on Cube whenever I don't have the time.. currently that is problematic and would be much easier if the source to the official game could be truely open.

Multiplayer continues to be an important aspect of Cube, so we can't ignore cheating and simply hope that people won't change 1 line of code to enable god mode or permanent octa-damage, because they will (tell me something about human nature and the people on the interweb).

The solution can't come in the form of "cheat protection", this simply isn't possible with the current cube, and even if the entire gameplay code was moved serverside, is still fragile. Don't even suggest it... make sure you understand the nature of the client/server gameplay code before commenting.

The solution for Cube I feel has to be a social one. As you may remember, I designed a solution before:
http://wouter.fov120.com/rants/trusted_communities.html
The problem with this particular design is that it is too complex to set up, and too centralized. I would like to come up with a solution that is simpler, less implementation work, and can work with any group of people, centralized or not.

This is the idea I came up with sofar:

Every player that wants to play in a cheat free environment, can use a command in Cube to generate a set of key files. A key file is simply a file of, say, 10000 random bytes. The player then hands out these files to players he trusts... or rather, players he wants to trust him. (why there are multiple files will become clear later).

A server can either be in untrusted mode (default, works as before), or trusted mode. It can be set to trusted mode by the server admin, or voted by the players until the server empties. It will show up in the server browser as trusted.

If a player A & B connect to a trusted server, A looks up B's nickname in his folder of key files. If he finds a corresponding key file, he chooses a few random file offsets and reads the bytes there. It now sends a packet to B asking it for the bytes at those offsets. If B really is B, it can simply read its own keyfile and return the values. A now compares the values, and if they match, it sends a "I trust B" packet to the server. The hud shows which clients you trust, and for each client how many clients trust him in total. You are now sure that B really is who he says he is.

On a trusted server, people that after exchange of trust packets have gained no trust, can be booted from the server automatically. This allows you to play games with trusted people in your community, and have external people unable to be join the game.

asking for random offsets guarantees that untrustworthy clients or even servers never get to sniff keys. "Trust" is evaluated locally and for you only, so can't be spoofed.

The one problem would be handing your key file to someone who later turns out to be untrustworthy. This person could now impersonate you and appear to be you to all your trusted friends. Hence the multiple key files, so you can give a different key file to different people (or groups of people). That way, if the person "goes bad", he can't impersonate you towards your friends, as he doesn't have the keyfile your friends have.

The system is not perfect of course. You can still have 2 cheaters join together and trust eachother. Luckily cheaters hardly ever come in groups, and there are more complicated ways to protect even against this.

The biggest issue is the inconvenience of having to exchange key files, and especially to require new players to find existing players on forums/irc before they can sensibly play. I think it is bearable though, as you only need to do it once, and Cube multiplayer is a fairly closed community. And if servers are by default untrusted, you can give newcomers the benefit of the doubt until they behave suspicious.

What do you all think? Please think it through thoroughly before commenting (I am talking to you, Jean Pierre! :). I am especially interested in "holes" in this system, i.e. ways that cheaters could abuse it if they really wanted to.

Go to first 20 messagesGo to previous 20 messages    Board Index    Go to next 20 messagesGo to last 20 messages

#95: Re: Simple Idea

by jean pierre on 06/11/2005 16:06, refers to #94

Yeah yeah and Aard will make it in Sauer.

reply to this message

#96: Re: Lightweight server-side cheat detection

by quirk on 06/11/2005 17:17, refers to #91

"Defensive running-speed cheats are harder to handle --- I don't currently see a lightweight solution there."

Could you just have the server store the last position of every player and the time that they were there and when you receive an update from that player, check if the distance moved is greater than the distance they should have been able to move in that time?

reply to this message

#97: Re: Lightweight server-side cheat detection

by eihrul on 06/11/2005 17:31, refers to #96

That wouldn't work: think about a cheat that let you run through walls.

But anyway, ANY technical solution is doomed to failure, short of a dumb client or maybe having a magical compiler that actually verified all your sources were unmodified when compiling them or DRM or voodoo.

That would be the reason why Aardappel wants a social solution to the problem.

reply to this message

#98: Re: Lightweight server-side cheat detection

by quirk on 06/11/2005 18:06, refers to #97

Did I say it would work on a cheat that would let you run through walls?

Any technical solution *that relies on the client doing any sort of checking* is doomed to failure. The only technical cheat prevention that is plausible has to be done by the server and the only thing it can do is sanity checking on the data that the clients send it. This won't catch all cheats (aimbots, hall hacks) but it will catch some.

As far as the social solution goes, I think its a good idea. Someone suggested that morons might trust each other to increase their status. I would suggest that trust given by a highly trusted person is worth more than someone trusted by less people.

I think the only suggestion I would make to aards suggestion (if being decentralised is really an issue) would be to have a pulic and private key setup but in reverse, only you can send messages from you but anyone can decrypt them, in effect, you keep the public key private and tell everyone your private key. Each player is known by the key that they give out and each player keeps a list of who they trust.

There may be a small (large) flaw in this, in a public-key system can the public key be generated from the private key?

reply to this message

#99: other projects

by Sparr on 06/11/2005 18:06

has anyone put any research into what bzflag does to prevent cheating? 10+ years of open source online competetive gaming, they must be doing something right.

reply to this message

#100: Re: ..

by quirk on 06/11/2005 18:08, refers to #98

That fits in with what I said, to spoof a location but then caiam a hit from where they're really standing, at some point they have to 'jump' the server sees this and points the shadowy finger of doom...

reply to this message

#101: Re: other projects

by quirk on 06/11/2005 18:11, refers to #99

From the FAQ...

"Yeah, unfortunately. Because of the open nature of BZFlag, its not too difficult for someone with programming experience to create a cheat client. At the moment the best solution is to find the server admin or just change servers when a cheater shows up."

I guess its not such a fast paced game, eg, unless you have the laser, you can't use an aimbot and unless you are bouncing off three walls its pretty easy to aim at a stationary target anyway.

reply to this message

#102: Re: ..

by quirk on 06/11/2005 19:39, refers to #102

Sorry, maybe I wasn't clear, that was part of my response to aards "social solution" as opposed to a technical one.

reply to this message

#103: Re: ..

by HopFrog on 06/11/2005 23:39, refers to #96

On the contrary Morgaine, I think you're missing the point of what I said. I fully recognize that many clients will be compiled for various reasons, such as platform ports, editing features, etc.

All I'm saying is that the information regarding whether they've done so should be available to the server admins (or possibly players, but I don't know if I like that idea because mob rule can be a nasty thing), because I'd imagine that most users use a stock binary while a minority uses custom compiles.

If that presumption is in fact the case, then a basic code-and-compile cheater (which along with a couple other things is probably the majority of cube cheaters, another presumption) would at least be visible because they'd report with an "unofficial" binary were that information looked into by a server admin, though they would still be able to play without barriers.

Maybe I didn't make my idea clear enough initially, it certainly wouldn't be the first time that's happened. It just seems like, at least for a possible patchwork solution in the short-term while work is done on a larger one, that this would provide a quick and dirty solution to many problems while not excluding custom compiled binaries from gaming fun. :)

reply to this message

#104: Re: ..

by CC machine on 06/12/2005 17:49, refers to #88

yes i know but it would work (if possible)

reply to this message

#105: Info about keys, key pair public/private and RSA

by CrazyTB on 06/13/2005 21:34

If keys are to be used, they must be in pair: public/private, like in RSA. Maybe knowing how SSH and OpenSSL works could be useful. Maybe using DSA instead of RSA.

Short intro on RSA: When you encode a message with "private key", you will need the "public key" to decode it. The same way, if you encode using "public key", you will need "private key" to decode it.

Public/private keys are equivalent in function, the only difference between them is that public can be really public available (post on forums, e-mail it, and so on), while private key must be kept secure (you shouldn't copy it).

A handshake between two clients A and B could be: A uses the B public key to encode some message. Then, A encode the already encoded message using A private key. A sends the double-encoded message to B. B uses the A public key to decode the message, then B uses the B private key to decode the message. Now, the message has been fully decoded. If the message makes sense, then we will know that A and B really have the private keys corresponding to public ones.

Remember: this only trusts that A is A and B is B. There is no need for encryption in normal game data, since we don't want to protect it from being sniffed or modified by "man-in-the-middle". The encryption would make a "secure tunnel", where only the ends (the clients/servers) would be "insecure". Our main concern is to make the "ends" secure, not the "tunnel", so, encryption won't solve our problem.

reply to this message

#106: Technical solution (?)

by CrazyTB on 06/13/2005 21:51

It isn't possible to verify the binaries, nor verify if one or another function has not been altered, since it is possible to the cheater intercept these checks and use the "official" binary/code for answering these checks.

So, it isn't possible to check the client code. Solution? Make all clients and/or the server check if data sent by clients is consistent. There is no other way.

It is also possible to make the client report its ammo (of all 4 weapons), armour and health from time to time, or only in "special conditions" (conditions when these values are changed, like getting items or being hurt). The server and/or clients will check if these values are equal to values the client should have. To make this work, the server and/or clients (I think server-only is easier in this solution) must have all stats for all players, and must keep these stats updated. Shouldn't be difficult to do that.

To not affect too much the network traffic, when a player get hurt, only health and armour should be sent; when firing, only ammo for used weapon must be sent; and so on...

Maybe the client should only send this data once per second.

This will solve the problem of "god mode" and "unlimitted ammo".

To solve the problem of "hyper damage", the server should also check if the weapon's damage is inside possible limits. This, however, may not solve all possible damage cheats. Solution? The server should choose the appropriate damage for each shoot.

Only hyper-speed and walk-thru-walls cheats will remain. These can be solved as people said: check if movement is allowed. (maybe this could be made client-side and clients must send an "ok" to server?)

What do you think? Good idea/solution?

reply to this message

#107: Re: ..

by Pxtl on 06/14/2005 07:36, refers to #107

That approach bloats the netcode, and is effectively a voting system. Voting systems are really, realy complicated and hard to get right.

reply to this message

#108: ..

by CrazyTB on 06/15/2005 02:03

I'm not sure, but I think some of the checks should be made in server, just because doing that in clients will need to transfer this redundant data to all clients and collect the response of them. This adds a lot of network traffic.

I think doing "redundant-data-check" (like ammo, health...) on server and "non-redundant-data-check" (like position, speed, moving/firing through walls...) on clients would be the best solution.

Checking the ammo, health, etc. is very easy. Just keep a couple of variables for each player, do one or two additions and subtractions, and compare the resulting value with value the client sent. I don't think this can slow down too much a server.

reply to this message

#109: ..

by pushplay on 06/15/2005 03:05

I think what hasn't been said here is that this whole verification system will beat some hacks, but it will never beat every hack. So you'll put yourself in a situation where you make the program uglier and uglier while the protocol gets thicker and thicker, in an arms race to try to keep up with the latest clever hack. You can't win that way, you can only spend tons of effort to minimize loss, and no one here is getting paid for that.

reply to this message

#110: Re: ..

by Aardappel_ on 06/15/2005 03:58, refers to #109

amen.

reply to this message

#111: ..

by CrazyTB on 06/16/2005 03:18

So... What to do? Noone wanna cheaters, but there will be some.

Maybe you could take a look at "fuhquake" game. It is open source, can be recompiled, but uses a closed-source DLL (dynamic link library) just to authenticate at server. I really don't know how this works, I never used fuhquake, I'm just saying what a friend (which uses Gentoo) told me.

reply to this message

#112: Re: ..

by remouk_ on 06/16/2005 11:42, refers to #111

This sounds really good, while it may not require so many time to build an official release of this dll.

reply to this message

#113: Re: Alternatives

by pushplay on 06/17/2005 02:15, refers to #112

I don't think you've quite thought this through. Now you don't even know if players disappearing and multiple people picking up ammo boxes is cheating or the effects of an ignored player, so you can't justifiably ignore someone.

reply to this message

#114: What might work...

by enigma_0Z on 06/17/2005 23:03

This should work...

Identify players with the masterserver by their MAC addresses ONCE. Create a unique key file for each player. Hold players accountable to these key files. You can develop a rating system for players of say 0-1000 starting them off at 500.

Servers decide how they want to run by that trust rating, servers can run "untrusted" by seting their minimum trust to 0, default at 500, or have only the best players run at 1000... You could limit it so that a player could only vote against another player once a game or every 5 minutes. Then have the server check to see if one player is bashing another (voting against him or her every chance he or she gets, for instance). Then kick the voting player, unless it's alot of players voting against one.

Basically, you need three things IMO:

1) Somewhat of an identity
2) A democracy for kick/ban/trust/whatever and a community-created rating for that
3) A way of preventing people from voting down someone just because he/she is better.

Either we can make voting so inconveinent that only people who are legitamantly voting can vote or something else...

Would it be possible to keep the net code closed source and opensource everything else like cube? We could have people work on things and keep the netcode a black box...

reply to this message

Go to first 20 messagesGo to previous 20 messages    Board Index    Go to next 20 messagesGo to last 20 messages


Unvalidated accounts can only reply to the 'Permanent Threads' section!


content by Aardappel & eihrul © 2001-2024
website by SleepwalkR © 2001-2024
58249287 visitors requested 76200934 pages
page created in 0.060 seconds using 10 queries
hosted by Boost Digital