I talk about DRM a lot, but I never really found a perfect way to explain it to people who are clueless about technology. When I teach a class on digital media, and go over DRM I usually briefly cover different methodologies, and then put up dates when they were cracked at the end of each slide. I found that it really drives the point home when I go through this buildup stage, explaining these extremely complex systems and conclude each series of slides with “Oh, and this scheme was actually cracked 3 days after release”.
Then I usually reuse few slides from my cryptology lecture and give them Bob, Alice and Eve example, emphasizing how in DRM world Alice and Eve are the same person. This is a great example that really speaks to people who understand and care about cryptography, but Fluency in Technology students usually find it a bit confusing. I’d love to have something explains the absurdity of DRM in a way that is clearer, funnier and doesn’t involve the 3 most hated people in my classroom (sorry Bob, Alice and Eve).
I think I might have found an allegory that might just work on that level. This is how Shamus from Twenty Sided described DRM in his recent post:
In the original Monkey Island, at one point you are captured by natives who lock you in a simple bamboo hut. There is a trap door in the floor through which you may escape. If you’re dumb you can walk over to the natives once you’re out, and they will grab you and throw you back into the hut. The second time they throw you in, they add chains to the door. The next time the door is made of metal. This keeps going until eventually (if you keep going back) they have a bamboo shack with a massive steel vault door on the front, a timed lock with an alarm system on it. It looks like the front of Fort Knox.
“How he keeps getting out is almost as mysterious as why he keeps coming back.“
In a lot of ways these DRM schemes are a bamboo hut with a vault door on the front. The keep using a bigger and bigger lock and a more complex system of authentication, but it still has to run on a machine where you can edit the executable, and all the hacker has to do is go in and disable the part that says, “Do the security check.” It doesn’t matter how secure or complex or devious the security check is, if the machine’s not doing it, it’s not doing it.
I played that game, and I remember that part but I never connected the two! But it is a perfect fit! It’s vivid, funny and really gets the point across. I really don’t expect my students to actually know what Secret of the Monkey Island was. Most of them are just to young to have played it when it was still on the market, and to clueless to download it from one of the the abandonware sites and play it via ScummVM. Well, maybe one or two would actually know about it. Still, the story is silly enough to work so I’m totally stealing it.
Here is a youtube video of that scene for your reference:
Shamus is right of course – it very difficult to design copy protection software in a way which will be difficult to crack by a 15 year old kid armed with a debugger and a hex editor. Anything that is running on the client machine can and will be tampered with. The only way to make the game uncrackable is to have the copy protection run on a remote server and have the client simply forward over user authentication. Still, that doesn’t prevent people from sharing accounts and hacking the client to do weird things. Not to mention the costs of running an operation like that. Most of video games that are not MMO’s are really client based applications ant as such will always be vulnerable.
So the second best thing you can do is to mislead and confuse the potential attacker and make his job difficult. Adamantyr posted a god example of this practice in the linked thread over at Twenty Sided:
Concerning executable cracking, Chris Crawford has a VERY good write-up of how he protected one of his games in his book “Chris Crawford On Game Design”.
In particular, he uses obfuscation techniques such as:
- Burying work code inside of recursive loops, so reading the active process stream has a ton of noise the hacker has to wade through to find the ONE interval that does something.
- Code over-writing, in other words, the program overwrites parts of itself while running in memory. This is actually really bad from a security standpoint nowadays, but it’s fiendishly clever and sadistic for the poor hacker who’s world view has just been demolished by code that changes when he’s NOT LOOKING.
- Dummy variables with obvious names that draw the hacker away from the actual important ones.
- Storing actual data in the stack garbage and fetching it in a clandestine way, like an “accidental” buffer over-run.
- Deliberately breaking the game so the legitimate version would “fix” one element of data. Otherwise the game can’t be finished.
He actually hired a professional hacker to try and break his program after he’d finished it, and the guy never got past the first level of defense he set up. He later found cracked versions online, but none of them were actually completable as his “flawed” data element wasn’t fixed.
I haven’t read Chris Crawford’s book but the techniques mentioned above would indeed make the life of your average teenage cracker very difficult. However, they would make the life of your average game developer a nightmare as well. Some of these things are really bad practices. Storing data in garbage, controlled buffer overflows, cryptic spaghetti code – this stuff is just bad software development plain and simple. If you are a single programmer on the project, you can probably get away with scattering stuff like that all over your code. When you are working as part of the team, this is the kind of stuff that will get you beaten up by an angry mob of coworkers who have to debug your cryptic code.
These methods do not really seem to fit well into the modern software development model. The only way to make this soft of copy protection work is to have it tightly woven into the very fabric of your software. The copy protection checks should be tightly coupled with real processing code, overlapping and hiding behind real data in as many places as possible. But who the hell is going to test and maintain that kind of stuff? No one really does copy protection this way anymore.
These days most companies think of DRM as a security layer or a module you can buy or license then slap onto a wide range of products your products. They view it as installing a lock, on the bamboo hut because that makes sense and is economical. Once you build an awesome lock, you can use it on any hut you want. Sadly, a hut is still a hut. It is made out of bamboo which can be defeated with a hacksaw, and you can always tunnel under it since it has no floor. What Crhis Craftword seems to be proposing is building a Cube like environment instead of a hut. But that is a hard job which requires not only dedication but also experience. The problem is that most game developers are not really experts in obfuscating their code, and building copy protection mechanisms into their code. In fact they are usually the exact opposite of that. They are trained to write clean and understandable code that is easy to test, easy to debug and conforms to the best practices. Game development studios just want to make games. Who insists on DRM then? The publishers of course. They are the major driving force behind the copy protection industry because “piracy” cuts into their profits the most. And they are not experts on writing obfuscated software either. What they want is something simple like this:
- Get a nice black box containing precompiled binaries for the game from the developer studio
- Purchase a another box with a complete, end-to-end DRM solution
- Pay some low skilled employee to wrap the game proper in the DRM container creating master package to be burned on CD’s or DVD’s
Neither game developers nor publishers are really interested in building these protection systems. They are interested in buying them, and thus a whole new industry grew as a response to this. Companies started specializing and building DRM systems as separate products. DRM is now a piece of software that is generic and modular designed to fit with as many different products as possible to maximize profits. It can’t blend seamlessly into the game it is protecting or hide behind live data. By necessity, the number of places where the game code intersects with DRM code is limited. The more you try to integrate the two, the more of custom code and modifications you need. And of course, DRM makes charge for this kind of stuff at a premium rate. So the direction which the game industry seems to be taking is building really complex and impressive locks to use on their bamboo huts because it is really the only logical and economical way to do this. The other route is just plain nutty – exuberantly expensive, and potentially creating huge maintenance problems in exchange for what? They can’t guarantee you success – no one can. If something runs on the client machine, it can and will be tampered with – any part of it can be overwritten or modified.
But as you can see the whole system is deeply flawed. Sometimes I wonder how do executives who make the decisions to use DRM systems such as SecuRom or StarForce react when they find out that a cracked version of their product hit the torrent sites 3 hours after the release? How do they justify the expenses they incurred to license the protection technology? Perhaps they don’t. Perhaps no one tells them these things. Perhaps they live out their lives oblivious to the truth, thinking that the millions of dollars spent on licensing some DRM product actually made their software invulnerable. More likely though they hide behind company policy so they can then justify low sales to their sock-holders telling them stories how evil pirates are still robbing them blind despite these strong counter-measure steps they took.
Anyway, if you don’t mind Shamus, I’m gonna use your Monkey Island allegory next semester when I’m teaching my class about digital media and DRM. :)
[tags]drm, copy protection, monkey island, twenty sided[/tags]