Game Developers – Stop Failing at Online Multiplayer

Independent video games are all the rage lately. Minecraft and Angry Birds are the premiere examples of how games developed outside of the big commercial industry can become huge successes. This is mostly because of platforms like Steam and other digital stores allowing these games to get vast exposure. This is great because after years of genre cookie cutters we finally have a new fountain of ludic innovation in the video game world.

Despite the greatness in the indie video game scene, there is also a great deal of fail. Because indie developers lack the resources of giant corporations like Nintendo or Electronic Arts, there are certain flaws in their games that we have to accept. The controls won’t be as polished and smooth. There might be graphical glitches on certain video cards. The game might crash under weird circumstances. Without the money or time for thorough testing, this is just a reality of life we must accept. I forgive the indie games for these kinds of flaws.

That being said, there is one area in which failure is absolutely not acceptable. That area is online multiplayer. It seems that just about every week a cool new multiplayer game comes out on Steam, but the networking is a complete disaster. This is absolutely unacceptable. If it’s primarily a single player game with a small online component, then it’s no big deal if that part doesn’t work. However, if it’s primarily an online game then the game may as well not exist if the networking is busted.

Continue reading

Posted in Technology, Video Games | 2 Comments

Why Not a Little Open?

If you look around at the world of hardware, there is one thing that is immediately obvious. The open hardware is shit compared to the closed hardware. In terms of industrial design, battery life, price, and just about every category other than openness, the closed devices are superior. The thing is, it doesn’t have to be this way, and the electronics manufacturers would actually stand to make much more money if they bucked the trend. Continue reading

Posted in Technology | 1 Comment

How to Play Doom

Just yesterday I read this fascinating piece by Stephen Totilo of Kotaku entitled The First Time I Played Doom Was Yesterday. ┬áIt’s really crazy that in this world, even a video game journalist with an incredibly long tenure could have completely missed playing Doom, one of the most important video games in history.

What fascinated me even more was that he had trouble actually getting the game to work. Even if this day and age when people can run Doom on everything from iPods, pocket watches, to microwave ovens, it’s very strange that someone would have trouble running it on desktop x86 PC. Because I think it is so important that every person, and especially every gamer, should at least have experience with this game, I present to you a tutorial on how to play Doom.

The first step is acquiring the game. This isn’t really as obvious as you might think. You see, the Doom engine has been open source for a very long time. The only part that is not free and open source is the game data itself. The maps, the art, the textures, all those things are still commercially owned by Id software, and you must pay money to acquire them legally.

The Doom engine stores all of those things in WAD files. An entire game for the Doom engine is stored in a single WAD file. You can, of course, acquire them illegally. If you don’t pay money, but want to stay legal, you can only acquire the Doom demo version WAD files. Personally I suggest you go on Steam and purchase the id super pack. It’s a great deal that will also give you Quake, Quake 2, Commander Keen, and many other games which are required playing. Make sure after you purchase the games that you install them in Steam to get the files downloaded to your machine. It should take a matter of seconds on a fast connection.

Now, if you are running Windows, you could attempt to play these games directly in Steam. This will launch the original Doom engine using DOSBox. This might work for you (it does for me), or it might not (it didn’t for Totilo). Even if it works, it is not a great experience. This engine was designed for DOS, and it shows its age on modern machines. It will run at a really crappy resolution, and it has limited options to make things any easier on you. Stick with it if you are a purist, but I’m betting you aren’t.

I do suggest that if it works for you, you should play classic controls mode for a few minutes. This way you can learn what Doom was like in the olden days. You had to hold a button down to enable strafing. You couldn’t aim up or down. You couldn’t jump. You had to hold a button to run. You couldn’t really mouse look. It’s important to know what it was like so you can fully appreciate the great luxuries we have today.

Now that you know what it was like all those years ago, you don’t need to suffer like that anymore. Whether you are running Windows, Mac, or Linux, grab yourself a free copy of the Doomsday Engine. Because the Doom engine is open source, many people have rewritten the entire thing to work well on modern computers. Doomsday engine is just one of these, but it has worked very well for me on all three platforms. I recommend it.

Installing the Doomsday engine is pretty straight forward. The only tricky part is that you have to tell it which Doom engine games you own, and tell it the locations of the WAD files. Check off the boxes for every game you have purchased on Steam. If you purchase the id complete pack, that’s every game except for the demo versions. The WAD files should be located in c:\Program Files (x86)\Steam\steamapps\common\GAMENAME\base\FILENAME.WAD

Just replace GAMENAME with whatever game you are looking for, such as ultimate doom, and replace FILENAME with the filename of the wad file that Doomsday is looking for. The only thing that might be tricky is that the Hexen: Deathkings of the Dark Citadel game actually requires two WAD files. Don’t worry, they’re both in the correct folder. If you are running a 32 bit Windows, you can leave off the (x86).

Once you’ve told Doomsday which games you have, and where the WAD files are, you’re in modern PC gaming territory. Doomsday’s configurations are almost exactly like those on every other PC game you are used to. You can select modern resolutions, completely configure your controls, go nuts. You can even do things like add in a mini map that wasn’t in the original game. The world is your oyster.

Just remember when you’re playing this old game with the new engine that it’s not the same as the original. You couldn’t jump, look up or down, strafe without an extra button, or change weapons with the scroll wheel. It’s an impure experience, but it’s good enough for you to learn about the Doom family of games. Most importantly it works, and it won’t have a person used to modern games quitting in frustration after a few minutes.

One more suggestion. In Totilo’s article he mentions not finding a shotgun for quite some time. The reason is that he was playing on a sissy difficulty level. He’s probably too young to die. I suggest you turn up the difficulty, otherwise you really aren’t going to learn anything. Nightmare is supposed to be nearly unbeatable, but you probably want at least hurt me plenty. If you do that, you’ll probably be seeing a shotgun toting bad guy right away.

One more hint. If you aren’t taking out one imp, or two or three regular guys, with one blast from the shotgun, then you’re doing it wrong.

Prepare to meet your Doom.

Posted in Video Games | 2 Comments

Don’t Change the Computer, Change Yourself

Many long years ago, I used to be a Gentoo user. Today I run Windows 7 on every computer I own, and I use default Ubuntu installations on servers and in Virtual machines for development. I went from running one of the hardest core Linux distros on the bare metal to not running Linux on bare metal at all. That’s a pretty big change as far as nerds are concerned. What happened?

My first experience with Linux was in the very late ’90s. The first time I saw it, some kids at the nerd summer camp I went to were running it. I didn’t use it, and I didn’t learn it. Yet, as with all things related to computers, I was curious. In high school a friend lent me a Corel Linux CD-ROM. I tried with no avail to make it work on the family computer (486 100mhz). I didn’t realize I had to boot from the CD-ROM, and I don’t think that computer even had that capability.

It wasn’t until 1999 that I downloaded Red Hat ISOs over my 56k dial-up connection. It was the first computer I ever built and personally owned, a Pentium /// 450mhz. I was able to dual boot Red Hat 6.0 and Windows 98SE. The thing is, I soon deleted it. It didn’t work with my USB dial-up modem. Linux was apparently just a different desktop with some different looking applications, and bad hardware support.

It wasn’t until I got to college that I actually learned something. All the computer science labs ran Solaris, and I started to learn UNIX for real. I soon discovered Mandrake Linux. It actually worked with all my hardware, even my weird PCI IDE controller. Most importantly, I learned to SSH into the Solaris systems to do my school work, and even use X11-forwarding.

At this point I was hooked. I was running Linux as the primary OS in a dual boot system. I only loaded up Windows (2000) for PC gaming. The thing is, hardware support was still a problem. My hardware didn’t all work perfectly, and Mandrake didn’t update quickly enough to give me the updates that fixed those problems. That’s when I had the distro rodeo and picked Gentoo.

I tried every major Linux distro at the time, and BSD as well. None of them really impressed me, and Gentoo didn’t even seem to work. Yet, I kept going back to it. The splash screen was so good looking, the community forums were so helpful, and the documentation was so great, that I kept at it. Eventually I came to realize that yes, I actually had installed it properly multiple times over, but unlike other Linux distros X was not a default part of the system. Booting into a command line wasn’t a failure, it was success.

From then on I was Gentoo crazy. I would constantly be rebuilding packages updating packages, even reinstalling the whole system. I would constantly be tweaking and twisting system settings to see what they did. Mostly I messed around with the user interface. I even fell into using fvwm, the most customizable X window manager. By default it sucked, but if you configured it, you could go beyond anything else in existence. Boy, did I ever configure it.

Late in my college years, and after graduation, I was soon employed. My free time evaporated. I was spending a lot of time commuting to work. My iPod became a lot more important than my desktop. I needed Windows to run iTunes, and to play games. I had little desire to code outside of work. Yet, I still kept a dual booted Linux because it was nicer to SSH into my blog/podcast server from that, than from PuTTY.

By then, I was rocking Ubuntu. It just worked. It installed in 15 minutes, not three days. I didn’t need to jump through any hoops to get my NVidia card working. I no longer needed to edit the X configuration to get dual monitor support and proper resolutions. I still had a natural inclination to customize the user interface, but it was kept at a minimum simply due to the fact I had no free time for it.

Eventually I was running Ubuntu and Windows. Then my desktop was dual booted, but my laptop was just Ubuntu. And then when I discovered VirtualBox, and it became good enough, I ran Ubuntu on the desktop in a virtual machine, and only had Windows installed on the bare metal. And just recently I replaced that laptop with a new one. This new laptop is so powerful that it too can run Linux in a virtual machine. And thus, my days of installing Linux are over. I wish I could run Linux in the bare metal and Windows in a virtual machine. The thing is that I use Windows for gaming, and VMs are not so great for that. I use Linux for web development, and running it in a VM makes no difference for that at all.

You see, in college I had a ton of free time. I spent that free time working on my computer. I do not regret spending my time in that manner. Most of the Linux skills I use every single day I learned from rocking Gentoo. Even so, I can now look at my old self and laugh. I was not unlike a mechanic who always worked in the garage and never drove anywhere. I spent all that time trying to customize the computer to my preferences, that I hardly ever computed. That’s why I don’t have my own software business. While I was compiling and configuring other people’s code, the other guys were writing that code.

Thankfully I learned my lesson, and I hope you can learn the same. It is often quite difficult to change computers. You can spend hours just customizing keyboard shortcuts, let alone other settings. Then, as soon as you use a different computer, all of your customizations are gone. Even if your customization can increase your efficiency, is it really worth it if it takes you an hour to set it up?

Take it from someone who has been there. Instead of changing the computer, change yourself. It’s easy to change yourself, you have 100% full control. You don’t even need to spend time looking up how to do it. Get used to the default keyboard shortcuts instead of making your own. Learn to live with the default settings. Once you do, you can be just as efficient as you would have been with your customizations. Better still, you will be able to sit at any computer and get going immediately without being frustrated at a different setup.

It used to be that my computer felt like home, and every other computer felt like a foreign country. Thanks to cloud computing, any computer with an Internet connection and a web browser can become your home as quickly as you can login. You no longer have to spend a day installing and configuring software after you have a fresh OS install. You just have to install your favorite browser, and maybe one or two other things like Steam or iTunes.

Learning default settings only makes this even easier. I truly feel at home on absolutely any computer I sit in front of. I am familiar with all three major OSes and all the major browsers. I still do not prefer OSX, which I am forced to use at work, but I know it. I’ve also made vim my text editor of choice, and I use it with a very default configuration. Vim or vi is already installed on almost every system, I can easily use any server I come across. People who are perhaps used to graphical editors, like Eclipse, may have some slight difficulty if they have to SSH into a server and fix something.

If you get frustrated behind the wheel of any car other than your own, then do you really know how to drive? A real driver can sit in any driver’s seat and be off to the races after a quick seat and mirror adjustment. That’s not to say you should never spend time under the hood. It’s very important to know how things work, and to be able to fix them when they go wrong. Just don’t spend so much time under there that you never leave your garage.

You won’t get very far in this world by working on your computer. If you want to make it somewhere, you have to use the computer to compute. Start now.

Posted in Technology | 4 Comments

Presenting Presentoh, King of Presentations

I do a lot of panels at various conventions. When I do these panels, I need some sort of visual aid. For this I usually turn to Powerpoint, and its alternatives. The thing is, these just plain suck, at least for my purposes. They have a zillion features that allow people to make really bad presentations. Meanwhile, the feature that I really want, which is video, effectively doesn’t work.

It dawned on me that because of HTML5, it should actually be possible to create an HTML presentation with video that is seamless. Today I went ahead and prepared a video-centric panel for Connecticon, and I did it all in HTML5. I realized afterwards that I should build this into a tool, so that others can use it as well. Hence, Presentoh was born.

The idea is really simple. There are only four kinds of slides I think you should be using: title slides, bulleted lists, videos, and images. I built a simple HTML template which could handle all four types of slides. I made a brain-dead simple CSS style for it which entails white text on black background. The black background also looks more professional because photos and videos really stand out. Also, it looks better when using a projector in a dark room because you can’t see the borders of the projection.

The only catch was that I wanted to use my Logitech Cordless Presenter. I went out and found jquery.hotkeys, which allowed me to bind any key on the keyboard to move between slides by redirecting in JavaScript. Problem solved.

All you have to do to use Presentoh is create a file containing JSON which defines all the slides in your presentation. Then you just run the script, tell it which json file you want to use, and it spits out all the HTML files for your presentation. Find the first slide in your presentation, open it in your browser, go full screen, and rock and roll.

I think it’s pretty awesome that while this tool is really hacky, and doesn’t have any error checking, it still beats out Google Docs, Powerpoint, Open Office, and all the rest in terms of how it handles video. There are some slight catches, but nothing that can’t be worked around. For example, you have to make sure all of your videos are the right codec for the browser you are using. My H.264 videos obviously didn’t work in Firefox.

Also another issue is that there is no way to automatically make an HTML5 video go full screen. If you’re wondering why, the answer is on stackoverflow. Therefore you have to manually specify the height and/or width of each video depending on its aspect ratio, and the resolution you will be presenting in. This is to prevent videos from going off the edges of the screen if they are too wide or too tall, and to prevent aspect ratios from being broken.

All the rest of the information can be found in the README file. I hope Presentoh helps somebody out there with their presentation needs. Enjoy.

Posted in Technology | Leave a comment

GitHub Suggested Workflow

The thing that’s great about Git is that it allows the user to pick any work flow they desire. The problem is that it allows the user to pick any work flow they desire. Thus, users who don’t know exactly what they want to do, are confused as to what they should do. Github makes things even worse, as it confuses users with lots of extra information. Continue reading

Posted in Technology | 3 Comments

iPad: Why Must We Be Forced To Choose?

There is, of course, the expected hoo-hah about the iPad. The rage between those who love it and those who can’t stand its proprietary nature is in full swing. Yet, I am constantly in awe of the fact that nobody seems to notice the obvious and simple cause of this clash. Continue reading

Posted in Technology | 7 Comments