Archive for April, 2006

I will never work for EA

Wednesday, April 26th, 2006

After reading about the EA spouse settlement I’m glad I never went to work for EA.

About a year and a half ago EA contacted me after I sent in a resume. I was really excited by this because everything on the phone went very well. I spoke to a programming department head and we really hit it off. Over 3 days, the EA recruiter contacted me 3 times by phone and we worked out a (in hindsight, low) salary of $75,000 a year. I was all set to move down there. On a Thursday he said they’d buy me plane tickets to fly down on Monday and speak to them in-person. I never hear back regarding confirmation of the flight times so I write three times on Friday and call both him and the person I interviewed with on the phone. No response.

So Monday comes and goes and I assume they lost interest and rudely just stopped replying to my emails. But about two months later the recruiter sends me an email out of the blue, apologizing with the excuse that he had to fly out, told his staff to handle the arrangements, which they didn’t do.

That sounds like a bunch of crap to me. First of all, if he flew out, does that mean he stopped checking his email? This is a connected world. It’s not like I sent one email either. I sent multiple emails, ending with something to the effect of “Hey it’s Monday and I didn’t hear from you. If you lost interest please let me know.” And why would it take 2 months for him to follow up? Furthermore, I just wrote to him and got an answer like an hour before I sent in my inquiry about the plane ticket. Did he fly out in that short period of time, making no mention of this or that I should expect to hear from his staff?

If you don’t want to hire me, fine. If there is a canidate you want to try first, go for it. This is business. I do the same thing, where I have multiple job offers and go in order of preference. Don’t lie or jerk me around though. If you say “You are going to fly here on Monday” I will start packing, and if I don’t fly I will have wasted time and effort.

So a different recruiter contacted me about 4 months ago. I brought up EA spouse, and she said something to the effect that they made a lot of changes and things were different now. I didn’t answer though, because even if that were the case I will never work for EA after getting jerked around.

Rant: How the unwashed masses have ruined computing

Friday, April 21st, 2006

When I first started using the PC (mid-80s) there was only DOS. DOS wasn’t hard to use. My Dad, who bought the computer, installed some kind of shell where you could hit F7 or something to enter a directory, F6 to go back, and F5 to run the highlighted program. However, the shell actually ran the regular cd.. and cd [dir] commands. So I learned 80% of all the commands I would ever use in DOS in the first day, and 99% in the first month. I never used the shell again after about a week.

Then Windows came out, and it sucked. It was slower to get to my programs, took minutes to load, as opposed to seconds in DOS, ate up memory, and basically didn’t contribute anything to my ability to use the computer. Yet, over several revisions, it was popular. Eventually, application companies stopped supporting DOS and I was forced to upgrade my computer just to run Windows.

Most of what I did at the time on the computer was play games. Back in the DOS and early Windows days, there were no installers. On floppy disks there were unpackers. Early CD-ROM games just had the whole game on the CD-ROM and you could run it from there directly if you wanted to.

Things were good, because I had control over my harddrive. I knew were every file on my computer was and what it did because I put it there. If I wanted to backup, copy, or move a game I just did it. There was no registry or shared files to contend with. There was no crap left over or “Add/Remove” programs list where 1/4 of the programs in the list actually don’t exist but can’t be removed from the list.

Yet, companies started using installers, and they started using Add/Remove programs, and soon my harddrive was cluttered with crap. My computer was now slower, with wasted harddrive space, and I had to reformat every 6 months or else things got out of control. Even now I still reformat about once a year.

Fast forward a few years. The next version of Windows, Vista, takes half a gigabyte of RAM just to run. I complain about my work computer regularly, which runs XP. Yet it is an Intel 3 gigaherts. That ought to be fast enough to open a text editor right? But for various reasons it’s often not. Essentially, unless you have a top-of-the-line computer you can’t even run the OS anymore without frustration and slowdowns. Why is this? Because people want cute 3D windows, animated icons, and other crap that has nothing to do with the point of the OS, which is to run your programs.

This is the same reason why console gaming is more popular than PC gaming. Consoles are less powerful, harder and more expensive to develop for, and the games are more expensive. Yet they account for 3/4 of the market for two reasons: quality control and installation which consists of inserting a cartridge rather than clicking through a few windows on your PC.

Half the population has an IQ under 100. Better applications, which don’t support “click and drool” as a fundamental design feature, fall behind those that may be worse in every regard except the installation. Windows vs. Linux is arguably in this camp.

It’s important to end a rant with a lesson learned, since complaining isn’t going to change the reality of the situation. The lesson is to make your software so easy to use that your Grandma can use it. If you can do that, you’ll sell to the 50% of the population with an IQ under 100 that your competitor can’t sell to.

Dynamic help is the Clippy of Visual Studio

Thursday, April 20th, 2006

Does anyone actually use Dynamic Help?

Dynamic Help = Snail Shit

It seems like the stupidest idea Microsoft has ever come up with. Well maybe behind Clippy and Microsoft Bob but not far behind. Within ass grabbing distance.

The problem with dynamic help is that it takes about 10 seconds to load on my snail slow work computer, and F1 is right next to the Escape key, which I hit all the time. And those fields are utterly useless. Do I really care what Microsoft thinks about my coding techniques and programming practices? 10 seconds is too short to go do something else, and too long to avoid breaking my concentration.

At least there’s no dog that comes up asking me what I want to search for.

Assert Boy

Tuesday, April 18th, 2006

When I was at my brief stint at Hypnotix, the programming owner there, who was the world’s biggest jackass, once called me “Assert Boy.” This was due to my habit of putting asserts all over my code – on a few occassions too many but mostly valid. So, when other programmers ran my code they’d sometimes hit asserts, which felt like a crash, and to a simpleminded observer (such as the programming owner), my code crashed a lot.

In fact, the only difference was that my code stopped early, where the problem was easy to fix, as opposed to late, someplace random in code, where the problem is very hard to fix.

In my opinion asserts are one of the most valuable debugging tools and should be used judiciously.

Improving compile and link speeds of C++ projects

Tuesday, April 11th, 2006

At work we have a very large codebase, in the millions of lines of code. Nobody understands all the code, and half the people who wrote it aren’t at the company anymore. So a lot of my time is spent experimentally – tracing through code to figure out how it works, or making small changes to fix unforeseen bugs. The problem is this requires a lot of recompiling small changes. Unfortunately, the code is designed in such a way that sometimes small changes result in the recompilation of hundreds of files.

I spend half my day not programming, nor thinking, but waiting for the compiler. This is by an order of magnitude the biggest hindrance towards my accomplishing anything meaningful. A single line of code change can take 15 minutes when you consider the time to build, the time to link, the time to regain my lost attention, and the time to do it all again for my second instance of Visual Studio (for network programming). Fortunately, I’ve learned a few things as the author of RakNet that can help with compile and link times. While they may be obvious to experienced programmers, even experienced programmers don’t consistently follow them so they are worth reviewing.

1. It’s bad to include header files in your cpp files. It’s exponentially bad to include header files in other header files

.cpp files increase the time to compile your project linearly. For example, if my cpp file includes “blah.h” then every time “blah.h” changes, then I have to recompile all cpp files that include that. That’s bad – but still far better than header files, which increase your total compile time exponentially. If files C.h and D.h include B.h, and B.h includes A.h, then anytime A.h changes we have to compile 4 header files – A.h, B.h, C.h, and D.h. But it’s not just 4 header files – we are now also rebuilding any cpp file that includes any of those header files.

2. Avoid, as much as possible, putting code in header files

Code is likely to change. Class definitions are much less likely to change. Furthermore, code in a cpp file is only included in that one place – that cpp. Code in a header file is included everywhere that header file is included. If you put code in a header file, that means that you are likely to change that header file and you are including the same code all over your program meaning it gets recompiled every time.

3. Split up your header into independent classes/structs/enums as much as possible.

It’s a hassle to add header files to a project. Oftentimes, when two kinds of interfaces, such as an enumeration and a structure, are both used with a class, it’s tempting to put both in the same header file.

Supermarket.h
enum Fruits {APPLE, ORANGE, GRAPE};
class Supermarket {Fruits GetType(Fruit *fruit);};

There are two problems with this. First, enumerations are likely to change. If we later add POMEGRANATE then every class that includes this needs to be recompiled. Secondly, every class that cares about Fruits now has to also include Supermarket, and vice-versa, resulting in unnecessary compiles.

4. If your class has complex types in the header, or is included by many other files, think about exposing an interface for that class.

Interfaces can be a hassle since you now have to maintain two copies of your class definition, rather than one. However, in some cases they can vastly cut down on compile times. Consider the following:

Database.h

#include "ComplexTemplatedList.h"
class Database
{
Rec* GetRecord();
ComplexTemplatedList recordsList;
};

DatabaseInterface.h

class Database
{
virtual Rec* GetRecord()=0;
};

In the second version, we got rid of all the implementation code yet every class that needs Database can still use it and now compile almost instantly.

Here’s another example:

class Logger
{
public:
void WriteLog(char *);
private:
char *logList;
int numLogs;
}

The logger will probably be included in a lot of places. If we were to add a field

int *logTimes;

then, even though the log times have nothing to do with the users of the log class, all those users will have to be recompiled. By using an interface, none of the member variables would be exposed, so adding or deleting member variables won’t cause recompiles of files that include this header.

5. Inside header files, include pointers to other classes rather than the classes themselves

CompileSlow.h

#include "CompositeClass.h"
class MyClass
{
CompositeClass a;
}

CompileFast.h

class CompositeClass;
class MyClass
{
CompositeClass *a;
}

Sorry if my last point is obvious, but it’s the easiest to do so is worth pointing out.

In conclusion, a few minutes of trouble today can save hours over the life of the project, not just for you but every other programmer that has to do builds.