# AA

Fri, Feb 4, 2011Hello, my name is Carl and I analyze algorithms.

When I first started grad school, I got roped in to auditing a 700 level, once a week discussion in the computer science department on advanced topics in algorithm analysis (AA). Most of the material came from Donald Knuth’s work. He was relatively new to me at the time and sort of gave me the impression that he had every record in the field.

The professor was skeptical. He seemed to be focused directly at me for the first 30 minutes or so. It seemed he was trying to get me to crack and take off so that they could get down to business. I eventually won him over and discovered I had a knack for it. I lasted about half the semester until other class work made it so that I no longer felt I had the time to attend. What I learned in that class blew my mind and has been a great help to me a few times in my career.

For those guys, algorithm complexity was everything. Constant factors meant nothing and the thought of actually implementing one of the specific algorithms that we discussed didn’t seem to ever cross their minds. I’m a little more practical. I know that constant factors, especially in a professional field that tends to push the limits of modern hardware, are not always negligible. This complexity analysis stuff, however, can be very practical as I learned through this experience.

I wrote a tool in C++ to check a VLSI layout against the process antenna (or charge collector) design rules. At the time, this was a pretty new constraint and there were no tools that would work on our custom layout format. The basic idea is to add up all of the area of a wire to find the amount of charge that will collect on it during the process of fabricating a chip. If the charge is too much to safely dissipate through a transistor then it could damage the device, rendering it useless. I had a tool that would find connected shapes (connectivity) but the tool was not designed to be used to calculate the surface area of wires. For example, on corners, it would leave out a number of shapes that were not important in calculating resistance or capacitance. The problem was that these “orphaned” shapes were important for what I was doing.

I figured that there weren’t very many of these orphans, so I wrote a nice simple doubly-nested loop to match them up with shapes where they touched. This was an O(N^2) algorithm but it was quick and worked great for developing tests on very small designs. As I tested my tool on larger designs, however, the runtime went from almost instantaneous to minutes, from minutes to hours. From there, I had enough data to tell that running it on our largest design would take months (at least).

In a hurry, I threw together a new algorithm based on one dimensional range trees that would match the orphan shapes (almost all rectangles) with other, connected shapes. I knew that there were better algorithms out there but I already had a range tree implementation handy. Right now, you might be thinking that, in the worst case, this solution is still N^2. (And, in general, you’d be right) My gut told me that this pathological case could never occur in a VLSI design. I was able to prove that it could not occur and was able to compute a lower bound for the algorithm thanks to the experience that I gained in grad school.

I was a little concerned that the solution would not make enough of a difference to save my tool. After all, I knew that the complexity of my algorithm was not optimal. However, the bound I computed was significantly lower and so I set out to implement it. A day or two later, I was able to run the tool with the optimized code. It ran on our largest design in less than a minute. No amount of twiddling constant factors or enabling compiler optimizations could have given me that result. I was amazed and forever sold on the merits of algorithm complexity analysis.