Tuesday, April 29, 2008

Why reinvent the wheel again and again?

Reading this post "Get that job at Google" on Stevey blog's made me wonder why we always reinvent the wheel in the software industry.

At one point he says the interviewers usually ask to implement a well known basic algorithm (tree, sorting, ...). Why?

Do they expect that rooting learning existing algorithms makes a good programmer?

Said differently: Do you really ask a Chief how to build pans, knifes or an oven? Wouldn't it be better to test if he is able to select appropriate ingredients or use bad ingredients to make tasty dishes?

As most of us I studied how sorting, hashtable, trees and other statistics methods where implemented. And I quickly realised that only very few mathematical genius would be able to ameliorate them hence I forgot how to code them as soon as possible to leave place for useful stuff.

Such as which of the available implementations (Boost::graph, Poco, std,...) is the best in a cross-platform implementation, which one is thread-safe, which one is fastest, etc.

Anyway very informative post.
I wonder how much time he uses to write those rants :)

PS: If I made interviews one of my question would be to use so-called "advanced" statistical methods to prove a relationship among data then change the chosen method parameters to disprove it!

No comments: