Some days ago I came across a piece of code by one colleague that caught me quite by surprise. He told me it was a performance improvement, and while in that case it was rather unnecessary and made the code less straight forward, it's good to have it in mind for other occasions. The idea is pretty basic, but I had never though about it before.
The example was in C# and Linq To Objects, so I'll stick to it, but it would be the same with JavaScript and underscore.js or whatever.
So we have a collections of City objects like these:
public class City { public string Name {get; set;} public int Population {get; set;} } var cities = new List(){ new City(){ Name = "Xixon", Population = 275000 }, new City(){ Name = "Berlin", Population = 4000000 }, new City(){ Name = "Toulouse", Population = 1250000 }, new City(){ Name = "Paris", Population = 12000000 } };
and we want to get the most populated city. So I would just do this:
biggestCity = cities.OrderByDescending(city => city.Population).First();
while my colleage had written this:
var maxPopulation = cities.Max(city => city.Population); var biggestCity = cities.First(city => city.Population == maxPopulation);
In the first code, regardless of the sorting algorithm used by OrderBy, we can say that you are finding the Max element n times (each time in a smaller collection), while in the second example you are finding that Max only once. Depending on the Sorting algorithm the first case will usually be O(n*log n), while the second case is O(n). As explained here, apart from the speed improvements, with the OrderBy approach you'll run into an Out of Memory exception if your collection is huge enough.
No comments:
Post a Comment