This week we were introduced to "big omega" and formal definitions for "big oh" and "big omega". We looked at some examples about growth, and how functions need to be compared after the break-point to determine which grows faster. Moreover, we learned what it means for a function to be "in" O(n^2) or Omega(n^2) by looking at some graphs and examples. It was somewhat interesting and pretty straightforward, which was a nice change. There was a slide that showed the order of speed of various functions, where f(x) = n^n is the fastest growing function. I found it interesting to learn that slower growing functions are the most desirable algorithms for computer science, so that they are able to handle huge data sets efficiently. It made sense after the instructor explained it, but I wouldn't have thought of that on my own.
We also took a look at some insertion sort functions, and how to produce an equation to determine the speed of the algorithm. The equations came back to show how they were examples of "big oh" notation and growth speed, creating a well rounded lecture. All in all it was fairly interesting, and I am starting to see the bigger picture about how "expensive" some programs can be and why.
M
No comments:
Post a Comment