This means that we can say that 20 is an upper bound on the number of bugs in Box A.īoth 20 and 50 are valid upper bounds for the number of bugs in Box A. Box A can not have more than 20 bugs (this should be obvious, since it is equal to 20) This means that we can say that 50 is an upper bound on the number of bugs in Box A. Box A can not have more than 50 bugs (this should be obvious, since it is more than 20) Similarly, an upper bound for X is a value that X can not be above. We tend to only mention the lower bound that is closest to the actual value, because it is the most useful. However, the lower bound of 10 is much more useful than the lower bound of 5, because it is closer to the actual number of bugs in the box. This means that we can say that 10 is a lower bound on the number of bugs in Box A.īoth 5 and 10 are valid lower bounds for the number of bugs in Box A. Box A can not have less than 10 bugs (this should be obvious, since it is equal to 10) This means that we can say that 5 is a lower bound on the number of bugs in Box A. Box A can not have less than 5 bugs (this should be obvious, since it is less than 10) So, knowing that Box A has between 10 and 20 bugs we can say that: Let's clarify what we mean by upper bound and lower bound by using Box A as an example.Ī lower bound for X is a value that X can not be below. The warden knows, but doesn't tell you, that Box A actually has 17 bugs in it, and Box B actually has 32 bugs in it. You don't know which box is Box A or which box is Box B. You have to pick one of the boxes and eat the bugs inside the box. One box has between 30 and 40 bugs in it (we'll call this Box B) ![]() One box has between 10 and 20 bugs in it (we'll call this Box A) He shows you two identical looking boxes and tells you: You are trapped in a prison and the warden of the prison will only let you go after you play his game. Since we have log(n) is O(log(n)) and Ω(log(n)) we can say that log(n) is Θ(log(n)). We went from loose lower bounds to a tight lower bound Log(n) is Ω(log(n)) since log(n) grows asymptotically no slower than log(n) Log(n) is Ω(log(log(n))) since log(n) grows asymptotically no slower than log(log(n)) Log(n) is Ω(1) since log(n) grows asymptotically no slower than 1 We went from loose upper bounds to a tight upper bound Log(n) is O(log(n)) since log(n) grows asymptotically no faster than log(n) Log(n) is O(n) since log(n) grows asymptotically no faster than n Log(n) is O(n^n) since log(n) grows asymptotically no faster than n^n If we had a lower bound of 5' 8" and an upper bound of 5' 8" we could then say that our neighbour is 5' 8". However if we gave a lower bound for the height of our neighbour at 5' 5", and an upper bound of 5' 10" we would have a much better idea of how tall our neighbour was. These statements aren't very useful, because these bounds are so loose. We can also say that the height of our neighbour is lower bounded by 1 nanometer. We can immediately say that the height of our neighbour is upper bounded by 1 million kilometers. If we can precisely give the growth rate then we know Θ. ![]() If we have tight bounds where O and Ω have the same growth rate then we precisely know the growth rate. These bounds can be tight or loose,but we prefer to make them tight as possible. Think of O as an upperbound, and Ω as a lower bound. But from the above, we can see this means that f(n) is Ω(g(n)) and f(n) is O(g(n)). ![]() So we can say that f(n) is not growing asymptotically slower or faster than g(n). ![]() If f(n) is Θ(g(n)) it is growing asymptotically at the same rate as g(n). If f(n) is Θ(g(n)) this means that f(n) grows asymptotically at the same rate as g(n) If f(n) is Ω(g(n)) this means that f(n) grows asymptotically no slower than g(n) If f(n) is O(g(n)) this means that f(n) grows asymptotically no faster than g(n) The worst-case running time for binary search is log(n). When we use asymptotic notation, unless stated otherwise, we are talking about worst-case running time.
0 Comments
Leave a Reply. |