I was inspired by watching one of the Harvard CS50 and Stanford lectures. I want to share two interesting facts which I studied recently.

1. Search problem solution.
2. Logarithmic function with small/big problem sizes.

Search problem solution.

Have you ever thought how computers solve our tasks?
Let’s imagine that we want to find the word “mission” in the vocabulary with about 1000 pages and what we are going to do? It’s easy, just typing the word in the search and Viola! But how does it happen? It is supposed that our application has some algorithm and use it in some way.

Let’s review some of them and their computational time:
1.The first one is when we turn page by page through the vocabulary until we find the word. It will take O(n) (linear time) to find the appropriate word, not so much complexity though. You can see the block scheme of what we are going to do.
1page
2. Ok let’s try to do it faster, go by twos. Also it would be good not to miss something in a process of searching. The computation time of this algorithm will be about O(n/2).
2pages_full
3. The third way. We can divide book by a half, and after that we will divide a half which can contain the searching word again. We will do so while we find the proper word. And this approach will take O(log n).
logn

Now we already have 3 algorithms and therefore, we need to understand which one is faster.

The best way to figure it out is to calculate how the function is growing. In mathematical sense growing means the derivative of function. For example, the derivative of the position of a moving object with respect to time is the object’s velocity: this measures how quickly the position of the object changes when time is advanced.

Ok, go back to our vocabulary and increase the number of pages to 2000 and let’s think… How many more steps these functions need to compute?

  1. To compute 1000 pages with the linear method O(n) we need maximum 1000 iteration. So for 2000 pages the number of steps will be increased ×2.
  2. The second method will grow a bit slower due to its computational time O(n/2).
  3. Third logarithmic method O(log n) requires only 1 additional step to compute 2000 pages.

You can easily understand the difference just looking on these 3 functions on the graph. And it seems we have a winner O(log n).
pages
FYI for vocabulary with 4bln pages, the logarithmic function needs just 32 iterations to find the required word. And now you can image the power of this function😺.

Logarithmic function with small/big problem sizes.

The interesting thing is that not always the logarithmic function is beneficial. Let’s compare the following functions: 1/2×n^2 and 6n × log n
The first graph displays that logarithmic function is growing slowly than quadratic one.
big problem

However, let’s take a look more precisely and you can notice that this amazing result happens mostly for big digits. If we decrease n, you will see that the logarithmic function growing faster than quadratic function when we have the small input data.
small problem

Therefore, when you are working with small problems it would be profitable not to use algorithms which have the logarithmic computational time.

Wow, we know a little bit more! This topic is really interesting for me and I would glad to see your comments here. If you have some propositions or request please contact me.

  • George

    i would like to see more of your topics

    • svitlana_moiseyenko

      Sorry for long response due to the tiny time-schedule. Will try to provide more articles…