I’m currently going through the Algorithms for DS course (part of Data Structures and Algorithms topic in the DS path) and I’ve stumbled upon what I think is an incorrect answer.
In part 10 (Some Other Algorithms), three functions are provided are we are to assess their time complexity. I would argue that the correct answers are:
length_time_complexity = “linear” (function with element enumeration)
Just for clarification. Even if we were using Python’s len() function it would be linear because even with len() function, the time taken depends on the length of the iterable (list, tuple, dictionary, etc).
from my understanding, the len() function in Python should essentially invoke the length property of a native object (as claimed by several people here [https://stackoverflow.com/questions/1115313/cost-of-len-function]). I’ve checked in PyCharm and I’m getting results consistent with these claims.
Could you try declaring both lists as variables in a separate notebook cell and then just time the individual run of len()?