Not using the method self.calc_length() in __init__ function, results in same output, so what's the use, what will be the difference?

Screen Link:
https://app.dataquest.io/m/352/object-oriented-python/10/creating-and-updating-an-attribute

My Code:

class NewList(DQ):
    """
    A Python list with some extras!
    """
    def __init__(self, initial_state):
        self.data = initial_state
        
    
    def append(self, new_item):
        """
        Append `new_item` to the NewList
        """
        self.data = self.data + [new_item]
        self.calc_length()
        
    
    def calc_length(self):
        length = 0
        for row in self.data:
            length += 1
        self.length  = length

fibonacci = NewList([1,1,2,3,5])
print(fibonacci.data)
fibonacci.append(8)
print(fibonacci.length)

What I expected to happen:
The only slight difference in my code is that I removed ‘self.calc_length()’ from init method. I expected an error or a different output

What actually happened:
But I got the same Output as with self.calc_length() included under init . So, what’s the difference? Is self.calc_length() not required in this code, but it may affect some other code? Or is it written out of best practices

[1, 1, 2, 3, 5]
6
This is the output in both cases, with/without self.calc_length() in __init__()
1 Like

I think i got it, but please correct me if I am wrong. It’s because I am callling the append() method, which anyway is calling self.calc_length(), so there is no difference in O/P or error. Had in my code for Output I had just typed
fibonacci = NewList([1,1,2,3,5])
print(fibonacci.data)
print(fibonacci.length)

It would have shown error, because I am not calling append() (which has self.calc_length()) and init doesn’t have .length()

2 Likes