I will answer about Python.
The lack of encapsulation and overloading functions is surprising.
Encapsulation in python, although very conditional, but there is. If the method starts with __, the interpreter automatically adds the prefix _% current_class% to the name and, accordingly, this method will not be visible in another class. Private Methods :
""" >>> obj = bar() >>> obj.__test() Traceback (most recent call last): ... AttributeError: 'bar' object has no attribute '__test' >>> obj.test2() Traceback (most recent call last): ... AttributeError: 'super' object has no attribute '_bar__test' >>> obj._foo__test() Hello >>> obj._foo__test <bound method bar.__test of <__main__.bar object at 0x...>> """ import doctest class foo(): def __test(self): print('Hello') class bar(foo): def test2(self): super().__test() doctest.testmod(optionflags=doctest.ELLIPSIS)
Thus, the encapsulation of class attributes is achieved. Of course, you can still refer to this method with the full name. However, for example, in C # you can access private methods through reflection, but this does not mean that these methods are public.
To create abstract classes, a special metaclass with decorators is used .
""" >>> obj = bar() >>> obj.test() Hello >>> obj = baz() Traceback (most recent call last): ... TypeError: Can't instantiate abstract class baz with abstract methods test """ import abc import doctest class foo(metaclass=abc.ABCMeta): @abc.abstractmethod def test(self): print('Hello') class bar(foo): def test(self): super().test() class baz(foo): pass doctest.testmod()
Interfaces are implemented in the same way.
As for function overloads, in a dynamic language, where methods can take a variable number of parameters and default values, this is not possible, and it is absolutely not necessary. Example:
""" >>> foo(1, 2, 3, 4, 5, arg1=1, arg2=2, bar=False) (1, 2, (3, 4, 5), False, {'arg1': 1, 'arg2': 2}) """ import doctest def foo(first, second, *args, bar=True, **kwargs): print( (first, second, args, bar, kwargs) ) doctest.testmod()
It is not clear what value any function returns.
Yes, this drawback is inherent, probably, in all dynamic languages. However, to justify the python, I can say that this problem is partially solved with the help of annotations , a full-fledged IDE, and good documentation of the functions.
Moreover, using annotations, metaclasses and decorators is not difficult to implement and static typing :) Ie about the following:
""" >>> foo().bar([1, 2, 3]) Traceback (most recent call last): ... TypeError: 'param' is 'list', but the expected type is 'int' >>> foo().bar(1) Traceback (most recent call last): ... TypeError: function return is 'str', but the expected type is 'bool' """ class foo(metaclass=StrongTyping): def bar(param: int) -> bool: return 'some text'
Of course, in order for the example to work, you also need to implement the metaclass StrongTyping.
The code seemed unreadable just because of the lack of parentheses.
This is probably a matter of habit. The absence of parentheses obliges the programmer to follow the indents and use a single tab character, which, in my opinion, is not enough. As for the visual separation of methods, then I am helping the IDE, which separates them with horizontal lines.
Somewhere I stumbled upon a council to use function calls less frequently, especially in cycles, since This is an expensive python operation. Immediately before my eyes, huge methods appeared in 200+ lines.
In my opinion, the flexibility of python and such features as expression generators, on the contrary, allow writing very compact and concise constructions. Where a lot of cycles and conditional constructions are required for strongly-typed languages, in python it is solved by a single line.
UPD. What are the advantages of dynamic typing? IMHO, it is in the dynamics. Where else can you write something like this:
""" >>> op = ImportantClass() >>> op.foo(1) >>> ImportantClass.debug(True) >>> op.foo(2) Called <function foo at 0x...> (<__main__.ImportantClass object at 0x...>, 2) {} None """ import doctest import functools def logging(func): """ Декоратор функции, который логирует все вызовы """ @functools.wraps(func) def wrapper(*args, **kwargs): ret = func(*args, **kwargs) print('Called ', func, args, kwargs, ret) return ret # одна из особенностей питона - это то, что все является объектами # добавляем атрибут к функции wrapper.original = func return wrapper def debugging(cls): """ Декоратор класса, добавляет метод debug(enable) """ @classmethod def debug(cls, enable=True): attr = '__call__' if enable else 'original' call = lambda f: logging(f) if enable else f.original # в цикле подменяются все функции на logging(f), либо на оригинальные, # которые хранятся в декораторе for func in [call(f) for f in cls.__dict__.values() if hasattr(f, attr)]: setattr(cls, func.__name__, func) # добавляем новый метод для декорируемого класс # именно для класса, декоратор вызывается только один раз cls.debug = debug return cls @debugging class ImportantClass(): def foo(self, *args, **kwargs): pass # do something important def bar(self, *args, **kwargs): pass # also do something doctest.testmod(optionflags=doctest.ELLIPSIS)
Decorators logging and debugging can be placed in a separate module and used for any classes. Moreover, with debugging turned off, there will be practically no overhead, since methods are replaced by other methods, and do not turn around.
On the other hand, it is this dynamism that can serve as a source of errors, so you have to use it carefully and do not forget about testing.
By the way, in fact, it is considered that the python has both strict and dynamic typing . T.N. code "1" + 1
throw a TypeError exception.