-
-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Unable to specify a default value for a generic parameter #3737
Comments
I think the problem here is that in general, e.g. if there are other parameters also using To make this work, you can use @overload
def foo() -> int: ...
@overload
def foo(a: _T) -> _T: ...
def foo(a = 42): # Implementation, assuming this isn't a stub
return a That's what we do in a few places in typeshed too (without the implementation though), e.g. check out |
I'm running in to a similar issue when trying to iron out a bug in the configparser stubs. Reducing RawConfigParser to the relevant parts gives: _section = Mapping[str, str]
class RawConfigParser:
def __init__(self, dict_type: Mapping[str, str] = ...) -> None: ...
def defaults(self) -> _section: ... This is incorrect at the moment, because _T1 = TypeVar('_T1', bound=Mapping)
class RawConfigParser(Generic[_T1]):
def __init__(self, dict_type: Type[_T1] = ...) -> None: ...
def defaults(self) -> _T1: ... but when you don't pass a dict_type, mypy can't infer what type it should be using and asks for a type annotation where it is being used. This is less than ideal for the default instantiation of the class (which will be what the overwhelming majority of people are using). So I tried adding the appropriate default to the definition: _T1 = TypeVar('_T1', bound=Mapping)
class RawConfigParser(_parser, Generic[_T1]):
def __init__(self, dict_type: Type[_T1] = OrderedDict) -> None: ...
def defaults(self) -> _T1: ... but then I hit the error that OP was seeing:
This feels like something that isn't currently supported, but I'm not sure if I'm missing a way that I could do this... |
@OddBloke |
@ilevkivskyi I don't follow, I'm afraid; could you expand on that a little, please? |
Consider this code: class UserMapping(Mapping):
...
RawConfigParser[UserMapping]() In this case |
@ilevkivskyi Thanks, that makes sense. Do you think what I'm trying to do is unrepresentable as things stand? |
I didn't think enough about this, but my general rule is not to be "obsessed" with precise types, sometimes |
Allowing default values for generic parameters might also enable using type constraints to ensure that generics are only used in specified ways. This is a bit esoteric, but here is an example of how you might use the same implementation for a dict and a set while requiring that consumers of the dict type always provide a value to class Never(Enum):
"""An uninhabited type"""
class Only(Enum):
"""A type with one member (kind of like NoneType)"""
ONE = 1
NOT_PASSED = Only(1)
class MyGenericMap(Generic[K, V]):
def insert(key, value=cast(Never, NOT_PASSED)):
# type: (K, V) -> None
...
if value is NOT_PASSED:
....
class MySet(MyGenericMap[K, Never]):
pass
class MyDict(MyGenericMap[K, V]):
pass
my_set = MySet[str]()
my_set.insert('hi') # this is fine, default value matches concrete type
my_set.insert('hi', 'hello') # type error, because 'hello' is not type never
my_dict = MyDict[str, int]()
my_dict.insert('hi') # type error because value is type str, not Never
my_dict.insert('hi', 22) # this is fine, because the default value is not used |
Huh, this just came up in our code. |
This request already appeared five times, so I am raising priority to high. I think my preferred solution for this would be to support lower bounds for type variables. It is not too hard to implement (but still a large addition), and at the same time it may provide more expressiveness in other situations. |
A second part for the workaround with overload should be noted: The header of implementation of the overloaded function must be without a generic annotation for that variable, but it is not acceptable for more complex functions where the body should be checked. Only the binding between input and output types is checked by overloading. The most precise solution for the body seems to encapsulate it by a new function with the same types, but without a default. @overload
def foo() -> int: ...
@overload
def foo(a: _T) -> _T: ...
def foo(a = 42): # unchecked implementation
return foo_internal(a)
def foo_internal(a: _T) -> _T:
# a checked complicated body moved here
return a Almost everything could be checked even in more complicated cases, but the number of necessary overloaded declarations could rise exponentially by EDIT |
Another solution is to use a broad static type for that parameter and immediately assign it in the body to the original exact generic type. It is a preferable solution for a class overload. _T = TypeVar('_T', list, dict)
class Cursor(Generic[_T]):
@overload
def __init__(self, connection: Connection) -> None: ...
@overload
def __init__(self, connection: Connection, row_type: Type[_T]) -> None: ...
def __init__(self, connection: Connection, row_type: Type=list) -> None:
self.row_type: Type[_T] = row_type # this annotation is important
...
def fetchone(self) -> Optional[_T]: ...
def fetchall(self) -> List[_T]: ... # more methods depend on _T type
cursor = Cursor(connection, dict) # cursor.execute(...)
reveal_type(cursor.fetchone()) # dict |
Wait. Why is that a problem? If default value doesn’t work for some calls, that should be a type error at the call site. _T = TypeVar('_T')
def foo(a: List[_T] = [42], b: List[_T] = [42]) -> List[_T]:
return a + b
foo() # This should work.
foo(a=[17]) # This should work.
foo(b=[17]) # This should work.
foo(a=["foo"]) # This should be a type error (_T cannot be both str and int).
foo(b=["foo"]) # This should be a type error (_T cannot be both int and str).
foo(a=["foo"], b=["foo"]) # This should work.
Only if we retain the requirement that the default value always works, which is not what’s being requested here. The desired result is: _T1 = TypeVar('_T1', bound=Mapping)
class RawConfigParser(_parser, Generic[_T1]):
def __init__(self, dict_type: Type[_T1] = OrderedDict) -> None: ...
def defaults(self) -> _T1: ...
class UserMapping(Mapping):
...
RawConfigParser[OrderedDict]() # This should work.
RawConfigParser[UserMapping]() # This should be a type error (_T1 cannot be both UserMapping and OrderedDict).
RawConfigParser[UserMapping](UserMapping) # This should work. |
I ran into this trying to write _T = TypeVar('_T')
def str2int(s:str, default:_T = None) -> Union[int, _T]: # error: Incompatible default for argument "default" (default has type "None", argument has type "_T")
try:
return int(s)
except ValueError:
return default (after figuring out that I need @overload
def str2int(s:str) -> Optional[int]: ...
@overload
def str2int(s:str, default:_T) -> Union[int, _T]: ...
def str2int(s, default = None):
the_body_is_no_longer_type_checked I think this raises the bar for type-checking newbies. |
May I ask if there's a little more detail on this design decision? Why are funciton implementations not checked for overloads? Searches on the topic are giving me blanks. I'm curious why mypy would not check a defanition for each overload variant. It would only need to revist one function per overload so I can't see the cost would be that high, nor the implementation that complex. I wonder what I'm missing? |
Generic parameters and defaults don't mix well (see python/mypy#3737) so we need to use overloads instead, or InstancesResult is always InstancesResult[Any, Any]. Overloads don't work on retrieve() due to defaults and ordering of params, so we create two new methods that can be safely typed.
Let's close this due to PEP 696? |
IIUC I don't think this should be closed. I don't think this should require specifying a default on the TypeVar explicitly and also mypy's behaviour doesn't change if you do so (default type is little different from default value) |
@hauntsaninja In that case--sorry for causing the mess. |
It seems like there are some workarounds available for class methods and normal function but does someone have a workaround for class attributes? E.g. config_parse_string: ConfigParseCallback[str]
class ConfigSetting(Generic[T]):
parse: ConfigParseCallback[T] = config_parse_string |
I'm looking at this and it seems like one potential solution would be updating |
@mcmanustfj That is not a solution because we do not want the default’s type to constrain call sites where the default is not used. In your example, we need to accept |
@andersk My thought was to only change the function type checking and not the invocation type checking, but I could definitely be misunderstanding how check_default_args works. My test patch has the following change:
Right now,
has the opposite problem and doesn't give the default invocation a type at all,
Not sure where to go from here, unfortunately. |
@mcmanustfj Just conceptually, changing only the function type checking as you describe is still not a solution, because it would fail to reject this wrong definition: def f[T](var: T = "asdf") -> T:
# we should not be able to assume var: str here
return var.lower()
f(42) and fail to reject this wrong invocation: def f[T](var: T = "asdf") -> T:
return var
# when the default is used, we should not let T be anything but str
ret: int = f() The correct solution needs to involve type-checking the default value at each call site if the default is used there. That is, we need to treat def f[T](var: T = "asdf") -> T:
return var
f()
f(arg) as the syntactic sugar it is for something like f_var_default = "asdf"
def f[T](var: T) -> T:
return var
f(f_var_default)
f(arg) (with perhaps an additional definition-site check that there exists some |
Simplified Example:
Real World example is something closer to:
The text was updated successfully, but these errors were encountered: