-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Need hash_funcs for type PyCapsule #1244
Copy link
Copy link
Closed
Labels
feature:cacheRelated to `st.cache_data` and `st.cache_resource`Related to `st.cache_data` and `st.cache_resource`feature:cache-hash-funcRelated to cache hashing functionsRelated to cache hashing functionstype:enhancementRequests for feature enhancements or new featuresRequests for feature enhancements or new features
Description
Summary
I cannot use caching while instantiating a Pytorch model via torchvision.models.__dict__, while it does work with torchvision.models.resnet18.
Steps to reproduce
import streamlit as st
import torchvision
@st.cache(allow_output_mutation=True)
def fun():
model = torchvision.models.__dict__['resnet18'](pretrained=True)
return model
model = fun()Expected behavior:
It should work just fine as this does work:
def fun():
model = torchvision.models.resnet18(pretrained=True)
return modelActual behavior:
The app throws the following error:
UnhashableType: Cannot hash object of type PyCapsule
Is this a regression?
NA
Debug info
- Streamlit version: Streamlit, version 0.56.0
- Python version: Python 3.6.8 :: Anaconda, Inc.
- Using Conda .
- OS version: macOS 10.14.6
- Browser version: Safari 13.0.4
Additional information
import streamlit as st
from types import MappingProxyType
class C():
a = 1
@st.cache(allow_output_mutation=True)
def f():
return C.__dict__['a']
f()raises UnhashableType: Cannot hash object of type mappingproxy
which I can resolve with adding hash_funcs={MappingProxyType: id} to st.cache arguments.
Reactions are currently unavailable
Metadata
Metadata
Labels
feature:cacheRelated to `st.cache_data` and `st.cache_resource`Related to `st.cache_data` and `st.cache_resource`feature:cache-hash-funcRelated to cache hashing functionsRelated to cache hashing functionstype:enhancementRequests for feature enhancements or new featuresRequests for feature enhancements or new features