-
assets
Upload folder using huggingface_hub
-
1.57 kB
Upload folder using huggingface_hub
-
20.8 kB
Upload folder using huggingface_hub
-
383 Bytes
Add pipeline tag (#1)
cifar10.pt
Detected Pickle imports (19)
- "typing.Any",
- "torch._utils._rebuild_parameter",
- "omegaconf.nodes.AnyNode",
- "collections.defaultdict",
- "torch._utils._rebuild_tensor_v2",
- "builtins.list",
- "omegaconf.listconfig.ListConfig",
- "omegaconf.dictconfig.DictConfig",
- "numpy.dtype",
- "omegaconf.base.ContainerMetadata",
- "torch.nn.modules.container.ModuleDict",
- "torch.storage._load_from_bytes",
- "numpy._core.multiarray.scalar",
- "builtins.dict",
- "torch_utils.persistence._reconstruct_persistent_obj",
- "omegaconf.base.Metadata",
- "builtins.int",
- "collections.OrderedDict",
- "torch.float32"
How to fix it?
225 MB
Upload folder using huggingface_hub
imagenet256_s_a1.pkl
Detected Pickle imports (30)
- "timm.layers.format.Format",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.linear.Identity",
- "timm.layers.mlp.Mlp",
- "torch.nn.modules.conv.Conv2d",
- "omegaconf.dictconfig.DictConfig",
- "torch._utils._rebuild_parameter",
- "torch.nn.modules.activation.SiLU",
- "torch.float32",
- "torch.nn.modules.container.ModuleList",
- "collections.defaultdict",
- "torch.nn.modules.dropout.Dropout",
- "torch.nn.modules.linear.Linear",
- "builtins.int",
- "torch.nn.modules.activation.GELU",
- "torch_utils.persistence._reconstruct_persistent_obj",
- "torch.nn.modules.sparse.Embedding",
- "typing.Any",
- "timm.layers.patch_embed.PatchEmbed",
- "timm.models.vision_transformer.Attention",
- "torch.nn.modules.normalization.LayerNorm",
- "omegaconf.base.ContainerMetadata",
- "omegaconf.listconfig.ListConfig",
- "omegaconf.base.Metadata",
- "torch.nn.modules.container.Sequential",
- "omegaconf.nodes.AnyNode",
- "collections.OrderedDict",
- "builtins.dict",
- "torch.storage._load_from_bytes",
- "builtins.list"
How to fix it?
2.71 GB
Upload folder using huggingface_hub
imagenet256_ts_a2.pkl
Detected Pickle imports (30)
- "timm.layers.patch_embed.PatchEmbed",
- "typing.Any",
- "torch._utils._rebuild_parameter",
- "timm.layers.mlp.Mlp",
- "omegaconf.nodes.AnyNode",
- "collections.defaultdict",
- "torch._utils._rebuild_tensor_v2",
- "torch.nn.modules.normalization.LayerNorm",
- "builtins.list",
- "torch.nn.modules.container.Sequential",
- "torch.nn.modules.dropout.Dropout",
- "omegaconf.listconfig.ListConfig",
- "torch.nn.modules.sparse.Embedding",
- "timm.layers.format.Format",
- "omegaconf.dictconfig.DictConfig",
- "torch.nn.modules.linear.Linear",
- "omegaconf.base.ContainerMetadata",
- "torch.nn.modules.linear.Identity",
- "torch.nn.modules.activation.SiLU",
- "torch.storage._load_from_bytes",
- "torch.nn.modules.container.ModuleList",
- "torch_utils.persistence._reconstruct_persistent_obj",
- "builtins.dict",
- "omegaconf.base.Metadata",
- "builtins.int",
- "timm.models.vision_transformer.Attention",
- "collections.OrderedDict",
- "torch.float32",
- "torch.nn.modules.activation.GELU",
- "torch.nn.modules.conv.Conv2d"
How to fix it?
2.71 GB
Upload folder using huggingface_hub