Instructions to use RGBD-SOD/dptdepth with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use RGBD-SOD/dptdepth with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="RGBD-SOD/dptdepth", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("RGBD-SOD/dptdepth", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
File size: 736 Bytes
dcacd5e | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | from typing import List
from transformers import PretrainedConfig
"""
The configuration of a model is an object that
will contain all the necessary information to build the model.
The three important things to remember when writing you own configuration are the following:
- you have to inherit from PretrainedConfig,
- the __init__ of your PretrainedConfig must accept any kwargs,
- those kwargs need to be passed to the superclass __init__.
"""
class DPTDepthConfig(PretrainedConfig):
"""
Defining a model_type for your configuration is not mandatory,
unless you want to register your model with the auto classes."""
model_type = "dptdepth"
def __init__(self, **kwargs):
super().__init__(**kwargs)
|