Instructions to use apple/DepthPro-hf with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use apple/DepthPro-hf with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("depth-estimation", model="apple/DepthPro-hf")# Load model directly from transformers import AutoImageProcessor, AutoModelForDepthEstimation processor = AutoImageProcessor.from_pretrained("apple/DepthPro-hf") model = AutoModelForDepthEstimation.from_pretrained("apple/DepthPro-hf") - Notebooks
- Google Colab
- Kaggle
Update Model Card: Use `DepthProImageProcessor` instead of `DepthProImageProcessorFast`
#7
by geetu040 - opened
With transformers v5, the Fast image processor variants are no longer exposed through separate prefixes. As a result, DepthProImageProcessorFast is now used by default when calling DepthProImageProcessor.
This PR updates the documentation to reflect the new behavior.