Skip to content

Commit 2c1a1a8

Browse files
authored
Merge branch 'main' into fix/read_image_as_pil_chw_format
2 parents f8cf1f4 + 253f76f commit 2c1a1a8

File tree

10 files changed

+1042
-42
lines changed

10 files changed

+1042
-42
lines changed

.github/workflows/ci.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ jobs:
3939
steps:
4040
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
4141
- name: Setup uv python package manager
42-
uses: astral-sh/setup-uv@85856786d1ce8acfbcc2f13a5f3fbd6b938f9f41 # v7.1.2
42+
uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
4343
with:
4444
python-version: ${{ matrix.python-version }}
4545
enable-cache: true
@@ -96,7 +96,7 @@ jobs:
9696
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]
9797
steps:
9898
- name: Setup uv python package manager
99-
uses: astral-sh/setup-uv@85856786d1ce8acfbcc2f13a5f3fbd6b938f9f41 # v7.1.2
99+
uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
100100
with:
101101
python-version: ${{ matrix.python-version }}
102102
enable-cache: true

.github/workflows/mmdet.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -69,7 +69,7 @@ jobs:
6969
steps:
7070
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
7171
- name: Setup uv python package manager
72-
uses: astral-sh/setup-uv@85856786d1ce8acfbcc2f13a5f3fbd6b938f9f41 # v7.1.2
72+
uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
7373
with:
7474
python-version: ${{ matrix.python-version }}
7575
enable-cache: true

.github/workflows/publish_docs.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ jobs:
3434
fetch-depth: 0
3535

3636
- name: 🐍 Install uv and set Python ${{ matrix.python-version }}
37-
uses: astral-sh/setup-uv@85856786d1ce8acfbcc2f13a5f3fbd6b938f9f41 # v7.1.2
37+
uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
3838
with:
3939
python-version: ${{ matrix.python-version }}
4040
activate-environment: true

.github/workflows/publish_pypi.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ jobs:
1010
steps:
1111
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
1212
- name: Setup uv python package manager
13-
uses: astral-sh/setup-uv@85856786d1ce8acfbcc2f13a5f3fbd6b938f9f41 # v7.1.2
13+
uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
1414
with:
1515
enable-cache: true
1616
prune-cache: false

.github/workflows/ruff.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ jobs:
2626
with:
2727
python-version: '3.10'
2828
- name: Install uv
29-
uses: astral-sh/setup-uv@85856786d1ce8acfbcc2f13a5f3fbd6b938f9f41 # v7.1.2
29+
uses: astral-sh/setup-uv@1e862dfacbd1d6d858c55d9b792c756523627244 # v7.1.4
3030
- name: Install dependencies
3131
run: |
3232
uv venv --python 3.10

demo/inference_for_ultralytics_yoloe.ipynb

Lines changed: 996 additions & 0 deletions
Large diffs are not rendered by default.

mkdocs.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -56,6 +56,7 @@ nav:
5656
- Inference RTDETR: notebooks/inference_for_rtdetr.ipynb
5757
- Inference Roboflow: notebooks/inference_for_roboflow.ipynb
5858
- Inference TorchVision: notebooks/inference_for_torchvision.ipynb
59+
- Inference YOLOE: notebooks/inference_for_ultralytics_yoloe.ipynb
5960
- Slicing: notebooks/slicing.ipynb
6061

6162
- Code Reference:

sahi/models/yoloe.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ class YOLOEDetectionModel(UltralyticsDetectionModel):
2626
- yoloe-11s-seg-pf.pt, yoloe-11m-seg-pf.pt, yoloe-11l-seg-pf.pt
2727
- yoloe-v8s-seg-pf.pt, yoloe-v8m-seg-pf.pt, yoloe-v8l-seg-pf.pt
2828
29-
!!! example "Usage with text prompts"
29+
!!! example "Usage Text Prompts"
3030
```python
3131
from sahi import AutoDetectionModel
3232

sahi/postprocess/combine.py

Lines changed: 32 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
from __future__ import annotations
22

3+
import numpy as np
34
import torch
4-
from shapely.geometry import box
5-
from shapely.strtree import STRtree
5+
from shapely import STRtree, box
66

77
from sahi.logger import logger
88
from sahi.postprocess.utils import ObjectPredictionList, has_match, merge_object_prediction_pair
@@ -53,31 +53,30 @@ def nms(
5353
Returns:
5454
A list of filtered indexes, Shape: [ ,]
5555
"""
56+
if len(predictions) == 0:
57+
return []
5658

57-
# Extract coordinates and scores as tensors
58-
x1 = predictions[:, 0]
59-
y1 = predictions[:, 1]
60-
x2 = predictions[:, 2]
61-
y2 = predictions[:, 3]
62-
scores = predictions[:, 4]
59+
# Ensure predictions are on CPU and convert to numpy
60+
if predictions.device.type != "cpu":
61+
predictions = predictions.cpu()
6362

64-
# Calculate areas as tensor (vectorized operation)
63+
predictions_np = predictions.numpy()
64+
65+
# Extract coordinates and scores
66+
x1 = predictions_np[:, 0]
67+
y1 = predictions_np[:, 1]
68+
x2 = predictions_np[:, 2]
69+
y2 = predictions_np[:, 3]
70+
scores = predictions_np[:, 4]
71+
72+
# Calculate areas
6573
areas = (x2 - x1) * (y2 - y1)
6674

67-
# Create Shapely boxes only once
68-
boxes = []
69-
for i in range(len(predictions)):
70-
boxes.append(
71-
box(
72-
x1[i].item(), # Convert only individual values
73-
y1[i].item(),
74-
x2[i].item(),
75-
y2[i].item(),
76-
)
77-
)
75+
# Create Shapely boxes (vectorized)
76+
boxes = box(x1, y1, x2, y2)
7877

79-
# Sort indices by score (descending) using torch
80-
sorted_idxs = torch.argsort(scores, descending=True).tolist()
78+
# Sort indices by score (descending)
79+
sorted_idxs = np.argsort(scores)[::-1]
8180

8281
# Build STRtree
8382
tree = STRtree(boxes)
@@ -91,7 +90,7 @@ def nms(
9190

9291
keep.append(current_idx)
9392
current_box = boxes[current_idx]
94-
current_area = areas[current_idx].item() # Convert only when needed
93+
current_area = areas[current_idx]
9594

9695
# Query potential intersections using STRtree
9796
candidate_idxs = tree.query(current_box)
@@ -108,16 +107,16 @@ def nms(
108107
if scores[candidate_idx] == scores[current_idx]:
109108
# Use box coordinates for stable ordering
110109
current_coords = (
111-
x1[current_idx].item(),
112-
y1[current_idx].item(),
113-
x2[current_idx].item(),
114-
y2[current_idx].item(),
110+
x1[current_idx],
111+
y1[current_idx],
112+
x2[current_idx],
113+
y2[current_idx],
115114
)
116115
candidate_coords = (
117-
x1[candidate_idx].item(),
118-
y1[candidate_idx].item(),
119-
x2[candidate_idx].item(),
120-
y2[candidate_idx].item(),
116+
x1[candidate_idx],
117+
y1[candidate_idx],
118+
x2[candidate_idx],
119+
y2[candidate_idx],
121120
)
122121

123122
# Compare coordinates lexicographically
@@ -130,10 +129,10 @@ def nms(
130129

131130
# Calculate metric
132131
if match_metric == "IOU":
133-
union = current_area + areas[candidate_idx].item() - intersection
132+
union = current_area + areas[candidate_idx] - intersection
134133
metric = intersection / union if union > 0 else 0
135134
elif match_metric == "IOS":
136-
smaller = min(current_area, areas[candidate_idx].item())
135+
smaller = min(current_area, areas[candidate_idx])
137136
metric = intersection / smaller if smaller > 0 else 0
138137
else:
139138
raise ValueError("Invalid match_metric")

sahi/slicing.py

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,8 +63,12 @@ def get_slice_bboxes(
6363
y_max = y_min = 0
6464

6565
if slice_height and slice_width:
66-
y_overlap = int(overlap_height_ratio * slice_height)
67-
x_overlap = int(overlap_width_ratio * slice_width)
66+
if overlap_height_ratio is not None and overlap_height_ratio >= 1.0:
67+
raise ValueError("Overlap ratio must be less than 1.0")
68+
if overlap_width_ratio is not None and overlap_width_ratio >= 1.0:
69+
raise ValueError("Overlap ratio must be less than 1.0")
70+
y_overlap = int((overlap_height_ratio if overlap_height_ratio is not None else 0.2) * slice_height)
71+
x_overlap = int((overlap_width_ratio if overlap_width_ratio is not None else 0.2) * slice_width)
6872
elif auto_slice_resolution:
6973
x_overlap, y_overlap, slice_width, slice_height = get_auto_slice_params(height=image_height, width=image_width)
7074
else:

0 commit comments

Comments
 (0)