Metadata-Version: 2.4
Name: mobilesam_lite
Version: 0.1.0
Summary: Unofficial MobileSAMv2 and MobileSAM software package for lightweight Segment Anything and everything inference.
Author: bill2239
License: Apache-2.0
Project-URL: Homepage, https://github.com/bill2239/mobilesam_lite
Project-URL: Repository, https://github.com/bill2239/mobilesam_lite
Project-URL: Issues, https://github.com/bill2239/mobilesam_lite/issues
Keywords: segmentation,sam,segment-anything,computer-vision,pytorch
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Science/Research
Classifier: License :: OSI Approved :: Apache Software License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Topic :: Scientific/Engineering :: Artificial Intelligence
Requires-Python: >=3.8
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: matplotlib>=3.3
Requires-Dist: numpy>=1.23
Requires-Dist: opencv-python>=4.6
Requires-Dist: Pillow>=7.1.2
Requires-Dist: psutil>=5.9
Requires-Dist: PyYAML>=5.3.1
Requires-Dist: requests>=2.23
Requires-Dist: scipy>=1.4.1
Requires-Dist: torch>=2.1
Requires-Dist: torchvision>=0.16
Requires-Dist: timm>=0.9.12
Requires-Dist: tqdm>=4.64
Provides-Extra: dev
Requires-Dist: build>=1.2.2; extra == "dev"
Requires-Dist: twine>=5.1.1; extra == "dev"
Dynamic: license-file

# MobileSAM_lite 

An unofficial Python package for MobileSAM and MobileSAMv2 runtime that adds support for lighter encoder models not available in the original implementation.

This package vendors the runtime code needed for inference:

- `mobilesamv2`
- `tinyvit`
- `efficientvit`
- `ultralytics` under `mobilesam_lite/_vendor/ultralytics`

It intentionally does not bundle model checkpoints. Download weights separately and pass the checkpoint path at runtime.

The optional `mobilesamv2.promt_mobilesamv2` module now resolves its Ultralytics dependency from the vendored package in `mobilesam_lite._vendor.ultralytics`.

## Install locally

```bash
pip install -e .
```

## Build distributions

```bash
python -m build
```

This will generate wheel and source distributions under `dist/`.

## Example

```python
import torch

from mobilesam_lite.mobile_sam import SamPredictor, sam_model_registry

model = sam_model_registry["vit_t"]("./weight/mobile_sam.pt")
device = "cuda" if torch.cuda.is_available() else "cpu"
model.to(device)
model.eval()

predictor = SamPredictor(model)
```

## Verify an installed wheel

After installing the wheel into a clean environment, run:

```bash
python example_inference_mobilesam.py --checkpoint /path/to/mobile_sam.pt
```

You can also provide a real image:

```bash
python example_inference_mobilesam.py --checkpoint /path/to/mobile_sam.pt --image /path/to/image.jpg
```

The script prints the installed distribution version, the imported package path, and the output tensor shapes from one prediction call.

For the MobileSAMv2 decoder path, use:

```bash
python example_inference_mobilesamv2.py \
  --checkpoint /path/to/mobile_sam.pt \
  --prompt-decoder-checkpoint /path/to/Prompt_guided_Mask_Decoder.pt \
  --object-aware-model-checkpoint /path/to/ObjectAwareModel.pt
```

This script verifies the packaged MobileSAMv2 pipeline with `ObjectAwareModel` box proposals plus the prompt-guided decoder, and writes `boxes.png`, `mask_union.png`, `mask_union_overlay.png`, and `mask_overlay.png` into the chosen output directory.

## Reference: Official MobileSAM repository 
https://github.com/chaoningzhang/mobilesam
