Metadata-Version: 2.4
Name: aigco
Version: 0.0.4
Summary: Omni LLM, Agent and inference
Requires-Python: <3.13,>=3.12
Description-Content-Type: text/markdown
Requires-Dist: python-dotenv
Requires-Dist: datasets>=2.18.0
Requires-Dist: peft>=0.7.1
Requires-Dist: pip>=24.0
Requires-Dist: swanlab>=0.3.0
Requires-Dist: torch==2.10.0
Requires-Dist: torchvision
Requires-Dist: transformers>=5.1.0
Provides-Extra: flash-attn
Requires-Dist: flash-attn; extra == "flash-attn"

## From Source

better aigc lib

mkdir src && git -C src clone https://github.com/TorrentBrave/aigco.git

git submodule add --force https://github.com/TorrentBrave/aigco.git src/aigco

uv add --editable ./src/aigco/ <!-- will update aigco.egg.info -->

cd src/aigco

uv lock

uv sync

## From PypI
uv pip install aigco[flash_attn]
or add to deependencies
"flash-attn @ https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.16/flash_attn-2.8.3%2Bcu130torch2.10-cp312-cp312-linux_x86_64.whl"
