Metadata-Version: 2.1
Name: liberal_alpha
Version: 0.1.16
Summary: Liberal Alpha Python SDK for interacting with gRPC-based backend
Home-page: https://github.com/capybaralabs-xyz/Liberal_Alpha
Author: capybaralabs
Author-email: donny@capybaralabs.xyz
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.8
Classifier: Programming Language :: Python :: 3.9
Classifier: Programming Language :: Python :: 3.10
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Requires-Python: >=3.8
Description-Content-Type: text/markdown
Requires-Dist: grpcio>=1.30.0
Requires-Dist: protobuf>=3.13.0
Requires-Dist: requests>=2.20.0
Requires-Dist: coincurve>=13.0.0
Requires-Dist: pycryptodome>=3.9.0
Requires-Dist: eth-account>=0.5.0
Requires-Dist: eth-keys>=0.3.0
Requires-Dist: websockets>=8.0.0
Requires-Dist: msgpack>=1.0.0

# Liberal Alpha Python SDK (Historical HTTP APIs)

This SDK provides **historical upload** and **historical download** APIs over HTTP.

## Install

```bash
Install
pip install liberal_alpha

Environment Variables
# Optional: override default API base (default: https://api.liberalalpha.com)
export LIBALPHA_API_BASE="https://api.liberalalpha.com"

# Upload auth (X-API-Key)
export LIBALPHA_API_KEY="YOUR_API_KEY"

# Download auth (used to obtain JWT via /api/users/auth)
export LIBALPHA_PRIVATE_KEY="0xYOUR_PRIVATE_KEY"

# Optional upload tuning
export LIBALPHA_UPLOAD_BATCH_ID="12345"      # optional batch id (int or numeric string)
export LIBALPHA_UPLOAD_CHUNK_SIZE="1048576"  # default 1MB
export LIBALPHA_UPLOAD_RESUME="1"            # 1=enable resume, 0=disable
export LIBALPHA_UPLOAD_TIMEOUT="60"          # per-request timeout for chunk upload/finalize

Initialize Client
from liberal_alpha.client import LiberalAlphaClient

client = LiberalAlphaClient(
    api_key="YOUR_API_KEY",            # optional if using env LIBALPHA_API_KEY
    private_key="0xYOUR_PRIVATE_KEY",  # optional if using env LIBALPHA_PRIVATE_KEY
    # api_base="https://api.liberalalpha.com",  # optional override
    # timeout=90,                              # optional default HTTP timeout
)

Historical Upload API
def upload_data(record_id: int, df: "pandas.DataFrame") -> bool:
    ...

DataFrame Format (Recommended)

Format B (flat columns) — you provide record_id/symbol/timestamp plus any extra feature columns.
The SDK will automatically pack extra columns into data (dict) on upload.

Required columns:

record_id (int)

symbol (str)

timestamp (int): milliseconds since epoch
(seconds are also accepted and will be auto-converted to milliseconds)

Everything besides record_id/symbol/timestamp/batch_id will be packed into data.

Also Supported

Format A (backend-native) — if you already have a data dict column:

record_id (int)

symbol (str)

data (dict)

timestamp (int)

Upload Example (Format B)
import pandas as pd
from liberal_alpha.client import LiberalAlphaClient

client = LiberalAlphaClient(api_key="YOUR_API_KEY")

rows = []
base_ts_ms = 1733299200000
for i in range(1000):
    rows.append({
        "record_id": 4,
        "symbol": "BNfBTC",
        "timestamp": base_ts_ms + i * 1000,
        # any extra columns become data(dict)
        "open": 50000.0 + i,
        "close": 51000.0 + i,
        "high": 52000.0 + i,
        "low": 49000.0 + i,
        "volume": 123.45 + i,
    })

df = pd.DataFrame(rows)

# Optional: set batch id via env (public API has only (record_id, df))
# export LIBALPHA_UPLOAD_BATCH_ID="20251215_ohlcv_part1"
ok = client.upload_data(record_id=4, df=df)
print("Upload ok:", ok)

Notes

Upload uses X-API-Key authentication.

Upload is chunked and supports resume (enabled by default).

Resume state is cached locally at:
~/.libalpha_upload_sessions.json (configurable via LIBALPHA_UPLOAD_CACHE_PATH).

Historical Download API
def download_data(
    record_id: int,
    symbols: list[str],
    dates: list[int],
    tz_info: "datetime.tzinfo | str | int | float" = "Asia/Singapore",
    *,
    size: int = 500,
    cursor: str | None = None,
    fetch_all: bool = True,
) -> "pandas.DataFrame":
    ...

Parameters

record_id: record id to download

symbols: list of symbols, e.g. ["BTCUSDT", "ETHUSDT"]
If you pass [], the SDK will auto-fetch all symbols (if supported by backend).

dates: list of local dates in YYYYMMDD, e.g. [20251216, 20251217]
If you pass [], no date filter is applied.

tz_info: controls how local_date is computed

"Asia/Singapore" (IANA tz string)

numeric offsets like 8, -4, or strings like "+8", "-4"

size: server page size per request

cursor: ISO cursor string; usually you do not need to set this manually

fetch_all (default: True): fetch all pages for the requested time range(s)

Cursor behavior when dates is provided

If dates is non-empty and you do not pass cursor, the SDK will automatically query from:

cursor = end_of_day(max(dates)) (23:59:59.999 in tz_info)

If you pass multiple discrete dates, the SDK will fetch each date separately (each date uses its own end-of-day cursor), then concatenate results. This avoids “max date cursor only” missing earlier discrete dates.

Download Example
from liberal_alpha.client import LiberalAlphaClient

client = LiberalAlphaClient(private_key="0xYOUR_PRIVATE_KEY")

df = client.download_data(
    record_id=11,
    symbols=[],                 # empty => auto fetch all symbols
    dates=[20251216, 20251217], # local dates
    tz_info="Asia/Singapore",   # or tz_info=8
    fetch_all=True,             # default True
)

print(df.head())
print("rows:", len(df))

Notes

Download uses Bearer JWT; SDK obtains JWT via POST /api/users/auth using your private key.

If fetch_all=False, the SDK will stop early (useful when you only want a quick preview).
