Metadata-Version: 2.4
Name: py4writers
Version: 1.0.0
Summary: Async Python library for the 4writers.net API
Author: soca1m
Author-email: socalmy2003@gmail.com
Requires-Python: >=3.12,<4.0
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.12
Classifier: Programming Language :: Python :: 3.13
Classifier: Programming Language :: Python :: 3.14
Requires-Dist: aiohttp (>=3.11.10,<4.0.0)
Requires-Dist: bs4 (>=0.0.2,<0.0.3)
Requires-Dist: build (>=1.2.2.post1,<2.0.0)
Requires-Dist: certifi (>=2024.8.30,<2025.0.0)
Requires-Dist: envparse (>=0.2.0,<0.3.0)
Requires-Dist: lxml (>=5.3.0,<6.0.0)
Requires-Dist: setuptools (>=75.7.0,<76.0.0)
Description-Content-Type: text/markdown

# py4writers

[![PyPI version](https://badge.fury.io/py/py4writers.svg)](https://badge.fury.io/py/py4writers)
[![Python 3.12+](https://img.shields.io/badge/python-3.12+-blue.svg)](https://www.python.org/downloads/)
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](https://opensource.org/licenses/MIT)
[![Tests](https://img.shields.io/badge/tests-93%20passed-brightgreen.svg)]()

Async Python client for the [4writers.net](https://4writers.net) platform API. Fetch orders, download files, accept work — all from your own scripts.

## Features

- **Fully async** — built on `aiohttp`, designed for `async with` / `await`
- **Streaming iterators** — `iter_orders()`, `iter_completed_orders()`, etc. yield one order at a time without loading everything into memory
- **Automatic retry** — exponential backoff on network errors (configurable)
- **Rate limiting** — semaphore-based concurrency control (default 10 parallel requests)
- **Typed models** — `Order` and `File` dataclasses with full type hints
- **Clean error hierarchy** — `AuthenticationError`, `NetworkError`, `ParsingError`, etc.

## Installation

```bash
pip install py4writers
```

Or with Poetry:

```bash
poetry add py4writers
```

## Quick start

```python
import asyncio
from py4writers import API

async def main():
    async with API(login="your_login", password="your_password") as api:
        await api.login()

        orders = await api.get_orders()
        for order in orders:
            print(f"{order.order_id}: {order.title} — ${order.total:.2f}")

asyncio.run(main())
```

## Usage

### Fetching available orders

**Batch** — loads all orders from one page at once, with descriptions and files:

```python
orders = await api.get_orders(page=1, page_size=50, category="essay")

for order in orders:
    print(order.title, order.total, order.description)
```

**Streaming** — yields orders one-by-one, automatically paginates:

```python
async for order in api.iter_orders(max_pages=3):
    print(order.title, order.total)
    if order.total > 50:
        break  # stop early whenever you want
```

### Completed orders

```python
completed = await api.get_completed_orders()
for order in completed:
    print(f"{order.title} — paid ${order.your_payment:.2f}")

# or streaming:
async for order in api.iter_completed_orders():
    print(order.title, order.editor_work, order.your_payment)
```

### Active orders (in progress / revision / late)

```python
# Generic:
orders = await api.get_active_orders(order_type="processing")

# Convenience wrappers:
processing = await api.get_processing_orders()
revisions  = await api.get_revision_orders()
late       = await api.get_late_orders()

# Streaming:
async for order in api.iter_processing_orders():
    print(order.title, order.remaining)
```

### Files

```python
# List files attached to an order:
files = await api.get_order_files(order_index=2569038)

for f in files:
    print(f.name, f.author, f.date)

# Download a file:
content = await api.download_file(file_id=f.id)
with open(f.name, "wb") as fp:
    fp.write(content)

# Stream files one by one:
async for f in api.iter_order_files(order_index=2569038):
    data = await api.download_file(f.id)
    save(f.name, data)
```

### Order details

```python
description = await api.fetch_order_details(order_index=2569038)
print(description)

# For completed orders:
description = await api.fetch_order_details(order_index=2569038, is_completed=True)
```

### Taking an order

```python
success = await api.take_order(order_index=2569038)
if success:
    print("Order accepted!")
```

## Data models

### `Order`

| Field            | Type                | Description                    |
|------------------|---------------------|--------------------------------|
| `title`          | `str`               | Order title                    |
| `subject`        | `str`               | Subject area                   |
| `order_id`       | `str`               | Public order ID                |
| `order_index`    | `int`               | Internal index (used in API calls) |
| `deadline`       | `str`               | Deadline date/time             |
| `remaining`      | `str`               | Time remaining                 |
| `order_type`     | `str`               | Essay, Research Paper, etc.    |
| `academic_level` | `str`               | College, Bachelor, Master, etc.|
| `style`          | `str`               | APA, MLA, Chicago, etc.       |
| `language`       | `str`               | Language                       |
| `pages`          | `int`               | Page count                     |
| `sources`        | `int`               | Required sources               |
| `salary`         | `float`             | Base payment                   |
| `bonus`          | `float`             | Bonus                          |
| `total`          | `float`             | Total (salary + bonus)         |
| `description`    | `Optional[str]`     | Full description (loaded on demand) |
| `files`          | `Optional[List[File]]` | Attached files (loaded on demand) |
| `editor_work`    | `Optional[float]`   | Editor's payment (completed only) |
| `your_payment`   | `Optional[float]`   | Writer's payment (completed only) |

### `File`

| Field    | Type             | Description        |
|----------|------------------|--------------------|
| `id`     | `int`            | File ID            |
| `name`   | `str`            | Filename           |
| `author` | `str`            | Who uploaded it    |
| `date`   | `str`            | Upload date        |
| `data`   | `Optional[bytes]`| Cached file content|

```python
file.get_download_url()  # returns the full download URL
```

## Configuration

### API client

```python
from py4writers import API

# With credentials:
api = API(login="user", password="pass")

# With an existing session cookie:
api = API(session="your_session_cookie")

# Custom concurrency limit:
api = API(login="user", password="pass", max_concurrent_requests=5)
```

Always use the context manager to ensure resources are released:

```python
async with API(login="user", password="pass") as api:
    await api.login()
    # ...
# session is closed automatically
```

### Logging

```python
import logging

# Basic:
logging.basicConfig(level=logging.INFO)

# Verbose (see every HTTP call):
logging.basicConfig(level=logging.DEBUG)
```

### Environment variables

```python
from envparse import env

env.read_envfile(".env")

async with API(login=env.str("LOGIN"), password=env.str("PASSWORD")) as api:
    await api.login()
```

`.env` file:

```
LOGIN=your_login
PASSWORD=your_password
```

## Error handling

```python
from py4writers.exceptions import (
    FourWritersAPIError,   # base class for all errors
    AuthenticationError,   # login failed
    SessionExpiredError,   # session expired, re-auth needed
    NetworkError,          # connection / HTTP errors (auto-retried)
    ParsingError,          # HTML parsing failed
    OrderNotFoundError,    # order not found
    FileNotFoundError,     # file not found
    RateLimitError,        # rate limit exceeded
)

try:
    await api.login()
    orders = await api.get_orders()
except AuthenticationError:
    print("Bad credentials")
except NetworkError:
    print("Network issue (already retried 3 times)")
except ParsingError:
    print("Website structure may have changed")
```

## Advanced

### Retry decorator

All network-facing methods use `@async_retry` internally. You can use it in your own code:

```python
from py4writers.utils import async_retry
from py4writers.exceptions import NetworkError

@async_retry(max_attempts=3, delay=1.0, backoff=2.0, exceptions=(NetworkError,))
async def my_flaky_operation():
    ...
```

### Rate limiter

```python
from py4writers.utils import RateLimiter

limiter = RateLimiter(max_concurrent=5)

async with limiter:
    await do_something()

# or:
result = await limiter.execute(do_something())
```

## Project structure

```
src/py4writers/
    __init__.py          # Public API exports
    const.py             # URLs, endpoints, constants
    exceptions.py        # Exception hierarchy
    api/
        api.py           # API facade class
        _auth.py         # Authentication mixin
        _orders.py       # Order fetching engine
        _files.py        # File operations mixin
        _actions.py      # Order actions mixin
        _enrichment.py   # Shared enrichment logic
    client/
        aiohttp.py       # HTTP client wrapper
    types/
        models.py        # Order and File dataclasses
    parsers/
        order_parser.py  # HTML parsing
    utils/
        retry.py         # Retry decorator
        rate_limiter.py  # Semaphore rate limiter
```

## Running tests

```bash
# Install dev dependencies:
pip install pytest pytest-asyncio

# Run:
pytest tests/ -v
```

## Requirements

- Python 3.12+
- aiohttp >= 3.11
- beautifulsoup4 >= 4.12
- lxml >= 5.3
- certifi

## License

MIT

## Author

**socalmy** — [socalmy2003@gmail.com](mailto:socalmy2003@gmail.com)

- PyPI: [pypi.org/project/py4writers](https://pypi.org/project/py4writers/)
- GitHub: [github.com/soca1m/py4writers](https://github.com/soca1m/py4writers)

