Metadata-Version: 2.4
Name: sibr-module
Version: 0.1.0
Summary: Help modules for different applications, especially for google cloud
Author-email: Sigvard Bratlie <sigvard@sibr.no>
License: LICENSE
Project-URL: Homepage, https://github.com/sigvardbratlie/sibr-module
Project-URL: Bug Tracker, https://github.com/sigvardbratlie/sibr-module/issues
Keywords: google cloud,bigquery,gcs,utilities
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Requires-Python: >=3.9
Description-Content-Type: text/markdown
License-File: LICENSE
Requires-Dist: bigframes
Requires-Dist: db-dtypes
Requires-Dist: gcsfs
Requires-Dist: google-api-core
Requires-Dist: google-auth
Requires-Dist: google-auth-oauthlib
Requires-Dist: google-cloud
Requires-Dist: google-cloud-appengine-logging
Requires-Dist: google-cloud-audit-log
Requires-Dist: google-cloud-bigquery
Requires-Dist: google-cloud-bigquery-connection
Requires-Dist: google-cloud-bigquery-storage
Requires-Dist: google-cloud-core
Requires-Dist: google-cloud-functions
Requires-Dist: google-cloud-iam
Requires-Dist: google-cloud-logging
Requires-Dist: google-cloud-resource-manager
Requires-Dist: google-cloud-storage
Requires-Dist: google-cloud-secret-manager
Requires-Dist: google-crc32c
Requires-Dist: grpcio
Requires-Dist: numpy
Requires-Dist: pandas
Requires-Dist: pandas-gbq
Requires-Dist: pyarrow
Requires-Dist: joblib
Requires-Dist: PyYaml
Requires-Dist: openpyxl
Dynamic: license-file

# SIBR Module

[![PyPI version](https://badge.fury.io/py/sibr-module.svg)](https://badge.fury.io/py/sibr-module) A collection of helper modules for interacting with Google Cloud Platform services, designed to simplify common workflows. This package provides easy-to-use classes for BigQuery, Google Cloud Storage, Secret Manager, and Cloud Logging.

## Features

* **`BigQuery`**: Easily upload DataFrames to BigQuery tables with automatic schema detection and support for `append`, `replace`, and `merge` operations.
* **`CStorage`**: Upload and download files to and from Google Cloud Storage buckets.
* **`SecretsManager`**: Securely access secrets from Google Secret Manager.
* **`Logger`**: A flexible logger that supports both local file logging and integration with Google Cloud Logging.

## Installation

You can install the package from PyPI:

```pip install sibr-module ```

## Quickstart
Ensure you are authenticated with Google Cloud. You can do this by running:

```gcloud auth application-default login```

```python
import pandas as pd
from sibr_module import BigQuery, Logger

# --- 1. Set up a logger ---
# This will log to a local file and, if enabled, to Google Cloud Logging
logger = Logger(log_name="my_app_logger", enable_cloud_logging=True)
logger.info("Application starting up.")

# --- 2. Prepare your data ---
data = {'name': ['Alice', 'Bob'], 'score': [85, 92]}
my_dataframe = pd.DataFrame(data)

# --- 3. Use the BigQuery helper ---
try:
    # Initialize the client with your Google Cloud Project ID
    bq_client = BigQuery(project_id="your-gcp-project-id", logger=logger)

    # Upload the DataFrame to a BigQuery table
    bq_client.to_bq(
        df=my_dataframe,
        dataset_name="my_dataset",
        table_name="my_table",
        if_exists="append"  # Options: 'append', 'replace', or 'merge'
    )

    logger.info("Successfully uploaded data to BigQuery.")

except Exception as e:
    logger.error(f"An error occurred: {e}")
```

## Usage Details
### BigQuery
The BigQuery class handles interactions with Google BigQuery. 
* to_bq(df, dataset_name, table_name, if_exists='append', merge_on=None): Uploads a pandas DataFrame.
  * if_exists='append': Adds data to an existing table.
  * if_exists='replace': Deletes the existing table and creates a new one.
  * if_exists='merge': Updates existing rows and inserts new ones. Requires merge_on to be set with a list of key columns.
  * read_bq(query): Executes a query and returns the result as a pandas DataFrame.

### CStorage
The CStorage class simplifies file operations with Google Cloud Storage.

* upload(local_file_path, destination_blob_name): Uploads a local file to the bucket.

* download(source_blob_name, destination_file_path): Downloads a file from the bucket.

### SecretsManager
Access your secrets easily.

* get_secret(secret_id): Retrieves the latest version of a secret by its ID.

## Contributing
Contributions are welcome! Please open an issue or submit a pull request if you have any improvements or bug fixes.

## License
This project is licensed under the MIT License. See the LICENSE file for details.
