abacusai.api_class
Submodules
abacusai.api_class.abstract
abacusai.api_class.ai_agents
abacusai.api_class.batch_prediction
abacusai.api_class.blob_input
abacusai.api_class.dataset
abacusai.api_class.dataset_application_connector
abacusai.api_class.document_retriever
abacusai.api_class.enums
abacusai.api_class.feature_group
abacusai.api_class.model
abacusai.api_class.monitor
abacusai.api_class.monitor_alert
abacusai.api_class.project
abacusai.api_class.python_functions
abacusai.api_class.refresh
Package Contents
Classes
Helper class that provides a standard way to create an ABC using |
|
Configs for vector store indexing. |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Batch Prediction Config for the ANOMALY_DETECTION problem type |
|
Batch Prediction Config for the ANOMALY_OUTLIERS problem type |
|
Batch Prediction Config for the FORECASTING problem type |
|
Batch Prediction Config for the NAMED_ENTITY_EXTRACTION problem type |
|
Batch Prediction Config for the PERSONALIZATION problem type |
|
Batch Prediction Config for the PREDICTIVE_MODELING problem type |
|
Batch Prediction Config for the PRETRAINED_MODELS problem type |
|
Batch Prediction Config for the SENTENCE_BOUNDARY_DETECTION problem type |
|
Batch Prediction Config for the THEME_ANALYSIS problem type |
|
Batch Prediction Config for the ChatLLM problem type |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Binary large object input data. |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Dataset config for Confluence Application Connector |
|
Dataset config for Google Analytics Application Connector |
|
Dataset config for Google Drive Application Connector |
|
Dataset config for Jira Application Connector |
|
Dataset config for OneDrive Application Connector |
|
Dataset config for Sharepoint Application Connector |
|
Dataset config for Zendesk Application Connector |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Generic enumeration. |
|
Configs for vector store indexing. |
|
Configs for document retriever. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Generic enumeration. |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
An abstract class for the sampling config of a feature group |
|
The number of distinct values of the key columns to include in the sample, or number of rows if key columns not specified. |
|
The fraction of distinct values of the feature group to include in the sample. |
|
Helper class that provides a standard way to create an ABC using |
|
An abstract class for the merge config of a feature group |
|
Merge LAST N chunks/versions of an incremental dataset. |
|
Merge rows within a given timewindow of the most recent timestamp |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Training config for the PERSONALIZATION problem type |
|
Training config for the PREDICTIVE_MODELING problem type |
|
Training config for the FORECASTING problem type |
|
Training config for the NAMED_ENTITY_EXTRACTION problem type |
|
Training config for the NATURAL_LANGUAGE_SEARCH problem type |
|
Training config for the CHAT_LLM problem type |
|
Training config for the SENTENCE_BOUNDARY_DETECTION problem type |
|
Training config for the SENTIMENT_DETECTION problem type |
|
Training config for the DOCUMENT_CLASSIFICATION problem type |
|
Training config for the DOCUMENT_SUMMARIZATION problem type |
|
Training config for the DOCUMENT_VISUALIZATION problem type |
|
Training config for the CLUSTERING problem type |
|
Training config for the CLUSTERING_TIMESERIES problem type |
|
Training config for the EVENT_ANOMALY problem type |
|
Training config for the CUMULATIVE_FORECASTING problem type |
|
Training config for the ANOMALY_DETECTION problem type |
|
Training config for the THEME ANALYSIS problem type |
|
Training config for the AI_AGENT problem type |
|
Training config for the CUSTOM_TRAINED_MODEL problem type |
|
Training config for the CUSTOM_ALGORITHM problem type |
|
Training config for the OPTIMIZATION problem type |
|
Helper class that provides a standard way to create an ABC using |
|
Algorithm that can be deployed to a model. |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Accuracy Below Threshold Condition Config for Monitor Alerts |
|
Feature Drift Condition Config for Monitor Alerts |
|
Target Drift Condition Config for Monitor Alerts |
|
History Length Drift Condition Config for Monitor Alerts |
|
Data Integrity Violation Condition Config for Monitor Alerts |
|
Bias Violation Condition Config for Monitor Alerts |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Email Action Config for Monitor Alerts |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
A config class for python function arguments |
|
A config class for python function arguments |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
|
Helper class that provides a standard way to create an ABC using |
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.FieldDescriptor
Bases:
abacusai.api_class.abstract.ApiClass
Configs for vector store indexing.
- Parameters:
field (str) – The field to be extracted. This will be used as the key in the response.
description (str) – The description of this field. If not included, the response_field will be used.
example_extraction (Union[str, int, bool, float]) – An example of this extracted field.
type (enums.FieldDescriptorType) – The type of this field. If not provided, the default type is STRING.
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class._ApiClassFactory
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key
- config_class_map
- class abacusai.api_class.BatchPredictionArgs
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- problem_type: abacusai.api_class.enums.ProblemType
- classmethod _get_builder()
- class abacusai.api_class.AnomalyDetectionBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the ANOMALY_DETECTION problem type
- Parameters:
for_eval (bool) – If True, the test fold which was created during training and used for metrics calculation will be used as input data. These predictions are hence, used for model evaluation.
prediction_time_endpoint (str) – The end point for predictions.
prediction_time_range (int) – Over what period of time should we make predictions (in seconds).
minimum_anomaly_score (int) – Exclude results with an anomaly score (1 in x event) below this threshold. Range: [1, 1_000_000_000_000].
summary_mode (bool) – Only show top anomalies per ID.
attach_raw_data (bool) – Return raw data along with anomalies.
small_batch (bool) – Size of batch data guaranteed to be small.
- __post_init__()
- class abacusai.api_class.AnomalyOutliersBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the ANOMALY_OUTLIERS problem type
- Parameters:
- __post_init__()
- class abacusai.api_class.ForecastingBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the FORECASTING problem type
- Parameters:
for_eval (bool) – If True, the test fold which was created during training and used for metrics calculation will be used as input data. These predictions are hence, used for model evaluation
predictions_start_date (str) – The start date for predictions.
use_prediction_offset (bool) – If True, use prediction offset.
start_date_offset (int) – Sets prediction start date as this offset relative to the prediction start date.
forecasting_horizon (int) – The number of timestamps to predict in the future. Range: [1, 1000].
item_attributes_to_include_in_the_result (list) – List of columns to include in the prediction output.
explain_predictions (bool) – If True, explain predictions for the forecast.
- __post_init__()
- class abacusai.api_class.NamedEntityExtractionBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the NAMED_ENTITY_EXTRACTION problem type
- Parameters:
- __post_init__()
- class abacusai.api_class.PersonalizationBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the PERSONALIZATION problem type
- Parameters:
for_eval (bool) – If True, the test fold which was created during training and used for metrics calculation will be used as input data. These predictions are hence, used for model evaluation.
number_of_items (int) – Number of items to recommend.
result_columns (list) – List of columns to include in the prediction output.
score_field (str) – If specified, relative item scores will be returned using a field with this name
- __post_init__()
- class abacusai.api_class.PredictiveModelingBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the PREDICTIVE_MODELING problem type
- Parameters:
for_eval (bool) – If True, the test fold which was created during training and used for metrics calculation will be used as input data. These predictions are hence, used for model evaluation.
explainer_type (enums.ExplainerType) – The type of explainer to use to generate explanations on the batch prediction.
number_of_samples_to_use_for_explainer (int) – Number Of Samples To Use For Kernel Explainer.
include_multi_class_explanations (bool) – If True, Includes explanations for all classes in multi-class classification.
features_considered_constant_for_explanations (str) – Comma separate list of fields to treat as constant in SHAP explanations.
importance_of_records_in_nested_columns (str) – Returns importance of each index in the specified nested column instead of SHAP column explanations.
explanation_filter_lower_bound (float) – If set explanations will be limited to predictions above this value, Range: [0, 1].
explanation_filter_upper_bound (float) – If set explanations will be limited to predictions below this value, Range: [0, 1].
bound_label (str) – For classification problems specifies the label to which the explanation bounds are applied.
output_columns (list) – A list of column names to include in the prediction result.
- explainer_type: abacusai.api_class.enums.ExplainerType
- __post_init__()
- class abacusai.api_class.PretrainedModelsBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the PRETRAINED_MODELS problem type
- Parameters:
for_eval (bool) – If True, the test fold which was created during training and used for metrics calculation will be used as input data. These predictions are hence, used for model evaluation.
files_output_location_prefix (str) – The output location prefix for the files.
channel_id_to_label_map (str) – JSON string for the map from channel ids to their labels.
- __post_init__()
- class abacusai.api_class.SentenceBoundaryDetectionBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the SENTENCE_BOUNDARY_DETECTION problem type
- Parameters:
- __post_init__()
- class abacusai.api_class.ThemeAnalysisBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the THEME_ANALYSIS problem type
- Parameters:
for_eval (bool) – If True, the test fold which was created during training and used for metrics calculation will be used as input data. These predictions are hence, used for model evaluation.
analysis_frequency (str) – The length of each analysis interval.
start_date (str) – The end point for predictions.
analysis_days (int) – How many days to analyze.
- __post_init__()
- class abacusai.api_class.ChatLLMBatchPredictionArgs
Bases:
BatchPredictionArgs
Batch Prediction Config for the ChatLLM problem type
- Parameters:
for_eval (bool) – If True, the test fold which was created during training and used for metrics calculation will be used as input data. These predictions are hence, used for model evaluation.
- __post_init__()
- class abacusai.api_class._BatchPredictionArgsFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key = 'problemType'
- config_class_map
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.BlobInput
Bases:
abacusai.api_class.abstract.ApiClass
Binary large object input data.
- Parameters:
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.ParsingConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- class abacusai.api_class.DocumentProcessingConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class._ApiClassFactory
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key
- config_class_map
- class abacusai.api_class.DatasetConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- application_connector_type: abacusai.api_class.enums.ApplicationConnectorType
- classmethod _get_builder()
- class abacusai.api_class.ConfluenceDatasetConfig
Bases:
DatasetConfig
Dataset config for Confluence Application Connector
- __post_init__()
- class abacusai.api_class.GoogleAnalyticsDatasetConfig
Bases:
DatasetConfig
Dataset config for Google Analytics Application Connector
- Parameters:
- __post_init__()
- class abacusai.api_class.GoogleDriveDatasetConfig
Bases:
DatasetConfig
Dataset config for Google Drive Application Connector
- Parameters:
location (str) – The regex location of the files to fetch
is_documentset (bool) – Whether the dataset is a document set
csv_delimiter (str, optional) – If the file format is CSV, use a specific csv delimiter
extract_bounding_boxes (bool, optional) – Signifies whether to extract bounding boxes out of the documents. Only valid if is_documentset if True
merge_file_schemas (bool, optional) – Signifies if the merge file schema policy is enabled. Not applicable if is_documentset is True
- __post_init__()
- class abacusai.api_class.JiraDatasetConfig
Bases:
DatasetConfig
Dataset config for Jira Application Connector
- Parameters:
- __post_init__()
- class abacusai.api_class.OneDriveDatasetConfig
Bases:
DatasetConfig
Dataset config for OneDrive Application Connector
- Parameters:
location (str) – The regex location of the files to fetch
is_documentset (bool) – Whether the dataset is a document set
csv_delimiter (str, optional) – If the file format is CSV, use a specific csv delimiter
extract_bounding_boxes (bool, optional) – Signifies whether to extract bounding boxes out of the documents. Only valid if is_documentset if True
merge_file_schemas (bool, optional) – Signifies if the merge file schema policy is enabled. Not applicable if is_documentset is True
- __post_init__()
Bases:
DatasetConfig
Dataset config for Sharepoint Application Connector
- Parameters:
location (str) – The regex location of the files to fetch
is_documentset (bool) – Whether the dataset is a document set
csv_delimiter (str, optional) – If the file format is CSV, use a specific csv delimiter
extract_bounding_boxes (bool, optional) – Signifies whether to extract bounding boxes out of the documents. Only valid if is_documentset if True
merge_file_schemas (bool, optional) – Signifies if the merge file schema policy is enabled. Not applicable if is_documentset is True
- class abacusai.api_class.ZendeskDatasetConfig
Bases:
DatasetConfig
Dataset config for Zendesk Application Connector
- __post_init__()
- class abacusai.api_class._DatasetConfigFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key = 'applicationConnectorType'
- config_class_map
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.VectorStoreTextEncoder
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- E5 = 'E5'
- OPENAI = 'OPENAI'
- SENTENCE_BERT = 'SENTENCE_BERT'
- E5_SMALL = 'E5_SMALL'
- class abacusai.api_class.VectorStoreConfig
Bases:
abacusai.api_class.abstract.ApiClass
Configs for vector store indexing.
- Parameters:
chunk_size (int) – The size of text chunks in the vector store.
chunk_overlap_fraction (float) – The fraction of overlap between chunks.
text_encoder (VectorStoreTextEncoder) – Encoder used to index texts from the documents.
- text_encoder: abacusai.api_class.enums.VectorStoreTextEncoder
- class abacusai.api_class.DocumentRetrieverConfig
Bases:
VectorStoreConfig
Configs for document retriever.
- class abacusai.api_class.ApiEnum
Bases:
enum.Enum
Generic enumeration.
Derive from this class to define new enumerations.
- __eq__(other)
Return self==value.
- __hash__()
Return hash(self).
- class abacusai.api_class.ProblemType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- AI_AGENT = 'ai_agent'
- ANOMALY_DETECTION = 'anomaly_new'
- ANOMALY_OUTLIERS = 'anomaly'
- EVENT_ANOMALY = 'event_anomaly'
- CLUSTERING = 'clustering'
- CLUSTERING_TIMESERIES = 'clustering_timeseries'
- CUMULATIVE_FORECASTING = 'cumulative_forecasting'
- NAMED_ENTITY_EXTRACTION = 'nlp_ner'
- NATURAL_LANGUAGE_SEARCH = 'nlp_search'
- CHAT_LLM = 'chat_llm'
- SENTENCE_BOUNDARY_DETECTION = 'nlp_sentence_boundary_detection'
- SENTIMENT_DETECTION = 'nlp_sentiment'
- DOCUMENT_CLASSIFICATION = 'nlp_classification'
- DOCUMENT_SUMMARIZATION = 'nlp_summarization'
- DOCUMENT_VISUALIZATION = 'nlp_document_visualization'
- PERSONALIZATION = 'personalization'
- PREDICTIVE_MODELING = 'regression'
- FINETUNED_LLM = 'finetuned_llm'
- FORECASTING = 'forecasting'
- CUSTOM_TRAINED_MODEL = 'plug_and_play'
- CUSTOM_ALGORITHM = 'trainable_plug_and_play'
- FEATURE_STORE = 'feature_store'
- IMAGE_CLASSIFICATION = 'vision_classification'
- OBJECT_DETECTION = 'vision_object_detection'
- IMAGE_VALUE_PREDICTION = 'vision_regression'
- MODEL_MONITORING = 'model_monitoring'
- LANGUAGE_DETECTION = 'language_detection'
- OPTIMIZATION = 'optimization'
- PRETRAINED_MODELS = 'pretrained'
- THEME_ANALYSIS = 'theme_analysis'
- class abacusai.api_class.RegressionObjective
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- AUC = 'auc'
- ACCURACY = 'acc'
- LOG_LOSS = 'log_loss'
- PRECISION = 'precision'
- RECALL = 'recall'
- F1_SCORE = 'fscore'
- MAE = 'mae'
- MAPE = 'mape'
- WAPE = 'wape'
- RMSE = 'rmse'
- R_SQUARED_COEFFICIENT_OF_DETERMINATION = 'r^2'
- class abacusai.api_class.RegressionTreeHPOMode
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- RAPID = 'rapid'
- THOROUGH = 'thorough'
- class abacusai.api_class.RegressionAugmentationStrategy
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- SMOTE = 'smote'
- RESAMPLE = 'resample'
- class abacusai.api_class.RegressionTargetTransform
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- LOG = 'log'
- QUANTILE = 'quantile'
- YEO_JOHNSON = 'yeo-johnson'
- BOX_COX = 'box-cox'
- class abacusai.api_class.RegressionTypeOfSplit
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- RANDOM = 'Random Sampling'
- TIMESTAMP_BASED = 'Timestamp Based'
- ROW_INDICATOR_BASED = 'Row Indicator Based'
- class abacusai.api_class.RegressionTimeSplitMethod
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- TEST_SPLIT_PERCENTAGE_BASED = 'Test Split Percentage Based'
- TEST_START_TIMESTAMP_BASED = 'Test Start Timestamp Based'
- class abacusai.api_class.RegressionLossFunction
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- HUBER = 'Huber'
- MSE = 'Mean Squared Error'
- MAE = 'Mean Absolute Error'
- MAPE = 'Mean Absolute Percentage Error'
- MSLE = 'Mean Squared Logarithmic Error'
- TWEEDIE = 'Tweedie'
- CROSS_ENTROPY = 'Cross Entropy'
- FOCAL_CROSS_ENTROPY = 'Focal Cross Entropy'
- AUTOMATIC = 'Automatic'
- CUSTOM = 'Custom'
- class abacusai.api_class.ExplainerType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- KERNEL_EXPLAINER = 'KERNEL_EXPLAINER'
- LIME_EXPLAINER = 'LIME_EXPLAINER'
- TREE_EXPLAINER = 'TREE_EXPLAINER'
- EBM_EXPLAINER = 'EBM_EXPLAINER'
- class abacusai.api_class.SamplingMethodType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- N_SAMPLING = 'N_SAMPLING'
- PERCENT_SAMPLING = 'PERCENT_SAMPLING'
- class abacusai.api_class.MergeMode
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- LAST_N = 'LAST_N'
- TIME_WINDOW = 'TIME_WINDOW'
- class abacusai.api_class.FillLogic
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- AVERAGE = 'average'
- MAX = 'max'
- MEDIAN = 'median'
- MIN = 'min'
- CUSTOM = 'custom'
- BACKFILL = 'bfill'
- FORWARDFILL = 'ffill'
- LINEAR = 'linear'
- NEAREST = 'nearest'
- class abacusai.api_class.BatchSize
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- BATCH_8 = 8
- BATCH_16 = 16
- BATCH_32 = 32
- BATCH_64 = 64
- BATCH_128 = 128
- BATCH_256 = 256
- BATCH_384 = 384
- BATCH_512 = 512
- BATCH_740 = 740
- BATCH_1024 = 1024
- class abacusai.api_class.HolidayCalendars
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- AU = 'AU'
- UK = 'UK'
- US = 'US'
- class abacusai.api_class.FileFormat
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- AVRO = 'AVRO'
- PARQUET = 'PARQUET'
- TFRECORD = 'TFRECORD'
- TSV = 'TSV'
- CSV = 'CSV'
- ORC = 'ORC'
- JSON = 'JSON'
- ODS = 'ODS'
- XLS = 'XLS'
- GZ = 'GZ'
- ZIP = 'ZIP'
- TAR = 'TAR'
- DOCX = 'DOCX'
- PDF = 'PDF'
- RAR = 'RAR'
- JPEG = 'JPG'
- PNG = 'PNG'
- TIF = 'TIFF'
- NUMBERS = 'NUMBERS'
- PPTX = 'PPTX'
- PPT = 'PPT'
- class abacusai.api_class.ExperimentationMode
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- RAPID = 'rapid'
- THOROUGH = 'thorough'
- class abacusai.api_class.PersonalizationTrainingMode
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- EXPERIMENTAL = 'EXP'
- PRODUCTION = 'PROD'
- class abacusai.api_class.PersonalizationObjective
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- NDCG = 'ndcg'
- NDCG_5 = 'ndcg@5'
- NDCG_10 = 'ndcg@10'
- MAP = 'map'
- MAP_5 = 'map@5'
- MAP_10 = 'map@10'
- MRR = 'mrr'
- PERSONALIZATION = 'personalization@10'
- COVERAGE = 'coverage'
- class abacusai.api_class.ForecastingObjective
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- ACCURACY = 'w_c_accuracy'
- WAPE = 'wape'
- MAPE = 'mape'
- CMAPE = 'cmape'
- RMSE = 'rmse'
- CV = 'coefficient_of_variation'
- BIAS = 'bias'
- SRMSE = 'srmse'
- class abacusai.api_class.ForecastingFrequency
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- HOURLY = '1H'
- DAILY = '1D'
- WEEKLY_SUNDAY_START = '1W'
- WEEKLY_MONDAY_START = 'W-MON'
- WEEKLY_SATURDAY_START = 'W-SAT'
- MONTH_START = 'MS'
- MONTH_END = '1M'
- QUARTER_START = 'QS'
- QUARTER_END = '1Q'
- YEARLY = '1Y'
- class abacusai.api_class.ForecastingDataSplitType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- AUTO = 'Automatic Time Based'
- TIMESTAMP = 'Timestamp Based'
- ITEM = 'Item Based'
- PREDICTION_LENGTH = 'Force Prediction Length'
- class abacusai.api_class.ForecastingLossFunction
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- CUSTOM = 'Custom'
- MEAN_ABSOLUTE_ERROR = 'mae'
- NORMALIZED_MEAN_ABSOLUTE_ERROR = 'nmae'
- PEAKS_MEAN_ABSOLUTE_ERROR = 'peaks_mae'
- MEAN_ABSOLUTE_PERCENTAGE_ERROR = 'stable_mape'
- POINTWISE_ACCURACY = 'accuracy'
- ROOT_MEAN_SQUARE_ERROR = 'rmse'
- NORMALIZED_ROOT_MEAN_SQUARE_ERROR = 'nrmse'
- ASYMMETRIC_MEAN_ABSOLUTE_PERCENTAGE_ERROR = 'asymmetric_mape'
- STABLE_STANDARDIZED_MEAN_ABSOLUTE_PERCENTAGE_ERROR = 'stable_standardized_mape_with_cmape'
- GAUSSIAN = 'mle_gaussian_local'
- GAUSSIAN_FULL_COVARIANCE = 'mle_gaussfullcov'
- GUASSIAN_EXPONENTIAL = 'mle_gaussexp'
- MIX_GAUSSIANS = 'mle_gaussmix'
- WEIBULL = 'mle_weibull'
- NEGATIVE_BINOMIAL = 'mle_negbinom'
- LOG_ROOT_MEAN_SQUARE_ERROR = 'log_rmse'
- class abacusai.api_class.ForecastingLocalScaling
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- ZSCORE = 'zscore'
- SLIDING_ZSCORE = 'sliding_zscore'
- LAST_POINT = 'lastpoint'
- MIN_MAX = 'minmax'
- MIN_STD = 'minstd'
- ROBUST = 'robust'
- ITEM = 'item'
- class abacusai.api_class.ForecastingFillMethod
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- BACK = 'BACK'
- MIDDLE = 'MIDDLE'
- FUTURE = 'FUTURE'
- class abacusai.api_class.ForecastingQuanitlesExtensionMethod
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- DIRECT = 'direct'
- QUADRATIC = 'quadratic'
- ANCESTRAL_SIMULATION = 'simulation'
- class abacusai.api_class.NERObjective
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- LOG_LOSS = 'log_loss'
- AUC = 'auc'
- PRECISION = 'precision'
- RECALL = 'recall'
- ANNOTATIONS_PRECISION = 'annotations_precision'
- ANNOTATIONS_RECALL = 'annotations_recall'
- class abacusai.api_class.NERModelType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- PRETRAINED_BERT = 'pretrained_bert'
- PRETRAINED_ROBERTA_27 = 'pretrained_roberta_27'
- PRETRAINED_ROBERTA_43 = 'pretrained_roberta_43'
- PRETRAINED_MULTILINGUAL = 'pretrained_multilingual'
- LEARNED = 'learned'
- class abacusai.api_class.NLPDocumentFormat
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- AUTO = 'auto'
- TEXT = 'text'
- DOC = 'doc'
- TOKENS = 'tokens'
- class abacusai.api_class.SentimentType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- VALENCE = 'valence'
- EMOTION = 'emotion'
- class abacusai.api_class.ClusteringImputationMethod
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- AUTOMATIC = 'Automatic'
- ZEROS = 'Zeros'
- INTERPOLATE = 'Interpolate'
- class abacusai.api_class.ConnectorType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- FILE = 'FILE'
- DATABASE = 'DATABASE'
- STREAMING = 'STREAMING'
- APPLICATION = 'APPLICATION'
- class abacusai.api_class.ApplicationConnectorType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- GOOGLEANALYTICS = 'GOOGLEANALYTICS'
- GOOGLEDRIVE = 'GOOGLEDRIVE'
- GIT = 'GIT'
- CONFLUENCE = 'CONFLUENCE'
- JIRA = 'JIRA'
- ONEDRIVE = 'ONEDRIVE'
- ZENDESK = 'ZENDESK'
- SLACK = 'SLACK'
- SHAREPOINT = 'SHAREPOINT'
- TEAMS = 'TEAMS'
- class abacusai.api_class.PythonFunctionArgumentType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- FEATURE_GROUP = 'FEATURE_GROUP'
- INTEGER = 'INTEGER'
- STRING = 'STRING'
- BOOLEAN = 'BOOLEAN'
- FLOAT = 'FLOAT'
- JSON = 'JSON'
- LIST = 'LIST'
- DATASET_ID = 'DATASET_ID'
- MODEL_ID = 'MODEL_ID'
- FEATURE_GROUP_ID = 'FEATURE_GROUP_ID'
- MONITOR_ID = 'MONITOR_ID'
- BATCH_PREDICTION_ID = 'BATCH_PREDICTION_ID'
- DEPLOYMENT_ID = 'DEPLOYMENT_ID'
- class abacusai.api_class.PythonFunctionOutputArgumentType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- NTEGER = 'INTEGER'
- STRING = 'STRING'
- BOOLEAN = 'BOOLEAN'
- FLOAT = 'FLOAT'
- JSON = 'JSON'
- LIST = 'LIST'
- DATASET_ID = 'DATASET_ID'
- MODEL_ID = 'MODEL_ID'
- FEATURE_GROUP_ID = 'FEATURE_GROUP_ID'
- MONITOR_ID = 'MONITOR_ID'
- BATCH_PREDICTION_ID = 'BATCH_PREDICTION_ID'
- DEPLOYMENT_ID = 'DEPLOYMENT_ID'
- ANY = 'ANY'
- class abacusai.api_class.VectorStoreTextEncoder
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- E5 = 'E5'
- OPENAI = 'OPENAI'
- SENTENCE_BERT = 'SENTENCE_BERT'
- E5_SMALL = 'E5_SMALL'
- class abacusai.api_class.LLMName
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- OPENAI_GPT4 = 'OPENAI_GPT4'
- OPENAI_GPT3_5 = 'OPENAI_GPT3_5'
- OPENAI_GPT3_5_SHORT = 'OPENAI_GPT3_5_SHORT'
- CLAUDE_V2 = 'CLAUDE_V2'
- ABACUS_GIRAFFE = 'ABACUS_GIRAFFE'
- ABACUS_LLAMA2_QA = 'ABACUS_LLAMA2_QA'
- ABACUS_LLAMA2_CODE = 'ABACUS_LLAMA2_CODE'
- LLAMA2_CHAT = 'LLAMA2_CHAT'
- PALM = 'PALM'
- PALM_TEXT = 'PALM_TEXT'
- class abacusai.api_class.MonitorAlertType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- ACCURACY_BELOW_THRESHOLD = 'AccuracyBelowThreshold'
- FEATURE_DRIFT = 'FeatureDrift'
- DATA_INTEGRITY_VIOLATIONS = 'DataIntegrityViolations'
- BIAS_VIOLATIONS = 'BiasViolations'
- HISTORY_LENGTH_DRIFT = 'HistoryLengthDrift'
- TARGET_DRIFT = 'TargetDrift'
- class abacusai.api_class.FeatureDriftType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- KL = 'kl'
- KS = 'ks'
- WS = 'ws'
- JS = 'js'
- class abacusai.api_class.DataIntegrityViolationType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- NULL_VIOLATIONS = 'null_violations'
- TYPE_MISMATCH_VIOLATIONS = 'type_mismatch_violations'
- RANGE_VIOLATIONS = 'range_violations'
- CATEGORICAL_RANGE_VIOLATION = 'categorical_range_violations'
- TOTAL_VIOLATIONS = 'total_violations'
- class abacusai.api_class.BiasType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- DEMOGRAPHIC_PARITY = 'demographic_parity'
- EQUAL_OPPORTUNITY = 'equal_opportunity'
- GROUP_BENEFIT_EQUALITY = 'group_benefit'
- TOTAL = 'total'
- class abacusai.api_class.AlertActionType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- EMAIL = 'Email'
- class abacusai.api_class.PythonFunctionType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- FEATURE_GROUP = 'FEATURE_GROUP'
- PLOTLY_FIG = 'PLOTLY_FIG'
- STEP_FUNCTION = 'STEP_FUNCTION'
- class abacusai.api_class.EvalArtifactType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- FORECASTING_ACCURACY = 'bar_chart'
- FORECASTING_VOLUME = 'bar_chart_volume'
- FORECASTING_HISTORY_LENGTH_ACCURACY = 'bar_chart_accuracy_by_history'
- class abacusai.api_class.FieldDescriptorType
Bases:
ApiEnum
Generic enumeration.
Derive from this class to define new enumerations.
- STRING = 'STRING'
- INTEGER = 'INTEGER'
- FLOAT = 'FLOAT'
- BOOLEAN = 'BOOLEAN'
- DATETIME = 'DATETIME'
- DATE = 'DATE'
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class._ApiClassFactory
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key
- config_class_map
- class abacusai.api_class.SamplingConfig
Bases:
abacusai.api_class.abstract.ApiClass
An abstract class for the sampling config of a feature group
- classmethod _get_builder()
- __post_init__()
- class abacusai.api_class.NSamplingConfig
Bases:
SamplingConfig
The number of distinct values of the key columns to include in the sample, or number of rows if key columns not specified.
- Parameters:
sampling_method (SamplingMethodType) – N_SAMPLING
sample_count (int) – The number of rows to include in the sample
key_columns (List[str]) – The feature(s) to use as the key(s) when sampling
- sampling_method: abacusai.api_class.enums.SamplingMethodType
- class abacusai.api_class.PercentSamplingConfig
Bases:
SamplingConfig
The fraction of distinct values of the feature group to include in the sample.
- Parameters:
sampling_method (SamplingMethodType) – PERCENT_SAMPLING
sample_percent (float) – The percentage of the rows to sample
key_columns (List[str]) – The feature(s) to use as the key(s) when sampling
- sampling_method: abacusai.api_class.enums.SamplingMethodType
- class abacusai.api_class._SamplingConfigFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_class_key = 'sampling_method'
- config_class_map
- class abacusai.api_class.MergeConfig
Bases:
abacusai.api_class.abstract.ApiClass
An abstract class for the merge config of a feature group
- classmethod _get_builder()
- __post_init__()
- class abacusai.api_class.LastNMergeConfig
Bases:
MergeConfig
Merge LAST N chunks/versions of an incremental dataset.
- Parameters:
- merge_mode: abacusai.api_class.enums.MergeMode
- class abacusai.api_class.TimeWindowMergeConfig
Bases:
MergeConfig
Merge rows within a given timewindow of the most recent timestamp
- Parameters:
- merge_mode: abacusai.api_class.enums.MergeMode
- class abacusai.api_class._MergeConfigFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_class_key = 'merge_mode'
- config_class_map
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class._ApiClassFactory
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key
- config_class_map
- class abacusai.api_class.TrainingConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- problem_type: abacusai.api_class.enums.ProblemType
- classmethod _get_builder()
- class abacusai.api_class.PersonalizationTrainingConfig
Bases:
TrainingConfig
Training config for the PERSONALIZATION problem type
- Parameters:
objective (PersonalizationObjective) – Ranking scheme used to select final best model.
sort_objective (PersonalizationObjective) – Ranking scheme used to sort models on the metrics page.
training_mode (PersonalizationTrainingMode) – whether to train in production or experimental mode. Defaults to EXP.
target_action_types (List[str]) – List of action types to use as targets for training.
target_action_weights (Dict[str, float]) – Dictionary of action types to weights for training.
session_event_types (List[str]) – List of event types to treat as occurrences of sessions.
test_split (int) – Percent of dataset to use for test data. We support using a range between 6% to 20% of your dataset to use as test data.
recent_days_for_training (int) – Limit training data to a certain latest number of days.
training_start_date (str) – Only consider training interaction data after this date. Specified in the timezone of the dataset.
test_on_user_split (bool) – Use user splits instead of using time splits, when validating and testing the model.
test_split_on_last_k_items (bool) – Use last k items instead of global timestamp splits, when validating and testing the model.
test_last_items_length (int) – Number of items to leave out for each user when using leave k out folds.
test_window_length_hours (int) – Duration (in hours) of most recent time window to use when validating and testing the model.
explicit_time_split (bool) – Sets an explicit time-based test boundary.
test_row_indicator (str) – Column indicating which rows to use for training (TRAIN), validation (VAL) and testing (TEST).
full_data_retraining (bool) – Train models separately with all the data.
sequential_training (bool) – Train a mode sequentially through time.
data_split_feature_group_table_name (str) – Specify the table name of the feature group to export training data with the fold column.
optimized_event_type (str) – The final event type to optimize for and compute metrics on.
dropout_rate (int) – Dropout rate for neural network.
batch_size (BatchSize) – Batch size for neural network.
disable_transformer (bool) – Disable training the transformer algorithm.
disable_gpu (boo) – Disable training on GPU.
filter_history (bool) – Do not recommend items the user has already interacted with.
action_types_exclusion_days (Dict[str, float]) – Mapping from action type to number of days for which we exclude previously interacted items from prediction
session_dedupe_mins (float) – Minimum number of minutes between two sessions for a user.
max_history_length (int) – Maximum length of user-item history to include user in training examples.
compute_rerank_metrics (bool) – Compute metrics based on rerank results.
add_time_features (bool) – Include interaction time as a feature.
disable_timestamp_scalar_features (bool) – Exclude timestamp scalar features.
compute_session_metrics (bool) – Evaluate models based on how well they are able to predict the next session of interactions.
max_user_history_len_percentile (int) – Filter out users with history length above this percentile.
downsample_item_popularity_percentile (float) – Downsample items more popular than this percentile.
use_user_id_feature (bool) – Use user id as a feature in CTR models.
- sort_objective: abacusai.api_class.enums.PersonalizationObjective
- training_mode: abacusai.api_class.enums.PersonalizationTrainingMode
- batch_size: abacusai.api_class.enums.BatchSize
- __post_init__()
- class abacusai.api_class.RegressionTrainingConfig
Bases:
TrainingConfig
Training config for the PREDICTIVE_MODELING problem type
- Parameters:
objective (RegressionObjective) – Ranking scheme used to select final best model.
sort_objective (RegressionObjective) – Ranking scheme used to sort models on the metrics page.
tree_hpo_mode – (RegressionTreeHPOMode): Turning off Rapid Experimentation will take longer to train.
type_of_split (RegressionTypeOfSplit) – Type of data splitting into train/test (validation also).
test_split (int) – Percent of dataset to use for test data. We support using a range between 5% to 20% of your dataset to use as test data.
disable_test_val_fold (bool) – Do not create a TEST_VAL set. All records which would be part of the TEST_VAL fold otherwise, remain in the TEST fold.
k_fold_cross_validation (bool) – Use this to force k-fold cross validation bagging on or off.
num_cv_folds (int) – Specify the value of k in k-fold cross validation.
timestamp_based_splitting_column (str) – Timestamp column selected for splitting into test and train.
timestamp_based_splitting_method (RegressionTimeSplitMethod) – Method of selecting TEST set, top percentile wise or after a given timestamp.
test_splitting_timestamp (str) – Rows with timestamp greater than this will be considered to be in the test set.
sampling_unit_keys (List[str]) – Constrain train/test separation to partition a column.
test_row_indicator (str) – Column indicating which rows to use for training (TRAIN) and testing (TEST). Validation (VAL) can also be specified.
full_data_retraining (bool) – Train models separately with all the data.
rebalance_classes (bool) – Class weights are computed as the inverse of the class frequency from the training dataset when this option is selected as “Yes”. It is useful when the classes in the dataset are unbalanced. Re-balancing classes generally boosts recall at the cost of precision on rare classes.
rare_class_augmentation_threshold (float) – Augments any rare class whose relative frequency with respect to the most frequent class is less than this threshold. Default = 0.1 for classification problems with rare classes.
augmentation_strategy (RegressionAugmentationStrategy) – Strategy to deal with class imbalance and data augmentation.
training_rows_downsample_ratio (float) – Uses this ratio to train on a sample of the dataset provided.
active_labels_column (str) – Specify a column to use as the active columns in a multi label setting.
min_categorical_count (int) – Minimum threshold to consider a value different from the unknown placeholder.
sample_weight (str) – Specify a column to use as the weight of a sample for training and eval.
numeric_clipping_percentile (float) – Uses this option to clip the top and bottom x percentile of numeric feature columns where x is the value of this option.
target_transform (RegressionTargetTransform) – Specify a transform (e.g. log, quantile) to apply to the target variable.
ignore_datetime_features (bool) – Remove all datetime features from the model. Useful while generalizing to different time periods.
max_text_words (int) – Maximum number of words to use from text fields.
perform_feature_selection (bool) – If enabled, additional algorithms which support feature selection as a pretraining step will be trained separately with the selected subset of features. The details about their selected features can be found in their respective logs.
feature_selection_intensity (int) – This determines the strictness with which features will be filtered out. 1 being very lenient (more features kept), 100 being very strict.
batch_size (BatchSize) – Batch size.
dropout_rate (int) – Dropout percentage rate.
pretrained_model_name (str) – Enable algorithms which process text using pretrained multilingual NLP models.
is_multilingual (bool) – Enable algorithms which process text using pretrained multilingual NLP models.
loss_function (RegressionLossFunction) – Loss function to be used as objective for model training.
loss_parameters (str) – Loss function params in format <key>=<value>;<key>=<value>;…..
target_encode_categoricals (bool) – Use this to turn target encoding on categorical features on or off.
drop_original_categoricals (bool) – This option helps us choose whether to also feed the original label encoded categorical columns to the mdoels along with their target encoded versions.
monotonically_increasing_features (List[str]) – Constrain the model such that it behaves as if the target feature is monotonically increasing with the selected features
monotonically_decreasing_features (List[str]) – Constrain the model such that it behaves as if the target feature is monotonically decreasing with the selected features
data_split_feature_group_table_name (str) – Specify the table name of the feature group to export training data with the fold column.
custom_loss_functions (List[str]) – Registered custom losses available for selection.
custom_metrics (List[str]) – Registered custom metrics available for selection.
- sort_objective: abacusai.api_class.enums.RegressionObjective
- tree_hpo_mode: abacusai.api_class.enums.RegressionTreeHPOMode
- type_of_split: abacusai.api_class.enums.RegressionTypeOfSplit
- timestamp_based_splitting_method: abacusai.api_class.enums.RegressionTimeSplitMethod
- augmentation_strategy: abacusai.api_class.enums.RegressionAugmentationStrategy
- target_transform: abacusai.api_class.enums.RegressionTargetTransform
- batch_size: abacusai.api_class.enums.BatchSize
- loss_function: abacusai.api_class.enums.RegressionLossFunction
- __post_init__()
- class abacusai.api_class.ForecastingTrainingConfig
Bases:
TrainingConfig
Training config for the FORECASTING problem type
- Parameters:
prediction_length (int) – How many timesteps in the future to predict.
objective (ForecastingObjective) – Ranking scheme used to select final best model.
sort_objective (ForecastingObjective) – Ranking scheme used to sort models on the metrics page.
forecast_frequency (ForecastingFrequency) – Forecast frequency.
probability_quantiles (List[float]) – Prediction quantiles.
force_prediction_length (int) – Force length of test window to be the same as prediction length.
filter_items (bool) – Filter items with small history and volume.
enable_feature_selection (bool) – Enable feature selection.
enable_padding (bool) – Pad series to the max_date of the dataset
enable_cold_start (bool) – Enable cold start forecasting by training/predicting for zero history items.
enable_multiple_backtests (bool) – Whether to enable multiple backtesting or not.
num_backtesting_windows (int) – Total backtesting windows to use for the training.
backtesting_window_step_size (int) – Use this step size to shift backtesting windows for model training.
full_data_retraining (bool) – Train models separately with all the data.
additional_forecast_keys – List[str]: List of categoricals in timeseries that can act as multi-identifier.
experimentation_mode (ExperimentationMode) – Selecting Thorough Experimentation will take longer to train.
type_of_split (ForecastingDataSplitType) – Type of data splitting into train/test.
test_by_item (bool) – Partition train/test data by item rather than time if true.
test_start (str) – Limit training data to dates before the given test start.
test_split (int) – Percent of dataset to use for test data. We support using a range between 5% to 20% of your dataset to use as test data.
loss_function (ForecastingLossFunction) – Loss function for training neural network.
underprediction_weight (float) – Weight for underpredictions
disable_networks_without_analytic_quantiles (bool) – Disable neural networks, which quantile functions do not have analytic expressions (e.g, mixture models)
initial_learning_rate (float) – Initial learning rate.
l2_regularization_factor (float) – L2 regularization factor.
dropout_rate (int) – Dropout percentage rate.
recurrent_layers (int) – Number of recurrent layers to stack in network.
recurrent_units (int) – Number of units in each recurrent layer.
convolutional_layers (int) – Number of convolutional layers to stack on top of recurrent layers in network.
convolution_filters (int) – Number of filters in each convolution.
local_scaling_mode (ForecastingLocalScaling) – Options to make NN inputs stationary in high dynamic range datasets.
zero_predictor (bool) – Include subnetwork to classify points where target equals zero.
skip_missing (bool) – Make the RNN ignore missing entries rather instead of processing them.
batch_size (ForecastingBatchSize) – Batch size.
batch_renormalization (bool) – Enable batch renormalization between layers.
history_length (int) – While training, how much history to consider.
prediction_step_size (int) – Number of future periods to include in objective for each training sample.
training_point_overlap (float) – Amount of overlap to allow between training samples.
max_scale_context (int) – Maximum context to use for local scaling.
quantiles_extension_method (ForecastingQuanitlesExtensionMethod) – Quantile extension method
number_of_samples (int) – Number of samples for ancestral simulation
symmetrize_quantiles (bool) – Force symmetric quantiles (like in Gaussian distribution)
use_log_transforms (bool) – Apply logarithmic transformations to input data.
smooth_history (float) – Smooth (low pass filter) the timeseries.
local_scale_target (bool) – Using per training/prediction window target scaling.
use_clipping (bool) – Apply clipping to input data to stabilize the training.
timeseries_weight_column (str) – If set, we use the values in this column from timeseries data to assign time dependent item weights during training and evaluation.
item_attributes_weight_column (str) – If set, we use the values in this column from item attributes data to assign weights to items during training and evaluation.
use_timeseries_weights_in_objective (bool) – If True, we include weights from column set as “TIMESERIES WEIGHT COLUMN” in objective functions.
use_item_weights_in_objective (bool) – If True, we include weights from column set as “ITEM ATTRIBUTES WEIGHT COLUMN” in objective functions.
skip_timeseries_weight_scaling (bool) – If True, we will avoid normalizing the weights.
timeseries_loss_weight_column (str) – Use value in this column to weight the loss while training.
use_item_id (bool) – Include a feature to indicate the item being forecast.
use_all_item_totals (bool) – Include as input total target across items.
handle_zeros_as_missing_values (bool) – If True, handle zero values in demand as missing data.
datetime_holiday_calendars (List[HolidayCalendars]) – Holiday calendars to augment training with.
fill_missing_values (List[dict]) – Strategy for filling in missing values.
enable_clustering (bool) – Enable clustering in forecasting.
data_split_feature_group_table_name (str) – Specify the table name of the feature group to export training data with the fold column.
custom_loss_functions (List[str]) – Registered custom losses available for selection.
custom_metrics (List[str]) – Registered custom metrics available for selection.
return_fractional_forecasts – Use this to return fractional forecast values while prediction
- sort_objective: abacusai.api_class.enums.ForecastingObjective
- forecast_frequency: abacusai.api_class.enums.ForecastingFrequency
- experimentation_mode: abacusai.api_class.enums.ExperimentationMode
- type_of_split: abacusai.api_class.enums.ForecastingDataSplitType
- loss_function: abacusai.api_class.enums.ForecastingLossFunction
- local_scaling_mode: abacusai.api_class.enums.ForecastingLocalScaling
- batch_size: abacusai.api_class.enums.BatchSize
- quantiles_extension_method: abacusai.api_class.enums.ForecastingQuanitlesExtensionMethod
- datetime_holiday_calendars: List[abacusai.api_class.enums.HolidayCalendars]
- __post_init__()
- class abacusai.api_class.NamedEntityExtractionTrainingConfig
Bases:
TrainingConfig
Training config for the NAMED_ENTITY_EXTRACTION problem type
- Parameters:
objective (NERObjective) – Ranking scheme used to select final best model.
sort_objective (NERObjective) – Ranking scheme used to sort models on the metrics page.
ner_model_type (NERModelType) – Type of NER model to use.
test_split (int) – Percent of dataset to use for test data. We support using a range between 5 ( i.e. 5% ) to 20 ( i.e. 20% ) of your dataset.
test_row_indicator (str) – Column indicating which rows to use for training (TRAIN) and testing (TEST).
dropout_rate (float) – Dropout rate for neural network.
batch_size (BatchSize) – Batch size for neural network.
active_labels_column (str) – Entities that have been marked in a particular text
document_format (NLPDocumentFormat) – Format of the input documents.
include_longformer (bool) – Whether to include the longformer model.
save_predicted_pdf (bool) – Whether to save predicted PDF documents
enhanced_ocr (bool) – Enhanced text extraction from predicted digital documents
- objective: abacusai.api_class.enums.NERObjective
- sort_objective: abacusai.api_class.enums.NERObjective
- ner_model_type: abacusai.api_class.enums.NERModelType
- batch_size: abacusai.api_class.enums.BatchSize
- document_format: abacusai.api_class.enums.NLPDocumentFormat
- __post_init__()
- class abacusai.api_class.NaturalLanguageSearchTrainingConfig
Bases:
TrainingConfig
Training config for the NATURAL_LANGUAGE_SEARCH problem type
- Parameters:
abacus_internal_model (bool) – Use a Abacus.AI LLM to answer questions about your data without using any external APIs
num_completion_tokens (int) – Default for maximum number of tokens for chat answers. Reducing this will get faster responses which are more succinct
larger_embeddings (bool) – Use a higher dimension embedding model.
search_chunk_size (int) – Chunk size for indexing the documents.
chunk_overlap_fraction (float) – Overlap in chunks while indexing the documents.
test_split (int) – Percent of dataset to use for test data. We support using a range between 5 ( i.e. 5% ) to 20 ( i.e. 20% ) of your dataset.
- __post_init__()
- class abacusai.api_class.ChatLLMTrainingConfig
Bases:
TrainingConfig
Training config for the CHAT_LLM problem type
- Parameters:
document_retrievers (List[str]) – List of document retriever names to use for the feature stores this model was trained with.
num_completion_tokens (int) – Default for maximum number of tokens for chat answers. Reducing this will get faster responses which are more succinct
system_message (str) – The generative LLM system message
temperature (float) – The generative LLM temperature
metadata_columns (list) – Include the metadata column values in the retrieved search results.
- __post_init__()
- class abacusai.api_class.SentenceBoundaryDetectionTrainingConfig
Bases:
TrainingConfig
Training config for the SENTENCE_BOUNDARY_DETECTION problem type
- Parameters:
- batch_size: abacusai.api_class.enums.BatchSize
- __post_init__()
- class abacusai.api_class.SentimentDetectionTrainingConfig
Bases:
TrainingConfig
Training config for the SENTIMENT_DETECTION problem type
- Parameters:
sentiment_type (SentimentType) – Type of sentiment to detect.
test_split (int) – Percent of dataset to use for test data. We support using a range between 5 ( i.e. 5% ) to 20 ( i.e. 20% ) of your dataset.
dropout_rate (float) – Dropout rate for neural network.
batch_size (BatchSize) – Batch size for neural network.
compute_metrics (bool) – Whether to compute metrics.
- sentiment_type: abacusai.api_class.enums.SentimentType
- batch_size: abacusai.api_class.enums.BatchSize
- __post_init__()
- class abacusai.api_class.DocumentClassificationTrainingConfig
Bases:
TrainingConfig
Training config for the DOCUMENT_CLASSIFICATION problem type
- Parameters:
zero_shot_hypotheses (List[str]) – Zero shot hypotheses. Example text: ‘This text is about pricing’.
test_split (int) – Percent of dataset to use for test data. We support using a range between 5 ( i.e. 5% ) to 20 ( i.e. 20% ) of your dataset.
dropout_rate (float) – Dropout rate for neural network.
batch_size (BatchSize) – Batch size for neural network.
- batch_size: abacusai.api_class.enums.BatchSize
- __post_init__()
- class abacusai.api_class.DocumentSummarizationTrainingConfig
Bases:
TrainingConfig
Training config for the DOCUMENT_SUMMARIZATION problem type
- Parameters:
- batch_size: abacusai.api_class.enums.BatchSize
- __post_init__()
- class abacusai.api_class.DocumentVisualizationTrainingConfig
Bases:
TrainingConfig
Training config for the DOCUMENT_VISUALIZATION problem type
- Parameters:
- batch_size: abacusai.api_class.enums.BatchSize
- __post_init__()
- class abacusai.api_class.ClusteringTrainingConfig
Bases:
TrainingConfig
Training config for the CLUSTERING problem type
- Parameters:
num_clusters_selection (int) – Number of clusters. If None, will be selected automatically.
- __post_init__()
- class abacusai.api_class.ClusteringTimeseriesTrainingConfig
Bases:
TrainingConfig
Training config for the CLUSTERING_TIMESERIES problem type
- Parameters:
num_clusters_selection (int) – Number of clusters. If None, will be selected automatically.
imputation (ClusteringImputationMethod) – Imputation method for missing values.
- __post_init__()
- class abacusai.api_class.EventAnomalyTrainingConfig
Bases:
TrainingConfig
Training config for the EVENT_ANOMALY problem type
- Parameters:
anomaly_fraction (float) – The fraction of the dataset to classify as anomalous, between 0 and 0.5
- __post_init__()
- class abacusai.api_class.CumulativeForecastingTrainingConfig
Bases:
TrainingConfig
Training config for the CUMULATIVE_FORECASTING problem type
- Parameters:
test_split (int) – Percent of dataset to use for test data. We support using a range between 5 ( i.e. 5% ) to 20 ( i.e. 20% ) of your dataset.
historical_frequency (str) – Forecast frequency
cumulative_prediction_lengths (List[int]) – List of Cumulative Prediction Frequencies. Each prediction length must be between 1 and 365.
skip_input_transform (bool) – Avoid doing numeric scaling transformations on the input.
skip_target_transform (bool) – Avoid doing numeric scaling transformations on the target.
predict_residuals (bool) – Predict residuals instead of totals at each prediction step.
- __post_init__()
- class abacusai.api_class.AnomalyDetectionTrainingConfig
Bases:
TrainingConfig
Training config for the ANOMALY_DETECTION problem type
- Parameters:
test_split (int) – Percent of dataset to use for test data. We support using a range between 5 (i.e. 5%) to 20 (i.e. 20%) of your dataset to use as test data.
value_high (bool) – Detect unusually high values.
mixture_of_gaussians (bool) – Detect unusual combinations of values using mixture of Gaussians.
variational_autoencoder (bool) – Use variational autoencoder for anomaly detection.
spike_up (bool) – Detect outliers with a high value.
spike_down (bool) – Detect outliers with a low value.
trend_change (bool) – Detect changes to the trend.
- __post_init__()
- class abacusai.api_class.ThemeAnalysisTrainingConfig
Bases:
TrainingConfig
Training config for the THEME ANALYSIS problem type
- __post_init__()
- class abacusai.api_class.AIAgentTrainingConfig
Bases:
TrainingConfig
Training config for the AI_AGENT problem type
- Parameters:
- __post_init__()
- class abacusai.api_class.CustomTrainedModelTrainingConfig
Bases:
TrainingConfig
Training config for the CUSTOM_TRAINED_MODEL problem type
- Parameters:
max_catalog_size (int) – Maximum expected catalog size.
max_dimension (int) – Maximum expected dimension of the catalog.
index_output_path (str) – Fully qualified cloud location (GCS, S3, etc) to export snapshots of the embedding to.
docker_image_uri (str) – Docker image URI.
service_port (int) – Service port.
- __post_init__()
- class abacusai.api_class.CustomAlgorithmTrainingConfig
Bases:
TrainingConfig
Training config for the CUSTOM_ALGORITHM problem type
- Parameters:
train_function_name (str) – The name of the train function.
predict_many_function_name (str) – The name of the predict many function.
training_input_tables (List[str]) – List of tables to use for training.
predict_function_name (str) – Optional name of the predict function if the predict many function is not given.
train_module_name (str) – The name of the train module - only relevant if model is being uploaded from a zip file or github repositoty.
predict_module_name (str) – The name of the predict module - only relevant if model is being uploaded from a zip file or github repositoty.
test_split (int) – Percent of dataset to use for test data. We support using a range between 6% to 20% of your dataset to use as test data.
- __post_init__()
- class abacusai.api_class.OptimizationTrainingConfig
Bases:
TrainingConfig
Training config for the OPTIMIZATION problem type
- Parameters:
solve_time_limit (float) – The maximum time in seconds to spend solving the problem. Accepts values between 0 and 86400.
- __post_init__()
- class abacusai.api_class._TrainingConfigFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key = 'problem_type'
- config_class_map
- class abacusai.api_class.DeployableAlgorithm
Bases:
abacusai.api_class.abstract.ApiClass
Algorithm that can be deployed to a model.
- Parameters:
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.ForecastingMonitorConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class._ApiClassFactory
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key
- config_class_map
- class abacusai.api_class.AlertConditionConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- alert_type: abacusai.api_class.enums.MonitorAlertType
- classmethod _get_builder()
- class abacusai.api_class.AccuracyBelowThresholdConditionConfig
Bases:
AlertConditionConfig
Accuracy Below Threshold Condition Config for Monitor Alerts
- Parameters:
threshold (float) – Threshold for when to consider a column to be in violation. The alert will only fire when the drift value is strictly greater than the threshold.
- __post_init__()
- class abacusai.api_class.FeatureDriftConditionConfig
Bases:
AlertConditionConfig
Feature Drift Condition Config for Monitor Alerts
- Parameters:
feature_drift_type (FeatureDriftType) – Feature drift type to apply the threshold on to determine whether a column has drifted significantly enough to be a violation.
threshold (float) – Threshold for when to consider a column to be in violation. The alert will only fire when the drift value is strictly greater than the threshold.
minimum_violations (int) – Number of columns that must exceed the specified threshold to trigger an alert.
feature_names (List[str]) – List of feature names to monitor for this alert.
- feature_drift_type: abacusai.api_class.enums.FeatureDriftType
- __post_init__()
- class abacusai.api_class.TargetDriftConditionConfig
Bases:
AlertConditionConfig
Target Drift Condition Config for Monitor Alerts
- Parameters:
feature_drift_type (FeatureDriftType) – Target drift type to apply the threshold on to determine whether a column has drifted significantly enough to be a violation.
threshold (float) – Threshold for when to consider the target column to be in violation. The alert will only fire when the drift value is strictly greater than the threshold.
- feature_drift_type: abacusai.api_class.enums.FeatureDriftType
- __post_init__()
- class abacusai.api_class.HistoryLengthDriftConditionConfig
Bases:
AlertConditionConfig
History Length Drift Condition Config for Monitor Alerts
- Parameters:
feature_drift_type (FeatureDriftType) – History length drift type to apply the threshold on to determine whether the history length has drifted significantly enough to be a violation.
threshold (float) – Threshold for when to consider the history length to be in violation. The alert will only fire when the drift value is strictly greater than the threshold.
- feature_drift_type: abacusai.api_class.enums.FeatureDriftType
- __post_init__()
- class abacusai.api_class.DataIntegrityViolationConditionConfig
Bases:
AlertConditionConfig
Data Integrity Violation Condition Config for Monitor Alerts
- Parameters:
data_integrity_type (DataIntegrityViolationType) – This option selects the data integrity violations to monitor for this alert.
minimum_violations (int) – Number of columns that must exceed the specified threshold to trigger an alert.
- data_integrity_type: abacusai.api_class.enums.DataIntegrityViolationType
- __post_init__()
- class abacusai.api_class.BiasViolationConditionConfig
Bases:
AlertConditionConfig
Bias Violation Condition Config for Monitor Alerts
- Parameters:
bias_type (BiasType) – This option selects the bias metric to monitor for this alert.
threshold (float) – Threshold for when to consider a column to be in violation. The alert will only fire when the drift value is strictly greater than the threshold.
minimum_violations (int) – Number of columns that must exceed the specified threshold to trigger an alert.
- bias_type: abacusai.api_class.enums.BiasType
- __post_init__()
- class abacusai.api_class._AlertConditionConfigFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key = 'alert_type'
- config_class_key_value_camel_case = True
- config_class_map
- class abacusai.api_class.AlertActionConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- action_type: abacusai.api_class.enums.AlertActionType
- classmethod _get_builder()
- class abacusai.api_class.EmailActionConfig
Bases:
AlertActionConfig
Email Action Config for Monitor Alerts
- Parameters:
- __post_init__()
- class abacusai.api_class._AlertActionConfigFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key = 'action_type'
- config_class_map
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.FeatureMappingConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- class abacusai.api_class.ProjectFeatureGroupTypeMappingsConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- feature_mappings: List[FeatureMappingConfig]
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.PythonFunctionArgument
Bases:
abacusai.api_class.abstract.ApiClass
A config class for python function arguments
- Parameters:
variable_type (PythonFunctionArgumentType) – The type of the python function argument
name (str) – The name of the python function variable
is_required (bool) – Whether the argument is required
value (Any) – The value of the argument
pipeline_variable (str) – The name of the pipeline variable to use as the value
- variable_type: abacusai.api_class.enums.PythonFunctionArgumentType
- value: Any
- class abacusai.api_class.OutputVariableMapping
Bases:
abacusai.api_class.abstract.ApiClass
A config class for python function arguments
- Parameters:
variable_type (PythonFunctionOutputArgumentType) – The type of the python function output argument
name (str) – The name of the python function variable
- variable_type: abacusai.api_class.enums.PythonFunctionOutputArgumentType
- class abacusai.api_class.ApiClass
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- __post_init__()
- classmethod _get_builder()
- __str__()
Return str(self).
- _repr_html_()
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class._ApiClassFactory
Bases:
abc.ABC
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key
- config_class_map
- class abacusai.api_class.FeatureGroupExportConfig
Bases:
abacusai.api_class.abstract.ApiClass
Helper class that provides a standard way to create an ABC using inheritance.
- connector_type: abacusai.api_class.enums.ConnectorType
- classmethod _get_builder()
- class abacusai.api_class.FileConnectorExportConfig
Bases:
FeatureGroupExportConfig
Helper class that provides a standard way to create an ABC using inheritance.
- connector_type: abacusai.api_class.enums.ConnectorType
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class.DatabaseConnectorExportConfig
Bases:
FeatureGroupExportConfig
Helper class that provides a standard way to create an ABC using inheritance.
- connector_type: abacusai.api_class.enums.ConnectorType
- to_dict()
Standardizes converting an ApiClass to dictionary. Keys of response dictionary are converted to camel case. This also validates the fields ( type, value, etc ) received in the dictionary.
- class abacusai.api_class._FeatureGroupExportConfigFactory
Bases:
abacusai.api_class.abstract._ApiClassFactory
Helper class that provides a standard way to create an ABC using inheritance.
- config_abstract_class
- config_class_key = 'connectorType'
- config_class_map