- Papers in cache
- {{ total_papers }}
- Papers indexed
- {{ indexed_count }} {% if embedding_model %}
- Embedding model
{{ embedding_model }}
{% else %}
- Embedding model
- Not configured — set it in Config {% endif %}
{% extends "base.html" %} {% block title %}AI Index — MOSAIC{% endblock %} {% block content %}
The vector index lets MOSAIC answer questions about your cached papers using semantic search. Before using Ask or Chat, you need an embedding model configured.