LANCY.TECH
KNOWLEDGE SERVED TRANSPARENTLY
rag-pipeline --status
llm: served
backend: online
latency: 14ms
status:
ingesting...