Aqueduct RunLLM - Short Review

Data Tools



Product Overview of Aqueduct and RunLLM



Introduction

Aqueduct, associated with RunLLM, is a comprehensive platform that combines the capabilities of an MLOps framework with advanced AI-powered technical support. Here’s a detailed overview of what the product does and its key features.



What Aqueduct Does

Aqueduct is an open-source MLOps (Machine Learning Operations) framework designed to streamline the deployment, management, and monitoring of machine learning (ML) and large language model (LLM) workloads. It allows data scientists to define and deploy ML and LLM tasks on any cloud infrastructure, including Kubernetes, Spark, Airflow, and AWS Lambda, using vanilla Python code.



Key Features of Aqueduct

  • Python-Native Pipeline API: Aqueduct’s API enables users to define workflows in regular Python code, eliminating the need for domain-specific languages (DSLs) or YAML configurations. This facilitates quick and effective deployment of code into production.
  • Integrated with Existing Infrastructure: Workflows defined in Aqueduct can run seamlessly on various cloud infrastructures, allowing users to leverage their existing tooling without the need for significant changes.
  • Centralized Visibility: Aqueduct provides comprehensive visibility into code, data, metrics, and metadata generated by each workflow run. This ensures users can monitor the performance and health of their pipelines effectively.
  • Secure Execution: Aqueduct runs entirely within the user’s cloud environment, ensuring the security and integrity of data and code.
  • Workflow Management: The core abstraction in Aqueduct is a Workflow, which is a sequence of Artifacts (data) transformed by Operators (compute). Workflows can be run on a fixed schedule or triggered on-demand.
  • Advanced Integrations: Aqueduct supports integrations with various systems such as Databricks Spark clusters, Snowflake, AWS S3, and more. It also allows for the execution of operators on different engines and the loading of data from local filesystems.


What RunLLM Does

RunLLM is an AI-powered technical support tool built on top of the Aqueduct framework. Here are its key functionalities:



Key Features of RunLLM

  • AI-Powered Technical Support: RunLLM acts as a technical support engineer by reading and understanding product documentation, guides, and APIs to provide precise and concise answers to user queries. If it does not know the answer, it clearly states so.
  • Insight Generation: RunLLM automatically surfaces valuable insights from user interactions, eliminating the need for manual data analysis.
  • Citations and Data Sources: Every answer provided by RunLLM comes with citations and explanations of why the data source was relevant, enhancing transparency and trust.
  • Expertise Building: RunLLM builds expertise in the product by continuously learning from documentation and user interactions, which helps in accelerating customer adoption and saving time for support teams.


Conclusion

Aqueduct and RunLLM together form a powerful platform that not only streamlines ML and LLM workflows but also enhances technical support with AI-driven insights. Aqueduct’s robust MLOps capabilities and RunLLM’s advanced AI support make it an indispensable tool for data scientists, ML engineers, and technical support teams.

Scroll to Top