Best AI tools for< Manage Data Pipelines >
20 - AI tool Sites
SingleStore
SingleStore is a real-time data platform designed for apps, analytics, and gen AI. It offers faster hybrid vector + full-text search, fast-scaling integrations, and a free tier. SingleStore can read, write, and reason on petabyte-scale data in milliseconds. It supports streaming ingestion, high concurrency, first-class vector support, record lookups, and more.
nuvo
nuvo is an AI-powered data import solution that offers fast, secure, and scalable data import solutions for software companies. It provides tools like nuvo Data Importer SDK and nuvo Data Pipeline to streamline manual and recurring ETL data imports, enabling users to manage data imports independently. With AI-enhanced automation, nuvo helps prepare clean data for preferred systems quickly and efficiently, reducing manual effort and improving data quality. The platform allows users to upload unlimited data in various formats, match imported data to system schemas, clean and validate data, and import clean data into target systems with just a click.
Encord
Encord is a complete data development platform designed for AI applications, specifically tailored for computer vision and multimodal AI teams. It offers tools to intelligently manage, clean, and curate data, streamline labeling and workflow management, and evaluate model performance. Encord aims to unlock the potential of AI for organizations by simplifying data-centric AI pipelines, enabling the building of better models and deploying high-quality production AI faster.
DVC
DVC is an open-source version control system for machine learning projects. It allows users to track and manage their data, models, and code in a single place. DVC also provides a number of features that make it easy to collaborate on machine learning projects, such as experiment tracking, model registration, and pipeline management.
Baseten
Baseten is a machine learning infrastructure that provides a unified platform for data scientists and engineers to build, train, and deploy machine learning models. It offers a range of features to simplify the ML lifecycle, including data preparation, model training, and deployment. Baseten also provides a marketplace of pre-built models and components that can be used to accelerate the development of ML applications.
Hopsworks
Hopsworks is an AI platform that offers a comprehensive solution for building, deploying, and monitoring machine learning systems. It provides features such as a Feature Store, real-time ML capabilities, and generative AI solutions. Hopsworks enables users to develop and deploy reliable AI systems, orchestrate and monitor models, and personalize machine learning models with private data. The platform supports batch and real-time ML tasks, with the flexibility to deploy on-premises or in the cloud.
Bionl
Bionl is a no-code bioinformatics platform designed to streamline biomedical research for researchers and scientists. It offers a full workspace with features such as bioinformatics pipelines customization, GenAI for data analysis, AI-powered literature search, PDF analysis, and access to public datasets. Bionl aims to automate cloud, file system, data, and workflow management for efficient and precise analyses. The platform caters to Pharma and Biotech companies, academic researchers, and bioinformatics CROs, providing powerful tools for genetic analysis and speeding up research processes.
Modjo
Modjo is an AI platform designed to boost sales team productivity, enhance commercial efficiency, and develop sales skills. It captures interactions with customers, creates an augmented conversational database, and deploys AI solutions within organizations. Modjo's AI capabilities help in productivity, sales execution, and coaching, enabling teams to focus on selling, manage pipelines efficiently, and develop sales skills. The platform offers features like AI call scoring, CRM filling, call reviews, email follow-up, and insights library.
Athina AI
Athina AI is a comprehensive platform designed to monitor, debug, analyze, and improve the performance of Large Language Models (LLMs) in production environments. It provides a suite of tools and features that enable users to detect and fix hallucinations, evaluate output quality, analyze usage patterns, and optimize prompt management. Athina AI supports integration with various LLMs and offers a range of evaluation metrics, including context relevancy, harmfulness, summarization accuracy, and custom evaluations. It also provides a self-hosted solution for complete privacy and control, a GraphQL API for programmatic access to logs and evaluations, and support for multiple users and teams. Athina AI's mission is to empower organizations to harness the full potential of LLMs by ensuring their reliability, accuracy, and alignment with business objectives.
Salesmate
Salesmate is a modern CRM software designed for teams to market, sell, and service from one platform. It offers advanced automation features to streamline processes, engage customers across all channels, generate leads, and make better decisions with rich data and insights. Salesmate is highly customizable, user-friendly, and suitable for various industries and roles. With time-saving tools, automations, and AI capabilities, Salesmate aims to improve user experience and boost business growth.
Frame AI
Frame is an Intelligent Company Workspace solution that serves as the control center for AI workforce, offering a suite of AI-powered tools and AI employees to scale businesses. It provides features like building custom AI employees, cross-app search, cross-app linking, Chrome extension, and real-time data awareness. Frame empowers users to manage product development, hire employees, manage sales pipelines, raise venture capital, collaborate remotely, centralize company knowledge, and augment teams with AI employees. The application is loved by users from forward-thinking companies and offers a delightful suite of native apps for teamwork, including note-taking, task management, whiteboard, and more. Frame is designed for speed and efficiency, enabling users to switch between apps seamlessly and maintain clean data through auto-enrichment.
HubSpot
HubSpot is a customer relationship management (CRM) platform that provides software and tools for marketing, sales, customer service, content management, and operations. It is designed to help businesses grow by connecting their data, teams, and customers on one platform. HubSpot's AI tools are used to automate tasks, personalize marketing campaigns, and provide insights into customer behavior.
Manatal
Manatal is an AI Recruitment Software designed to streamline the hiring processes for HR teams, recruitment agencies, and headhunters. It offers features such as candidate sourcing from various channels, applicant tracking system, recruitment CRM, candidate enrichment, AI recommendations, collaboration tools, reports & analytics, branded career page creation, support & assistance, data privacy compliance, and more. The platform provides customizable pipelines, AI-powered recommendations, candidate profile enrichment, mobile application access, and onboarding & placement management. Manatal aims to transform recruitment by leveraging AI technology to source and hire candidates faster and more efficiently.
SeekOut
SeekOut is an AI-powered platform designed to help organizations find the right candidates for open roles, develop their teams, and improve company culture. It offers features such as external talent sourcing, applicant review, pipeline insights, internal talent development, career compass, and talent intelligence. SeekOut is trusted by over 1,000 leading brands to recruit hard-to-find, diverse talent and manage talent acquisition and management in one platform. The platform integrates external data with HR systems to automatically build comprehensive profiles and provides data-driven insights to understand talent needs and prepare for the future.
Byterat
Byterat is a cloud-based platform that provides battery data management, visualization, and analytics. It offers an end-to-end data pipeline that automatically synchronizes, processes, and visualizes materials, manufacturing, and test data from all labs. Byterat also provides 24/7 access to experiments from anywhere in the world and integrates seamlessly with current workflows. It is customizable to specific cell chemistries and allows users to build custom visualizations, dashboards, and analyses. Byterat's AI-powered battery research has been published in leading journals, and its team has pioneered a new class of models that extract tell-tale signals of battery health from electrical signals to forecast future performance.
DVC
DVC is an open-source platform for managing machine learning data and experiments. It provides a unified interface for working with data from various sources, including local files, cloud storage, and databases. DVC also includes tools for versioning data and experiments, tracking metrics, and automating compute resources. DVC is designed to make it easy for data scientists and machine learning engineers to collaborate on projects and share their work with others.
Crustdata
Crustdata is a platform that provides real-time LinkedIn headcount and people data for making informed investment and sales decisions. It offers curated, dynamic data refreshed weekly to help users stay updated on company performance, sales dynamics, investment intelligence, and competitive intelligence. The platform enables users to track companies of interest, enrich CRM systems, and access various datasets related to web traffic, Google search impressions, product reviews, CEO and company reviews, investment data, SEO rankings, company news, and Form D filings. Additionally, Crustdata offers services to identify and fix data gaps, modernize data pipelines, and leverage AI for market mapping and competitor identification.
SID
SID is a data ingestion, storage, and retrieval pipeline that provides real-time context for AI applications. It connects to various data sources, handles authentication and permission flows, and keeps information up-to-date. SID's API allows developers to retrieve the right piece of data for a given task, enabling them to build AI apps that are fast, accurate, and scalable. With SID, developers can focus on building their products and leave the data management to SID.
ITSoli
ITSoli is an AI consulting firm that specializes in AI adoption, transformation, and data intelligence services. They offer custom AI models, data services, and strategic partnerships to help organizations innovate, automate, and accelerate their AI journey. With expertise in fine-tuning AI models and training custom agents, ITSoli aims to unlock the power of AI for businesses across various industries.
ITVA
ITVA is an AI automation tool for network infrastructure products that revolutionizes network management by enabling users to configure, query, and document their network using natural language. It offers features such as rapid configuration deployment, network diagnostics acceleration, automated diagram generation, and modernized IP address management. ITVA's unique solution securely connects to networks, combining real-time data with a proprietary dataset curated by veteran engineers. The tool ensures unparalleled accuracy and insights through its real-time data pipeline and on-demand dynamic analysis capabilities.
20 - Open Source AI Tools
airbyte
Airbyte is an open-source data integration platform that makes it easy to move data from any source to any destination. With Airbyte, you can build and manage data pipelines without writing any code. Airbyte provides a library of pre-built connectors that make it easy to connect to popular data sources and destinations. You can also create your own connectors using Airbyte's no-code Connector Builder or low-code CDK. Airbyte is used by data engineers and analysts at companies of all sizes to build and manage their data pipelines.
airbyte-platform
Airbyte is an open-source data integration platform that makes it easy to move data from any source to any destination. With Airbyte, you can build and manage data pipelines without writing any code. Airbyte provides a library of pre-built connectors that make it easy to connect to popular data sources and destinations. You can also create your own connectors using Airbyte's low-code Connector Development Kit (CDK). Airbyte is used by data engineers and analysts at companies of all sizes to move data for a variety of purposes, including data warehousing, data analysis, and machine learning.
qb
QANTA is a system and dataset for question answering tasks. It provides a script to download datasets, preprocesses questions, and matches them with Wikipedia pages. The system includes various datasets, training, dev, and test data in JSON and SQLite formats. Dependencies include Python 3.6, `click`, and NLTK models. Elastic Search 5.6 is needed for the Guesser component. Configuration is managed through environment variables and YAML files. QANTA supports multiple guesser implementations that can be enabled/disabled. Running QANTA involves using `cli.py` and Luigi pipelines. The system accesses raw Wikipedia dumps for data processing. The QANTA ID numbering scheme categorizes datasets based on events and competitions.
mage-ai
Mage is an open-source data pipeline tool for transforming and integrating data. It offers an easy developer experience, engineering best practices built-in, and data as a first-class citizen. Mage makes it easy to build, preview, and launch data pipelines, and provides observability and scaling capabilities. It supports data integrations, streaming pipelines, and dbt integration.
Geoweaver
Geoweaver is an in-browser software that enables users to easily compose and execute full-stack data processing workflows using online spatial data facilities, high-performance computation platforms, and open-source deep learning libraries. It provides server management, code repository, workflow orchestration software, and history recording capabilities. Users can run it from both local and remote machines. Geoweaver aims to make data processing workflows manageable for non-coder scientists and preserve model run history. It offers features like progress storage, organization, SSH connection to external servers, and a web UI with Python support.
bionic-gpt
BionicGPT is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality. BionicGPT can run on your laptop or scale into the data center.
awesome-mlops
Awesome MLOps is a curated list of tools related to Machine Learning Operations, covering areas such as AutoML, CI/CD for Machine Learning, Data Cataloging, Data Enrichment, Data Exploration, Data Management, Data Processing, Data Validation, Data Visualization, Drift Detection, Feature Engineering, Feature Store, Hyperparameter Tuning, Knowledge Sharing, Machine Learning Platforms, Model Fairness and Privacy, Model Interpretability, Model Lifecycle, Model Serving, Model Testing & Validation, Optimization Tools, Simplification Tools, Visual Analysis and Debugging, and Workflow Tools. The repository provides a comprehensive collection of tools and resources for individuals and teams working in the field of MLOps.
airflow-chart
This Helm chart bootstraps an Airflow deployment on a Kubernetes cluster using the Helm package manager. The version of this chart does not correlate to any other component. Users should not expect feature parity between OSS airflow chart and the Astronomer airflow-chart for identical version numbers. To install this helm chart remotely (using helm 3) kubectl create namespace airflow helm repo add astronomer https://helm.astronomer.io helm install airflow --namespace airflow astronomer/airflow To install this repository from source sh kubectl create namespace airflow helm install --namespace airflow . Prerequisites: Kubernetes 1.12+ Helm 3.6+ PV provisioner support in the underlying infrastructure Installing the Chart: sh helm install --name my-release . The command deploys Airflow on the Kubernetes cluster in the default configuration. The Parameters section lists the parameters that can be configured during installation. Upgrading the Chart: First, look at the updating documentation to identify any backwards-incompatible changes. To upgrade the chart with the release name `my-release`: sh helm upgrade --name my-release . Uninstalling the Chart: To uninstall/delete the `my-release` deployment: sh helm delete my-release The command removes all the Kubernetes components associated with the chart and deletes the release. Updating DAGs: Bake DAGs in Docker image The recommended way to update your DAGs with this chart is to build a new docker image with the latest code (`docker build -t my-company/airflow:8a0da78 .`), push it to an accessible registry (`docker push my-company/airflow:8a0da78`), then update the Airflow pods with that image: sh helm upgrade my-release . --set images.airflow.repository=my-company/airflow --set images.airflow.tag=8a0da78 Docker Images: The Airflow image that are referenced as the default values in this chart are generated from this repository: https://github.com/astronomer/ap-airflow. Other non-airflow images used in this chart are generated from this repository: https://github.com/astronomer/ap-vendor. Parameters: The complete list of parameters supported by the community chart can be found on the Parameteres Reference page, and can be set under the `airflow` key in this chart. The following tables lists the configurable parameters of the Astronomer chart and their default values. | Parameter | Description | Default | | :----------------------------- | :-------------------------------------------------------------------------------------------------------- | :---------------------------- | | `ingress.enabled` | Enable Kubernetes Ingress support | `false` | | `ingress.acme` | Add acme annotations to Ingress object | `false` | | `ingress.tlsSecretName` | Name of secret that contains a TLS secret | `~` | | `ingress.webserverAnnotations` | Annotations added to Webserver Ingress object | `{}` | | `ingress.flowerAnnotations` | Annotations added to Flower Ingress object | `{}` | | `ingress.baseDomain` | Base domain for VHOSTs | `~` | | `ingress.auth.enabled` | Enable auth with Astronomer Platform | `true` | | `extraObjects` | Extra K8s Objects to deploy (these are passed through `tpl`). More about Extra Objects. | `[]` | | `sccEnabled` | Enable security context constraints required for OpenShift | `false` | | `authSidecar.enabled` | Enable authSidecar | `false` | | `authSidecar.repository` | The image for the auth sidecar proxy | `nginxinc/nginx-unprivileged` | | `authSidecar.tag` | The image tag for the auth sidecar proxy | `stable` | | `authSidecar.pullPolicy` | The K8s pullPolicy for the the auth sidecar proxy image | `IfNotPresent` | | `authSidecar.port` | The port the auth sidecar exposes | `8084` | | `gitSyncRelay.enabled` | Enables git sync relay feature. | `False` | | `gitSyncRelay.repo.url` | Upstream URL to the git repo to clone. | `~` | | `gitSyncRelay.repo.branch` | Branch of the upstream git repo to checkout. | `main` | | `gitSyncRelay.repo.depth` | How many revisions to check out. Leave as default `1` except in dev where history is needed. | `1` | | `gitSyncRelay.repo.wait` | Seconds to wait before pulling from the upstream remote. | `60` | | `gitSyncRelay.repo.subPath` | Path to the dags directory within the git repository. | `~` | Specify each parameter using the `--set key=value[,key=value]` argument to `helm install`. For example, sh helm install --name my-release --set executor=CeleryExecutor --set enablePodLaunching=false . Walkthrough using kind: Install kind, and create a cluster We recommend testing with Kubernetes 1.25+, example: sh kind create cluster --image kindest/node:v1.25.11 Confirm it's up: sh kubectl cluster-info --context kind-kind Add Astronomer's Helm repo sh helm repo add astronomer https://helm.astronomer.io helm repo update Create namespace + install the chart sh kubectl create namespace airflow helm install airflow -n airflow astronomer/airflow It may take a few minutes. Confirm the pods are up: sh kubectl get pods --all-namespaces helm list -n airflow Run `kubectl port-forward svc/airflow-webserver 8080:8080 -n airflow` to port-forward the Airflow UI to http://localhost:8080/ to confirm Airflow is working. Login as _admin_ and password _admin_. Build a Docker image from your DAGs: 1. Start a project using astro-cli, which will generate a Dockerfile, and load your DAGs in. You can test locally before pushing to kind with `astro airflow start`. `sh mkdir my-airflow-project && cd my-airflow-project astro dev init` 2. Then build the image: `sh docker build -t my-dags:0.0.1 .` 3. Load the image into kind: `sh kind load docker-image my-dags:0.0.1` 4. Upgrade Helm deployment: sh helm upgrade airflow -n airflow --set images.airflow.repository=my-dags --set images.airflow.tag=0.0.1 astronomer/airflow Extra Objects: This chart can deploy extra Kubernetes objects (assuming the role used by Helm can manage them). For Astronomer Cloud and Enterprise, the role permissions can be found in the Commander role. yaml extraObjects: - apiVersion: batch/v1beta1 kind: CronJob metadata: name: "{{ .Release.Name }}-somejob" spec: schedule: "*/10 * * * *" concurrencyPolicy: Forbid jobTemplate: spec: template: spec: containers: - name: myjob image: ubuntu command: - echo args: - hello restartPolicy: OnFailure Contributing: Check out our contributing guide! License: Apache 2.0 with Commons Clause
Awesome-AI-Data-GitHub-Repos
Awesome AI & Data GitHub-Repos is a curated list of essential GitHub repositories covering the AI & ML landscape. It includes resources for Natural Language Processing, Large Language Models, Computer Vision, Data Science, Machine Learning, MLOps, Data Engineering, SQL & Database, and Statistics. The repository aims to provide a comprehensive collection of projects and resources for individuals studying or working in the field of AI and data science.
foundationallm
FoundationaLLM is a platform designed for deploying, scaling, securing, and governing generative AI in enterprises. It allows users to create AI agents grounded in enterprise data, integrate REST APIs, experiment with large language models, centrally manage AI agents and assets, deploy scalable vectorization data pipelines, enable non-developer users to create their own AI agents, control access with role-based access controls, and harness capabilities from Azure AI and Azure OpenAI. The platform simplifies integration with enterprise data sources, provides fine-grain security controls, load balances across multiple endpoints, and is extensible to new data sources and orchestrators. FoundationaLLM addresses the need for customized copilots or AI agents that are secure, licensed, flexible, and suitable for enterprise-scale production.
metaflow
Metaflow is a user-friendly library designed to assist scientists and engineers in developing and managing real-world data science projects. Initially created at Netflix, Metaflow aimed to enhance the productivity of data scientists working on diverse projects ranging from traditional statistics to cutting-edge deep learning. For further information, refer to Metaflow's website and documentation.
taipy
Taipy is an open-source Python library for easy, end-to-end application development, featuring what-if analyses, smart pipeline execution, built-in scheduling, and deployment tools.
dvc
DVC, or Data Version Control, is a command-line tool and VS Code extension that helps you develop reproducible machine learning projects. With DVC, you can version your data and models, iterate fast with lightweight pipelines, track experiments in your local Git repo, compare any data, code, parameters, model, or performance plots, and share experiments and automatically reproduce anyone's experiment.
vertex-ai-samples
The Google Cloud Vertex AI sample repository contains notebooks and community content that demonstrate how to develop and manage ML workflows using Google Cloud Vertex AI.
llm-app-stack
LLM App Stack, also known as Emerging Architectures for LLM Applications, is a comprehensive list of available tools, projects, and vendors at each layer of the LLM app stack. It covers various categories such as Data Pipelines, Embedding Models, Vector Databases, Playgrounds, Orchestrators, APIs/Plugins, LLM Caches, Logging/Monitoring/Eval, Validators, LLM APIs (proprietary and open source), App Hosting Platforms, Cloud Providers, and Opinionated Clouds. The repository aims to provide a detailed overview of tools and projects for building, deploying, and maintaining enterprise data solutions, AI models, and applications.
llmops-promptflow-template
LLMOps with Prompt flow is a template and guidance for building LLM-infused apps using Prompt flow. It provides centralized code hosting, lifecycle management, variant and hyperparameter experimentation, A/B deployment, many-to-many dataset/flow relationships, multiple deployment targets, comprehensive reporting, BYOF capabilities, configuration-based development, local prompt experimentation and evaluation, endpoint testing, and optional Human-in-loop validation. The tool is customizable to suit various application needs.
awesome-generative-ai
A curated list of Generative AI projects, tools, artworks, and models
20 - OpenAI Gpts
DataKitchen DataOps and Data Observability GPT
A specialist in DataOps and Data Observability, aiding in data management and monitoring.
Data Engineer Consultant
Guides in data engineering tasks with a focus on practical solutions.
AI Workload Optimizer
You've heard that AI can save you time, but you don't know how? Tell me what you do in a typical workweek, and I'll tell you how!
Triage Management and Pipeline Architecture
Strategic advisor for triage management and pipeline optimization in business operations.
👑 Data Privacy for PI & Security Firms 👑
Private Investigators and Security Firms, given the nature of their work, handle highly sensitive information and must maintain strict confidentiality and data privacy standards.
👑 Data Privacy for Watch & Jewelry Designers 👑
Watchmakers and Jewelry Designers, high-end businesses dealing with valuable items and personal details of clients, making data privacy and security paramount.
👑 Data Privacy for Event Management 👑
Data Privacy for Event Management and Ticketing Services handle personal data such as names, contact details, and payment information for event registrations and ticket purchases.
Data Governance Advisor
Ensures data accuracy, consistency, and security across organization.
👑 Data Privacy for Home Inspection & Appraisal 👑
Home Inspection and Appraisal Services have access to personal property and related information, requiring them to be vigilant about data privacy.
👑 Data Privacy for Freelancers & Independents 👑
Freelancers and Independent Consultants, individuals in these roles often handle client data, project specifics, and personal contact information, requiring them to be vigilant about data privacy.
👑 Data Privacy for Architecture & Construction 👑
Architecture and Construction Firms handle sensitive project data, client information, and architectural plans, necessitating strict data privacy measures.
👑 Data Privacy for Real Estate Agencies 👑
Real Estate Agencies and Brokers deal with personal data of clients, including financial information and preferences, requiring careful handling and protection of such data.
👑 Data Privacy for Spa & Beauty Salons 👑
Spa and Beauty Salons collect Customer inforation, including personal details and treatment records, necessitating a high level of confidentiality and data protection.
Data Architect
Database Developer assisting with SQL/NoSQL, architecture, and optimization.