Skip to content

Latest commit

 

History

History
190 lines (134 loc) · 6.61 KB

welcome.rst

File metadata and controls

190 lines (134 loc) · 6.61 KB

Deepchecks Logo

Welcome to Deepchecks!

Deepchecks is a holistic tool for testing, validating and monitoring your machine learning models and data, throughout the model's lifecycle. It enables you to identify problems with your data quality, distributions, and model's performance with minimal effort.

See more info in the Deepchecks Components for Continuous Validation <welcome__deepchecks_components> section, along with the direct links to the documentation of each component.

Get Started with Deepchecks Testing

Deepchecks Testing Suite of Checks

1

🏃‍♀️ Quickstarts 🏃‍♀️

Downloadable end-to-end guides, demonstrating how to start testing your data & model in just a few minutes.

💁‍♂️ Get Help & Give Us Feedback 💁

Links for how to interact with us via our Slack Community or by opening an issue on Github.

💻 Install 💻

Full installation guide (quick one can be found in quickstarts)

🤓 General: Concepts & Guides 🤓

A comprehensive view of deepchecks concepts, customizations, and core use cases.

🔢 Tabular 🔢

Quickstarts, main concepts, checks gallery and end-to-end guides demonstrating how to start working Deepchecks with tabular data and models.

🔤️ NLP 🔤️

Quickstarts, main concepts, checks gallery and end-to-end guides demonstrating how to start working Deepchecks with textual data. Future releases to come!

🎦‍ Computer Vision (Note: in Beta Release) 🎦‍

Quickstarts, main concepts, checks gallery and end-to-end guides demonstrating how to start working Deepchecks with CV data and models. Built-in support for PyTorch, TensorFlow, and custom frameworks.

🚀 Interactive Checks Demo 🚀

Play with some of the existing tabular checks and see how they work on various datasets with custom corruptions injected.

🤖 API Reference 🤖

Reference and links to source code for Deepchecks Testing's components.

🏃‍♀️ Testing Quickstarts 🏃‍♀️

1

🔢 Tabular 🔢

🎦‍ Vision 🎦‍ (in Beta)

🔤️ NLP 🔤️ (in Alpha)

Deepchecks' Components

1

Testing Docs (Here)

Tests during research and model development

CI Docs

Tests before deploying the model to production

Monitoring Docs

Tests and continuous monitoring during production

Deepchecks accompanies you through various testing needs such as verifying your data's integrity, inspecting its distributions, validating data splits, evaluating your model and comparing between different models, throughout the model's lifecycle.

Phases for Continuous Validation of ML Models and Data

Deechecks' continuous validation approach is based on testing the ML models and data throughout the different phases using the exact same checks, enabling a simple, elaborate and seamless experience for configuring and consuming the results. Each phase has its relevant interfaces (e.g. visual outputs, output results to json, alert configuration, push notifications, RCA, etc.) for interacting with the test results.

Get Help & Give Us Feedback

Join Our Community 👋

In addition to perusing the documentation, feel free to:

To support us, please give us a star on ⭐️ Github ⭐️, it really means a lot for open source projects!