diff --git a/content/operate/oss_and_stack/stack-with-enterprise/search/scalable-search.md b/content/operate/oss_and_stack/stack-with-enterprise/search/query-performance-factor.md similarity index 70% rename from content/operate/oss_and_stack/stack-with-enterprise/search/scalable-search.md rename to content/operate/oss_and_stack/stack-with-enterprise/search/query-performance-factor.md index f48457cfdf..2dced5729a 100644 --- a/content/operate/oss_and_stack/stack-with-enterprise/search/scalable-search.md +++ b/content/operate/oss_and_stack/stack-with-enterprise/search/query-performance-factor.md @@ -1,17 +1,18 @@ --- -Title: Enable scalable Redis Query Engine in Redis Enterprise +Title: Configure the query performance factor for Redis Query Engine in Redis Enterprise alwaysopen: false categories: - docs - operate - stack -description: Enable the scalable Redis Query Engine in Redis Enterprise to increase the performance of queries. -linkTitle: Enable scalable Redis Query Engine +description: Configure the query performance factor for Redis Query Engine in Redis Enterprise to increase the performance of queries. +linkTitle: Configure query performance factor weight: 20 aliases: /operate/oss_and_stack/stack-with-enterprise/search/scalable-search/ + /operate/oss_and_stack/stack-with-enterprise/search/query-performance-factor/ --- -Redis Query Engine is a capability intended to increase the performance of queries, including [vector search]({{}}). When enabled, it allows you to increase a database's compute capacity and throughput by allocating more virtual CPUs per shard in addition to horizontal scaling with more shards. This document describes how to configure the Redis Query Engine. +Query performance factors are intended to increase the performance of queries, including [vector search]({{}}). When enabled, it allows you to increase a database's compute capacity and throughput by allocating more virtual CPUs per shard in addition to horizontal scaling with more shards. This document describes how to configure the query performance factor. {{}} Some use cases might not scale effectively. Redis experts can help determine if vertical scaling with the Redis Query Engine will boost performance for your use case and guide you on whether to use vertical scaling, horizontal scaling, or both. @@ -31,13 +32,13 @@ If you do not have a cluster that supports Redis Query Engine, [install Redis En 1. Calculate the RAM requirements using the [Index Size Calculator](https://redis.io/redisearch-sizing-calculator/). The total RAM required is the sum of the dataset and index sizes. -1. [Determine the scaling factor](#calculate-scaling-factor) you want and the required number of CPUs. Unused CPUs, above the 20% necessary for Redis, can be used for the scalable Redis Query Engine. +1. [Determine the query performance factor](#calculate-performance-factor) you want and the required number of CPUs. Unused CPUs, above the 20% necessary for Redis, can be used for the scalable Redis Query Engine. -1. Create a new Redis database with the number of CPUs configured for the scalable Redis Query Engine. +1. Create a new Redis database with the number of CPUs configured for the Query performance factor. -## Calculate scaling factor +## Calculate performance factor -### CPUs for Redis Query Engine +### CPUs for query performance factor Vertical scaling of the Redis Query Engine is achieved by provisioning additional CPUs for the search module. At least 20% of the available CPUs must be reserved for Redis internal processing. Use the following formula to define the maximum number of CPUs that can be allocated to search. @@ -47,9 +48,9 @@ Vertical scaling of the Redis Query Engine is achieved by provisioning additiona | Redis internals | 20% | | Available CPUs for Redis Query Engine | floor(0.8 * x) | -### Scale factor versus CPUs +### Query performance factor versus CPUs -The following table shows the number of CPUs required for each scale factor. This calculation is sensitive to how the search index and queries are defined. Certain scenarios might yield less throughput than the ratios in the following table. +The following table shows the number of CPUs required for each performance factor. This calculation is sensitive to how the search index and queries are defined. Certain scenarios might yield less throughput than the ratios in the following table. | Scale factor | Minimum CPUs required for Redis Query Engine | |----------------|-----------------------------------------| @@ -63,7 +64,7 @@ The following table shows the number of CPUs required for each scale factor. Thi | 14 | 21 | | 16 | 24 | -### Example scale factor calculation +### Example performance factor calculation | Variable | Value | |----------|-------| @@ -72,13 +73,13 @@ The following table shows the number of CPUs required for each scale factor. Thi | Scale factor | 4x | | Minimum CPUs required for scale factor | 6 | -## Enable scalable Redis Query Engine +## Configure query performance factor -To enable the scalable Redis Query Engine in Redis Enterprise, use the [REST API]({{}}) to create a new database or update an existing database. +To configure the query performance factor in Redis Enterprise, use the [REST API]({{}}) to create a new database or update an existing database. ### Create new database -To create a database with the scalable Redis Query Engine enabled, use the [create database REST API endpoint]({{}}) with a [BDB object]({{}}) that includes the following parameters: +To create a database and configure the query performance factor, use the [create database REST API endpoint]({{}}) with a [BDB object]({{}}) that includes the following parameters: ```json { @@ -91,11 +92,11 @@ To create a database with the scalable Redis Query Engine enabled, use the [crea } ``` -See [Calculate scaling factor](#calculate-scaling-factor) to determine the value to use for ``. +See [Calculate performance factor](#calculate-performance-factor) to determine the value to use for ``. #### Example REST API request for a new database -The following JSON is an example request body used to create a new database with the scalable Redis Query Engine enabled: +The following JSON is an example request body used to create a new database with a 4x query performance factor configured: ```json { @@ -127,18 +128,18 @@ curl -k -u ":" https://:9443/v1/bdbs -H "Content-Type:appl ### Update existing database -To enable the scalable Redis Query Engine for an existing database, use the following REST API requests: +To configure the query performance factor for an existing database, use the following REST API requests: - [Update database configuration]({{}}) to modify the DMC proxy. -- [Upgrade module]({{}}) to set the search module’s scaling factor. +- [Upgrade module]({{}}) to set the search module’s query performance factor. {{}} - Because this procedure also restarts the database shards, you should perform it during a maintenance period. - This procedure overwrites any existing module configuration parameters. {{}} -The following example script uses both endpoints to configure a 4x scale factor: +The following example script uses both endpoints to configure a 4x query performance factor: ```sh #!/bin/bash @@ -167,7 +168,7 @@ curl -o /dev/null -s -k -u ":" https://:9443/v1/bdbs/$DB_I ## Monitoring Redis Query Engine -To monitor a database with the scalable Redis Query Engine enabled: +To monitor a database with a query performance factor configured: 1. Integrate your Redis Enterprise deployment with Prometheus. See [Prometheus and Grafana with Redis Enterprise]({{}}) for instructions. @@ -175,10 +176,10 @@ To monitor a database with the scalable Redis Query Engine enabled: The following Prometheus UI screenshot shows `redis_process_cpu_usage_percent` spikes for a database with two shards: - - 1st 100% spike: [`memtier_benchmark`](https://github.com/RedisLabs/memtier_benchmark) search test at the default scale factor (1 CPU per shard for search). + - 1st 100% spike: [`memtier_benchmark`](https://github.com/RedisLabs/memtier_benchmark) search test at the default (no additional CPUs for search). - - 2nd 100% spike: reconfiguration and shard restart for a 4x scale factor. + - 2nd 100% spike: reconfiguration and shard restart for a 4x query performance factor. - - 3rd 600% spike: `memtier_benchmark` search test with threading at a 4x scale factor (6 CPUs per shard). + - 3rd 600% spike: `memtier_benchmark` search test with threading at a 4x query performance factor (6 CPUs per shard). {{The Prometheus graph shows three spikes for redis_process_cpu_usage_percent: 100%, another 100%, then 600%.}}