Skip to content

Set default backends path #924

New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Merged
merged 9 commits into from
Jun 9, 2022
Merged

Set default backends path #924

merged 9 commits into from
Jun 9, 2022

Conversation

alonre24
Copy link
Collaborator

@alonre24 alonre24 commented Jun 2, 2022

Set the backends' default path on module load, so it can be retrieved with AI.CONFIG GET command. Also - pass the module's path to flow tests.

@alonre24 alonre24 requested a review from DvirDukhan June 2, 2022 11:42
@alonre24 alonre24 marked this pull request as draft June 2, 2022 14:27
@alonre24 alonre24 marked this pull request as ready for review June 2, 2022 17:31
@codecov
Copy link

codecov bot commented Jun 7, 2022

Codecov Report

Merging #924 (7549768) into master (2dd4ef1) will increase coverage by 0.46%.
The diff coverage is 96.57%.

❗ Current head 7549768 differs from pull request most recent head fcec66d. Consider uploading reports for the commit fcec66d to get more accurate results

@@            Coverage Diff             @@
##           master     #924      +/-   ##
==========================================
+ Coverage   81.30%   81.77%   +0.46%     
==========================================
  Files          54       54              
  Lines        8201     8214      +13     
==========================================
+ Hits         6668     6717      +49     
+ Misses       1533     1497      -36     
Impacted Files Coverage Δ
src/execution/DAG/dag_builder.c 100.00% <ø> (ø)
src/execution/DAG/dag_op.c 100.00% <ø> (ø)
src/execution/parsing/script_commands_parser.c 78.57% <ø> (-0.12%) ⬇️
.../serialization/RDB/decoder/previous/v2/decode_v2.c 46.37% <50.00%> (+0.45%) ⬆️
tests/module/LLAPI.c 75.89% <80.00%> (-0.02%) ⬇️
src/execution/background_workers.c 94.63% <95.23%> (+0.01%) ⬆️
src/redisai.c 89.27% <97.14%> (+1.47%) ⬆️
src/backends/backends.c 78.57% <100.00%> (+0.53%) ⬆️
...backends/libtorch_c/torch_extensions/torch_redis.h 90.90% <100.00%> (ø)
src/backends/tensorflow.c 72.43% <100.00%> (+0.06%) ⬆️
... and 18 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5e4517d...fcec66d. Read the comment docs.

Comment on lines +103 to +105
size_t backends_default_path_len = strlen(dyn_lib_dir_name) + strlen("/backends");
*backends_path = RedisModule_Alloc(backends_default_path_len + 1);
RedisModule_Assert(sprintf(*backends_path, "%s/backends", dyn_lib_dir_name) > 0);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

RedisModule_CreateStringPrintf?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is that better? We don't use HoldString or something here (so I can't see why using RedisModuleString object has an advantage...)

@@ -150,7 +149,7 @@ int Config_SetIntraOperationParallelism(RedisModuleString *num_threads_string) {
int Config_SetModelChunkSize(RedisModuleString *chunk_size_string) {
long long val;
int result = RedisModule_StringToLongLong(chunk_size_string, &val);
if (result != REDISMODULE_OK || val <= 0) {
if (result != REDISMODULE_OK || val <= 0 || val > REDISAI_DEFAULT_MODEL_CHUNK_SIZE) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why the addition of this condition?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to limit the chunk_size, since Redis cannot reply with blobs that are larger than 512MB, right?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

redis/redis-doc#1653 (comment)
see here. You can configure the server to support bigger values

@@ -28,6 +28,8 @@ typedef enum { RAI_DEVICE_CPU = 0, RAI_DEVICE_GPU = 1 } RAI_Device;
#define REDISAI_INFOMSG_MODEL_EXECUTION_TIMEOUT "Setting MODEL_EXECUTION_TIMEOUT parameter to"
#define REDISAI_INFOMSG_BACKEND_MEMORY_LIMIT "Setting BACKEND_MEMORY_LIMIT parameter to"

#define REDISAI_DEFAULT_MODEL_CHUNK_SIZE (511 * 1024 * 1024)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why the addition?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To use this constant in Config_SetModelChunkSize

@@ -1424,6 +1429,8 @@ int RedisModule_OnLoad(RedisModuleCtx *ctx, RedisModuleString **argv, int argc)

RedisModule_SetModuleOptions(ctx, REDISMODULE_OPTIONS_HANDLE_IO_ERRORS);

RedisModule_SubscribeToServerEvent(ctx, RedisModuleEvent_Shutdown, RAI_CleanupModule);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is the minimal redis version supporting this event? does it align with our min_redis_version on our ramp files?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, this API is supported from version 6.0.0, which is the minimal version for RedisAI anyway....

@@ -567,7 +567,7 @@ def test_synchronization(self):

def launch_redis_and_run_onnx(con, proc_id, pipes):
my_pipe = pipes[proc_id]
port = 6380 + proc_id # Let every subprocess run on a fresh port.
port = 6380 + 30*proc_id # Let every subprocess run on a fresh port (safe distance for RLTEST parallelism).
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

didn't you remove RLTest parallelism?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes... But I don't think that this addition is bad, as we might wan't to use the parallelism in the future. WDYT?

@alonre24 alonre24 added ci-test and removed ci-test labels Jun 9, 2022
@alonre24 alonre24 merged commit 1ea457a into master Jun 9, 2022
@alonre24 alonre24 deleted the set_default_backends_path branch June 9, 2022 11:20
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants