From 47c4c2992cbdb0e152179b3a930cf4e35651e1c1 Mon Sep 17 00:00:00 2001 From: Jim Allen Wallace Date: Fri, 10 Jan 2025 16:37:27 -0700 Subject: [PATCH 1/6] Changed name from RedisAI to redis-inference-optimization, and updated README --- README.md | 77 ++++++++++++++++++++++++++----------------------------- 1 file changed, 37 insertions(+), 40 deletions(-) diff --git a/README.md b/README.md index ba9a953e..3d95a455 100644 --- a/README.md +++ b/README.md @@ -1,43 +1,44 @@ -[![GitHub issues](https://img.shields.io/github/release/RedisAI/RedisAI.svg?sort=semver)](https://github.com/RedisAI/RedisAI/releases/latest) -[![CircleCI](https://circleci.com/gh/RedisAI/RedisAI/tree/master.svg?style=svg)](https://circleci.com/gh/RedisAI/RedisAI/tree/master) -[![Dockerhub](https://img.shields.io/badge/dockerhub-redislabs%2Fredisai-blue)](https://hub.docker.com/r/redislabs/redisai/tags/) -[![codecov](https://codecov.io/gh/RedisAI/RedisAI/branch/master/graph/badge.svg)](https://codecov.io/gh/RedisAI/RedisAI) -[![Total alerts](https://img.shields.io/lgtm/alerts/g/RedisAI/RedisAI.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/RedisAI/RedisAI/alerts/) -[![Forum](https://img.shields.io/badge/Forum-RedisAI-blue)](https://forum.redislabs.com/c/modules/redisai) +[![GitHub issues](https://img.shields.io/github/release/redis-inference-optimization/redis-inference-optimization.svg?sort=semver)](https://github.com/redis-inference-optimization/redis-inference-optimization/releases/latest) +[![CircleCI](https://circleci.com/gh/redis-inference-optimization/redis-inference-optimization/tree/master.svg?style=svg)](https://circleci.com/gh/redis-inference-optimization/redis-inference-optimization/tree/master) +[![Dockerhub](https://img.shields.io/badge/dockerhub-redislabs%2Fredis-inference-optimization-blue)](https://hub.docker.com/r/redislabs/redis-inference-optimization/tags/) +[![codecov](https://codecov.io/gh/redis-inference-optimization/redis-inference-optimization/branch/master/graph/badge.svg)](https://codecov.io/gh/redis-inference-optimization/redis-inference-optimization) +[![Total alerts](https://img.shields.io/lgtm/alerts/g/redis-inference-optimization/redis-inference-optimization.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/redis-inference-optimization/redis-inference-optimization/alerts/) +[![Forum](https://img.shields.io/badge/Forum-redis-inference-optimization-blue)](https://forum.redislabs.com/c/modules/redis-inference-optimization) [![Discord](https://img.shields.io/discord/697882427875393627?style=flat-square)](https://discord.gg/rTQm7UZ) > [!CAUTION] -> **RedisAI is no longer actively maintained or supported.** +> **redis-inference-optimization is no longer actively maintained or supported.** > -> We are grateful to the RedisAI community for their interest and support. +> We are grateful to the redis-inference-optimization community for their interest and support. +> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings. -# RedisAI -RedisAI is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **RedisAI both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure. +# redis-inference-optimization +Redis-inference-optimization is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **Redis-inference-optimization both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure. -To read RedisAI docs, visit [redisai.io](https://oss.redis.com/redisai/). To see RedisAI in action, visit the [demos page](https://oss.redis.com/redisai/examples/). +To see redis-inference-optimization in action, visit the [demos page](https://oss.redis.com/redis-inference-optimization/examples/). # Quickstart -RedisAI is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies. +redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies. -The following sections describe how to get started with RedisAI. +The following sections describe how to get started with redis-inference-optimization. ## Docker -The quickest way to try RedisAI is by launching its official Docker container images. +The quickest way to try redis-inference-optimization is by launching its official Docker container images. ### On a CPU only machine ``` -docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic +docker run -p 6379:6379 redislabs/redis-inference-optimization:1.2.7-cpu-bionic ``` ### On a GPU machine For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.3 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker) ``` -docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic +docker run -p 6379:6379 --gpus all -it --rm redislabs/redis-inference-optimization:1.2.7-gpu-bionic ``` ## Building -You can compile and build the module from its source code. The [Developer](https://oss.redis.com/redisai/developer/) page has more information about the design and implementation of the RedisAI module and how to contribute. +You can compile and build the module from its source code. ### Prerequisites * Packages: git, python3, make, wget, g++/clang, & unzip @@ -49,17 +50,17 @@ You can compile and build the module from its source code. The [Developer](https You can obtain the module's source code by cloning the project's repository using git like so: ```sh -git clone --recursive https://github.com/RedisAI/RedisAI +git clone --recursive https://github.com/redis-inference-optimization/redis-inference-optimization ``` Switch to the project's directory with: ```sh -cd RedisAI +cd redis-inference-optimization ``` ### Building the Dependencies -Use the following script to download and build the libraries of the various RedisAI backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only: +Use the following script to download and build the libraries of the various redis-inference-optimization backends (TensorFlow, PyTorch, ONNXRuntime) for CPU only: ```sh bash get_deps.sh @@ -72,14 +73,14 @@ bash get_deps.sh gpu ``` ### Building the Module -Once the dependencies have been built, you can build the RedisAI module with: +Once the dependencies have been built, you can build the redis-inference-optimization module with: ```sh make -C opt clean ALL=1 make -C opt ``` -Alternatively, run the following to build RedisAI with GPU support: +Alternatively, run the following to build redis-inference-optimization with GPU support: ```sh make -C opt clean ALL=1 @@ -88,17 +89,16 @@ make -C opt GPU=1 ### Backend Dependancy -RedisAI currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between RedisAI and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given RedisAI version, check with the backend documentation about incompatible features between the version of your backend and the version RedisAI is built with. +redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with. -| RedisAI | PyTorch | TensorFlow | TFLite | ONNXRuntime | +| redis-inference-optimization | PyTorch | TensorFlow | TFLite | ONNXRuntime | |:--------|:--------:|:----------:|:------:|:-----------:| | 1.0.3 | 1.5.0 | 1.15.0 | 2.0.0 | 1.2.0 | | 1.2.7 | 1.11.0 | 2.8.0 | 2.0.0 | 1.11.1 | | master | 1.11.0 | 2.8.0 | 2.0.0 | 1.11.1 | -Note: Keras and TensorFlow 2.x are supported through graph freezing. See [this script](http://dev.cto.redis.s3.amazonaws.com/RedisAI/test_data/tf2-minimal.py -) to see how to export a frozen graph from Keras and TensorFlow 2.x. +Note: Keras and TensorFlow 2.x are supported through graph freezing. ## Loading the Module To load the module upon starting the Redis server, simply use the `--loadmodule` command line switch, the `loadmodule` configuration directive or the [Redis `MODULE LOAD` command](https://redis.io/commands/module-load) with the path to module's library. @@ -106,22 +106,22 @@ To load the module upon starting the Redis server, simply use the `--loadmodule` For example, to load the module from the project's path with a server command line switch use the following: ```sh -redis-server --loadmodule ./install-cpu/redisai.so +redis-server --loadmodule ./install-cpu/redis-inference-optimization.so ``` ### Give it a try -Once loaded, you can interact with RedisAI using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redisai/intro/#getting-started). +Once loaded, you can interact with redis-inference-optimization using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redis-inference-optimization/intro/#getting-started). ### Client libraries -Some languages already have client libraries that provide support for RedisAI's commands. The following table lists the known ones: +Some languages already have client libraries that provide support for redis-inference-optimization's commands. The following table lists the known ones: | Project | Language | License | Author | URL | | ------- | -------- | ------- | ------ | --- | -| JRedisAI | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/JRedisAI) | -| redisai-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-py) | -| redisai-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-go) | -| redisai-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisai-js) | +| Jredis-inference-optimization | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/Jredis-inference-optimization) | +| redis-inference-optimization-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-py) | +| redis-inference-optimization-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-go) | +| redis-inference-optimization-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-js) | | redis-modules-sdk | TypeScript | BSD-3-Clause | [Dani Tseitlin](https://github.com/danitseitlin) | [Github](https://github.com/danitseitlin/redis-modules-sdk) | | redis-modules-java | Java | Apache-2.0 | [dengliming](https://github.com/dengliming) | [Github](https://github.com/dengliming/redis-modules-java) | | smartredis | C++ | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) | @@ -131,16 +131,13 @@ Some languages already have client libraries that provide support for RedisAI's -The full documentation for RedisAI's API can be found at the [Commands page](commands.md). - -## Documentation -Read the docs at [redisai.io](https://oss.redis.com/redisai/). +The full documentation for redis-inference-optimization's API can be found at the [Commands page](commands.md). ## Contact Us If you have questions, want to provide feedback or perhaps report an issue or [contribute some code](contrib.md), here's where we're listening to you: -* [Forum](https://forum.redis.com/c/modules/redisai) -* [Repository](https://github.com/RedisAI/RedisAI/issues) +* [Forum](https://forum.redis.com/c/modules/redis-inference-optimization) +* [Repository](https://github.com/redis-inference-optimization/redis-inference-optimization/issues) ## License -RedisAI is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1). +redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1). From 45993804a68f30c23530f257535883a2d494b122 Mon Sep 17 00:00:00 2001 From: Jim Allen Wallace Date: Fri, 10 Jan 2025 16:41:59 -0700 Subject: [PATCH 2/6] Changed name from RedisAI to redis-inference-optimization, and updated README --- README.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index 3d95a455..2f3f3c45 100644 --- a/README.md +++ b/README.md @@ -7,12 +7,12 @@ [![Discord](https://img.shields.io/discord/697882427875393627?style=flat-square)](https://discord.gg/rTQm7UZ) > [!CAUTION] -> **redis-inference-optimization is no longer actively maintained or supported.** +> **Redis-inference-optimization is no longer actively maintained or supported.** > > We are grateful to the redis-inference-optimization community for their interest and support. > Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings. -# redis-inference-optimization +# Redis-inference-optimization Redis-inference-optimization is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **Redis-inference-optimization both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure. To see redis-inference-optimization in action, visit the [demos page](https://oss.redis.com/redis-inference-optimization/examples/). From 11d45f6d0f345fa6975431cfbb50683303bd2037 Mon Sep 17 00:00:00 2001 From: jcaw07 <121062614+jcaw07@users.noreply.github.com> Date: Tue, 14 Jan 2025 17:55:54 -0700 Subject: [PATCH 3/6] Update README.md Co-authored-by: Benoit Dion <573574+benoitdion@users.noreply.github.com> --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 2f3f3c45..c2d61489 100644 --- a/README.md +++ b/README.md @@ -50,7 +50,7 @@ You can compile and build the module from its source code. You can obtain the module's source code by cloning the project's repository using git like so: ```sh -git clone --recursive https://github.com/redis-inference-optimization/redis-inference-optimization +git clone --recursive https://github.com/RedisAI/redis-inference-optimization ``` Switch to the project's directory with: From 6270f90aafd8f4a91c480e412c7ba9563cd5560f Mon Sep 17 00:00:00 2001 From: jcaw07 <121062614+jcaw07@users.noreply.github.com> Date: Tue, 14 Jan 2025 17:56:07 -0700 Subject: [PATCH 4/6] Update README.md Co-authored-by: Benoit Dion <573574+benoitdion@users.noreply.github.com> --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index c2d61489..b2738b87 100644 --- a/README.md +++ b/README.md @@ -137,7 +137,7 @@ The full documentation for redis-inference-optimization's API can be found at th If you have questions, want to provide feedback or perhaps report an issue or [contribute some code](contrib.md), here's where we're listening to you: * [Forum](https://forum.redis.com/c/modules/redis-inference-optimization) -* [Repository](https://github.com/redis-inference-optimization/redis-inference-optimization/issues) +* [Repository](https://github.com/RedisAI/redis-inference-optimization/issues) ## License redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1). From 1e95348d91b7efeac5e302807f55d55641fa083e Mon Sep 17 00:00:00 2001 From: Jim Allen Wallace Date: Tue, 14 Jan 2025 18:01:18 -0700 Subject: [PATCH 5/6] Removed broken link to demo. --- README.md | 2 -- 1 file changed, 2 deletions(-) diff --git a/README.md b/README.md index b2738b87..bb55bd64 100644 --- a/README.md +++ b/README.md @@ -15,8 +15,6 @@ # Redis-inference-optimization Redis-inference-optimization is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **Redis-inference-optimization both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure. -To see redis-inference-optimization in action, visit the [demos page](https://oss.redis.com/redis-inference-optimization/examples/). - # Quickstart redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies. From 75775e061cb68aeacea07064a70c3fca6fef4066 Mon Sep 17 00:00:00 2001 From: Jim Allen Wallace Date: Thu, 16 Jan 2025 10:34:32 -0700 Subject: [PATCH 6/6] Reverted docker links to RedisAI. Removed external broken links. --- README.md | 40 +++++++++++----------------------------- 1 file changed, 11 insertions(+), 29 deletions(-) diff --git a/README.md b/README.md index bb55bd64..d071810a 100644 --- a/README.md +++ b/README.md @@ -1,22 +1,14 @@ -[![GitHub issues](https://img.shields.io/github/release/redis-inference-optimization/redis-inference-optimization.svg?sort=semver)](https://github.com/redis-inference-optimization/redis-inference-optimization/releases/latest) -[![CircleCI](https://circleci.com/gh/redis-inference-optimization/redis-inference-optimization/tree/master.svg?style=svg)](https://circleci.com/gh/redis-inference-optimization/redis-inference-optimization/tree/master) -[![Dockerhub](https://img.shields.io/badge/dockerhub-redislabs%2Fredis-inference-optimization-blue)](https://hub.docker.com/r/redislabs/redis-inference-optimization/tags/) -[![codecov](https://codecov.io/gh/redis-inference-optimization/redis-inference-optimization/branch/master/graph/badge.svg)](https://codecov.io/gh/redis-inference-optimization/redis-inference-optimization) -[![Total alerts](https://img.shields.io/lgtm/alerts/g/redis-inference-optimization/redis-inference-optimization.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/redis-inference-optimization/redis-inference-optimization/alerts/) -[![Forum](https://img.shields.io/badge/Forum-redis-inference-optimization-blue)](https://forum.redislabs.com/c/modules/redis-inference-optimization) -[![Discord](https://img.shields.io/discord/697882427875393627?style=flat-square)](https://discord.gg/rTQm7UZ) - > [!CAUTION] > **Redis-inference-optimization is no longer actively maintained or supported.** > > We are grateful to the redis-inference-optimization community for their interest and support. -> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings. +> Previously, redis-inference-optimization was named RedisAI, but was renamed in Jan 2025 to reduce confusion around Redis' other AI offerings. To learn more about Redis' current AI offerings, visit [the Redis website](https://redis.io/redis-for-ai). # Redis-inference-optimization Redis-inference-optimization is a Redis module for executing Deep Learning/Machine Learning models and managing their data. Its purpose is being a "workhorse" for model serving, by providing out-of-the-box support for popular DL/ML frameworks and unparalleled performance. **Redis-inference-optimization both maximizes computation throughput and reduces latency by adhering to the principle of data locality**, as well as simplifies the deployment and serving of graphs by leveraging on Redis' production-proven infrastructure. # Quickstart -redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies. +Redis-inference-optimization is a Redis module. To run it you'll need a Redis server (v6.0.0 or greater), the module's shared library, and its dependencies. The following sections describe how to get started with redis-inference-optimization. @@ -24,14 +16,14 @@ The following sections describe how to get started with redis-inference-optimiza The quickest way to try redis-inference-optimization is by launching its official Docker container images. ### On a CPU only machine ``` -docker run -p 6379:6379 redislabs/redis-inference-optimization:1.2.7-cpu-bionic +docker run -p 6379:6379 redislabs/redisai:1.2.7-cpu-bionic ``` ### On a GPU machine For GPU support you will need a machine you'll need a machine that has Nvidia driver (CUDA 11.3 and cuDNN 8.1), nvidia-container-toolkit and Docker 19.03+ installed. For detailed information, checkout [nvidia-docker documentation](https://github.com/NVIDIA/nvidia-docker) ``` -docker run -p 6379:6379 --gpus all -it --rm redislabs/redis-inference-optimization:1.2.7-gpu-bionic +docker run -p 6379:6379 --gpus all -it --rm redislabs/redisai:1.2.7-gpu-bionic ``` @@ -87,7 +79,7 @@ make -C opt GPU=1 ### Backend Dependancy -redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with. +Redis-inference-optimization currently supports PyTorch (libtorch), Tensorflow (libtensorflow), TensorFlow Lite, and ONNXRuntime as backends. This section shows the version map between redis-inference-optimization and supported backends. This extremely important since the serialization mechanism of one version might not match with another. For making sure your model will work with a given redis-inference-optimization version, check with the backend documentation about incompatible features between the version of your backend and the version redis-inference-optimization is built with. | redis-inference-optimization | PyTorch | TensorFlow | TFLite | ONNXRuntime | @@ -109,17 +101,17 @@ redis-server --loadmodule ./install-cpu/redis-inference-optimization.so ### Give it a try -Once loaded, you can interact with redis-inference-optimization using redis-cli. Basic information and examples for using the module is described [here](https://oss.redis.com/redis-inference-optimization/intro/#getting-started). +Once loaded, you can interact with redis-inference-optimization using redis-cli. ### Client libraries Some languages already have client libraries that provide support for redis-inference-optimization's commands. The following table lists the known ones: | Project | Language | License | Author | URL | | ------- | -------- | ------- | ------ | --- | -| Jredis-inference-optimization | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/Jredis-inference-optimization) | -| redis-inference-optimization-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-py) | -| redis-inference-optimization-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-go) | -| redis-inference-optimization-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/redis-inference-optimization-js) | +| JredisAI | Java | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redis-inference-optimization/JRedisAI) | +| redisAI-py | Python | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redisAI/redisAI-py) | +| redisAI-go | Go | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/RedisAI/redisAI-go) | +| redisAI-js | Typescript/Javascript | BSD-3 | [RedisLabs](https://redislabs.com/) | [Github](https://github.com/redisAI/redisAI-js) | | redis-modules-sdk | TypeScript | BSD-3-Clause | [Dani Tseitlin](https://github.com/danitseitlin) | [Github](https://github.com/danitseitlin/redis-modules-sdk) | | redis-modules-java | Java | Apache-2.0 | [dengliming](https://github.com/dengliming) | [Github](https://github.com/dengliming/redis-modules-java) | | smartredis | C++ | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) | @@ -127,15 +119,5 @@ Some languages already have client libraries that provide support for redis-infe | smartredis | Fortran | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) | | smartredis | Python | BSD-2-Clause | [Cray Labs](https://github.com/CrayLabs) | [Github](https://github.com/CrayLabs/SmartRedis) | - - -The full documentation for redis-inference-optimization's API can be found at the [Commands page](commands.md). - -## Contact Us -If you have questions, want to provide feedback or perhaps report an issue or [contribute some code](contrib.md), here's where we're listening to you: - -* [Forum](https://forum.redis.com/c/modules/redis-inference-optimization) -* [Repository](https://github.com/RedisAI/redis-inference-optimization/issues) - ## License -redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1). +Redis-inference-optimization is licensed under your choice of the Redis Source Available License 2.0 (RSALv2) or the Server Side Public License v1 (SSPLv1).