From 63c3fc1b61141ea6483c6b0bbdd998b6b7310b44 Mon Sep 17 00:00:00 2001 From: tdruez Date: Wed, 17 Jan 2024 10:40:16 -0700 Subject: [PATCH] Update the documentation with pipelines renaming #1044 Signed-off-by: tdruez --- docs/automation.rst | 2 +- docs/built-in-pipelines.rst | 10 +++---- docs/command-line-interface.rst | 2 +- docs/distros-os-images.rst | 19 ++++++------- docs/faq.rst | 24 +++++++++-------- docs/output-files.rst | 4 +-- docs/rest-api.rst | 12 ++++----- docs/tutorial_api_analyze_package_archive.rst | 8 +++--- docs/tutorial_cli_analyze_docker_image.rst | 27 ++++++++++--------- docs/tutorial_web_ui_analyze_docker_image.rst | 7 ++--- 10 files changed, 60 insertions(+), 55 deletions(-) diff --git a/docs/automation.rst b/docs/automation.rst index 612509840..86b8d5125 100644 --- a/docs/automation.rst +++ b/docs/automation.rst @@ -93,7 +93,7 @@ For instance, you can create a project and trigger it using the following comman crontab:: docker compose exec -it web scanpipe create-project scan-$(date +"%Y-%m-%dT%H:%M:%S") \ - --pipeline scan_package \ + --pipeline scan_single_package \ --input-url https://github.com/package-url/packageurl-python/archive/refs/heads/main.zip \ --execute diff --git a/docs/built-in-pipelines.rst b/docs/built-in-pipelines.rst index 704755adc..a738368e5 100644 --- a/docs/built-in-pipelines.rst +++ b/docs/built-in-pipelines.rst @@ -18,7 +18,7 @@ Pipeline Base Class :members: :member-order: bysource -.. _pipeline_docker: +.. _pipeline_analyze_docker_image: Analyse Docker Image -------------------- @@ -26,7 +26,7 @@ Analyse Docker Image :members: :member-order: bysource -.. _pipeline_root_filesystems: +.. _pipeline_analyze_root_filesystem: Analyze Root Filesystem or VM Image ----------------------------------- @@ -34,7 +34,7 @@ Analyze Root Filesystem or VM Image :members: :member-order: bysource -.. _pipeline_docker_windows: +.. _analyze_windows_docker_image: Analyse Docker Windows Image ---------------------------- @@ -66,7 +66,7 @@ Load Inventory :members: :member-order: bysource -.. _pipeline_deploy_to_develop: +.. _pipeline_map_deploy_to_develop: Map Deploy To Develop --------------------- @@ -98,7 +98,7 @@ Scan Codebase Package :members: :member-order: bysource -.. _pipeline_scan_package: +.. _pipeline_scan_single_package: Scan Single Package ------------------- diff --git a/docs/command-line-interface.rst b/docs/command-line-interface.rst index cf2f88957..2a64f897a 100644 --- a/docs/command-line-interface.rst +++ b/docs/command-line-interface.rst @@ -172,7 +172,7 @@ You can use more than one ``PIPELINE_NAME`` to add multiple pipelines at once. For example, assuming you have created beforehand a project named "foo", this will add the docker pipeline to your project:: - $ scanpipe add-pipeline --project foo docker + $ scanpipe add-pipeline --project foo analyze_docker_image `$ scanpipe execute --project PROJECT` diff --git a/docs/distros-os-images.rst b/docs/distros-os-images.rst index 3e7ec63a0..44ff4ac3e 100644 --- a/docs/distros-os-images.rst +++ b/docs/distros-os-images.rst @@ -39,18 +39,19 @@ may be only used for certain pipelines: - **RPM-based** Linux distros: RHEL, Fedora, openSUSE/SUSE - **Alpine** Linux distros -For the above three flavors, the :ref:`docker ` and -:ref:`root_filesystems ` pipelines support comprehensive -detection of installed system packages, their provenance, their license metadata, -and their installed files. +For the above three flavors, the +:ref:`analyze_docker_image ` and +:ref:`analyze_root_filesystem_or_vm_image ` pipelines +support comprehensive detection of installed system packages, their provenance, +their license metadata, and their installed files. -- For **Windows**, the :ref:`docker_windows ` pipeline supports - Windows Docker images with extensive detection of installed Windows packages, - programs, and the majority of installed files. +- For **Windows**, the :ref:`analyze_windows_docker_image ` + pipeline supports Windows Docker images with extensive detection of installed Windows + packages, programs, and the majority of installed files. - **Distroless** Docker images system packages are detected with the - :ref:`docker ` pipeline; package and license metadata are also - detected. + :ref:`analyze_docker_image ` pipeline; package and + license metadata are also detected. However, some work needs to be done to achieve comprehensive support and fix the issue of system packages ot tracking their installed files. Check `this open GitHub issue `_ diff --git a/docs/faq.rst b/docs/faq.rst index fcb6c7ac8..26b6e4974 100644 --- a/docs/faq.rst +++ b/docs/faq.rst @@ -23,22 +23,22 @@ Selecting the right pipeline for your needs depends primarily on the type of inp data you have available. Here are some general guidelines based on different input scenarios: -- If you have a **Docker image** as input, use the :ref:`docker ` - pipeline. +- If you have a **Docker image** as input, use the + :ref:`analyze_docker_image ` pipeline. - For a full **codebase compressed as an archive**, choose the :ref:`scan_codebase ` pipeline. - If you have a **single package archive**, opt for the - :ref:`scan_package ` pipeline. + :ref:`scan_single_package ` pipeline. - When dealing with a **Linux root filesystem** (rootfs), the - :ref:`root_filesystems ` pipeline is the appropriate - choice. + :ref:`analyze_root_filesystem_or_vm_image ` pipeline + is the appropriate choice. - For processing the results of a **ScanCode-toolkit scan** or **ScanCode.io scan**, use the :ref:`load_inventory ` pipeline. - When you have **manifest files**, such as a **CycloneDX BOM, SPDX document, lockfile**, etc., use the :ref:`inspect_packages ` pipeline. - For scenarios involving both a **development and deployment codebase**, consider using - the :ref:`deploy_to_develop ` pipeline. + the :ref:`map_deploy_to_develop ` pipeline. These pipelines will automatically execute the necessary steps to scan and create the packages, dependencies, and resources for your project based on the input data provided. @@ -56,10 +56,11 @@ by running some of the following additional pipelines: Please ensure that you have set up :ref:`PurlDB ` before running this pipeline. -What is the difference between scan_codebase and scan_package pipelines? ------------------------------------------------------------------------- +What is the difference between scan_codebase and scan_single_package pipelines? +------------------------------------------------------------------------------- -The key differences are that the :ref:`scan_package ` pipeline +The key differences are that the +:ref:`scan_single_package ` pipeline treats the input as if it were a single package, such as a package archive, and computes a **License clarity** and a **Scan summary** to aggregate the package scan data: @@ -116,8 +117,9 @@ The following tools and libraries are used during the docker images analysis pip - Secondary libraries and plugins from `scancode-plugins `_. -The pipeline documentation is available at :ref:`pipeline_docker` and its source code -at `docker.py `_. +The pipeline documentation is available at :ref:`pipeline_analyze_docker_image` and +its source code at +`docker.py `_. It is hopefully designed to be simple and readable code. Am I able to run ScanCode.io on Windows? diff --git a/docs/output-files.rst b/docs/output-files.rst index ee3e7d1ea..2ed3ec051 100644 --- a/docs/output-files.rst +++ b/docs/output-files.rst @@ -69,7 +69,7 @@ as shown below ], "runs": [ { - "pipeline_name": "docker", + "pipeline_name": "analyze_docker_image", "description": "A pipeline to analyze a Docker image.", "uuid": "5f1ec0c5-91ed-45c8-ab3d-beae44018716", "created_date": "2021-06-13T00:50:18.367560Z", @@ -78,7 +78,7 @@ as shown below "task_end_date": "2021-06-13T01:20:56.486136Z", "task_exitcode": 0, "task_output": "", - "log": "2021-06-13 01:20:47.66 Pipeline [docker] starting\n2021-06-13 01:20:47.66 Step [extract_images] starting\n2021-06-13 01:20:47.72 Step [extract_images] completed in 0.05 seconds\n2021-06-13 01:20:47.72 Step [extract_layers] starting\n2021-06-13 01:20:47.84 Step [extract_layers] completed in 0.12 seconds\n2021-06-13 01:20:47.84 Step [find_images_linux_distro] starting\n2021-06-13 01:20:47.84 Step [find_images_linux_distro] completed in 0.00 seconds\n2021-06-13 01:20:47.85 Step [collect_images_information] starting\n2021-06-13 01:20:47.85 Step [collect_images_information] completed in 0.00 seconds\n2021-06-13 01:20:47.85 Step [collect_and_create_codebase_resources] starting\n2021-06-13 01:20:48.65 Step [collect_and_create_codebase_resources] completed in 0.79 seconds\n2021-06-13 01:20:48.65 Step [collect_and_create_system_packages] starting\n2021-06-13 01:20:50.89 Step [collect_and_create_system_packages] completed in 2.24 seconds\n2021-06-13 01:20:50.89 Step [flag_uninteresting_codebase_resources] starting\n2021-06-13 01:20:50.90 Step [tag_uninteresting_codebase_resources] completed in 0.00 seconds\n2021-06-13 01:20:50.90 Step [tag_empty_files] starting\n2021-06-13 01:20:50.91 Step [tag_empty_files] completed in 0.00 seconds\n2021-06-13 01:20:50.91 Step [scan_for_application_packages] starting\n2021-06-13 01:20:50.98 Step [scan_for_application_packages] completed in 0.07 seconds\n2021-06-13 01:20:50.98 Step [scan_for_files] starting\n2021-06-13 01:20:56.46 Step [scan_for_files] completed in 5.48 seconds\n2021-06-13 01:20:56.46 Step [analyze_scanned_files] starting\n2021-06-13 01:20:56.47 Step [analyze_scanned_files] completed in 0.00 seconds\n2021-06-13 01:20:56.47 Step [tag_not_analyzed_codebase_resources] starting\n2021-06-13 01:20:56.48 Step [tag_not_analyzed_codebase_resources] completed in 0.00 seconds\n2021-06-13 01:20:56.48 Pipeline completed\n", + "log": "2021-06-13 01:20:47.66 Pipeline [analyze_docker_image] starting\n2021-06-13 01:20:47.66 Step [extract_images] starting\n2021-06-13 01:20:47.72 Step [extract_images] completed in 0.05 seconds\n2021-06-13 01:20:47.72 Step [extract_layers] starting\n2021-06-13 01:20:47.84 Step [extract_layers] completed in 0.12 seconds\n2021-06-13 01:20:47.84 Step [find_images_linux_distro] starting\n2021-06-13 01:20:47.84 Step [find_images_linux_distro] completed in 0.00 seconds\n2021-06-13 01:20:47.85 Step [collect_images_information] starting\n2021-06-13 01:20:47.85 Step [collect_images_information] completed in 0.00 seconds\n2021-06-13 01:20:47.85 Step [collect_and_create_codebase_resources] starting\n2021-06-13 01:20:48.65 Step [collect_and_create_codebase_resources] completed in 0.79 seconds\n2021-06-13 01:20:48.65 Step [collect_and_create_system_packages] starting\n2021-06-13 01:20:50.89 Step [collect_and_create_system_packages] completed in 2.24 seconds\n2021-06-13 01:20:50.89 Step [flag_uninteresting_codebase_resources] starting\n2021-06-13 01:20:50.90 Step [tag_uninteresting_codebase_resources] completed in 0.00 seconds\n2021-06-13 01:20:50.90 Step [tag_empty_files] starting\n2021-06-13 01:20:50.91 Step [tag_empty_files] completed in 0.00 seconds\n2021-06-13 01:20:50.91 Step [scan_for_application_packages] starting\n2021-06-13 01:20:50.98 Step [scan_for_application_packages] completed in 0.07 seconds\n2021-06-13 01:20:50.98 Step [scan_for_files] starting\n2021-06-13 01:20:56.46 Step [scan_for_files] completed in 5.48 seconds\n2021-06-13 01:20:56.46 Step [analyze_scanned_files] starting\n2021-06-13 01:20:56.47 Step [analyze_scanned_files] completed in 0.00 seconds\n2021-06-13 01:20:56.47 Step [tag_not_analyzed_codebase_resources] starting\n2021-06-13 01:20:56.48 Step [tag_not_analyzed_codebase_resources] completed in 0.00 seconds\n2021-06-13 01:20:56.48 Pipeline completed\n", "execution_time": 8 } ], diff --git a/docs/rest-api.rst b/docs/rest-api.rst index 776133e4a..a5f27c373 100644 --- a/docs/rest-api.rst +++ b/docs/rest-api.rst @@ -94,7 +94,7 @@ Using cURL: data='{ "name": "project_name", "input_urls": "https://download.url/package.archive", - "pipeline": "scan_package", + "pipeline": "scan_single_package", "execute_now": true }' @@ -111,7 +111,7 @@ Using cURL: upload_file="/path/to/the/archive.zip" curl -F "name=project_name" \ - -F "pipeline=scan_package" \ + -F "pipeline=scan_single_package" \ -F "execute_now=True" \ -F "upload_file=@$upload_file" \ "$api_url" @@ -131,7 +131,7 @@ Using Python and the **"requests"** library: data = { "name": "project_name", "input_urls": "https://download.url/package.archive", - "pipeline": "scan_package", + "pipeline": "scan_single_package", "execute_now": True, } response = requests.post(api_url, data=data) @@ -149,7 +149,7 @@ Using Python and the **"requests"** library: api_url = "http://localhost/api/projects/" data = { "name": "project_name", - "pipeline": "scan_package", + "pipeline": "scan_single_package", "execute_now": True, } files = {"upload_file": open("/path/to/the/archive.zip", "rb")} @@ -279,7 +279,7 @@ Using cURL: api_url="http://localhost/api/projects/6461408c-726c-4b70-aa7a-c9cc9d1c9685/add_pipeline/" content_type="Content-Type: application/json" data='{ - "pipeline": "docker", + "pipeline": "analyze_docker_image", "execute_now": true }' @@ -434,7 +434,7 @@ The run details view returns all information available about a pipeline run. { "url": "http://localhost/api/runs/8d5c3962-5fca-47d7-b8c8-47a19247714e/", - "pipeline_name": "scan_package", + "pipeline_name": "scan_single_package", "status": "success", "description": "A pipeline to scan a single package archive with ScanCode-toolkit.", "project": "http://localhost/api/projects/cd5b0459-303f-4e92-99c4-ea6d0a70193e/", diff --git a/docs/tutorial_api_analyze_package_archive.rst b/docs/tutorial_api_analyze_package_archive.rst index dd1f9406d..010c89cda 100644 --- a/docs/tutorial_api_analyze_package_archive.rst +++ b/docs/tutorial_api_analyze_package_archive.rst @@ -15,7 +15,7 @@ Instructions: - First, let's create a new project called ``boolean.py-3.8``. - We'll be using this `package `_ as the project input. -- We can add and execute the scan_package pipeline on our new project. +- We can add and execute the scan_single_package pipeline on our new project. .. note:: Whether you follow this tutorial and previous instructions using cURL or @@ -33,7 +33,7 @@ Using cURL data='{ "name": "boolean.py-3.8", "input_urls": "https://github.com/bastikr/boolean.py/archive/refs/tags/v3.8.zip", - "pipeline": "scan_package", + "pipeline": "scan_single_package", "execute_now": true }' @@ -52,7 +52,7 @@ Using cURL { "name": "boolean.py-3.8", "input_urls": "https://github.com/bastikr/boolean.py/archive/refs/tags/v3.8.zip", - "pipeline": "scan_package", + "pipeline": "scan_single_package", "execute_now": true } @@ -100,7 +100,7 @@ Using Python script data = { "name": "boolean.py-3.8", "input_urls": "https://github.com/bastikr/boolean.py/archive/refs/tags/v3.8.zip", - "pipeline": "scan_package", + "pipeline": "scan_single_package", "execute_now": True, } response = requests.post(api_url, data=data) diff --git a/docs/tutorial_cli_analyze_docker_image.rst b/docs/tutorial_cli_analyze_docker_image.rst index 81fbc2e73..0d3f0ef3a 100644 --- a/docs/tutorial_cli_analyze_docker_image.rst +++ b/docs/tutorial_cli_analyze_docker_image.rst @@ -79,15 +79,15 @@ Instructions Alternatively, you can copy files manually to the :guilabel:`input/` directory to include entire directories. -- Add the docker pipeline to your project: +- Add the ``analyze_docker_image`` pipeline to your project: .. code-block:: console - $ scanpipe add-pipeline --project staticbox docker + $ scanpipe add-pipeline --project staticbox analyze_docker_image .. code-block:: console - >> Pipeline docker added to the project + >> Pipeline analyze_docker_image added to the project - Check the status of the pipeline added to your project: @@ -97,7 +97,7 @@ Instructions .. code-block:: console - >> [NOT_STARTED] docker + >> [NOT_STARTED] analyze_docker_image .. note:: The ``scanpipe show-pipeline`` command lists all the pipelines added to the @@ -106,8 +106,8 @@ Instructions already running, pipelines with **"SUCCESS"** or **"FAILURE"** status, and those will be running next, pipelines with **"NOT_STARTED"** status as shown below. -- Run the docker pipeline on this project. In the output, you will be shown - the pipeline's execution progress: +- Run the ``analyze_docker_image`` pipeline on this project. In the output, you will be + shown the pipeline's execution progress: .. code-block:: console @@ -115,17 +115,17 @@ Instructions .. code-block:: console - >> Pipeline docker run in progress... - Pipeline [docker] starting + >> Pipeline analyze_docker_image run in progress... + Pipeline [analyze_docker_image] starting Step [extract_images] starting Step [extract_images] completed in 0.18 seconds Step [extract_layers] starting [...] Pipeline completed - docker successfully executed on project staticbox + analyze_docker_image successfully executed on project staticbox - Executing the ``show-pipeline`` command again will also confirm the success - of the pipeline execution - **"[SUCCESS] docker"** status: + of the pipeline execution - **"[SUCCESS] analyze_docker_image"** status: .. code-block:: console @@ -133,7 +133,7 @@ Instructions .. code-block:: console - >> [SUCCESS] docker + >> [SUCCESS] analyze_docker_image - Get the results of the pipeline execution as a JSON file using the ``output`` command: @@ -155,11 +155,12 @@ Instructions after the project creation. For example, the following command will create a project named ``staticbox2``, download the test Docker image to the project's :guilabel:`input/` - directory, add the docker pipeline, and execute the pipeline in one operation: + directory, add the ``analyze_docker_image`` pipeline, and execute the pipeline in + one operation: .. code-block:: bash $ scanpipe create-project staticbox2 \ --input-url https://github.com/nexB/scancode.io-tutorial/releases/download/sample-images/30-alpine-nickolashkraus-staticbox-latest.tar \ - --pipeline docker \ + --pipeline analyze_docker_image \ --execute diff --git a/docs/tutorial_web_ui_analyze_docker_image.rst b/docs/tutorial_web_ui_analyze_docker_image.rst index 5374e3226..0f08fe37f 100644 --- a/docs/tutorial_web_ui_analyze_docker_image.rst +++ b/docs/tutorial_web_ui_analyze_docker_image.rst @@ -38,9 +38,10 @@ Instructions - Paste the input Docker image's URL, `docker://alpine/httpie `_, in the **"Download URL"** field, which fetches the image from the provided URL. -- Use the **"Pipeline"** dropdown list, add the **"docker"** pipeline to your project -- You can add and execute the docker pipeline in one operation by checking the - **"Execute pipeline now"** checkbox. +- Use the **"Pipeline"** dropdown list, add the ``analyze_docker_image`` pipeline to + your project. +- You can add and execute the ``analyze_docker_image`` pipeline in one operation by + checking the **"Execute pipeline now"** checkbox. .. image:: images/tutorial-web-ui-project-form.png