UpdateHistoricoAlerta is a tool designed to automate the process of copying SQL script files generated by AlertTools and sending them to a server for database updates. Additionally, it allows for the deployment of incidence map images on the AlertaDengue homepage.
This playbook is responsible for preparing the hosts by creating directories and including another playbook for further tasks. It targets the 'localhost' machine to perform initial setup.
- Create Directories: This playbook checks if an output directory (
/tmp/sql/
) exists on the local machine (localhost
). If the directory doesn't exist, it creates it. This task is used to ensure the required directory structure is in place. - Include a Play for the Tasks: This task includes another playbook,
historico-alert-update.yaml
, that contains the specific tasks for updating historical alerts data.. This is done using theansible.builtin.import_playbook
module. It effectively extends the execution by incorporating tasks from the included playbook.
username
: This variable is set to the username of the user running the playbook on the local machine.
Note: The playbook serves as a preparation step before executing more complex tasks related to updating historical alerts data., which are defined in the included playbook (historico-alert-update.yaml
).
This playbook is responsible for updating historical alerts data. on a cluster of hosts. It is designed to run tasks in parallel on multiple hosts specified in the 'cluster' group.
- Check SSH Availability: This task checks if SSH is running on the specified
desired_port
for each host in the cluster. It verifies SSH availability on each host before proceeding with other tasks. - Configure Ansible Port: Ansible dynamically sets the
ansible_port
based on thedesired_port
, ensuring that Ansible communicates with the correct SSH port on each host. - Create Directories: This task ensures that the required directories (
SRC_DIR
) for SQL files exist on the local machine (localhost
). If not, it creates the directories. - Copy SQL Files: SQL files from the role's scripts directory (
../ansible/roles/hist-alertas-update/scripts/sql/*.sql
) are copied to the localSRC_DIR
. These SQL files are later used to update historical alerts data. in the database. - Copy Script to Server: The SQL file (
SQL_FNAME
) is copied to the destination directory (DEST_DIR
) on each cluster node. Ownership and permissions are set for the copied file. - Add Data to Database: This task runs the SQL script on the PostgreSQL database. It utilizes the
psql
command to connect to the database and execute the SQL file. - Execute Script on Server: The playbook executes a Bash script (
SCRIPT_NAME
) on each cluster node. The script's execution is detached withnohup
to ensure it continues running even if the SSH session ends. It is run with elevated privileges (administrador
user). - Remove SQL Scripts: SQL files are removed from both the local
SRC_DIR
and the/tmp/sql/
directory on each cluster node after execution. - Register Upload in Log File: This task registers the update operation in a log file (
system_update_epiweeks.log
). It appends a timestamp and the SQL filename to the log, providing a record of the update process.
desired_port
: The desired SSH port for the cluster nodes.yearweek
: The year and week for which the historical alerts data. are being updated.disease
: The specific disease for which historical alerts data. are being updated.db_user
,db_name
,db_password
: Database connection details.SRC_DIR
,DEST_DIR
: Directories for storing SQL files on the local and remote machines.SQL_FNAME
: The name of the SQL file generated based on the year, week, and disease.SCRIPT_DIR
: The directory of the Bash script to be executed on remote servers.SCRIPT_NAME
: The name of the script to be executed on the remote servers.LOG_PATH
: The path to the log file where update records are maintained.
This playbook is responsible for synchronizing map images from a source directory to a destination directory on remote servers and executing a script. It targets the 'cluster' group of hosts.
- Synchronize Map Images Directory: Uses the Ansible
synchronize
module to copy map images from the source directory (SRC_DIR
) to the destination directory (DEST_DIR
). This task runs on the local machine (localhost
) and is delegated to the same machine. - Execute Script on Server: Executes a shell script named
sync_incidence_maps.sh
on the remote servers. The script is located in theSCRIPT_DIR
. The task runs with elevated privileges (become: true
), and the user executing the script is 'administrador'. - Register sent images in the log file: Appends a log entry to a log file (
system_update_epiweeks.log
) located in theLOG_PATH
. The log entry includes the current timestamp and the name of the script (SCRIPT_NAME
) that was executed.
SCRIPT_NAME
: The name of the script to be executed on the remote servers.SCRIPT_DIR
: The directory path where the script is located.ROLE_MAPS_DIR
: The relative directory path to the role's map images directory.SRC_DIR
: The full path to the source directory containing map images to be synchronized.DEST_DIR
: The full path to the destination directory where map images will be copied.LOG_PATH
: The path to the directory where log files are stored.
Note: The playbook assumes the existence of the sync_incidence_maps.sh
script in the specified directory and the availability of the necessary permissions to execute the script.
This playbook updates the database and deploys container services by executing scripts on the server and logging the operations. It is designed for operations requiring database updates and container deployment across a cluster of hosts.
- Execute Update Script: Executes a script to update historical alert data in the database.
- Execute Staging Script: Deploys updated data to a staging environment for testing.
- Log Updates: Records the completion of the database update process in the system log.
- Log Deployment: Records the deployment of container services in the system log.
SCRIPT_UPDATE_TABLES
: The script for updating tables in the database.SCRIPT_UPDATE_STAGING
: The script for updating the staging environment.SCRIPT_PROD_DIR
: The production directory for AlertaDengue services.SCRIPT_STAGING_DIR
: The staging directory for AlertaDengue services.LOG_PATH
: The path to where log files are stored.
Note: This playbook ensures that database updates and container deployments are executed smoothly and logged for audit and tracking purposes.
For development, we encourage you to use conda
. If you don't know
what is that, check these links:
- In Spanish: https://opensciencelabs.org/blog/como-instalar-y-comenzar-utilizar-conda/
- In English: https://cloudsmith.com/blog/what-is-conda/
We recommend you to use mamba-forge, a combination of miniconda + conda-forge + mamba. You can download it from here: https://github.com/conda-forge/miniforge#mambaforge
Here’s how to set up UpdateHistoricoAlerta for local environment.
- Clone this repository locally:
$ git clone git@github.com:AlertaDengue/UpdateHistoricoAlerta.git
- Create a conda environment and activate it:
$ cd UpdateHistoricoAlerta
$ mamba env create --file conda/base.yaml
and
$ conda activate update-alertas
To use the Makim
, you need to have makim
installed on your system. You can execute the defined targets by running the following command in your project directory
makim <target>
# to visualize all makim commands execute:
makim --help
# to install maki autocompletions execute:
makim --install-autocompletion
Replace with one of the following available targets:
- Creates a new vault configuration file using
ansible-vault create
. This file is used for securely storing sensitive data.
- Edits the existing vault configuration file using
ansible-vault edit
. You can modify sensitive data securely.
- Changes the password of the vault configuration file for added security.
- Executes an Ansible playbook to update alerts. It activates the virtual environment, prompts for a vault password, and runs the
historico-alert-prepare-hosts.yaml
playbook with additional variables, such asyearweek
anddisease
, specified via the-e
option.
- Allows you to view the execution history of Ansible tasks. It activates the virtual environment and uses the
ansible
command to run a task (cat /var/log/ansible/system_update_epiweeks.log
) on your servers to display the log.
- Executes an Ansible playbook to synchronize map images. Similar to the
update-alertas
target, it activates the virtual environment, prompts for a vault password, and runs theincidence-map-upload.yaml
playbook to synchronize map images.
makim ansible.update-alertas --disease dengue --yearweek 202406 # Execute the playbook to update alerts.
makim ansible.sync-maps # Execute the playbook to synchronize the incidence map images.
makim ansible.create-vault-config # See the vault-template in the "ansible/config" directory to configure your variables.
makim ansible.history --logfile incidence_maps_update.log #
makim ansible.containers-system-update $