Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Code Refactor ruff check --fix --extend-select I #73

Merged
merged 1 commit into from
Jun 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
90 changes: 52 additions & 38 deletions .github/workflows/check_domains.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ name: Check Domains
on:
schedule:
# Runs every day at 05:00 UTC
- cron: '0 5 * * *'
- cron: "0 5 * * *"
workflow_dispatch:

jobs:
Expand All @@ -15,46 +15,60 @@ jobs:
strategy:
fail-fast: false
matrix:
domain: ["ultralytics.com", "ultralitics.com", "ultralytics.ai", "ultralytics.app", "ultralytics.eu", "ultralytics.es", "ultralytics.io", "ultralytics.net", "ultralytics.org", "yolov5.com", "pjreddie.org", "pjreddie.net"]
domain:
[
"ultralytics.com",
"ultralitics.com",
"ultralytics.ai",
"ultralytics.app",
"ultralytics.eu",
"ultralytics.es",
"ultralytics.io",
"ultralytics.net",
"ultralytics.org",
"yolov5.com",
"pjreddie.org",
"pjreddie.net",
]
prefix: ["www.", ""]
steps:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.x"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install requests
- name: Check domain redirections
shell: python
run: |
import requests
def check_domain_redirection(domain, prefix, max_attempts=3):
"""
Check if the given domain redirects to ultralytics.com correctly, with up to 3 retries.
"""
url = f"https://{prefix}{domain}"
print(f"\nChecking {url}")
for attempt in range(max_attempts):
try:
response = requests.get(url, allow_redirects=True)
response.raise_for_status()
final_url = response.url
if "ultralytics.com" in final_url and response.status_code == 200:
print("Success βœ…")
return True
except requests.RequestException as e:
print(f"Error: {e}")
if attempt == max_attempts - 1:
print(f"Failed after {max_attempts} attempts ❌.")
return False
return False
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.x"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install requests
- name: Check domain redirections
shell: python
run: |
import requests

def check_domain_redirection(domain, prefix, max_attempts=3):
"""
Check if the given domain redirects to ultralytics.com correctly, with up to 3 retries.
"""
url = f"https://{prefix}{domain}"
print(f"\nChecking {url}")
for attempt in range(max_attempts):
try:
response = requests.get(url, allow_redirects=True)
response.raise_for_status()
final_url = response.url
if "ultralytics.com" in final_url and response.status_code == 200:
print("Success βœ…")
return True
except requests.RequestException as e:
print(f"Error: {e}")
if attempt == max_attempts - 1:
print(f"Failed after {max_attempts} attempts ❌.")
return False
return False

success = check_domain_redirection('${{ matrix.domain }}', '${{ matrix.prefix }}')
if not success:
raise Exception(f"Domain check failed for ${{ matrix.domain }} with prefix '${{ matrix.prefix }}'")
success = check_domain_redirection('${{ matrix.domain }}', '${{ matrix.prefix }}')
if not success:
raise Exception(f"Domain check failed for ${{ matrix.domain }} with prefix '${{ matrix.prefix }}'")

Summary:
runs-on: ubuntu-latest
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/download_websites.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ name: Download Websites
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * *' # runs at 00:00 UTC every day
- cron: "0 0 * * *" # runs at 00:00 UTC every day
push:
branches:
- gh-pages
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/format.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ name: Ultralytics Actions

on:
push:
branches: [main,master]
branches: [main, master]
pull_request:
branches: [main,master]
branches: [main, master]

jobs:
format:
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/links.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ name: Check Broken links
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * *' # runs at 00:00 UTC every day
- cron: "0 0 * * *" # runs at 00:00 UTC every day
push:
branches:
- main
Expand All @@ -27,12 +27,12 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
branch: [main, gh-pages] # Define the branches to test
branch: [main, gh-pages] # Define the branches to test
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
ref: ${{ matrix.branch }} # Use matrix to check out the correct branch
ref: ${{ matrix.branch }} # Use matrix to check out the correct branch

- name: Download and install lychee
run: |
Expand Down
14 changes: 7 additions & 7 deletions .github/workflows/sitemaps.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,13 @@ jobs:
- name: Checkout Repo
uses: actions/checkout@v4
with:
ref: gh-pages # checkout gh-pages branch
fetch-depth: 2 # fetch the current and previous commit
ref: gh-pages # checkout gh-pages branch
fetch-depth: 2 # fetch the current and previous commit

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.x'
python-version: "3.x"

- name: Install Dependencies
run: |
Expand Down Expand Up @@ -78,7 +78,7 @@ jobs:
import os
import re
import requests

def submit_urls_to_indexnow(host, urls):
key = os.environ['INDEXNOW_KEY']
endpoint = "https://api.indexnow.org/indexnow" # static API endpoint from https://www.indexnow.org/faq
Expand All @@ -92,15 +92,15 @@ jobs:
print(f"Failed to submit batch of URLs: Status code {response.status_code}, Response: {response.text}")
except Exception as e:
print(f"ERROR ❌: Failed to submit batch of URLs - {e}")

def extract_urls_from_sitemap(sitemap_url):
try:
response = requests.get(sitemap_url)
return re.findall(r"<loc>(.*?)</loc>", response.text)
except Exception as e:
print(f"ERROR ❌: Failed to extract URLs from {sitemap_url} - {e}")
return []

def filter_modified_urls(urls, modified_files):
# Filter URLs based on modified files
modified_urls = []
Expand All @@ -110,7 +110,7 @@ jobs:
if full_url in urls:
modified_urls.append(full_url)
return modified_urls

# Submit URLs from each sitemap to IndexNow
host = "docs.ultralytics.com"
all_urls = []
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/stale.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
name: Close stale issues
on:
schedule:
- cron: '0 0 * * *' # Runs at 00:00 UTC every day
- cron: "0 0 * * *" # Runs at 00:00 UTC every day

jobs:
stale:
Expand Down Expand Up @@ -43,5 +43,5 @@ jobs:
days-before-issue-close: 10
days-before-pr-stale: 90
days-before-pr-close: 30
exempt-issue-labels: 'documentation,tutorial,TODO'
operations-per-run: 300 # The maximum number of operations per run, used to control rate limiting.
exempt-issue-labels: "documentation,tutorial,TODO"
operations-per-run: 300 # The maximum number of operations per run, used to control rate limiting.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,13 +49,13 @@ mkdocs serve

- #### Command Breakdown:

- `mkdocs` is the main MkDocs command-line interface.
- `serve` is the subcommand to build and locally serve your documentation.
- `mkdocs` is the main MkDocs command-line interface.
- `serve` is the subcommand to build and locally serve your documentation.

- 🧐 Note:

- Grasp changes to the docs in real-time as `mkdocs serve` supports live reloading.
- To stop the local server, press `CTRL+C`.
- Grasp changes to the docs in real-time as `mkdocs serve` supports live reloading.
- To stop the local server, press `CTRL+C`.

## 🌍 Building and Serving Multi-Language

Expand Down
Loading