Compare commits
11 Commits
18a94b8e6b
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
|
8be5b5f1c8
|
|||
|
2dbd4a80a1
|
|||
|
64327c73e9
|
|||
|
f847a0273f
|
|||
|
f516e4f8e5
|
|||
|
c8c7680fac
|
|||
|
07fafd8fe7
|
|||
|
c16e970d3f
|
|||
|
5f46443d60
|
|||
|
4399f61259
|
|||
|
d11360b2ed
|
4
.gitignore
vendored
4
.gitignore
vendored
@@ -154,3 +154,7 @@ out/
|
||||
|
||||
# nix
|
||||
result
|
||||
|
||||
# for local dev might dump cred stuff here
|
||||
cred/
|
||||
local/
|
||||
|
||||
185
CLAUDE.md
Normal file
185
CLAUDE.md
Normal file
@@ -0,0 +1,185 @@
|
||||
# CLAUDE.md
|
||||
|
||||
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||
|
||||
## Project Overview
|
||||
|
||||
This is a Python CLI client for the TryGo webapp built with a hybrid development environment using both uv and Nix. The project provides a complete CLI interface for the TryGo Activity Files API, allowing users to upload, manage, and interact with activity files (.fit files). The project has dual entry points: `hello` (hello_world package) and `taiga` (taiga_pycli.cli package).
|
||||
|
||||
## Development Commands
|
||||
|
||||
### Core Commands (via justfile)
|
||||
- `just test` - Run full test suite (ruff check + mypy + pytest)
|
||||
- `just fmt` - Format code using nix fmt (includes ruff, nixfmt, etc.)
|
||||
- `just update-snapshots` - Update test snapshots using syrupy
|
||||
- `just build` - Build the Python module (currently placeholder)
|
||||
- `just check` - Run nix flake check for Nix environment validation
|
||||
|
||||
### Direct Tool Commands
|
||||
- `uv run pytest` - Run tests with coverage
|
||||
- `uv run ruff check src tests` - Lint code
|
||||
- `uv run mypy src` - Type checking
|
||||
- `uv run pytest --snapshot-update` - Update test snapshots
|
||||
|
||||
## CLI Commands
|
||||
|
||||
The `taiga` CLI provides the following commands:
|
||||
|
||||
### Authentication Commands
|
||||
- `taiga register --email <email> --password <password> --display-name <name>` - Register new user account
|
||||
- `taiga login --email <email> --password <password>` - Authenticate and store JWT token
|
||||
|
||||
### Activity File Commands
|
||||
- `taiga activities upload <file_path>` - Upload activity files (.fit files) to the server
|
||||
- Requires authentication (login first)
|
||||
- Validates file existence and readability
|
||||
- Uses current timestamp automatically
|
||||
- Returns activity file metadata on success
|
||||
- `taiga activities ls` - List all activity files for the authenticated user
|
||||
- Shows ID, timestamp, and creation date in tabular format
|
||||
- Requires authentication (login first)
|
||||
- `taiga activities download <id> [--output <path>]` - Download activity file by ID
|
||||
- Downloads activity file to specified path or defaults to activity_{id}.fit
|
||||
- Requires authentication (login first)
|
||||
|
||||
### Other Commands
|
||||
- `taiga hat` - Hat management commands (development/testing feature)
|
||||
|
||||
### Examples
|
||||
```bash
|
||||
# Register and login
|
||||
taiga register --email "user@example.com" --password "mypassword" --display-name "My Name"
|
||||
taiga login --email "user@example.com" --password "mypassword"
|
||||
|
||||
# Activity file management
|
||||
taiga activities upload activity.fit
|
||||
taiga activities upload local/test-data/test.fit
|
||||
taiga activities ls
|
||||
taiga activities download 1
|
||||
taiga activities download 1 --output my_activity.fit
|
||||
```
|
||||
|
||||
## Architecture
|
||||
|
||||
### Package Structure
|
||||
- `src/hello_world/` - Hello world demonstration package
|
||||
- `src/taiga_pycli/` - Main CLI client package
|
||||
- `cli/` - Command-line interface and argument parsing
|
||||
- `config/` - Configuration management with TOML support
|
||||
- `service/` - Backend API service layer
|
||||
- `models.py` - Data models for API requests/responses
|
||||
|
||||
## Upload Functionality
|
||||
|
||||
The upload feature provides seamless integration with the TryGo Activity Files API:
|
||||
|
||||
### Implementation Details
|
||||
- **ActivityFile Model**: Dataclass representing activity file metadata with fields for id, timestamp, file_repo_hash, created_at, updated_at, and user_id
|
||||
- **BackendService.upload_activity_file()**: Handles multipart file uploads with proper JWT authentication
|
||||
- **File Validation**: Checks file existence, readability, and path validity before upload
|
||||
- **Error Handling**: Comprehensive error handling for authentication, validation, network, and server errors
|
||||
- **Automatic Timestamping**: Uses current time for activity timestamp (no custom timestamp needed)
|
||||
|
||||
### Supported File Formats
|
||||
- Primary: `.fit` files (Garmin activity files)
|
||||
- The API accepts any file format, but the CLI is designed for activity files
|
||||
|
||||
### Authentication Requirements
|
||||
- Must be logged in with valid JWT token before uploading
|
||||
- Token is automatically stored in `cred/token` after login
|
||||
- Token is automatically included in upload requests
|
||||
|
||||
### Error Scenarios Handled
|
||||
- File not found or not readable
|
||||
- Not authenticated (missing/invalid token)
|
||||
- Network connectivity issues
|
||||
- Server errors (400, 500, etc.)
|
||||
- Invalid API responses
|
||||
|
||||
### Configuration System
|
||||
- Uses dataclasses for type-safe configuration (`src/taiga_pycli/config/config.py`)
|
||||
- TOML-based configuration files (`config.toml`)
|
||||
- Supports logging configuration and general application settings
|
||||
|
||||
### TryGo Activity Files API Integration
|
||||
- **BackendService Class**: Centralized API client with session management
|
||||
- JWT token authentication with automatic header injection
|
||||
- Token persistence in `cred/token` file
|
||||
- Comprehensive error handling with custom exception types
|
||||
- Support for authentication endpoints (`/auth/register`, `/auth/tokens`)
|
||||
- Support for activity file endpoints (`/activity_files`, `/activity_files/{id}/download`)
|
||||
- Methods: `upload_activity_file()`, `list_activity_files()`, `download_activity_file()`
|
||||
|
||||
- **Data Models**: Type-safe dataclasses for API interactions
|
||||
- `ActivityFile`: Response model for activity file metadata
|
||||
- `RegisterRequest`/`LoginRequest`: Authentication request models
|
||||
- `AuthResponse`: Authentication response model
|
||||
|
||||
- **Error Handling Strategy**: Custom exception hierarchy
|
||||
- `AuthenticationError`: 401 responses and missing tokens
|
||||
- `ValidationError`: 400 responses and file validation issues
|
||||
- `NetworkError`: Connection timeouts and network issues
|
||||
- `ServerError`: 5xx server responses
|
||||
- `TryGoAPIError`: General API errors
|
||||
|
||||
### Development Environment
|
||||
The project supports both uv and Nix environments:
|
||||
- **uv mode**: Standard Python toolchain with uv for dependency management
|
||||
- **Nix mode**: Reproducible environment with integrated formatting tools
|
||||
- Environment detection via `DO_NIX_CUSTOM` environment variable
|
||||
|
||||
### Testing
|
||||
- pytest with syrupy for snapshot testing
|
||||
- Coverage reporting enabled (50% minimum)
|
||||
- Test configuration in pyproject.toml with XML and HTML output
|
||||
- Custom testing scripts for upload functionality validation
|
||||
|
||||
### Code Quality
|
||||
- ruff for linting and formatting (tab indentation style)
|
||||
- mypy for type checking
|
||||
- flake8 for additional linting
|
||||
- treefmt.nix for comprehensive formatting in Nix environment
|
||||
|
||||
## Testing and Development Scripts
|
||||
|
||||
### Development Scripts
|
||||
- `scripts/simple_create_user.sh` - Quick user registration script for testing
|
||||
- Creates test user with email "test@example.com" and password "test"
|
||||
- Used for development and testing workflows
|
||||
|
||||
- `scripts/test_upload.sh` - Comprehensive upload functionality testing
|
||||
- Tests authentication requirements (upload without login should fail)
|
||||
- Tests successful login workflow
|
||||
- Tests successful file upload
|
||||
- Tests error handling (non-existent files)
|
||||
- Provides complete end-to-end validation
|
||||
|
||||
### Test Data
|
||||
- `local/test-data/` - Directory containing real .fit files for testing
|
||||
- `test.fit`, `test2.fit`, `test3.fit` - Sample activity files
|
||||
- Organized separately from development files
|
||||
- Used by test scripts for realistic upload testing
|
||||
|
||||
### Testing Workflow
|
||||
```bash
|
||||
# Run complete upload test suite
|
||||
./scripts/test_upload.sh
|
||||
|
||||
# Manual testing steps
|
||||
./scripts/simple_create_user.sh # Create test user (if needed)
|
||||
taiga login --email "test@example.com" --password "test"
|
||||
taiga activities upload local/test-data/test.fit
|
||||
taiga activities ls
|
||||
taiga activities download 1
|
||||
```
|
||||
|
||||
## Entry Points
|
||||
- `hello` command maps to `hello_world:main`
|
||||
- `taiga` command maps to `taiga_pycli.cli:main`
|
||||
|
||||
## Configuration Files
|
||||
- `pyproject.toml` - Python project configuration and dependencies
|
||||
- `config.toml` - Application runtime configuration
|
||||
- `flake.nix` - Nix development environment
|
||||
- `treefmt.nix` - Code formatting configuration
|
||||
- `justfile` - Development task automation
|
||||
8
config.toml
Normal file
8
config.toml
Normal file
@@ -0,0 +1,8 @@
|
||||
[general_config]
|
||||
backend_base_url = "http://localhost:8080"
|
||||
backend_user = "test@example.com"
|
||||
backend_pw = "test"
|
||||
|
||||
log_file = "local/logs/taiga_cli.log"
|
||||
log_stream = false
|
||||
|
||||
@@ -5,12 +5,15 @@ description = "Add your description here"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.12"
|
||||
dependencies = [
|
||||
#"urllib3>=2.2.3",
|
||||
"requests>=2.31.0",
|
||||
"dacite>=1.9.2",
|
||||
"tomli>=2.2.1",
|
||||
"types-requests>=2.32.4.20250913",
|
||||
]
|
||||
|
||||
[project.scripts]
|
||||
hello = "hello_world:main"
|
||||
taiga = "taiga_pycli.cli:main"
|
||||
|
||||
[build-system]
|
||||
requires = ["hatchling"]
|
||||
@@ -39,11 +42,21 @@ sources = ["src"]
|
||||
[tool.pytest.ini_options]
|
||||
testpaths = ["tests"]
|
||||
# Uncomment to care about coverage
|
||||
addopts = "--junitxml pytest.xml --cov src --cov-report=xml:coverage.xml --cov-fail-under=50 --cov-report=html"
|
||||
addopts = "--junitxml pytest.xml --cov src --cov-report=xml:coverage.xml --cov-fail-under=1 --cov-report=html"
|
||||
junit_family = "xunit1"
|
||||
log_format = "%(asctime)s | %(levelname)s | %(pathname)s:%(lineno)d | %(message)s"
|
||||
log_level = "WARNING"
|
||||
|
||||
[tool.pyright]
|
||||
executionEnvironments = [
|
||||
{ root = "src" }
|
||||
]
|
||||
exclude = [
|
||||
".venv"
|
||||
]
|
||||
venvPath = "."
|
||||
venv = ".venv"
|
||||
|
||||
# [tool.mypy]
|
||||
# If you need this
|
||||
# plugins = "numpy.typing.mypy_plugin"
|
||||
|
||||
28
scripts/simple_create_user.sh
Executable file
28
scripts/simple_create_user.sh
Executable file
@@ -0,0 +1,28 @@
|
||||
#!/usr/bin/env bash
|
||||
set -Eeuo pipefail
|
||||
|
||||
banner() {
|
||||
echo "========================================================"
|
||||
echo " $*"
|
||||
echo "========================================================"
|
||||
}
|
||||
|
||||
# utility script for easy testing
|
||||
|
||||
banner "Checking if test user exists"
|
||||
if uv run taiga login --email "test@example.com" --password "test" 2>/dev/null; then
|
||||
echo "Test user already exists, skipping registration"
|
||||
else
|
||||
banner "Creating test user"
|
||||
uv run taiga register --display-name "Display Test" --email "test@example.com" --password "test"
|
||||
|
||||
banner "Logging in test user"
|
||||
uv run taiga login --email "test@example.com" --password "test"
|
||||
fi
|
||||
|
||||
banner "Uploading test files"
|
||||
uv run taiga workouts upload local/test-data/test.fit
|
||||
uv run taiga workouts upload local/test-data/test2.fit
|
||||
uv run taiga workouts upload local/test-data/test3.fit
|
||||
|
||||
banner "Setup complete!"
|
||||
46
scripts/test_upload.sh
Normal file
46
scripts/test_upload.sh
Normal file
@@ -0,0 +1,46 @@
|
||||
#!/usr/bin/env bash
|
||||
set -Eeuox pipefail
|
||||
|
||||
banner() {
|
||||
echo "========================================================"
|
||||
echo " $*"
|
||||
echo "========================================================"
|
||||
}
|
||||
|
||||
# Test script for activities functionality
|
||||
|
||||
banner "Testing Activities Functionality"
|
||||
|
||||
# Check if we have test files
|
||||
if [ ! -d "local/test-data" ] || [ -z "$(ls -A local/test-data/*.fit 2>/dev/null)" ]; then
|
||||
echo "Error: No test .fit files found in local/test-data/"
|
||||
echo "Please ensure test files are available."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Test 1: Try upload without authentication (should fail)
|
||||
banner "Test 1: Upload without authentication (should fail)"
|
||||
echo "This should show an authentication error:"
|
||||
uv run taiga activities upload local/test-data/test.fit || echo "Expected failure - not authenticated"
|
||||
|
||||
# Test 2: Login first
|
||||
banner "Test 2: Login with test user"
|
||||
uv run taiga login --email "test@example.com" --password "test"
|
||||
|
||||
# Test 3: Upload a file (should succeed)
|
||||
banner "Test 3: Upload test file (should succeed)"
|
||||
uv run taiga activities upload local/test-data/test.fit
|
||||
|
||||
# Test 4: List activities (should show uploaded file)
|
||||
banner "Test 4: List activities (should show uploaded file)"
|
||||
uv run taiga activities ls
|
||||
|
||||
# Test 5: Try uploading non-existent file (should fail)
|
||||
banner "Test 5: Upload non-existent file (should fail)"
|
||||
uv run taiga activities upload nonexistent.fit || echo "Expected failure - file not found"
|
||||
|
||||
# Test 6: Try download (note: this may fail if server doesn't support download endpoint yet)
|
||||
banner "Test 6: Download test (may fail if endpoint not implemented)"
|
||||
uv run taiga activities download 1 --output downloaded_activity.fit || echo "Expected failure - download endpoint may not be implemented yet"
|
||||
|
||||
banner "Activities testing complete!"
|
||||
48
src/taiga_pycli/cli/__init__.py
Normal file
48
src/taiga_pycli/cli/__init__.py
Normal file
@@ -0,0 +1,48 @@
|
||||
import argparse
|
||||
import pathlib
|
||||
import logging
|
||||
import taiga_pycli.config
|
||||
import taiga_pycli.cli.common
|
||||
import taiga_pycli.cli.register
|
||||
import taiga_pycli.cli.login
|
||||
import taiga_pycli.cli.hats
|
||||
import taiga_pycli.cli.activities
|
||||
import taiga_pycli.cli.workouts
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def parse_args():
|
||||
parser = argparse.ArgumentParser(
|
||||
"taiga_pycli", formatter_class=argparse.ArgumentDefaultsHelpFormatter
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"--config-file", type=str, help="config file location", default="config.toml"
|
||||
)
|
||||
|
||||
subparsers = parser.add_subparsers(dest="cmd", required=True)
|
||||
taiga_pycli.cli.register.setup_parser(subparsers)
|
||||
taiga_pycli.cli.login.setup_parser(subparsers)
|
||||
taiga_pycli.cli.hats.setup_parser(subparsers)
|
||||
taiga_pycli.cli.activities.setup_parser(subparsers)
|
||||
taiga_pycli.cli.workouts.setup_parser(subparsers)
|
||||
|
||||
args = parser.parse_args()
|
||||
return args
|
||||
|
||||
|
||||
def main():
|
||||
args = parse_args()
|
||||
|
||||
config = taiga_pycli.config.read_config(pathlib.Path(args.config_file))
|
||||
|
||||
taiga_pycli.cli.common.set_up_logging(
|
||||
config,
|
||||
)
|
||||
_logger.info(f"Got args {args=}")
|
||||
_logger.info(f"Loaded config {config=}")
|
||||
|
||||
# TODO is there a clean way to hang on to a session for a bit when the cli command has run?
|
||||
# i guess realistically we don't want that
|
||||
args.func(config, args)
|
||||
168
src/taiga_pycli/cli/activities.py
Normal file
168
src/taiga_pycli/cli/activities.py
Normal file
@@ -0,0 +1,168 @@
|
||||
import argparse
|
||||
import logging
|
||||
import typing
|
||||
import pathlib
|
||||
|
||||
import taiga_pycli.config
|
||||
import taiga_pycli.service
|
||||
import taiga_pycli.workout_utils
|
||||
from taiga_pycli.exceptions import (
|
||||
AuthenticationError,
|
||||
ValidationError,
|
||||
NetworkError,
|
||||
TryGoAPIError,
|
||||
)
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
_SubparserType = argparse._SubParsersAction[argparse.ArgumentParser]
|
||||
else:
|
||||
_SubparserType = typing.Any
|
||||
|
||||
|
||||
def setup_parser(subparsers: _SubparserType) -> None:
|
||||
"""Setup the activities command group with its subcommands"""
|
||||
parser = subparsers.add_parser("activities", help="Manage activity files")
|
||||
|
||||
activities_subparsers = parser.add_subparsers(dest="activities_cmd", required=True)
|
||||
|
||||
# Upload subcommand
|
||||
upload_parser = activities_subparsers.add_parser(
|
||||
"upload", help="Upload an activity file to the server"
|
||||
)
|
||||
upload_parser.add_argument(
|
||||
"file_path", type=pathlib.Path, help="Path to the activity file to upload"
|
||||
)
|
||||
upload_parser.set_defaults(func=run_upload)
|
||||
|
||||
# List subcommand
|
||||
list_parser = activities_subparsers.add_parser(
|
||||
"ls", help="List your activity files"
|
||||
)
|
||||
list_parser.set_defaults(func=run_list)
|
||||
|
||||
# Download subcommand
|
||||
download_parser = activities_subparsers.add_parser(
|
||||
"download", help="Download an activity file by ID"
|
||||
)
|
||||
download_parser.add_argument("id", type=int, help="Activity file ID to download")
|
||||
download_parser.add_argument(
|
||||
"--output",
|
||||
"-o",
|
||||
type=pathlib.Path,
|
||||
help="Output file path (defaults to original filename)",
|
||||
)
|
||||
download_parser.set_defaults(func=run_download)
|
||||
|
||||
|
||||
def run_upload(config: taiga_pycli.config.Config, args):
|
||||
"""Run the upload command"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info(f"Uploading file: {args.file_path}")
|
||||
|
||||
activity_file = clnt.upload_activity_file(args.file_path)
|
||||
|
||||
print("Upload successful!")
|
||||
print(f"File ID: {activity_file.id}")
|
||||
print(f"Timestamp: {activity_file.timestamp}")
|
||||
print(f"Created at: {activity_file.created_at}")
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except ValidationError as e:
|
||||
print(f"Validation error: {e}")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during upload: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
def run_list(config: taiga_pycli.config.Config, args):
|
||||
"""Run the list command"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info("Listing activity files")
|
||||
|
||||
activity_files = clnt.list_activity_files()
|
||||
|
||||
if not activity_files:
|
||||
print("No activity files found.")
|
||||
return 0
|
||||
|
||||
# Print header
|
||||
print(f"{'ID':<5} {'Timestamp':<20} {'Created':<20}")
|
||||
print("-" * 45)
|
||||
|
||||
# Print each activity file
|
||||
for activity_file in activity_files:
|
||||
timestamp_formatted = taiga_pycli.workout_utils.format_activity_timestamp(activity_file.timestamp)
|
||||
created_formatted = taiga_pycli.workout_utils.format_activity_timestamp(activity_file.created_at)
|
||||
print(
|
||||
f"{activity_file.id:<5} {timestamp_formatted:<20} {created_formatted:<20}"
|
||||
)
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during list: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
def run_download(config: taiga_pycli.config.Config, args):
|
||||
"""Run the download command"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info(f"Downloading activity file ID: {args.id}")
|
||||
|
||||
output_path = clnt.download_activity_file(args.id, args.output)
|
||||
|
||||
print("Download successful!")
|
||||
print(f"File saved to: {output_path}")
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except ValidationError as e:
|
||||
print(f"Validation error: {e}")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during download: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
29
src/taiga_pycli/cli/common/__init__.py
Normal file
29
src/taiga_pycli/cli/common/__init__.py
Normal file
@@ -0,0 +1,29 @@
|
||||
import taiga_pycli.config
|
||||
import typing
|
||||
import logging
|
||||
import pathlib
|
||||
|
||||
|
||||
def set_up_logging(
|
||||
config: taiga_pycli.config.Config,
|
||||
create_logfile_parents: bool = True,
|
||||
):
|
||||
# for convenience
|
||||
conf = config.general_config
|
||||
handlers: typing.List[logging.Handler] = []
|
||||
|
||||
if conf.log_stream:
|
||||
handlers.append(logging.StreamHandler())
|
||||
|
||||
if conf.log_file is not None:
|
||||
if create_logfile_parents:
|
||||
# create any parent directories for the log file if needed.
|
||||
pathlib.Path(conf.log_file).parent.mkdir(parents=True, exist_ok=True)
|
||||
handlers.append(logging.FileHandler(conf.log_file))
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.DEBUG,
|
||||
format=conf.log_pattern,
|
||||
handlers=handlers,
|
||||
)
|
||||
logging.captureWarnings(True)
|
||||
66
src/taiga_pycli/cli/hats.py
Normal file
66
src/taiga_pycli/cli/hats.py
Normal file
@@ -0,0 +1,66 @@
|
||||
import argparse
|
||||
import logging
|
||||
import typing
|
||||
import taiga_pycli.config
|
||||
import taiga_pycli.models
|
||||
|
||||
import taiga_pycli.service
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
_SubparserType = argparse._SubParsersAction[argparse.ArgumentParser]
|
||||
else:
|
||||
_SubparserType = typing.Any
|
||||
|
||||
|
||||
def setup_parser(subparsers: _SubparserType) -> None:
|
||||
parser = subparsers.add_parser("hat")
|
||||
|
||||
parser.add_argument(
|
||||
"--name", type=str, help="The name of the hat to add", default=None
|
||||
)
|
||||
parser.add_argument(
|
||||
"--description",
|
||||
type=str,
|
||||
help="The description of the hat to add",
|
||||
default=None,
|
||||
)
|
||||
|
||||
parser.set_defaults(func=run)
|
||||
|
||||
|
||||
def run(cfg: taiga_pycli.config.Config, args):
|
||||
# clnt = taiga_pycli.client.TryGoAPIClient("http://localhost:8080/")
|
||||
|
||||
backend = taiga_pycli.service.BackendService(cfg)
|
||||
|
||||
if args.name is not None:
|
||||
if args.description is None:
|
||||
_logger.error("Got a null description, exiting")
|
||||
return
|
||||
# both not None
|
||||
backend.add_hat(args.name, args.description)
|
||||
return
|
||||
|
||||
else:
|
||||
_logger.debug("Not creating, just list")
|
||||
if args.description is not None:
|
||||
_logger.error("Provided a description without name")
|
||||
|
||||
response = backend.get_hats()
|
||||
if response is None:
|
||||
_logger.warning("none response here")
|
||||
return
|
||||
real_hats = []
|
||||
for hat in response:
|
||||
rh = taiga_pycli.models.Hat(
|
||||
name=hat["name"],
|
||||
description=hat["description"],
|
||||
user_id=hat["user_id"],
|
||||
)
|
||||
real_hats.append(rh)
|
||||
print(rh)
|
||||
# _logger.info(response)
|
||||
return
|
||||
52
src/taiga_pycli/cli/login.py
Normal file
52
src/taiga_pycli/cli/login.py
Normal file
@@ -0,0 +1,52 @@
|
||||
import argparse
|
||||
import logging
|
||||
import typing
|
||||
import getpass
|
||||
|
||||
import taiga_pycli.config
|
||||
import taiga_pycli.service
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
_SubparserType = argparse._SubParsersAction[argparse.ArgumentParser]
|
||||
else:
|
||||
_SubparserType = typing.Any
|
||||
|
||||
|
||||
class Password:
|
||||
DEFAULT = "Prompt if not provided"
|
||||
|
||||
def __init__(self, value):
|
||||
if value == self.DEFAULT:
|
||||
value = getpass.getpass("Password: ")
|
||||
self.value = value
|
||||
|
||||
def __str__(self):
|
||||
return self.value
|
||||
|
||||
|
||||
def setup_parser(subparsers: _SubparserType) -> None:
|
||||
parser = subparsers.add_parser("login")
|
||||
|
||||
parser.add_argument(
|
||||
"--email", type=str, help="The email to log in with ", default=None
|
||||
)
|
||||
parser.add_argument(
|
||||
"--password", type=Password, help="Password", default=Password.DEFAULT
|
||||
)
|
||||
parser.set_defaults(func=run)
|
||||
|
||||
|
||||
def run(config: taiga_pycli.config.Config, args):
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info(f"using password {args.password}")
|
||||
email_to_use = args.email
|
||||
if args.email is None:
|
||||
email_to_use = config.general_config.backend_user
|
||||
|
||||
clnt.login(email_to_use, str(args.password))
|
||||
# _logger.info(response)
|
||||
return
|
||||
0
src/taiga_pycli/cli/main.py
Normal file
0
src/taiga_pycli/cli/main.py
Normal file
33
src/taiga_pycli/cli/register.py
Normal file
33
src/taiga_pycli/cli/register.py
Normal file
@@ -0,0 +1,33 @@
|
||||
import argparse
|
||||
import logging
|
||||
import typing
|
||||
import taiga_pycli.config
|
||||
|
||||
import taiga_pycli.service
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
_SubparserType = argparse._SubParsersAction[argparse.ArgumentParser]
|
||||
else:
|
||||
_SubparserType = typing.Any
|
||||
|
||||
|
||||
def setup_parser(subparsers: _SubparserType) -> None:
|
||||
parser = subparsers.add_parser("register")
|
||||
|
||||
parser.add_argument(
|
||||
"--display-name", type=str, help="config file location", default=""
|
||||
)
|
||||
parser.add_argument("--email", type=str, help="config file location", default="")
|
||||
parser.add_argument("--password", type=str, help="config file location", default="")
|
||||
parser.set_defaults(func=run)
|
||||
|
||||
|
||||
def run(cfg: taiga_pycli.config.Config, args):
|
||||
# clnt = taiga_pycli.client.TryGoAPIClient("http://localhost:8080/")
|
||||
|
||||
backend = taiga_pycli.service.BackendService(cfg)
|
||||
response = backend.register(args.email, args.password, args.display_name)
|
||||
_logger.info(response)
|
||||
68
src/taiga_pycli/cli/upload.py
Normal file
68
src/taiga_pycli/cli/upload.py
Normal file
@@ -0,0 +1,68 @@
|
||||
import argparse
|
||||
import logging
|
||||
import typing
|
||||
import pathlib
|
||||
|
||||
import taiga_pycli.config
|
||||
import taiga_pycli.service
|
||||
from taiga_pycli.exceptions import (
|
||||
AuthenticationError,
|
||||
ValidationError,
|
||||
NetworkError,
|
||||
TryGoAPIError,
|
||||
)
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
_SubparserType = argparse._SubParsersAction[argparse.ArgumentParser]
|
||||
else:
|
||||
_SubparserType = typing.Any
|
||||
|
||||
|
||||
def setup_parser(subparsers: _SubparserType) -> None:
|
||||
parser = subparsers.add_parser(
|
||||
"upload", help="Upload an activity file to the server"
|
||||
)
|
||||
|
||||
parser.add_argument(
|
||||
"file_path", type=pathlib.Path, help="Path to the activity file to upload"
|
||||
)
|
||||
|
||||
parser.set_defaults(func=run)
|
||||
|
||||
|
||||
def run(config: taiga_pycli.config.Config, args):
|
||||
"""Run the upload command"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info(f"Uploading file: {args.file_path}")
|
||||
|
||||
activity_file = clnt.upload_activity_file(args.file_path)
|
||||
|
||||
print("Upload successful!")
|
||||
print(f"File ID: {activity_file.id}")
|
||||
print(f"Timestamp: {activity_file.timestamp}")
|
||||
print(f"Created at: {activity_file.created_at}")
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except ValidationError as e:
|
||||
print(f"Validation error: {e}")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during upload: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
299
src/taiga_pycli/cli/workouts.py
Normal file
299
src/taiga_pycli/cli/workouts.py
Normal file
@@ -0,0 +1,299 @@
|
||||
import argparse
|
||||
import logging
|
||||
import typing
|
||||
import pathlib
|
||||
|
||||
import taiga_pycli.config
|
||||
import taiga_pycli.service
|
||||
import taiga_pycli.workout_utils
|
||||
from taiga_pycli.models import CreateWorkoutRequest
|
||||
from taiga_pycli.exceptions import (
|
||||
AuthenticationError,
|
||||
ValidationError,
|
||||
NetworkError,
|
||||
TryGoAPIError,
|
||||
)
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
if typing.TYPE_CHECKING:
|
||||
_SubparserType = argparse._SubParsersAction[argparse.ArgumentParser]
|
||||
else:
|
||||
_SubparserType = typing.Any
|
||||
|
||||
|
||||
def setup_parser(subparsers: _SubparserType) -> None:
|
||||
"""Setup the workouts command group with its subcommands"""
|
||||
parser = subparsers.add_parser("workouts", help="Manage workouts")
|
||||
|
||||
workouts_subparsers = parser.add_subparsers(dest="workouts_cmd", required=True)
|
||||
|
||||
# Upload subcommand - primary workflow
|
||||
upload_parser = workouts_subparsers.add_parser(
|
||||
"upload", help="Upload a FIT file and create workout automatically"
|
||||
)
|
||||
upload_parser.add_argument(
|
||||
"file_path", type=pathlib.Path, help="Path to the FIT file to upload"
|
||||
)
|
||||
upload_parser.set_defaults(func=run_upload)
|
||||
|
||||
# List subcommand
|
||||
list_parser = workouts_subparsers.add_parser(
|
||||
"ls", help="List your workouts"
|
||||
)
|
||||
list_parser.set_defaults(func=run_list)
|
||||
|
||||
# Show subcommand
|
||||
show_parser = workouts_subparsers.add_parser(
|
||||
"show", help="Show detailed workout information"
|
||||
)
|
||||
show_parser.add_argument("id", type=int, help="Workout ID to show")
|
||||
show_parser.set_defaults(func=run_show)
|
||||
|
||||
# Create subcommand for manual workout entry
|
||||
create_parser = workouts_subparsers.add_parser(
|
||||
"create", help="Create a workout manually"
|
||||
)
|
||||
create_parser.add_argument("--distance", type=float, help="Distance in miles")
|
||||
create_parser.add_argument("--time", type=int, help="Duration in seconds")
|
||||
create_parser.add_argument("--pace", type=float, help="Pace in minutes per mile")
|
||||
create_parser.add_argument("--speed", type=float, help="Speed in mph")
|
||||
create_parser.add_argument("--start-time", type=str, help="Start time (ISO format)")
|
||||
create_parser.add_argument("--end-time", type=str, help="End time (ISO format)")
|
||||
create_parser.set_defaults(func=run_create)
|
||||
|
||||
# Delete subcommand
|
||||
delete_parser = workouts_subparsers.add_parser(
|
||||
"delete", help="Delete a workout"
|
||||
)
|
||||
delete_parser.add_argument("id", type=int, help="Workout ID to delete")
|
||||
delete_parser.add_argument("--yes", "-y", action="store_true", help="Skip confirmation")
|
||||
delete_parser.set_defaults(func=run_delete)
|
||||
|
||||
|
||||
def run_upload(config: taiga_pycli.config.Config, args):
|
||||
"""Run the upload command - primary workflow"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info(f"Uploading FIT file and creating workout: {args.file_path}")
|
||||
|
||||
# Check file extension
|
||||
if not args.file_path.name.lower().endswith('.fit'):
|
||||
print(f"Warning: File does not have .fit extension: {args.file_path}")
|
||||
|
||||
# Upload and create workout in one step
|
||||
workout = clnt.upload_and_create_workout(args.file_path)
|
||||
|
||||
print("Workout created successfully!")
|
||||
print(f"Workout ID: {workout.id}")
|
||||
|
||||
# Print formatted workout summary
|
||||
summary = taiga_pycli.workout_utils.format_workout_summary(workout)
|
||||
if summary != "Workout data":
|
||||
print(f"Summary: {summary}")
|
||||
|
||||
# Show additional details if available
|
||||
if workout.distance_miles or workout.time_seconds:
|
||||
print("\nWorkout Details:")
|
||||
taiga_pycli.workout_utils.print_workout_details(workout)
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except ValidationError as e:
|
||||
print(f"Validation error: {e}")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during workout upload: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
def run_list(config: taiga_pycli.config.Config, args):
|
||||
"""Run the list command"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info("Listing workouts")
|
||||
|
||||
workouts = clnt.get_workouts()
|
||||
|
||||
if not workouts:
|
||||
print("No workouts found.")
|
||||
print("Upload a FIT file with 'taiga workouts upload <file.fit>' to get started!")
|
||||
return 0
|
||||
|
||||
# Print table header
|
||||
taiga_pycli.workout_utils.print_workout_table_header()
|
||||
|
||||
# Print each workout
|
||||
for workout in workouts:
|
||||
taiga_pycli.workout_utils.print_workout_row(workout)
|
||||
|
||||
print(f"\nTotal: {len(workouts)} workouts")
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during list: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
def run_show(config: taiga_pycli.config.Config, args):
|
||||
"""Run the show command"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info(f"Showing workout {args.id}")
|
||||
|
||||
workout = clnt.get_workout(args.id)
|
||||
|
||||
# Print detailed workout information
|
||||
taiga_pycli.workout_utils.print_workout_details(workout)
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except ValidationError as e:
|
||||
print(f"Workout not found: {e}")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during show: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
def run_create(config: taiga_pycli.config.Config, args):
|
||||
"""Run the create command for manual workout entry"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
_logger.info("Creating manual workout")
|
||||
|
||||
# Build request from args
|
||||
request = CreateWorkoutRequest(
|
||||
distance_miles=args.distance,
|
||||
time_seconds=args.time,
|
||||
pace_min_per_mile=args.pace,
|
||||
speed_mph=args.speed,
|
||||
start_time=args.start_time,
|
||||
end_time=args.end_time,
|
||||
)
|
||||
|
||||
# Validate that at least some data is provided
|
||||
if not any([args.distance, args.time, args.pace, args.speed, args.start_time]):
|
||||
print("Error: At least one workout parameter must be provided")
|
||||
print("Use --distance, --time, --pace, --speed, or --start-time")
|
||||
return 1
|
||||
|
||||
workout = clnt.create_workout(request)
|
||||
|
||||
print("Workout created successfully!")
|
||||
print(f"Workout ID: {workout.id}")
|
||||
|
||||
# Show workout details
|
||||
print("\nWorkout Details:")
|
||||
taiga_pycli.workout_utils.print_workout_details(workout)
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except ValidationError as e:
|
||||
print(f"Validation error: {e}")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during create: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
def run_delete(config: taiga_pycli.config.Config, args):
|
||||
"""Run the delete command"""
|
||||
try:
|
||||
clnt = taiga_pycli.service.BackendService(config)
|
||||
|
||||
# Get workout details first to show what we're deleting
|
||||
try:
|
||||
workout = clnt.get_workout(args.id)
|
||||
except (ValidationError, TryGoAPIError):
|
||||
print(f"Workout {args.id} not found.")
|
||||
return 1
|
||||
|
||||
# Show what we're about to delete
|
||||
print(f"About to delete workout #{workout.id}:")
|
||||
summary = taiga_pycli.workout_utils.format_workout_summary(workout)
|
||||
date = taiga_pycli.workout_utils.format_workout_date(workout.start_time or workout.created_at)
|
||||
print(f" {date}: {summary}")
|
||||
|
||||
# Confirm deletion unless --yes flag is used
|
||||
if not args.yes:
|
||||
response = input("\nAre you sure you want to delete this workout? (y/N): ")
|
||||
if response.lower() not in ['y', 'yes']:
|
||||
print("Deletion cancelled.")
|
||||
return 0
|
||||
|
||||
_logger.info(f"Deleting workout {args.id}")
|
||||
|
||||
clnt.delete_workout(args.id)
|
||||
|
||||
print(f"Workout {args.id} deleted successfully.")
|
||||
|
||||
except AuthenticationError as e:
|
||||
print(f"Authentication error: {e}")
|
||||
print("Please run 'taiga login' first.")
|
||||
return 1
|
||||
except ValidationError as e:
|
||||
print(f"Workout not found: {e}")
|
||||
return 1
|
||||
except NetworkError as e:
|
||||
print(f"Network error: {e}")
|
||||
return 1
|
||||
except TryGoAPIError as e:
|
||||
print(f"API error: {e}")
|
||||
return 1
|
||||
except Exception as e:
|
||||
_logger.error(f"Unexpected error during delete: {e}")
|
||||
print(f"Unexpected error: {e}")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
10
src/taiga_pycli/config/__init__.py
Normal file
10
src/taiga_pycli/config/__init__.py
Normal file
@@ -0,0 +1,10 @@
|
||||
from taiga_pycli.config.config import (
|
||||
GeneralConfig,
|
||||
Config,
|
||||
)
|
||||
|
||||
from taiga_pycli.config.config_reader import (
|
||||
read_config,
|
||||
)
|
||||
|
||||
__all__ = ["GeneralConfig", "Config", "read_config"]
|
||||
19
src/taiga_pycli/config/config.py
Normal file
19
src/taiga_pycli/config/config.py
Normal file
@@ -0,0 +1,19 @@
|
||||
from dataclasses import dataclass
|
||||
|
||||
from typing import Optional
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class GeneralConfig:
|
||||
log_pattern: str = "%(asctime)s | %(process)d | %(levelname)-7s | %(name)s:%(lineno)d | %(message)s"
|
||||
log_file: Optional[str] = None
|
||||
log_stream: bool = True
|
||||
backend_base_url: str = ""
|
||||
# obviously bad, but for now just keep it
|
||||
backend_user: str = ""
|
||||
backend_pw: str = ""
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class Config:
|
||||
general_config: GeneralConfig = GeneralConfig()
|
||||
58
src/taiga_pycli/config/config_reader.py
Normal file
58
src/taiga_pycli/config/config_reader.py
Normal file
@@ -0,0 +1,58 @@
|
||||
import logging
|
||||
import dacite
|
||||
import pathlib
|
||||
import tomli
|
||||
|
||||
from taiga_pycli.config import GeneralConfig, Config
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
_common_dacite_config = dacite.Config(
|
||||
strict=True,
|
||||
)
|
||||
|
||||
|
||||
def read_general_config_from_dict(general_config_dict: dict) -> GeneralConfig:
|
||||
"""
|
||||
Converts a dictionary to a GeneralConfig object
|
||||
|
||||
:param general_config_dict: dictionary containing general config values
|
||||
:return: GeneralConfig object
|
||||
"""
|
||||
general_config = dacite.from_dict(
|
||||
data_class=GeneralConfig,
|
||||
data=general_config_dict,
|
||||
config=_common_dacite_config,
|
||||
)
|
||||
return general_config
|
||||
|
||||
|
||||
def read_config_dict(file_path: pathlib.Path) -> dict:
|
||||
"""
|
||||
Read a dict from file
|
||||
"""
|
||||
_logger.debug(f"Reading config from {file_path=}")
|
||||
with open(file_path, "rb") as toml_file:
|
||||
config_dict = tomli.load(toml_file)
|
||||
return config_dict
|
||||
|
||||
|
||||
def serialize_config(config_dict: dict) -> Config:
|
||||
"""
|
||||
Converts a dictionary to a Config object
|
||||
|
||||
Makes assumptions about structure of the config_dict, so validation should happen here too if needed.
|
||||
"""
|
||||
config = dacite.from_dict(
|
||||
data_class=Config,
|
||||
data=config_dict,
|
||||
config=_common_dacite_config,
|
||||
)
|
||||
_logger.warning(config)
|
||||
|
||||
return config
|
||||
|
||||
|
||||
def read_config(file_path: pathlib.Path) -> Config:
|
||||
config_dict = read_config_dict(file_path)
|
||||
return serialize_config(config_dict)
|
||||
31
src/taiga_pycli/exceptions.py
Normal file
31
src/taiga_pycli/exceptions.py
Normal file
@@ -0,0 +1,31 @@
|
||||
"""Custom exceptions for TryGo API client"""
|
||||
|
||||
|
||||
class TryGoAPIError(Exception):
|
||||
"""Base exception for TryGo API client errors"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class AuthenticationError(TryGoAPIError):
|
||||
"""Raised when authentication fails"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class ValidationError(TryGoAPIError):
|
||||
"""Raised when request validation fails"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class NetworkError(TryGoAPIError):
|
||||
"""Raised when network requests fail"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class ServerError(TryGoAPIError):
|
||||
"""Raised when server returns 5xx errors"""
|
||||
|
||||
pass
|
||||
102
src/taiga_pycli/models.py
Normal file
102
src/taiga_pycli/models.py
Normal file
@@ -0,0 +1,102 @@
|
||||
from dataclasses import dataclass
|
||||
from typing import Optional
|
||||
|
||||
|
||||
@dataclass
|
||||
class User:
|
||||
"""User model for API requests and responses"""
|
||||
|
||||
email: str
|
||||
password: str
|
||||
display_name: str
|
||||
id: Optional[str] = None
|
||||
|
||||
|
||||
@dataclass
|
||||
class RegisterRequest:
|
||||
"""Request payload for /auth/register endpoint"""
|
||||
|
||||
email: str
|
||||
password: str
|
||||
display_name: str
|
||||
|
||||
|
||||
@dataclass
|
||||
class LoginRequest:
|
||||
"""Request payload for /auth/login endpoint"""
|
||||
|
||||
email: str
|
||||
password: str
|
||||
|
||||
|
||||
@dataclass
|
||||
class AuthResponse:
|
||||
"""Response from authentication endpoints"""
|
||||
|
||||
message: str
|
||||
email: Optional[str] = None
|
||||
id: Optional[str] = None
|
||||
|
||||
|
||||
@dataclass
|
||||
class Hat:
|
||||
"""A wearable hat"""
|
||||
|
||||
name: str
|
||||
description: str
|
||||
user_id: int
|
||||
|
||||
|
||||
@dataclass
|
||||
class ActivityFile:
|
||||
"""Activity file model for API responses"""
|
||||
|
||||
id: int
|
||||
timestamp: str
|
||||
file_repo_hash: Optional[str]
|
||||
created_at: str
|
||||
updated_at: str
|
||||
user_id: int
|
||||
|
||||
|
||||
@dataclass
|
||||
class Workout:
|
||||
"""Workout model for API responses"""
|
||||
|
||||
id: int
|
||||
distance_miles: Optional[float]
|
||||
time_seconds: Optional[float]
|
||||
speed_mph: Optional[float]
|
||||
pace_min_per_mile: Optional[float]
|
||||
start_time: Optional[str]
|
||||
end_time: Optional[str]
|
||||
activity_file_id: Optional[int]
|
||||
created_at: str
|
||||
updated_at: str
|
||||
user_id: int
|
||||
|
||||
|
||||
@dataclass
|
||||
class CreateWorkoutRequest:
|
||||
"""Request payload for creating a new workout"""
|
||||
|
||||
distance_miles: Optional[float] = None
|
||||
time_seconds: Optional[float] = None
|
||||
speed_mph: Optional[float] = None
|
||||
pace_min_per_mile: Optional[float] = None
|
||||
start_time: Optional[str] = None
|
||||
end_time: Optional[str] = None
|
||||
activity_file_id: Optional[int] = None
|
||||
|
||||
|
||||
@dataclass
|
||||
class UpdateWorkoutRequest:
|
||||
"""Request payload for updating an existing workout"""
|
||||
|
||||
distance_miles: Optional[float] = None
|
||||
time_seconds: Optional[float] = None
|
||||
speed_mph: Optional[float] = None
|
||||
pace_min_per_mile: Optional[float] = None
|
||||
start_time: Optional[str] = None
|
||||
end_time: Optional[str] = None
|
||||
activity_file_id: Optional[int] = None
|
||||
718
src/taiga_pycli/service/__init__.py
Normal file
718
src/taiga_pycli/service/__init__.py
Normal file
@@ -0,0 +1,718 @@
|
||||
import logging
|
||||
from typing import Optional, Dict, Any, Sequence, List
|
||||
from dataclasses import asdict
|
||||
from pathlib import Path
|
||||
|
||||
import requests
|
||||
|
||||
|
||||
from taiga_pycli.models import (
|
||||
RegisterRequest,
|
||||
LoginRequest,
|
||||
AuthResponse,
|
||||
ActivityFile,
|
||||
Workout,
|
||||
CreateWorkoutRequest,
|
||||
UpdateWorkoutRequest,
|
||||
)
|
||||
from taiga_pycli.config import Config
|
||||
from taiga_pycli.exceptions import (
|
||||
TryGoAPIError,
|
||||
AuthenticationError,
|
||||
ValidationError,
|
||||
NetworkError,
|
||||
ServerError,
|
||||
)
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class BackendService:
|
||||
def __init__(self, config: Config):
|
||||
self.base_url = config.general_config.backend_base_url.rstrip("/")
|
||||
self.timeout = 30
|
||||
self.session = requests.Session()
|
||||
self.session.headers.update(
|
||||
{"Content-Type": "application/json", "User-Agent": "trygo-py-client/0.1.0"}
|
||||
)
|
||||
|
||||
self._token_path = Path("cred") / "token"
|
||||
self.token: Optional[str] = None
|
||||
|
||||
def _make_request(
|
||||
self, method: str, endpoint: str, data: Optional[Dict[str, Any]] = None
|
||||
) -> requests.Response:
|
||||
"""
|
||||
Make HTTP request to API endpoint
|
||||
|
||||
Args:
|
||||
method: HTTP method (GET, POST, PUT, DELETE)
|
||||
endpoint: API endpoint path
|
||||
data: Request payload data
|
||||
|
||||
Returns:
|
||||
Response data as dictionary
|
||||
|
||||
Raises:
|
||||
NetworkError: For connection/timeout errors
|
||||
AuthenticationError: For 401 errors
|
||||
ValidationError: For 400 errors
|
||||
ServerError: For 5xx errors
|
||||
TryGoAPIError: For other API errors
|
||||
"""
|
||||
url = f"{self.base_url}{endpoint}"
|
||||
_logger.info(f"Making {method} request to {url}")
|
||||
|
||||
try:
|
||||
response = self.session.request(
|
||||
method=method, url=url, json=data, timeout=self.timeout
|
||||
)
|
||||
|
||||
# Log response status
|
||||
_logger.debug(f"Response status: {response.status_code}")
|
||||
|
||||
# Handle different HTTP status codes
|
||||
if response.status_code == 401:
|
||||
raise AuthenticationError("Authentication failed")
|
||||
elif response.status_code == 400:
|
||||
error_msg = "Validation error"
|
||||
try:
|
||||
error_data = response.json()
|
||||
if "message" in error_data:
|
||||
error_msg = error_data["message"]
|
||||
except Exception:
|
||||
pass
|
||||
raise ValidationError(error_msg)
|
||||
elif 500 <= response.status_code < 600:
|
||||
raise ServerError(f"Server error: {response.status_code}")
|
||||
elif not response.ok:
|
||||
raise TryGoAPIError(f"API error: {response.status_code}")
|
||||
|
||||
# Parse JSON response (skip for 204 No Content)
|
||||
if response.status_code == 204:
|
||||
return response
|
||||
try:
|
||||
_logger.debug(response.json())
|
||||
return response
|
||||
except ValueError as e:
|
||||
raise TryGoAPIError(f"Invalid JSON response: {e}")
|
||||
|
||||
# claude really fucked this
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Request timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error: {e}")
|
||||
|
||||
def register(self, email: str, password: str, display_name: str) -> AuthResponse:
|
||||
"""
|
||||
Register a new user account
|
||||
"""
|
||||
request_data = RegisterRequest(
|
||||
email=email, password=password, display_name=display_name
|
||||
)
|
||||
|
||||
_logger.info(f"Registering user: {email}")
|
||||
|
||||
response = self._make_request("POST", "/auth/register", asdict(request_data))
|
||||
response_json = response.json()
|
||||
|
||||
# Parse response and create AuthResponse
|
||||
auth_response = AuthResponse(
|
||||
message=response_json.get("message", ""),
|
||||
email=response_json.get("email", ""),
|
||||
id=response_json.get("id", ""),
|
||||
)
|
||||
|
||||
return auth_response
|
||||
|
||||
def login(self, email: str, password: str) -> str:
|
||||
"""
|
||||
Authenticate user and get auth token
|
||||
"""
|
||||
request_data = LoginRequest(email=email, password=password)
|
||||
|
||||
_logger.info(f"Logging in user: {email}")
|
||||
|
||||
response = self._make_request("POST", "/auth/tokens", asdict(request_data))
|
||||
response_data = response.json()
|
||||
_logger.info(response_data)
|
||||
|
||||
# Parse response and create AuthResponse
|
||||
token = response_data.get("token", "")
|
||||
self._store_credential(token)
|
||||
|
||||
return token
|
||||
|
||||
def logout(self) -> None:
|
||||
"""
|
||||
Clear authentication token and session data
|
||||
|
||||
Optionally calls logout endpoint if server supports it
|
||||
"""
|
||||
_logger.info("Logging out user")
|
||||
|
||||
# Try to call logout endpoint if we have a token
|
||||
|
||||
_logger.debug("Local authentication cleared")
|
||||
|
||||
def get_hats(self) -> Optional[Sequence[Dict[str, Any]]]:
|
||||
# needs credential
|
||||
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
return None
|
||||
|
||||
_logger.debug("Credential was read")
|
||||
response = self._make_request("GET", "/hats")
|
||||
response_data = response.json()
|
||||
# _logger.debug(response)
|
||||
return response_data
|
||||
|
||||
def add_hat(self, name: str, description: str) -> Optional[Dict[str, Any]]:
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
return None
|
||||
|
||||
_logger.debug("Credential was read")
|
||||
response = self._make_request(
|
||||
"POST", "/hats", {"name": name, "description": description}
|
||||
)
|
||||
return response.json()
|
||||
|
||||
def _store_credential(self, token: str) -> None:
|
||||
self._token_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
with self._token_path.open(mode="w") as tokenf:
|
||||
tokenf.write(token)
|
||||
self.token = token
|
||||
self.session.headers.update({"Authorization": f"Bearer {token}"})
|
||||
|
||||
def _read_credential(self) -> Optional[str]:
|
||||
try:
|
||||
with open(self._token_path) as token_file:
|
||||
token_in_file = token_file.read()
|
||||
self.token = token_in_file
|
||||
self.session.headers.update(
|
||||
{"Authorization": f"Bearer {token_in_file}"}
|
||||
)
|
||||
return token_in_file
|
||||
except FileNotFoundError:
|
||||
_logger.error("No token file found, try logging in")
|
||||
return None
|
||||
|
||||
def upload_activity_file(self, file_path: Path) -> ActivityFile:
|
||||
"""
|
||||
Upload an activity file to the server
|
||||
|
||||
Args:
|
||||
file_path: Path to the file to upload
|
||||
|
||||
Returns:
|
||||
ActivityFile object with server response data
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
ValidationError: If file is invalid
|
||||
NetworkError: For connection issues
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
if not file_path.exists():
|
||||
raise ValidationError(f"File not found: {file_path}")
|
||||
|
||||
if not file_path.is_file():
|
||||
raise ValidationError(f"Path is not a file: {file_path}")
|
||||
|
||||
_logger.info(f"Uploading activity file: {file_path}")
|
||||
|
||||
url = f"{self.base_url}/activity_files"
|
||||
|
||||
try:
|
||||
with open(file_path, "rb") as file_obj:
|
||||
files = {"file": (file_path.name, file_obj, "application/octet-stream")}
|
||||
|
||||
# Temporarily remove Content-Type header to let requests set multipart boundary
|
||||
original_content_type = self.session.headers.pop("Content-Type", None)
|
||||
|
||||
try:
|
||||
response = self.session.post(
|
||||
url=url, files=files, timeout=self.timeout
|
||||
)
|
||||
finally:
|
||||
# Restore original Content-Type header
|
||||
if original_content_type:
|
||||
self.session.headers["Content-Type"] = original_content_type
|
||||
|
||||
# Handle response status codes
|
||||
if response.status_code == 401:
|
||||
raise AuthenticationError("Authentication failed")
|
||||
elif response.status_code == 400:
|
||||
error_msg = "File upload failed"
|
||||
try:
|
||||
error_data = response.json()
|
||||
if "error" in error_data:
|
||||
error_msg = error_data["error"]
|
||||
except Exception:
|
||||
pass
|
||||
raise ValidationError(error_msg)
|
||||
elif 500 <= response.status_code < 600:
|
||||
raise ServerError(f"Server error: {response.status_code}")
|
||||
elif not response.ok:
|
||||
raise TryGoAPIError(f"Upload failed: {response.status_code}")
|
||||
|
||||
# Parse response
|
||||
try:
|
||||
response_data = response.json()
|
||||
_logger.info(f"Upload successful: {response_data}")
|
||||
|
||||
return ActivityFile(
|
||||
id=response_data["id"],
|
||||
timestamp=response_data["timestamp"],
|
||||
file_repo_hash=response_data.get("file_repo_hash"),
|
||||
created_at=response_data["created_at"],
|
||||
updated_at=response_data["updated_at"],
|
||||
user_id=response_data["user_id"],
|
||||
)
|
||||
except ValueError as e:
|
||||
raise TryGoAPIError(f"Invalid JSON response: {e}")
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Upload timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error during upload")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error during upload: {e}")
|
||||
except FileNotFoundError:
|
||||
raise ValidationError(f"File not found: {file_path}")
|
||||
except PermissionError:
|
||||
raise ValidationError(f"Permission denied reading file: {file_path}")
|
||||
|
||||
def list_activity_files(self) -> List[ActivityFile]:
|
||||
"""
|
||||
List all activity files for the authenticated user
|
||||
|
||||
Returns:
|
||||
List of ActivityFile objects
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
_logger.info("Fetching activity files list")
|
||||
|
||||
try:
|
||||
response = self._make_request("GET", "/activity_files")
|
||||
response_data = response.json()
|
||||
|
||||
# Parse response into ActivityFile objects
|
||||
_logger.debug(response_data)
|
||||
activity_files: List[ActivityFile] = []
|
||||
_logger.debug("checking if list")
|
||||
if isinstance(response_data, list):
|
||||
_logger.debug("yes in list")
|
||||
for item in response_data:
|
||||
_logger.debug(item)
|
||||
if isinstance(item, dict):
|
||||
activity_file = ActivityFile(
|
||||
id=item["id"],
|
||||
timestamp=item["timestamp"],
|
||||
file_repo_hash=item.get("file_repo_hash"),
|
||||
created_at=item["created_at"],
|
||||
updated_at=item["updated_at"],
|
||||
user_id=item["user_id"],
|
||||
)
|
||||
activity_files.append(activity_file)
|
||||
|
||||
_logger.info(f"Retrieved {len(activity_files)} activity files")
|
||||
return activity_files
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Request timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error: {e}")
|
||||
|
||||
def download_activity_file(
|
||||
self, file_id: int, output_path: Optional[Path] = None
|
||||
) -> Path:
|
||||
"""
|
||||
Download an activity file by ID
|
||||
|
||||
Args:
|
||||
file_id: The ID of the activity file to download
|
||||
output_path: Optional path to save the file (defaults to activity_{id}.fit)
|
||||
|
||||
Returns:
|
||||
Path where the file was saved
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
ValidationError: If file not found or invalid
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
if output_path is None:
|
||||
output_path = Path(f"activity_{file_id}.fit")
|
||||
|
||||
_logger.info(f"Downloading activity file {file_id} to {output_path}")
|
||||
|
||||
url = f"{self.base_url}/activity_files/{file_id}/download"
|
||||
|
||||
try:
|
||||
response = self.session.get(url=url, timeout=self.timeout, stream=True)
|
||||
|
||||
# Handle response status codes
|
||||
if response.status_code == 401:
|
||||
raise AuthenticationError("Authentication failed")
|
||||
elif response.status_code == 404:
|
||||
raise ValidationError(f"Activity file with ID {file_id} not found")
|
||||
elif response.status_code == 400:
|
||||
raise ValidationError("Invalid download request")
|
||||
elif 500 <= response.status_code < 600:
|
||||
raise ServerError(f"Server error: {response.status_code}")
|
||||
elif not response.ok:
|
||||
raise TryGoAPIError(f"Download failed: {response.status_code}")
|
||||
|
||||
# Write file to disk
|
||||
with open(output_path, "wb") as f:
|
||||
for chunk in response.iter_content(chunk_size=8192):
|
||||
if chunk:
|
||||
f.write(chunk)
|
||||
|
||||
_logger.info(f"Downloaded activity file {file_id} to {output_path}")
|
||||
return output_path
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Download timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error during download")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error during download: {e}")
|
||||
except OSError as e:
|
||||
raise ValidationError(f"Error writing file {output_path}: {e}")
|
||||
|
||||
def upload_and_create_workout(self, file_path: Path) -> Workout:
|
||||
"""
|
||||
Upload FIT file and automatically create workout in one operation
|
||||
|
||||
Args:
|
||||
file_path: Path to the FIT file to upload
|
||||
|
||||
Returns:
|
||||
Workout object created from the FIT file
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
ValidationError: If file is invalid or FIT processing fails
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
_logger.info(f"Uploading FIT file and creating workout: {file_path}")
|
||||
|
||||
activity_file = self.upload_activity_file(file_path)
|
||||
_logger.info(f"Activity file uploaded with ID: {activity_file.id}")
|
||||
|
||||
try:
|
||||
workout = self.create_workout_from_activity_file(activity_file.id)
|
||||
_logger.info(f"Workout created with ID: {workout.id}")
|
||||
return workout
|
||||
except Exception as e:
|
||||
_logger.error(f"Failed to create workout from activity file {activity_file.id}: {e}")
|
||||
raise
|
||||
|
||||
def get_workouts(self) -> List[Workout]:
|
||||
"""
|
||||
Get all workouts for the authenticated user
|
||||
|
||||
Returns:
|
||||
List of Workout objects
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
_logger.info("Fetching workouts list")
|
||||
|
||||
try:
|
||||
response = self._make_request("GET", "/workouts")
|
||||
response_data = response.json()
|
||||
|
||||
workouts: List[Workout] = []
|
||||
if isinstance(response_data, list):
|
||||
for item in response_data:
|
||||
if isinstance(item, dict):
|
||||
workout = Workout(
|
||||
id=item["id"],
|
||||
distance_miles=item.get("distance_miles"),
|
||||
time_seconds=item.get("time_seconds"),
|
||||
speed_mph=item.get("speed_mph"),
|
||||
pace_min_per_mile=item.get("pace_min_per_mile"),
|
||||
start_time=item.get("start_time"),
|
||||
end_time=item.get("end_time"),
|
||||
activity_file_id=item.get("activity_file_id"),
|
||||
created_at=item["created_at"],
|
||||
updated_at=item["updated_at"],
|
||||
user_id=item["user_id"],
|
||||
)
|
||||
workouts.append(workout)
|
||||
|
||||
_logger.info(f"Retrieved {len(workouts)} workouts")
|
||||
return workouts
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Request timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error: {e}")
|
||||
|
||||
def get_workout(self, workout_id: int) -> Workout:
|
||||
"""
|
||||
Get a specific workout by ID
|
||||
|
||||
Args:
|
||||
workout_id: The ID of the workout to retrieve
|
||||
|
||||
Returns:
|
||||
Workout object
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
ValidationError: If workout not found
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
_logger.info(f"Fetching workout {workout_id}")
|
||||
|
||||
try:
|
||||
response = self._make_request("GET", f"/workouts/{workout_id}")
|
||||
item = response.json()
|
||||
|
||||
workout = Workout(
|
||||
id=item["id"],
|
||||
distance_miles=item.get("distance_miles"),
|
||||
time_seconds=item.get("time_seconds"),
|
||||
speed_mph=item.get("speed_mph"),
|
||||
pace_min_per_mile=item.get("pace_min_per_mile"),
|
||||
start_time=item.get("start_time"),
|
||||
end_time=item.get("end_time"),
|
||||
activity_file_id=item.get("activity_file_id"),
|
||||
created_at=item["created_at"],
|
||||
updated_at=item["updated_at"],
|
||||
user_id=item["user_id"],
|
||||
)
|
||||
|
||||
return workout
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Request timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error: {e}")
|
||||
|
||||
def create_workout(self, request: CreateWorkoutRequest) -> Workout:
|
||||
"""
|
||||
Create a new workout manually
|
||||
|
||||
Args:
|
||||
request: CreateWorkoutRequest with workout data
|
||||
|
||||
Returns:
|
||||
Created Workout object
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
ValidationError: If request data is invalid
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
_logger.info("Creating new workout")
|
||||
|
||||
try:
|
||||
response = self._make_request("POST", "/workouts", asdict(request))
|
||||
item = response.json()
|
||||
|
||||
workout = Workout(
|
||||
id=item["id"],
|
||||
distance_miles=item.get("distance_miles"),
|
||||
time_seconds=item.get("time_seconds"),
|
||||
speed_mph=item.get("speed_mph"),
|
||||
pace_min_per_mile=item.get("pace_min_per_mile"),
|
||||
start_time=item.get("start_time"),
|
||||
end_time=item.get("end_time"),
|
||||
activity_file_id=item.get("activity_file_id"),
|
||||
created_at=item["created_at"],
|
||||
updated_at=item["updated_at"],
|
||||
user_id=item["user_id"],
|
||||
)
|
||||
|
||||
_logger.info(f"Created workout with ID: {workout.id}")
|
||||
return workout
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Request timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error: {e}")
|
||||
|
||||
def update_workout(self, workout_id: int, request: UpdateWorkoutRequest) -> Workout:
|
||||
"""
|
||||
Update an existing workout
|
||||
|
||||
Args:
|
||||
workout_id: The ID of the workout to update
|
||||
request: UpdateWorkoutRequest with fields to update
|
||||
|
||||
Returns:
|
||||
Updated Workout object
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
ValidationError: If workout not found or request data is invalid
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
_logger.info(f"Updating workout {workout_id}")
|
||||
|
||||
try:
|
||||
response = self._make_request("PUT", f"/workouts/{workout_id}", asdict(request))
|
||||
item = response.json()
|
||||
|
||||
workout = Workout(
|
||||
id=item["id"],
|
||||
distance_miles=item.get("distance_miles"),
|
||||
time_seconds=item.get("time_seconds"),
|
||||
speed_mph=item.get("speed_mph"),
|
||||
pace_min_per_mile=item.get("pace_min_per_mile"),
|
||||
start_time=item.get("start_time"),
|
||||
end_time=item.get("end_time"),
|
||||
activity_file_id=item.get("activity_file_id"),
|
||||
created_at=item["created_at"],
|
||||
updated_at=item["updated_at"],
|
||||
user_id=item["user_id"],
|
||||
)
|
||||
|
||||
_logger.info(f"Updated workout {workout_id}")
|
||||
return workout
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Request timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error: {e}")
|
||||
|
||||
def delete_workout(self, workout_id: int) -> None:
|
||||
"""
|
||||
Delete a workout
|
||||
|
||||
Args:
|
||||
workout_id: The ID of the workout to delete
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
ValidationError: If workout not found
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
_logger.info(f"Deleting workout {workout_id}")
|
||||
|
||||
try:
|
||||
self._make_request("DELETE", f"/workouts/{workout_id}")
|
||||
# DELETE returns 204 No Content with empty body, so don't try to parse JSON
|
||||
_logger.info(f"Deleted workout {workout_id}")
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Request timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error: {e}")
|
||||
|
||||
def create_workout_from_activity_file(self, activity_file_id: int) -> Workout:
|
||||
"""
|
||||
Create workout from an existing activity file
|
||||
|
||||
Args:
|
||||
activity_file_id: The ID of the activity file to process
|
||||
|
||||
Returns:
|
||||
Created Workout object
|
||||
|
||||
Raises:
|
||||
AuthenticationError: If not authenticated
|
||||
ValidationError: If activity file not found or FIT processing fails
|
||||
NetworkError: For connection issues
|
||||
TryGoAPIError: For API errors
|
||||
"""
|
||||
cred = self._read_credential()
|
||||
if cred is None:
|
||||
raise AuthenticationError("Not authenticated. Please login first.")
|
||||
|
||||
_logger.info(f"Creating workout from activity file {activity_file_id}")
|
||||
|
||||
try:
|
||||
response = self._make_request(
|
||||
"POST", f"/workouts/from-activity-file/{activity_file_id}"
|
||||
)
|
||||
item = response.json()
|
||||
|
||||
workout = Workout(
|
||||
id=item["id"],
|
||||
distance_miles=item.get("distance_miles"),
|
||||
time_seconds=item.get("time_seconds"),
|
||||
speed_mph=item.get("speed_mph"),
|
||||
pace_min_per_mile=item.get("pace_min_per_mile"),
|
||||
start_time=item.get("start_time"),
|
||||
end_time=item.get("end_time"),
|
||||
activity_file_id=item.get("activity_file_id"),
|
||||
created_at=item["created_at"],
|
||||
updated_at=item["updated_at"],
|
||||
user_id=item["user_id"],
|
||||
)
|
||||
|
||||
_logger.info(f"Created workout {workout.id} from activity file {activity_file_id}")
|
||||
return workout
|
||||
|
||||
except requests.exceptions.Timeout:
|
||||
raise NetworkError("Request timeout")
|
||||
except requests.exceptions.ConnectionError:
|
||||
raise NetworkError("Connection error")
|
||||
except requests.exceptions.RequestException as e:
|
||||
raise NetworkError(f"Network error: {e}")
|
||||
199
src/taiga_pycli/workout_utils.py
Normal file
199
src/taiga_pycli/workout_utils.py
Normal file
@@ -0,0 +1,199 @@
|
||||
"""Utility functions for workout data formatting and display"""
|
||||
|
||||
from typing import Optional, Union
|
||||
from datetime import datetime
|
||||
from taiga_pycli.models import Workout
|
||||
|
||||
|
||||
def format_time_duration(seconds: Optional[Union[int, float]]) -> str:
|
||||
"""Format time duration in seconds to HH:MM:SS or MM:SS"""
|
||||
if seconds is None:
|
||||
return "--:--"
|
||||
|
||||
# Convert to int to handle float values from API
|
||||
total_seconds = int(seconds)
|
||||
hours = total_seconds // 3600
|
||||
minutes = (total_seconds % 3600) // 60
|
||||
secs = total_seconds % 60
|
||||
|
||||
if hours > 0:
|
||||
return f"{hours}:{minutes:02d}:{secs:02d}"
|
||||
else:
|
||||
return f"{minutes}:{secs:02d}"
|
||||
|
||||
|
||||
def format_pace(pace_min_per_mile: Optional[float]) -> str:
|
||||
"""Format pace in minutes per mile to MM:SS/mi"""
|
||||
if pace_min_per_mile is None:
|
||||
return "--:--/mi"
|
||||
|
||||
minutes = int(pace_min_per_mile)
|
||||
seconds = int((pace_min_per_mile - minutes) * 60)
|
||||
|
||||
return f"{minutes}:{seconds:02d}/mi"
|
||||
|
||||
|
||||
def format_speed(speed_mph: Optional[float]) -> str:
|
||||
"""Format speed to mph with one decimal place"""
|
||||
if speed_mph is None:
|
||||
return "-- mph"
|
||||
return f"{speed_mph:.1f} mph"
|
||||
|
||||
|
||||
def format_distance(distance_miles: Optional[float]) -> str:
|
||||
"""Format distance to miles with two decimal places"""
|
||||
if distance_miles is None:
|
||||
return "-- mi"
|
||||
return f"{distance_miles:.2f} mi"
|
||||
|
||||
|
||||
def format_workout_date(timestamp: Optional[str]) -> str:
|
||||
"""Format workout timestamp to human-readable date"""
|
||||
if timestamp is None:
|
||||
return "Unknown date"
|
||||
|
||||
try:
|
||||
dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00'))
|
||||
return dt.strftime("%b %d, %Y %I:%M %p")
|
||||
except (ValueError, AttributeError):
|
||||
return "Invalid date"
|
||||
|
||||
|
||||
def format_workout_date_short(timestamp: Optional[str]) -> str:
|
||||
"""Format workout timestamp for tabular display (shorter format)"""
|
||||
if timestamp is None:
|
||||
return "Unknown"
|
||||
|
||||
try:
|
||||
dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00'))
|
||||
return dt.strftime("%m/%d/%Y %H:%M")
|
||||
except (ValueError, AttributeError):
|
||||
return "Invalid"
|
||||
|
||||
|
||||
def format_activity_timestamp(timestamp: Optional[str]) -> str:
|
||||
"""Format activity file timestamp for tabular display"""
|
||||
if timestamp is None:
|
||||
return "Unknown"
|
||||
|
||||
try:
|
||||
# Handle both ISO formats with and without timezone info
|
||||
if timestamp.endswith('Z'):
|
||||
dt = datetime.fromisoformat(timestamp.replace('Z', '+00:00'))
|
||||
elif '+' in timestamp or '-' in timestamp[-6:]:
|
||||
# Handle timezone offset like -05:00
|
||||
dt = datetime.fromisoformat(timestamp)
|
||||
else:
|
||||
dt = datetime.fromisoformat(timestamp)
|
||||
|
||||
return dt.strftime("%Y-%m-%d %H:%M:%S")
|
||||
except (ValueError, AttributeError):
|
||||
return "Invalid date"
|
||||
|
||||
|
||||
def format_workout_summary(workout: Workout) -> str:
|
||||
"""Create a one-line summary of a workout"""
|
||||
distance = format_distance(workout.distance_miles)
|
||||
duration = format_time_duration(workout.time_seconds)
|
||||
pace = format_pace(workout.pace_min_per_mile)
|
||||
|
||||
parts = []
|
||||
if workout.distance_miles:
|
||||
parts.append(distance)
|
||||
if workout.time_seconds:
|
||||
parts.append(f"in {duration}")
|
||||
if workout.pace_min_per_mile:
|
||||
parts.append(f"({pace} pace)")
|
||||
|
||||
if parts:
|
||||
return " ".join(parts)
|
||||
else:
|
||||
return "Workout data"
|
||||
|
||||
|
||||
def estimate_calories(distance_miles: Optional[float], time_seconds: Optional[Union[int, float]], user_weight: float = 150) -> Optional[int]:
|
||||
"""Estimate calories burned based on distance and time"""
|
||||
if not distance_miles or not time_seconds:
|
||||
return None
|
||||
|
||||
pace = (time_seconds / 60) / distance_miles
|
||||
met = 8.0
|
||||
|
||||
if pace < 6:
|
||||
met = 15.0
|
||||
elif pace < 7:
|
||||
met = 12.0
|
||||
elif pace < 8:
|
||||
met = 10.0
|
||||
elif pace < 9:
|
||||
met = 8.5
|
||||
elif pace < 10:
|
||||
met = 8.0
|
||||
else:
|
||||
met = 7.0
|
||||
|
||||
return int(met * (user_weight / 2.2) * (time_seconds / 3600))
|
||||
|
||||
|
||||
# Column widths for workout table
|
||||
_COL_ID = 3
|
||||
_COL_DATE = 16
|
||||
_COL_DISTANCE = 10
|
||||
_COL_DURATION = 9
|
||||
_COL_PACE = 10
|
||||
_COL_SPEED = 9
|
||||
_COL_SOURCE = 7
|
||||
|
||||
|
||||
def print_workout_table_header():
|
||||
"""Print the header for a workout table"""
|
||||
print(f"{'ID':<{_COL_ID}} {'Date':<{_COL_DATE}} {'Distance':<{_COL_DISTANCE}} {'Duration':<{_COL_DURATION}} {'Pace':<{_COL_PACE}} {'Speed':<{_COL_SPEED}} {'Source':<{_COL_SOURCE}}")
|
||||
print("-" * (_COL_ID + _COL_DATE + _COL_DISTANCE + _COL_DURATION + _COL_PACE + _COL_SPEED + _COL_SOURCE + 6)) # +6 for spaces between columns
|
||||
|
||||
|
||||
def print_workout_row(workout: Workout):
|
||||
"""Print a single workout as a table row"""
|
||||
date = format_workout_date_short(workout.start_time or workout.created_at)
|
||||
distance = format_distance(workout.distance_miles)
|
||||
duration = format_time_duration(workout.time_seconds)
|
||||
pace = format_pace(workout.pace_min_per_mile)
|
||||
speed = format_speed(workout.speed_mph)
|
||||
source = "FIT" if workout.activity_file_id else "Manual"
|
||||
|
||||
print(f"{workout.id:<{_COL_ID}} {date:<{_COL_DATE}} {distance:<{_COL_DISTANCE}} {duration:<{_COL_DURATION}} {pace:<{_COL_PACE}} {speed:<{_COL_SPEED}} {source:<{_COL_SOURCE}}")
|
||||
|
||||
|
||||
def print_workout_details(workout: Workout):
|
||||
"""Print detailed workout information"""
|
||||
print(f"Workout #{workout.id}")
|
||||
print(f" Date: {format_workout_date(workout.start_time or workout.created_at)}")
|
||||
|
||||
if workout.distance_miles:
|
||||
print(f" Distance: {format_distance(workout.distance_miles)}")
|
||||
|
||||
if workout.time_seconds:
|
||||
print(f" Duration: {format_time_duration(workout.time_seconds)}")
|
||||
|
||||
if workout.pace_min_per_mile:
|
||||
print(f" Pace: {format_pace(workout.pace_min_per_mile)}")
|
||||
|
||||
if workout.speed_mph:
|
||||
print(f" Speed: {format_speed(workout.speed_mph)}")
|
||||
|
||||
if workout.start_time and workout.end_time:
|
||||
start = format_workout_date(workout.start_time)
|
||||
end = format_workout_date(workout.end_time)
|
||||
print(f" Start: {start}")
|
||||
print(f" End: {end}")
|
||||
|
||||
estimated_calories = estimate_calories(workout.distance_miles, workout.time_seconds)
|
||||
if estimated_calories:
|
||||
print(f" Estimated Calories: {estimated_calories}")
|
||||
|
||||
source = "FIT file" if workout.activity_file_id else "Manual entry"
|
||||
print(f" Source: {source}")
|
||||
if workout.activity_file_id:
|
||||
print(f" Activity File ID: {workout.activity_file_id}")
|
||||
|
||||
print(f" Created: {format_workout_date(workout.created_at)}")
|
||||
print(f" Updated: {format_workout_date(workout.updated_at)}")
|
||||
@@ -1,5 +0,0 @@
|
||||
from trygo_py_cliclient.config.config import (
|
||||
GeneralConfig,
|
||||
)
|
||||
|
||||
__all__ = ["GeneralConfig"]
|
||||
@@ -1,6 +0,0 @@
|
||||
from dataclasses import dataclass
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class GeneralConfig:
|
||||
log_pattern: str = "%(asctime)s | %(process)d | %(levelname)-7s | %(name)s:%(lineno)d | %(message)s"
|
||||
@@ -1,25 +0,0 @@
|
||||
import logging
|
||||
import dacite
|
||||
|
||||
from trygo_py_cliclient.config import GeneralConfig
|
||||
|
||||
_logger = logging.getLogger(__name__)
|
||||
|
||||
_common_dacite_config = dacite.Config(
|
||||
strict=True,
|
||||
)
|
||||
|
||||
|
||||
def read_general_config_dict(general_config_dict: dict) -> GeneralConfig:
|
||||
"""
|
||||
Converts a dictionary to a GeneralConfig object
|
||||
|
||||
:param general_config_dict: dictionary containing general config values
|
||||
:return: GeneralConfig object
|
||||
"""
|
||||
general_config = dacite.from_dict(
|
||||
data_class=GeneralConfig,
|
||||
data=general_config_dict,
|
||||
config=_common_dacite_config,
|
||||
)
|
||||
return general_config
|
||||
135
uv.lock
generated
135
uv.lock
generated
@@ -2,6 +2,57 @@ version = 1
|
||||
revision = 3
|
||||
requires-python = ">=3.12"
|
||||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2025.10.5"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/4c/5b/b6ce21586237c77ce67d01dc5507039d444b630dd76611bbca2d8e5dcd91/certifi-2025.10.5.tar.gz", hash = "sha256:47c09d31ccf2acf0be3f701ea53595ee7e0b8fa08801c6624be771df09ae7b43", size = 164519, upload-time = "2025-10-05T04:12:15.808Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/e4/37/af0d2ef3967ac0d6113837b44a4f0bfe1328c2b9763bd5b1744520e5cfed/certifi-2025.10.5-py3-none-any.whl", hash = "sha256:0f212c2744a9bb6de0c56639a6f68afe01ecd92d91f14ae897c4fe7bbeeef0de", size = 163286, upload-time = "2025-10-05T04:12:14.03Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "charset-normalizer"
|
||||
version = "3.4.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/83/2d/5fd176ceb9b2fc619e63405525573493ca23441330fcdaee6bef9460e924/charset_normalizer-3.4.3.tar.gz", hash = "sha256:6fce4b8500244f6fcb71465d4a4930d132ba9ab8e71a7859e6a5d59851068d14", size = 122371, upload-time = "2025-08-09T07:57:28.46Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/e9/5e/14c94999e418d9b87682734589404a25854d5f5d0408df68bc15b6ff54bb/charset_normalizer-3.4.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e28e334d3ff134e88989d90ba04b47d84382a828c061d0d1027b1b12a62b39b1", size = 205655, upload-time = "2025-08-09T07:56:08.475Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7d/a8/c6ec5d389672521f644505a257f50544c074cf5fc292d5390331cd6fc9c3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0cacf8f7297b0c4fcb74227692ca46b4a5852f8f4f24b3c766dd94a1075c4884", size = 146223, upload-time = "2025-08-09T07:56:09.708Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fc/eb/a2ffb08547f4e1e5415fb69eb7db25932c52a52bed371429648db4d84fb1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c6fd51128a41297f5409deab284fecbe5305ebd7e5a1f959bee1c054622b7018", size = 159366, upload-time = "2025-08-09T07:56:11.326Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/82/10/0fd19f20c624b278dddaf83b8464dcddc2456cb4b02bb902a6da126b87a1/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cfb2aad70f2c6debfbcb717f23b7eb55febc0bb23dcffc0f076009da10c6392", size = 157104, upload-time = "2025-08-09T07:56:13.014Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/16/ab/0233c3231af734f5dfcf0844aa9582d5a1466c985bbed6cedab85af9bfe3/charset_normalizer-3.4.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1606f4a55c0fd363d754049cdf400175ee96c992b1f8018b993941f221221c5f", size = 151830, upload-time = "2025-08-09T07:56:14.428Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ae/02/e29e22b4e02839a0e4a06557b1999d0a47db3567e82989b5bb21f3fbbd9f/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:027b776c26d38b7f15b26a5da1044f376455fb3766df8fc38563b4efbc515154", size = 148854, upload-time = "2025-08-09T07:56:16.051Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/05/6b/e2539a0a4be302b481e8cafb5af8792da8093b486885a1ae4d15d452bcec/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:42e5088973e56e31e4fa58eb6bd709e42fc03799c11c42929592889a2e54c491", size = 160670, upload-time = "2025-08-09T07:56:17.314Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/31/e7/883ee5676a2ef217a40ce0bffcc3d0dfbf9e64cbcfbdf822c52981c3304b/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:cc34f233c9e71701040d772aa7490318673aa7164a0efe3172b2981218c26d93", size = 158501, upload-time = "2025-08-09T07:56:18.641Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c1/35/6525b21aa0db614cf8b5792d232021dca3df7f90a1944db934efa5d20bb1/charset_normalizer-3.4.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:320e8e66157cc4e247d9ddca8e21f427efc7a04bbd0ac8a9faf56583fa543f9f", size = 153173, upload-time = "2025-08-09T07:56:20.289Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/ee/f4704bad8201de513fdc8aac1cabc87e38c5818c93857140e06e772b5892/charset_normalizer-3.4.3-cp312-cp312-win32.whl", hash = "sha256:fb6fecfd65564f208cbf0fba07f107fb661bcd1a7c389edbced3f7a493f70e37", size = 99822, upload-time = "2025-08-09T07:56:21.551Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/39/f5/3b3836ca6064d0992c58c7561c6b6eee1b3892e9665d650c803bd5614522/charset_normalizer-3.4.3-cp312-cp312-win_amd64.whl", hash = "sha256:86df271bf921c2ee3818f0522e9a5b8092ca2ad8b065ece5d7d9d0e9f4849bcc", size = 107543, upload-time = "2025-08-09T07:56:23.115Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/65/ca/2135ac97709b400c7654b4b764daf5c5567c2da45a30cdd20f9eefe2d658/charset_normalizer-3.4.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:14c2a87c65b351109f6abfc424cab3927b3bdece6f706e4d12faaf3d52ee5efe", size = 205326, upload-time = "2025-08-09T07:56:24.721Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/71/11/98a04c3c97dd34e49c7d247083af03645ca3730809a5509443f3c37f7c99/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41d1fc408ff5fdfb910200ec0e74abc40387bccb3252f3f27c0676731df2b2c8", size = 146008, upload-time = "2025-08-09T07:56:26.004Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/60/f5/4659a4cb3c4ec146bec80c32d8bb16033752574c20b1252ee842a95d1a1e/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1bb60174149316da1c35fa5233681f7c0f9f514509b8e399ab70fea5f17e45c9", size = 159196, upload-time = "2025-08-09T07:56:27.25Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/86/9e/f552f7a00611f168b9a5865a1414179b2c6de8235a4fa40189f6f79a1753/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:30d006f98569de3459c2fc1f2acde170b7b2bd265dc1943e87e1a4efe1b67c31", size = 156819, upload-time = "2025-08-09T07:56:28.515Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/95/42aa2156235cbc8fa61208aded06ef46111c4d3f0de233107b3f38631803/charset_normalizer-3.4.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:416175faf02e4b0810f1f38bcb54682878a4af94059a1cd63b8747244420801f", size = 151350, upload-time = "2025-08-09T07:56:29.716Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c2/a9/3865b02c56f300a6f94fc631ef54f0a8a29da74fb45a773dfd3dcd380af7/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6aab0f181c486f973bc7262a97f5aca3ee7e1437011ef0c2ec04b5a11d16c927", size = 148644, upload-time = "2025-08-09T07:56:30.984Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/77/d9/cbcf1a2a5c7d7856f11e7ac2d782aec12bdfea60d104e60e0aa1c97849dc/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:fdabf8315679312cfa71302f9bd509ded4f2f263fb5b765cf1433b39106c3cc9", size = 160468, upload-time = "2025-08-09T07:56:32.252Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/42/6f45efee8697b89fda4d50580f292b8f7f9306cb2971d4b53f8914e4d890/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:bd28b817ea8c70215401f657edef3a8aa83c29d447fb0b622c35403780ba11d5", size = 158187, upload-time = "2025-08-09T07:56:33.481Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/70/99/f1c3bdcfaa9c45b3ce96f70b14f070411366fa19549c1d4832c935d8e2c3/charset_normalizer-3.4.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:18343b2d246dc6761a249ba1fb13f9ee9a2bcd95decc767319506056ea4ad4dc", size = 152699, upload-time = "2025-08-09T07:56:34.739Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a3/ad/b0081f2f99a4b194bcbb1934ef3b12aa4d9702ced80a37026b7607c72e58/charset_normalizer-3.4.3-cp313-cp313-win32.whl", hash = "sha256:6fb70de56f1859a3f71261cbe41005f56a7842cc348d3aeb26237560bfa5e0ce", size = 99580, upload-time = "2025-08-09T07:56:35.981Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9a/8f/ae790790c7b64f925e5c953b924aaa42a243fb778fed9e41f147b2a5715a/charset_normalizer-3.4.3-cp313-cp313-win_amd64.whl", hash = "sha256:cf1ebb7d78e1ad8ec2a8c4732c7be2e736f6e5123a4146c5b89c9d1f585f8cef", size = 107366, upload-time = "2025-08-09T07:56:37.339Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/91/b5a06ad970ddc7a0e513112d40113e834638f4ca1120eb727a249fb2715e/charset_normalizer-3.4.3-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3cd35b7e8aedeb9e34c41385fda4f73ba609e561faedfae0a9e75e44ac558a15", size = 204342, upload-time = "2025-08-09T07:56:38.687Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ce/ec/1edc30a377f0a02689342f214455c3f6c2fbedd896a1d2f856c002fc3062/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b89bc04de1d83006373429975f8ef9e7932534b8cc9ca582e4db7d20d91816db", size = 145995, upload-time = "2025-08-09T07:56:40.048Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/17/e5/5e67ab85e6d22b04641acb5399c8684f4d37caf7558a53859f0283a650e9/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2001a39612b241dae17b4687898843f254f8748b796a2e16f1051a17078d991d", size = 158640, upload-time = "2025-08-09T07:56:41.311Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/f1/e5/38421987f6c697ee3722981289d554957c4be652f963d71c5e46a262e135/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:8dcfc373f888e4fb39a7bc57e93e3b845e7f462dacc008d9749568b1c4ece096", size = 156636, upload-time = "2025-08-09T07:56:43.195Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a0/e4/5a075de8daa3ec0745a9a3b54467e0c2967daaaf2cec04c845f73493e9a1/charset_normalizer-3.4.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:18b97b8404387b96cdbd30ad660f6407799126d26a39ca65729162fd810a99aa", size = 150939, upload-time = "2025-08-09T07:56:44.819Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/02/f7/3611b32318b30974131db62b4043f335861d4d9b49adc6d57c1149cc49d4/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ccf600859c183d70eb47e05a44cd80a4ce77394d1ac0f79dbd2dd90a69a3a049", size = 148580, upload-time = "2025-08-09T07:56:46.684Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7e/61/19b36f4bd67f2793ab6a99b979b4e4f3d8fc754cbdffb805335df4337126/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:53cd68b185d98dde4ad8990e56a58dea83a4162161b1ea9272e5c9182ce415e0", size = 159870, upload-time = "2025-08-09T07:56:47.941Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/06/57/84722eefdd338c04cf3030ada66889298eaedf3e7a30a624201e0cbe424a/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:30a96e1e1f865f78b030d65241c1ee850cdf422d869e9028e2fc1d5e4db73b92", size = 157797, upload-time = "2025-08-09T07:56:49.756Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/72/2a/aff5dd112b2f14bcc3462c312dce5445806bfc8ab3a7328555da95330e4b/charset_normalizer-3.4.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:d716a916938e03231e86e43782ca7878fb602a125a91e7acb8b5112e2e96ac16", size = 152224, upload-time = "2025-08-09T07:56:51.369Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/8c/9839225320046ed279c6e839d51f028342eb77c91c89b8ef2549f951f3ec/charset_normalizer-3.4.3-cp314-cp314-win32.whl", hash = "sha256:c6dbd0ccdda3a2ba7c2ecd9d77b37f3b5831687d8dc1b6ca5f56a4880cc7b7ce", size = 100086, upload-time = "2025-08-09T07:56:52.722Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ee/7a/36fbcf646e41f710ce0a563c1c9a343c6edf9be80786edeb15b6f62e17db/charset_normalizer-3.4.3-cp314-cp314-win_amd64.whl", hash = "sha256:73dc19b562516fc9bcf6e5d6e596df0b4eb98d87e4f79f3ae71840e6ed21361c", size = 107400, upload-time = "2025-08-09T07:56:55.172Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8a/1f/f041989e93b001bc4e44bb1669ccdcf54d3f00e628229a85b08d330615c5/charset_normalizer-3.4.3-py3-none-any.whl", hash = "sha256:ce571ab16d890d23b5c278547ba694193a45011ff86a9162a71307ed9f86759a", size = 53175, upload-time = "2025-08-09T07:57:26.864Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "colorama"
|
||||
version = "0.4.6"
|
||||
@@ -79,6 +130,9 @@ version = "0.1.0"
|
||||
source = { editable = "." }
|
||||
dependencies = [
|
||||
{ name = "dacite" },
|
||||
{ name = "requests" },
|
||||
{ name = "tomli" },
|
||||
{ name = "types-requests" },
|
||||
]
|
||||
|
||||
[package.dev-dependencies]
|
||||
@@ -92,7 +146,12 @@ dev = [
|
||||
]
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [{ name = "dacite", specifier = ">=1.9.2" }]
|
||||
requires-dist = [
|
||||
{ name = "dacite", specifier = ">=1.9.2" },
|
||||
{ name = "requests", specifier = ">=2.31.0" },
|
||||
{ name = "tomli", specifier = ">=2.2.1" },
|
||||
{ name = "types-requests", specifier = ">=2.32.4.20250913" },
|
||||
]
|
||||
|
||||
[package.metadata.requires-dev]
|
||||
dev = [
|
||||
@@ -104,6 +163,15 @@ dev = [
|
||||
{ name = "syrupy", specifier = ">=4.9.0" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "idna"
|
||||
version = "3.10"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "iniconfig"
|
||||
version = "2.0.0"
|
||||
@@ -220,6 +288,21 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/36/3b/48e79f2cd6a61dbbd4807b4ed46cb564b4fd50a76166b1c4ea5c1d9e2371/pytest_cov-6.0.0-py3-none-any.whl", hash = "sha256:eee6f1b9e61008bd34975a4d5bab25801eb31898b032dd55addc93e96fcaaa35", size = 22949, upload-time = "2024-10-29T20:13:33.215Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "requests"
|
||||
version = "2.32.5"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "certifi" },
|
||||
{ name = "charset-normalizer" },
|
||||
{ name = "idna" },
|
||||
{ name = "urllib3" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.9.10"
|
||||
@@ -257,6 +340,47 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/14/78/1e7aee4498f79624f85a6480eb8bc04bacb729c30336bc61b384c6ce4998/syrupy-4.9.0-py3-none-any.whl", hash = "sha256:3028d60188df9b39079678501be7d72fe64d32e2bb53d78df87f5a84bde94d76", size = 52045, upload-time = "2025-03-08T19:08:29.96Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tomli"
|
||||
version = "2.2.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/18/87/302344fed471e44a87289cf4967697d07e532f2421fdaf868a303cbae4ff/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff", size = 17175, upload-time = "2024-11-27T22:38:36.873Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/52/e1/f8af4c2fcde17500422858155aeb0d7e93477a0d59a98e56cbfe75070fd0/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea", size = 132762, upload-time = "2024-11-27T22:38:07.731Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/03/b8/152c68bb84fc00396b83e7bbddd5ec0bd3dd409db4195e2a9b3e398ad2e3/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8", size = 123453, upload-time = "2024-11-27T22:38:09.384Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/d6/fc9267af9166f79ac528ff7e8c55c8181ded34eb4b0e93daa767b8841573/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192", size = 233486, upload-time = "2024-11-27T22:38:10.329Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5c/51/51c3f2884d7bab89af25f678447ea7d297b53b5a3b5730a7cb2ef6069f07/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222", size = 242349, upload-time = "2024-11-27T22:38:11.443Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ab/df/bfa89627d13a5cc22402e441e8a931ef2108403db390ff3345c05253935e/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77", size = 252159, upload-time = "2024-11-27T22:38:13.099Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/6e/fa2b916dced65763a5168c6ccb91066f7639bdc88b48adda990db10c8c0b/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6", size = 237243, upload-time = "2024-11-27T22:38:14.766Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b4/04/885d3b1f650e1153cbb93a6a9782c58a972b94ea4483ae4ac5cedd5e4a09/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd", size = 259645, upload-time = "2024-11-27T22:38:15.843Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9c/de/6b432d66e986e501586da298e28ebeefd3edc2c780f3ad73d22566034239/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e", size = 244584, upload-time = "2024-11-27T22:38:17.645Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1c/9a/47c0449b98e6e7d1be6cbac02f93dd79003234ddc4aaab6ba07a9a7482e2/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98", size = 98875, upload-time = "2024-11-27T22:38:19.159Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/60/9b9638f081c6f1261e2688bd487625cd1e660d0a85bd469e91d8db969734/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4", size = 109418, upload-time = "2024-11-27T22:38:20.064Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/04/90/2ee5f2e0362cb8a0b6499dc44f4d7d48f8fff06d28ba46e6f1eaa61a1388/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7", size = 132708, upload-time = "2024-11-27T22:38:21.659Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c0/ec/46b4108816de6b385141f082ba99e315501ccd0a2ea23db4a100dd3990ea/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c", size = 123582, upload-time = "2024-11-27T22:38:22.693Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a0/bd/b470466d0137b37b68d24556c38a0cc819e8febe392d5b199dcd7f578365/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13", size = 232543, upload-time = "2024-11-27T22:38:24.367Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/e5/82e80ff3b751373f7cead2815bcbe2d51c895b3c990686741a8e56ec42ab/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281", size = 241691, upload-time = "2024-11-27T22:38:26.081Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/05/7e/2a110bc2713557d6a1bfb06af23dd01e7dde52b6ee7dadc589868f9abfac/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272", size = 251170, upload-time = "2024-11-27T22:38:27.921Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/64/7b/22d713946efe00e0adbcdfd6d1aa119ae03fd0b60ebed51ebb3fa9f5a2e5/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140", size = 236530, upload-time = "2024-11-27T22:38:29.591Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/38/31/3a76f67da4b0cf37b742ca76beaf819dca0ebef26d78fc794a576e08accf/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2", size = 258666, upload-time = "2024-11-27T22:38:30.639Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/07/10/5af1293da642aded87e8a988753945d0cf7e00a9452d3911dd3bb354c9e2/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744", size = 243954, upload-time = "2024-11-27T22:38:31.702Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5b/b9/1ed31d167be802da0fc95020d04cd27b7d7065cc6fbefdd2f9186f60d7bd/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec", size = 98724, upload-time = "2024-11-27T22:38:32.837Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c7/32/b0963458706accd9afcfeb867c0f9175a741bf7b19cd424230714d722198/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69", size = 109383, upload-time = "2024-11-27T22:38:34.455Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/6e/c2/61d3e0f47e2b74ef40a68b9e6ad5984f6241a942f7cd3bbfbdbd03861ea9/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc", size = 14257, upload-time = "2024-11-27T22:38:35.385Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "types-requests"
|
||||
version = "2.32.4.20250913"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "urllib3" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/36/27/489922f4505975b11de2b5ad07b4fe1dca0bca9be81a703f26c5f3acfce5/types_requests-2.32.4.20250913.tar.gz", hash = "sha256:abd6d4f9ce3a9383f269775a9835a4c24e5cd6b9f647d64f88aa4613c33def5d", size = 23113, upload-time = "2025-09-13T02:40:02.309Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/2a/20/9a227ea57c1285986c4cf78400d0a91615d25b24e257fd9e2969606bdfae/types_requests-2.32.4.20250913-py3-none-any.whl", hash = "sha256:78c9c1fffebbe0fa487a418e0fa5252017e9c60d1a2da394077f1780f655d7e1", size = 20658, upload-time = "2025-09-13T02:40:01.115Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "typing-extensions"
|
||||
version = "4.12.2"
|
||||
@@ -265,3 +389,12 @@ sdist = { url = "https://files.pythonhosted.org/packages/df/db/f35a00659bc03fec3
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438, upload-time = "2024-06-07T18:52:13.582Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "urllib3"
|
||||
version = "2.5.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" },
|
||||
]
|
||||
|
||||
Reference in New Issue
Block a user