mirror of
https://github.com/ChuckBuilds/LEDMatrix.git
synced 2026-05-16 10:23:31 +00:00
* fix(deps): bump minimum versions to address CVEs Pillow 10.4.0 → 12.2.0: CVE-2026-40192 (DoS via FITS decompression bomb), CVE-2026-25990 (OOB write via PSD image), CVE-2026-42311/42308/42310 requests 2.32.0 → 2.33.0: CVE-2026-25645 (temp file security bypass), CVE-2024-47081 (.netrc credentials leak) werkzeug 3.0.0 → 3.1.6: CVE-2023-46136, CVE-2024-49766/49767, CVE-2025-66221, CVE-2026-21860/27199 (DoS, path traversal, safe_join bypass) Flask 3.0.0 → 3.1.3: CVE-2026-27205 (session data caching info disclosure) spotipy 2.24.0 → 2.25.2: CVE-2025-27154, CVE-2025-66040 python-socketio 5.11.0 → 5.14.0: CVE-2025-61765 pytest 7.4.0 → 9.0.3: CVE-2025-71176 (insecure temp dir handling) Updated in requirements.txt, web_interface/requirements.txt, plugin-repos/starlark-apps/requirements.txt, and plugin-repos/march-madness/requirements.txt. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: resolve Pylint errors in executor, data service, and odds call Rename TimeoutError to PluginTimeoutError in plugin_executor.py to avoid shadowing the built-in; no external callers affected. Remove dead try/except in BackgroundDataService.shutdown: executor.shutdown() never accepted a timeout kwarg so the try branch always raised TypeError. Simplify to a direct shutdown(wait=wait) call. Remove is_live kwarg from odds_manager.get_odds() call in sports.py; BaseOddsManager.get_odds() has no such parameter. The live update interval is already encoded in the update_interval_seconds argument passed alongside. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: MD5→SHA-256, shellcheck warnings, and broken doc links config_service.py: replace MD5 with SHA-256 for config change detection; same semantics (equality comparison), no stored hashes affected. Shell scripts — shellcheck warnings: - diagnose_web_interface.sh: remove useless cat (SC2002) - dev_plugin_setup.sh: restructure A&&B||C into if/then (SC2015) - fix_assets_permissions.sh: remove unused REAL_HOME block (SC2034) - install_web_service.sh: remove unused USER_HOME assignment (SC2034) - diagnose_web_ui.sh: remove unused SUDO assignments (SC2034) - diagnose_plugin_permissions.sh: remove unused BLUE color var (SC2034) - first_time_install.sh: remove unused CLEAR var, PACKAGE_NAME assignment, and replace loop variable with _ (SC2034) docs/PLUGIN_ARCHITECTURE_SPEC.md: fix 10 broken TOC anchor links to include section numbers matching the actual headings (MD051). Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: remove unused imports and bare exception aliases (pyflakes F401/F841) Remove unused imports across 86 files in src/, web_interface/, test/, and scripts/ using autoflake. No logic changes — only dead import statements and unused names in from-imports are removed. Also remove bare exception aliases where the variable is never referenced in the handler body: - src/cache/disk_cache.py: except (IOError, OSError, PermissionError) as e - src/cache_manager.py: except (OSError, IOError, PermissionError) as perm_error - src/plugin_system/resource_monitor.py: except Exception as e - web_interface/app.py: except Exception as read_err 86 files changed, 205 lines removed, 18 pre-existing test failures unchanged. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: remove unused local variable assignments (pyflakes F841) Dead assignments removed across src/ and web_interface/: - background_data_service: drop future= on fire-and-forget executor.submit - base_classes/baseball: drop font= (all rendering uses self.fonts['time']) - base_classes/hockey: drop status_short= (never referenced after assignment) - common/cli: drop game_helper=/config_helper= bindings in import-test block; constructors called for instantiation-only validation - common/display_helper: drop text_width= (x_position uses display_width directly); drop draw= in create_error_image (uses _draw_centered_text) - config_manager: remove dead secrets_content loading block in migration path (comment already noted save_config_atomic handles secrets internally) - display_manager: drop setup_start= (timing was never completed or read) - font_manager: drop target_path= (catalog uses font_file_path directly); drop face=/font= bindings in validate_font (validation by construction — TypeError on failure is the signal, not the return value) - font_test_manager: drop width=/height= (draw_text uses display_manager directly) - plugin_system/state_reconciliation: drop manager= (only config/disk/state_mgr used) - plugin_system/store_manager: drop result= on pip install subprocess.run (check=True raises on failure; stdout unused) - web_interface/blueprints/pages_v3: drop main_config_path=""/secrets_config_path="" (render_template uses config_manager.get_*_path() inline) Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(js): resolve ESLint no-undef warnings across 6 JS files Three distinct patterns: 1. Vendor library globals — htmx is injected by <script> before these extension files load; ESLint lints files in isolation and doesn't know. Fix: add /* global htmx */ to htmx-sse.js and htmx-json-enc.js. 2. Cross-file globals — showNotification is defined as window.showNotification in app.js/notification.js but called bare in app.js and error_handler.js. ESLint doesn't connect window.X = Y with a bare call to X. Fix: add /* global showNotification */ to app.js and error_handler.js. 3. Forward-reference window.* functions — in array-table.js, checkbox-group.js, and custom-feeds.js, functions like removeArrayTableRow are called early inside event-handler closures but assigned to window.* later in the file. At runtime this works (the handler fires after the assignment), but ESLint sees the bare name at the call site. Fix: change bare calls to window.removeArrayTableRow(this) etc. so the reference is explicit and ESLint-safe. Also guard the updateSystemStats call in app.js reconnectSSE: the function is called but defined nowhere in the codebase. Guard with typeof check so it won't throw ReferenceError if the reconnect path is hit. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(js): resolve Biome lint warnings across 9 JS files noUnusedVariables (catch bindings → optional catch syntax): - app.js, file-upload.js, timezone-selector.js: } catch (e) { → } catch { ES2019 optional catch binding; e was unused in all three handlers noUnusedVariables (dead assignments): - app.js: remove const data= in display SSE stub (handler does nothing yet) - api_client.js: remove const timeoutId= (setTimeout ID never used to cancel) - custom-feeds.js: remove const oldIndex= (getAttribute result never read) - schedule-picker.js: remove const compactMode= (never used in HTML build) - select-dropdown.js: remove const icons= (icons not yet rendered in options) noPrototypeBuiltins: - day-selector.js: DAY_LABELS.hasOwnProperty(x) → Object.prototype.hasOwnProperty.call(DAY_LABELS, x) Safe form that works even on null-prototype objects useIterableCallbackReturn: - file-upload.js, notification.js: forEach(x => expr) → forEach(x => { expr; }) — forEach ignores return values; implicit return from arrow body was misleading htmx-sse.js is a vendor extension file with old-style var/== patterns that are correct for it; 18 Biome issues suppressed via Codacy API rather than modifying the vendor source. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(security): escape user input in raw HTML responses in pages_v3.py plugin_id comes directly from the URL path (/partials/plugin-config/<plugin_id>) and was interpolated into an HTML fragment without escaping. A crafted URL like /partials/plugin-config/<script>alert(1)</script> would inject that tag into the DOM via the HTMX partial response. Fix: wrap all user-controlled values in markupsafe.escape() before embedding in raw HTML strings. Affects the plugin-not-found 404 response and both error 500 responses in the plugin config partial. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: address Bandit B108/B110 across production code B110 (try/except/pass): - display_controller.py: narrow 'except Exception' to 'except AttributeError' for get_offset_frame() — plugins not having this optional method is the expected case, not all exceptions - config_manager.py: B110 already resolved by the earlier removal of the dead secrets-loading block (the except/pass was inside it) - All other except/pass blocks in src/ and web_interface/ are intentional (last-resort recovery, best-effort fallbacks, non-critical startup probes). Annotated each with # nosec B110 and a brief inline reason so the decision is explicit for future reviewers. - Test files and plugin-repos B110 suppressed via Codacy API (not prod code). B108 (/tmp usage): - permission_utils.py: /tmp listed to PREVENT permission changes on it — not used as a temp path. Annotated # nosec B108. - display_manager.py: fixed snapshot path is intentional (web UI reads same path); path-check guard also annotated. - wifi_manager.py: named /tmp files match the sudoers allowlist installed with the system (the paths are hard-coded in both places by design). Annotated all six open/cp references # nosec B108. - scripts/render_plugin.py: dev script default overridable by user. Annotated. - web_interface/app.py: reads the same fixed path written by display_manager. Annotated # nosec B108. - Test files suppressed via Codacy API. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: address remaining Codacy security findings Flask debug=True (real fix): - web_interface/app.py: debug=True in __main__ block exposes the Werkzeug interactive debugger (arbitrary code execution). Changed to os.environ.get('FLASK_DEBUG', '0') == '1' — off by default, opt-in via environment variable for local development. nosec annotations (accepted risk with documented rationale): - disk_cache.py: os.chmod(0o660) is intentional — web UI and LED matrix service share a group, 660 gives group write while denying world access (B103 + Semgrep insecure-file-permissions suppressed in Codacy) - wifi_manager.py: urlopen to hardcoded connectivity-check.ubuntu.com URL (B310 — no user input involved) - font_manager.py: urlretrieve URL comes from user's own config file on their local device (B310) - start_web_conditionally.py: os.execvp with both sys.executable and a fixed PROJECT_DIR-relative constant (B606) Confirmed false positives suppressed via Codacy API (15 issues): - SSRF (3x): client-side JS fetch — SSRF is server-side; browser fetch is CORS-restricted to same origin - B105 (3x): test fixtures use dummy secrets by design; store_manager checks for the placeholder string, it is not itself a secret - PMD numeric literal (2x): 10000000 is within Number.MAX_SAFE_INTEGER - Prototype pollution (1x): read-only schema traversal, no writes - no-unsanitized_method (1x): dynamic import() is CORS-restricted - detect-unsafe-regex (1x): operates on server-controlled config values - plugin-repos B103 (1x): vendor code chmod on executable - Semgrep insecure-file-permissions (3x): same disk_cache 0o660 as above Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: remove unnecessary f prefix from f-strings without placeholders (F541) Pyflakes F541 flags f-strings that contain no {} interpolation — they are identical to plain strings but trigger unnecessary string formatting overhead. Fixed in production code: - src/base_classes/data_sources.py (2 debug log calls) - src/logo_downloader.py (1 error log) - src/plugin_system/store_manager.py (5 strings across 3 log calls) - src/web_interface/validators.py (1 return value) - src/wifi_manager.py (4 log/message strings) - web_interface/start.py (1 print) F541 issues in test/, scripts/, and plugin-repos/ suppressed via Codacy API as non-production code. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * chore(dev): add Pillow compatibility smoke test script Covers all Pillow APIs used in LEDMatrix — image creation, drawing, font metrics, LANCZOS resampling, paste/alpha_composite, and PNG I/O. Run after any Pillow version bump to catch regressions before deploy. python3 scripts/dev/test_pillow_compat.py Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: resolve 8 new Codacy issues introduced by PR changes shellcheck SC2034: - first_time_install.sh: 'type' loop variable also unused in the wifi status loop (we previously fixed 'device' → '_' but left 'type'). Changed to '_ _ state' since neither device nor type is referenced. ESLint no-undef: - app.js: typeof guards don't satisfy no-undef; added updateSystemStats to the /* global */ declaration alongside showNotification. nosec annotation: - web_interface/app.py: app.run(host='0.0.0.0') line changed when we fixed debug=True, giving it a new issue ID. Re-added # nosec B104. pyflakes F401: - scripts/dev/test_pillow_compat.py: ImageFilter was imported but never used in the smoke test. Removed from the import. Codacy API suppressions (false positives on changed lines): - disk_cache.py 0o660 chmod (2x): lines changed when # nosec B103 was added, producing new Semgrep issue IDs. Re-suppressed. - pages_v3.py raw-html-concat: Semgrep does not recognise escape() as a sanitizer; the escape() call IS the correct fix. - app.py flask 0.0.0.0: same line as B104 above; Semgrep rule also re-suppressed. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix: address PR review findings Fix (10 of 15 findings): plugin-repos/march-madness/requirements.txt: Add urllib3>=1.26.0 — manager.py directly imports from urllib3; it was an undeclared transitive dependency via requests. scripts/dev/dev_plugin_setup.sh: Restore subshell form (cd "$target_dir" && git pull --rebase) || true so the shell's working directory is not permanently changed after the if-cd block. Previous fix for SC2015 leaked cwd into the remainder of the script. src/base_classes/sports.py: Narrow 'except Exception' to 'except RuntimeError as e' and log via self.logger.debug — Path.home() raises only RuntimeError for service users; other exceptions should not be silently swallowed. src/config_service.py: Fix stale "MD5 checksum" in ConfigVersion.__init__ docstring (line 40); the implementation uses SHA-256 since the Codacy fix. src/wifi_manager.py: Log the last-resort AP enable failure with exc_info=True instead of silently passing — failure here means the device may be unreachable. web_interface/blueprints/pages_v3.py: Log the outer metadata pre-load exception at debug level instead of swallowing it silently; schema still loads fully below. src/background_data_service.py: Remove unused 'timeout' parameter from shutdown() — executor.shutdown() does not accept timeout; update __del__ caller accordingly. src/font_manager.py: Validate URL scheme before urlretrieve — reject non-http/https schemes (e.g. file://) to prevent reading local files from config-supplied URLs. src/plugin_system/plugin_executor.py: Simplify redundant except tuple: (PluginTimeoutError, PluginError, Exception) → Exception, which already covers the others. test/test_display_controller.py: Mark empty test_plugin_discovery_and_loading as @pytest.mark.skip with reason. Move duplicate 'from datetime import datetime' to module header and remove the stray mid-module copy. Skip (5 of 15 findings, with reasons): - pytest 9.0.3 concerns: full suite already verified (467 pass, 18 pre-existing) - Pillow 12.2.0 API concerns: no deprecated APIs in codebase; tests + Pi smoke test pass - diagnose_web_ui.sh sudo validation: set -e already ensures fail-fast on any sudo failure - app.py request-logging except: must stay silent (recursive logging risk); annotated - app.py SSE file-read except: genuinely transient I/O; annotated Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> --------- Co-authored-by: Chuck <chuck@example.com> Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
869 lines
34 KiB
Python
869 lines
34 KiB
Python
"""
|
|
Plugin Manager
|
|
|
|
Manages plugin discovery, loading, and lifecycle for the LEDMatrix system.
|
|
Handles dynamic plugin loading from the plugins/ directory.
|
|
|
|
API Version: 1.0.0
|
|
"""
|
|
|
|
import json
|
|
import sys
|
|
import subprocess
|
|
import time
|
|
import threading
|
|
from pathlib import Path
|
|
from typing import Dict, List, Optional, Any
|
|
import logging
|
|
from src.exceptions import PluginError
|
|
from src.logging_config import get_logger
|
|
from src.plugin_system.plugin_loader import PluginLoader
|
|
from src.plugin_system.plugin_executor import PluginExecutor
|
|
from src.plugin_system.plugin_state import PluginStateManager, PluginState
|
|
from src.plugin_system.schema_manager import SchemaManager
|
|
from src.common.permission_utils import (
|
|
ensure_directory_permissions,
|
|
get_plugin_dir_mode
|
|
)
|
|
|
|
|
|
class PluginManager:
|
|
"""
|
|
Manages plugin discovery, loading, and lifecycle.
|
|
|
|
The PluginManager is responsible for:
|
|
- Discovering plugins in the plugins/ directory
|
|
- Loading plugin modules and instantiating plugin classes
|
|
- Managing plugin lifecycle (load, unload, reload)
|
|
- Providing access to loaded plugins
|
|
- Maintaining plugin manifests
|
|
|
|
Uses composition with specialized components:
|
|
- PluginLoader: Handles module loading and dependency installation
|
|
- PluginExecutor: Handles plugin execution with timeout and error isolation
|
|
- PluginStateManager: Manages plugin state machine
|
|
"""
|
|
|
|
def __init__(self, plugins_dir: str = "plugins",
|
|
config_manager: Optional[Any] = None,
|
|
display_manager: Optional[Any] = None,
|
|
cache_manager: Optional[Any] = None,
|
|
font_manager: Optional[Any] = None) -> None:
|
|
"""
|
|
Initialize the Plugin Manager.
|
|
|
|
Args:
|
|
plugins_dir: Path to the plugins directory
|
|
config_manager: Configuration manager instance
|
|
display_manager: Display manager instance
|
|
cache_manager: Cache manager instance
|
|
font_manager: Font manager instance
|
|
"""
|
|
self.plugins_dir: Path = Path(plugins_dir)
|
|
self.config_manager: Optional[Any] = config_manager
|
|
self.display_manager: Optional[Any] = display_manager
|
|
self.cache_manager: Optional[Any] = cache_manager
|
|
self.font_manager: Optional[Any] = font_manager
|
|
self.logger: logging.Logger = get_logger(__name__)
|
|
|
|
# Initialize plugin system components
|
|
self.plugin_loader = PluginLoader(logger=self.logger)
|
|
self.plugin_executor = PluginExecutor(default_timeout=30.0, logger=self.logger)
|
|
self.state_manager = PluginStateManager(logger=self.logger)
|
|
self.schema_manager = SchemaManager(plugins_dir=self.plugins_dir, logger=self.logger)
|
|
|
|
# Lock protecting plugin_manifests and plugin_directories from
|
|
# concurrent mutation (background reconciliation) and reads (requests).
|
|
self._discovery_lock = threading.RLock()
|
|
|
|
# Active plugins
|
|
self.plugins: Dict[str, Any] = {}
|
|
self.plugin_manifests: Dict[str, Dict[str, Any]] = {}
|
|
self.plugin_modules: Dict[str, Any] = {}
|
|
self.plugin_last_update: Dict[str, float] = {}
|
|
|
|
# Health tracking (optional, set by display_controller if available)
|
|
self.health_tracker = None
|
|
self.resource_monitor = None
|
|
|
|
# Ensure plugins directory exists with proper permissions
|
|
try:
|
|
ensure_directory_permissions(self.plugins_dir, get_plugin_dir_mode())
|
|
except (OSError, PermissionError) as e:
|
|
self.logger.error("Could not create plugins directory %s: %s", self.plugins_dir, e, exc_info=True)
|
|
raise PluginError(f"Could not create plugins directory: {self.plugins_dir}", context={'error': str(e)}) from e
|
|
|
|
def _scan_directory_for_plugins(self, directory: Path) -> List[str]:
|
|
"""
|
|
Scan a directory for plugins.
|
|
|
|
Args:
|
|
directory: Directory to scan
|
|
|
|
Returns:
|
|
List of plugin IDs found
|
|
"""
|
|
plugin_ids = []
|
|
|
|
if not directory.exists():
|
|
return plugin_ids
|
|
|
|
# Build new state locally before acquiring lock
|
|
new_manifests: Dict[str, Dict[str, Any]] = {}
|
|
new_directories: Dict[str, Path] = {}
|
|
|
|
try:
|
|
for item in directory.iterdir():
|
|
if not item.is_dir():
|
|
continue
|
|
# Skip backup directories so they don't overwrite live entries
|
|
if '.standalone-backup-' in item.name:
|
|
continue
|
|
|
|
manifest_path = item / "manifest.json"
|
|
if manifest_path.exists():
|
|
try:
|
|
with open(manifest_path, 'r', encoding='utf-8') as f:
|
|
manifest = json.load(f)
|
|
plugin_id = manifest.get('id')
|
|
if plugin_id:
|
|
plugin_ids.append(plugin_id)
|
|
new_manifests[plugin_id] = manifest
|
|
new_directories[plugin_id] = item
|
|
except (json.JSONDecodeError, PermissionError, OSError) as e:
|
|
self.logger.warning("Error reading manifest from %s: %s", manifest_path, e, exc_info=True)
|
|
continue
|
|
except (OSError, PermissionError) as e:
|
|
self.logger.error("Error scanning directory %s: %s", directory, e, exc_info=True)
|
|
|
|
# Replace shared state under lock so uninstalled plugins don't linger
|
|
with self._discovery_lock:
|
|
self.plugin_manifests.clear()
|
|
self.plugin_manifests.update(new_manifests)
|
|
if not hasattr(self, 'plugin_directories'):
|
|
self.plugin_directories = {}
|
|
else:
|
|
self.plugin_directories.clear()
|
|
self.plugin_directories.update(new_directories)
|
|
|
|
return plugin_ids
|
|
|
|
def discover_plugins(self) -> List[str]:
|
|
"""
|
|
Discover all plugins in the plugins directory.
|
|
|
|
Also checks for potential config key collisions and logs warnings.
|
|
|
|
Returns:
|
|
List of plugin IDs
|
|
"""
|
|
self.logger.info("Discovering plugins in %s", self.plugins_dir)
|
|
plugin_ids = self._scan_directory_for_plugins(self.plugins_dir)
|
|
self.logger.info("Discovered %d plugin(s)", len(plugin_ids))
|
|
|
|
# Check for config key collisions
|
|
collisions = self.schema_manager.detect_config_key_collisions(plugin_ids)
|
|
for collision in collisions:
|
|
self.logger.warning(
|
|
"Config collision detected: %s",
|
|
collision.get('message', str(collision))
|
|
)
|
|
|
|
return plugin_ids
|
|
|
|
def _get_dependency_marker_path(self, plugin_id: str) -> Path:
|
|
"""Get path to dependency installation marker file."""
|
|
plugin_dir = self.plugins_dir / plugin_id
|
|
if not plugin_dir.exists():
|
|
# Try with ledmatrix- prefix
|
|
plugin_dir = self.plugins_dir / f"ledmatrix-{plugin_id}"
|
|
return plugin_dir / ".dependencies_installed"
|
|
|
|
def _check_dependencies_installed(self, plugin_id: str) -> bool:
|
|
"""Check if dependencies are already installed for a plugin."""
|
|
marker_path = self._get_dependency_marker_path(plugin_id)
|
|
return marker_path.exists()
|
|
|
|
def _mark_dependencies_installed(self, plugin_id: str) -> None:
|
|
"""Mark dependencies as installed for a plugin."""
|
|
marker_path = self._get_dependency_marker_path(plugin_id)
|
|
try:
|
|
marker_path.touch()
|
|
# Set proper file permissions after creating marker
|
|
from src.common.permission_utils import (
|
|
ensure_file_permissions,
|
|
get_plugin_file_mode
|
|
)
|
|
ensure_file_permissions(marker_path, get_plugin_file_mode())
|
|
except (OSError, PermissionError) as e:
|
|
self.logger.warning("Could not create dependency marker for %s: %s", plugin_id, e)
|
|
|
|
def _remove_dependency_marker(self, plugin_id: str) -> None:
|
|
"""Remove dependency installation marker."""
|
|
marker_path = self._get_dependency_marker_path(plugin_id)
|
|
try:
|
|
if marker_path.exists():
|
|
marker_path.unlink()
|
|
except (OSError, PermissionError) as e:
|
|
self.logger.warning("Could not remove dependency marker for %s: %s", plugin_id, e)
|
|
|
|
def _install_plugin_dependencies(self, requirements_file: Path) -> bool:
|
|
"""
|
|
Install plugin dependencies from requirements.txt.
|
|
|
|
Args:
|
|
requirements_file: Path to requirements.txt
|
|
|
|
Returns:
|
|
True if installation succeeded or not needed, False on error
|
|
"""
|
|
try:
|
|
self.logger.info("Installing dependencies from %s", requirements_file)
|
|
result = subprocess.run(
|
|
[sys.executable, "-m", "pip", "install", "--break-system-packages", "--no-cache-dir", "-r", str(requirements_file)],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=300,
|
|
check=False
|
|
)
|
|
|
|
if result.returncode == 0:
|
|
self.logger.info("Dependencies installed successfully")
|
|
return True
|
|
else:
|
|
self.logger.warning("Dependency installation returned non-zero exit code: %s", result.stderr)
|
|
return False
|
|
except subprocess.TimeoutExpired:
|
|
self.logger.error("Dependency installation timed out")
|
|
return False
|
|
except FileNotFoundError as e:
|
|
self.logger.warning("Command not found: %s. Skipping dependency installation", e)
|
|
return True
|
|
except (BrokenPipeError, OSError) as e:
|
|
# Handle broken pipe errors (errno 32) which can occur during pip downloads
|
|
# Often caused by network interruptions or output buffer issues
|
|
if isinstance(e, OSError) and e.errno == 32:
|
|
self.logger.error(
|
|
"Broken pipe error during dependency installation. "
|
|
"This usually indicates a network interruption or pip output buffer issue. "
|
|
"Try installing again or check your network connection."
|
|
)
|
|
else:
|
|
self.logger.error("OS error during dependency installation: %s", e)
|
|
return False
|
|
except Exception as e:
|
|
self.logger.error("Unexpected error installing dependencies: %s", e, exc_info=True)
|
|
return True
|
|
|
|
def load_plugin(self, plugin_id: str) -> bool:
|
|
"""
|
|
Load a plugin by ID.
|
|
|
|
This method:
|
|
1. Checks if plugin is already loaded
|
|
2. Validates the manifest exists
|
|
3. Uses PluginLoader to import module and instantiate plugin
|
|
4. Validates the plugin configuration
|
|
5. Stores the plugin instance
|
|
6. Updates plugin state
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
|
|
Returns:
|
|
True if loaded successfully, False otherwise
|
|
"""
|
|
if plugin_id in self.plugins:
|
|
self.logger.warning("Plugin %s already loaded", plugin_id)
|
|
return True
|
|
|
|
manifest = self.plugin_manifests.get(plugin_id)
|
|
if not manifest:
|
|
self.logger.error("No manifest found for plugin: %s", plugin_id)
|
|
self.state_manager.set_state(plugin_id, PluginState.ERROR)
|
|
return False
|
|
|
|
try:
|
|
# Update state to LOADED
|
|
self.state_manager.set_state(plugin_id, PluginState.LOADED)
|
|
|
|
# Find plugin directory using PluginLoader
|
|
plugin_directories = getattr(self, 'plugin_directories', None)
|
|
plugin_dir = self.plugin_loader.find_plugin_directory(
|
|
plugin_id,
|
|
self.plugins_dir,
|
|
plugin_directories
|
|
)
|
|
|
|
if plugin_dir is None:
|
|
self.logger.error("Plugin directory not found: %s", plugin_id)
|
|
self.logger.error("Searched in: %s", self.plugins_dir)
|
|
self.state_manager.set_state(plugin_id, PluginState.ERROR)
|
|
return False
|
|
|
|
# Update mapping if found via search
|
|
if plugin_directories is None or plugin_id not in plugin_directories:
|
|
if not hasattr(self, 'plugin_directories'):
|
|
self.plugin_directories = {}
|
|
self.plugin_directories[plugin_id] = plugin_dir
|
|
|
|
# Get plugin config
|
|
if self.config_manager:
|
|
full_config = self.config_manager.load_config()
|
|
config = full_config.get(plugin_id, {})
|
|
else:
|
|
config = {}
|
|
|
|
# Check if plugin has a config schema
|
|
schema_path = self.schema_manager.get_schema_path(plugin_id)
|
|
if schema_path is None:
|
|
# Schema file doesn't exist
|
|
self.logger.warning(
|
|
f"Plugin '{plugin_id}' has no config_schema.json - configuration will not be validated. "
|
|
f"Consider adding a schema file for better error detection and user experience."
|
|
)
|
|
else:
|
|
# Schema file exists, try to load it
|
|
schema = self.schema_manager.load_schema(plugin_id)
|
|
if schema is None:
|
|
# Schema exists but couldn't be loaded (likely invalid JSON or schema)
|
|
self.logger.warning(
|
|
f"Plugin '{plugin_id}' has a config_schema.json but it could not be loaded. "
|
|
f"The schema may be invalid. Please verify the schema file at: {schema_path}"
|
|
)
|
|
|
|
# Merge config with schema defaults to ensure all defaults are applied
|
|
try:
|
|
defaults = self.schema_manager.generate_default_config(plugin_id, use_cache=True)
|
|
config = self.schema_manager.merge_with_defaults(config, defaults)
|
|
self.logger.debug(f"Merged config with schema defaults for {plugin_id}")
|
|
except Exception as e:
|
|
self.logger.warning(f"Could not apply schema defaults for {plugin_id}: {e}")
|
|
# Continue with original config if defaults can't be applied
|
|
|
|
# Use PluginLoader to load plugin
|
|
plugin_instance, module = self.plugin_loader.load_plugin(
|
|
plugin_id=plugin_id,
|
|
manifest=manifest,
|
|
plugin_dir=plugin_dir,
|
|
config=config,
|
|
display_manager=self.display_manager,
|
|
cache_manager=self.cache_manager,
|
|
plugin_manager=self,
|
|
install_deps=True
|
|
)
|
|
|
|
# Store module
|
|
self.plugin_modules[plugin_id] = module
|
|
|
|
# Register plugin-shipped fonts with the FontManager (if any).
|
|
# Plugin manifests can declare a "fonts" block that ships custom
|
|
# fonts with the plugin; FontManager.register_plugin_fonts handles
|
|
# the actual loading. Wired here so manifest declarations take
|
|
# effect without requiring plugin code changes.
|
|
font_manifest = manifest.get('fonts')
|
|
if font_manifest and self.font_manager is not None and hasattr(
|
|
self.font_manager, 'register_plugin_fonts'
|
|
):
|
|
try:
|
|
self.font_manager.register_plugin_fonts(plugin_id, font_manifest)
|
|
except Exception as e:
|
|
self.logger.warning(
|
|
"Failed to register fonts for plugin %s: %s", plugin_id, e
|
|
)
|
|
|
|
# Validate configuration
|
|
if hasattr(plugin_instance, 'validate_config'):
|
|
try:
|
|
if not plugin_instance.validate_config():
|
|
self.logger.error("Plugin %s configuration validation failed", plugin_id)
|
|
self.state_manager.set_state(plugin_id, PluginState.ERROR)
|
|
return False
|
|
except Exception as e:
|
|
self.logger.error("Error validating plugin %s config: %s", plugin_id, e, exc_info=True)
|
|
self.state_manager.set_state(plugin_id, PluginState.ERROR, error=e)
|
|
return False
|
|
|
|
# Store plugin instance
|
|
self.plugins[plugin_id] = plugin_instance
|
|
self.plugin_last_update[plugin_id] = 0.0
|
|
|
|
# Update state based on enabled status
|
|
if config.get('enabled', True):
|
|
self.state_manager.set_state(plugin_id, PluginState.ENABLED)
|
|
# Call on_enable if plugin is enabled
|
|
if hasattr(plugin_instance, 'on_enable'):
|
|
plugin_instance.on_enable()
|
|
else:
|
|
self.state_manager.set_state(plugin_id, PluginState.DISABLED)
|
|
|
|
self.logger.info("Loaded plugin: %s", plugin_id)
|
|
|
|
return True
|
|
|
|
except PluginError as e:
|
|
self.logger.error("Plugin error loading %s: %s", plugin_id, e, exc_info=True)
|
|
self.state_manager.set_state(plugin_id, PluginState.ERROR, error=e)
|
|
return False
|
|
except Exception as e:
|
|
self.logger.error("Unexpected error loading plugin %s: %s", plugin_id, e, exc_info=True)
|
|
self.state_manager.set_state(plugin_id, PluginState.ERROR, error=e)
|
|
return False
|
|
|
|
def unload_plugin(self, plugin_id: str) -> bool:
|
|
"""
|
|
Unload a plugin by ID.
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
|
|
Returns:
|
|
True if unloaded successfully, False otherwise
|
|
"""
|
|
if plugin_id not in self.plugins:
|
|
self.logger.warning("Plugin %s not loaded", plugin_id)
|
|
return False
|
|
|
|
try:
|
|
plugin = self.plugins[plugin_id]
|
|
|
|
# Call cleanup if available
|
|
if hasattr(plugin, 'cleanup'):
|
|
try:
|
|
plugin.cleanup()
|
|
except Exception as e:
|
|
self.logger.warning("Error during plugin cleanup: %s", e)
|
|
|
|
# Call on_disable if available
|
|
if hasattr(plugin, 'on_disable'):
|
|
try:
|
|
plugin.on_disable()
|
|
except Exception as e:
|
|
self.logger.warning("Error during plugin on_disable: %s", e)
|
|
|
|
# Remove from active plugins
|
|
del self.plugins[plugin_id]
|
|
if plugin_id in self.plugin_last_update:
|
|
del self.plugin_last_update[plugin_id]
|
|
|
|
# Remove main module from sys.modules if present
|
|
module_name = f"plugin_{plugin_id.replace('-', '_')}"
|
|
sys.modules.pop(module_name, None)
|
|
|
|
# Delegate sub-module and cached-module cleanup to the loader
|
|
self.plugin_loader.unregister_plugin_modules(plugin_id)
|
|
|
|
# Remove from plugin_modules
|
|
self.plugin_modules.pop(plugin_id, None)
|
|
|
|
# Update state
|
|
self.state_manager.set_state(plugin_id, PluginState.UNLOADED)
|
|
self.state_manager.clear_state(plugin_id)
|
|
|
|
self.logger.info("Unloaded plugin: %s", plugin_id)
|
|
return True
|
|
|
|
except Exception as e:
|
|
self.logger.error("Error unloading plugin %s: %s", plugin_id, e, exc_info=True)
|
|
self.state_manager.set_state(plugin_id, PluginState.ERROR, error=e)
|
|
return False
|
|
|
|
def reload_plugin(self, plugin_id: str) -> bool:
|
|
"""
|
|
Reload a plugin (unload and load).
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
|
|
Returns:
|
|
True if reloaded successfully, False otherwise
|
|
"""
|
|
self.logger.info("Reloading plugin: %s", plugin_id)
|
|
|
|
# Unload first
|
|
if plugin_id in self.plugins:
|
|
if not self.unload_plugin(plugin_id):
|
|
return False
|
|
|
|
# Re-discover to get updated manifest
|
|
manifest_path = self.plugins_dir / plugin_id / "manifest.json"
|
|
if manifest_path.exists():
|
|
try:
|
|
with open(manifest_path, 'r', encoding='utf-8') as f:
|
|
manifest = json.load(f)
|
|
with self._discovery_lock:
|
|
self.plugin_manifests[plugin_id] = manifest
|
|
except Exception as e:
|
|
self.logger.error("Error reading manifest: %s", e, exc_info=True)
|
|
return False
|
|
|
|
return self.load_plugin(plugin_id)
|
|
|
|
def get_plugin(self, plugin_id: str) -> Optional[Any]:
|
|
"""
|
|
Get a loaded plugin instance by ID.
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
|
|
Returns:
|
|
Plugin instance or None if not loaded
|
|
"""
|
|
return self.plugins.get(plugin_id)
|
|
|
|
def get_all_plugins(self) -> Dict[str, Any]:
|
|
"""
|
|
Get all loaded plugins.
|
|
|
|
Returns:
|
|
Dict of plugin_id: plugin_instance
|
|
"""
|
|
return self.plugins.copy()
|
|
|
|
def get_enabled_plugins(self) -> List[str]:
|
|
"""
|
|
Get list of enabled plugin IDs.
|
|
|
|
Returns:
|
|
List of plugin IDs that are currently enabled
|
|
"""
|
|
return [pid for pid, plugin in self.plugins.items() if plugin.enabled]
|
|
|
|
def get_plugin_info(self, plugin_id: str) -> Optional[Dict[str, Any]]:
|
|
"""
|
|
Get information about a plugin (manifest + runtime info).
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
|
|
Returns:
|
|
Dict with plugin information or None if not found
|
|
"""
|
|
with self._discovery_lock:
|
|
manifest = self.plugin_manifests.get(plugin_id)
|
|
if not manifest:
|
|
return None
|
|
|
|
info = manifest.copy()
|
|
|
|
# Add runtime information if plugin is loaded
|
|
plugin = self.plugins.get(plugin_id)
|
|
if plugin:
|
|
info['loaded'] = True
|
|
if hasattr(plugin, 'get_info'):
|
|
info['runtime_info'] = plugin.get_info()
|
|
else:
|
|
info['loaded'] = False
|
|
|
|
# Add state information
|
|
info['state'] = self.state_manager.get_state_info(plugin_id)
|
|
|
|
return info
|
|
|
|
def get_all_plugin_info(self) -> List[Dict[str, Any]]:
|
|
"""
|
|
Get information about all plugins.
|
|
|
|
Returns:
|
|
List of plugin info dictionaries
|
|
"""
|
|
with self._discovery_lock:
|
|
pids = list(self.plugin_manifests.keys())
|
|
return [info for info in [self.get_plugin_info(pid) for pid in pids] if info]
|
|
|
|
def get_plugin_directory(self, plugin_id: str) -> Optional[str]:
|
|
"""
|
|
Get the directory path for a plugin.
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
|
|
Returns:
|
|
Directory path as string or None if not found
|
|
"""
|
|
with self._discovery_lock:
|
|
if hasattr(self, 'plugin_directories') and plugin_id in self.plugin_directories:
|
|
return str(self.plugin_directories[plugin_id])
|
|
|
|
plugin_dir = self.plugins_dir / plugin_id
|
|
if plugin_dir.exists():
|
|
return str(plugin_dir)
|
|
|
|
plugin_dir = self.plugins_dir / f"ledmatrix-{plugin_id}"
|
|
if plugin_dir.exists():
|
|
return str(plugin_dir)
|
|
|
|
return None
|
|
|
|
def get_plugin_display_modes(self, plugin_id: str) -> List[str]:
|
|
"""
|
|
Get display modes provided by a plugin.
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
|
|
Returns:
|
|
List of display mode names
|
|
"""
|
|
with self._discovery_lock:
|
|
manifest = self.plugin_manifests.get(plugin_id)
|
|
if not manifest:
|
|
return []
|
|
|
|
display_modes = manifest.get('display_modes', [])
|
|
if isinstance(display_modes, list):
|
|
return display_modes
|
|
return []
|
|
|
|
def find_plugin_for_mode(self, mode: str) -> Optional[str]:
|
|
"""
|
|
Find which plugin provides a given display mode.
|
|
|
|
Args:
|
|
mode: Display mode identifier
|
|
|
|
Returns:
|
|
Plugin identifier or None if not found.
|
|
"""
|
|
normalized_mode = mode.strip().lower()
|
|
with self._discovery_lock:
|
|
manifests_snapshot = dict(self.plugin_manifests)
|
|
for plugin_id, manifest in manifests_snapshot.items():
|
|
display_modes = manifest.get('display_modes')
|
|
if isinstance(display_modes, list) and display_modes:
|
|
if any(m.lower() == normalized_mode for m in display_modes):
|
|
return plugin_id
|
|
|
|
return None
|
|
|
|
def _get_plugin_update_interval(self, plugin_id: str, plugin_instance: Any) -> Optional[float]:
|
|
"""
|
|
Get the update interval for a plugin.
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
plugin_instance: Plugin instance
|
|
|
|
Returns:
|
|
Update interval in seconds or None if not configured
|
|
"""
|
|
# Check manifest first
|
|
manifest = self.plugin_manifests.get(plugin_id, {})
|
|
update_interval = manifest.get('update_interval')
|
|
|
|
if update_interval:
|
|
try:
|
|
return float(update_interval)
|
|
except (ValueError, TypeError):
|
|
pass
|
|
|
|
# Check plugin config
|
|
if self.config_manager:
|
|
try:
|
|
config = self.config_manager.get_config()
|
|
plugin_config = config.get(plugin_id, {})
|
|
update_interval = plugin_config.get('update_interval')
|
|
if update_interval:
|
|
try:
|
|
return float(update_interval)
|
|
except (ValueError, TypeError):
|
|
pass
|
|
except Exception as e:
|
|
self.logger.debug("Could not get update interval from config: %s", e)
|
|
|
|
# Default: 60 seconds
|
|
return 60.0
|
|
|
|
def _record_update_failure(
|
|
self,
|
|
plugin_id: str,
|
|
exc: Optional[Exception] = None,
|
|
) -> None:
|
|
"""Apply the standard failure-recovery path for a plugin update.
|
|
|
|
Stamps plugin_last_update with the actual failure time so the full
|
|
configured interval elapses before the next retry, then transitions
|
|
the plugin back to ENABLED (not ERROR) with structured error context
|
|
so automatic recovery happens on the next scheduled cycle.
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
exc: The exception that caused the failure, if any. When None a
|
|
synthetic ExecutionFailure exception is constructed from the
|
|
timeout/executor-error path.
|
|
"""
|
|
failure_time = time.time()
|
|
if exc is not None:
|
|
err: Exception = exc
|
|
error_type = type(exc).__name__
|
|
else:
|
|
err = Exception(f"Plugin {plugin_id} execution failed (timeout or executor error)")
|
|
error_type = 'ExecutionFailure'
|
|
|
|
error_info = {
|
|
'error': str(err),
|
|
'error_type': error_type,
|
|
'timestamp': failure_time,
|
|
'recoverable': True,
|
|
}
|
|
self.logger.warning("Plugin %s update() failed; will retry after interval", plugin_id)
|
|
self.plugin_last_update[plugin_id] = failure_time
|
|
self.state_manager.set_state_with_error(plugin_id, PluginState.ENABLED, error_info, error=err)
|
|
if self.health_tracker:
|
|
self.health_tracker.record_failure(plugin_id, err)
|
|
|
|
def run_scheduled_updates(self, current_time: Optional[float] = None) -> None:
|
|
"""
|
|
Trigger plugin updates based on their defined update intervals.
|
|
Includes health tracking and circuit breaker logic.
|
|
Uses PluginExecutor for safe execution with timeout.
|
|
"""
|
|
if current_time is None:
|
|
current_time = time.time()
|
|
|
|
for plugin_id, plugin_instance in list(self.plugins.items()):
|
|
if not getattr(plugin_instance, "enabled", True):
|
|
continue
|
|
|
|
if not hasattr(plugin_instance, "update"):
|
|
continue
|
|
|
|
# Check circuit breaker before attempting update
|
|
if self.health_tracker and self.health_tracker.should_skip_plugin(plugin_id):
|
|
continue
|
|
|
|
# Check if plugin can execute
|
|
if not self.state_manager.can_execute(plugin_id):
|
|
continue
|
|
|
|
interval = self._get_plugin_update_interval(plugin_id, plugin_instance)
|
|
if interval is None:
|
|
continue
|
|
|
|
last_update = self.plugin_last_update.get(plugin_id, 0.0)
|
|
|
|
if last_update == 0.0 or (current_time - last_update) >= interval:
|
|
# Update state to RUNNING
|
|
self.state_manager.set_state(plugin_id, PluginState.RUNNING)
|
|
|
|
try:
|
|
# Use PluginExecutor for safe execution
|
|
success = False
|
|
if self.resource_monitor:
|
|
# If resource monitor exists, wrap the call
|
|
def monitored_update():
|
|
self.resource_monitor.monitor_call(plugin_id, plugin_instance.update)
|
|
success = self.plugin_executor.execute_update(
|
|
type('obj', (object,), {'update': monitored_update})(),
|
|
plugin_id
|
|
)
|
|
else:
|
|
success = self.plugin_executor.execute_update(plugin_instance, plugin_id)
|
|
|
|
if success:
|
|
self.plugin_last_update[plugin_id] = current_time
|
|
self.state_manager.record_update(plugin_id)
|
|
# Update state back to ENABLED
|
|
self.state_manager.set_state(plugin_id, PluginState.ENABLED)
|
|
# Record success
|
|
if self.health_tracker:
|
|
self.health_tracker.record_success(plugin_id)
|
|
else:
|
|
self._record_update_failure(plugin_id)
|
|
except Exception as exc: # pylint: disable=broad-except
|
|
self.logger.exception("Error updating plugin %s: %s", plugin_id, exc)
|
|
self._record_update_failure(plugin_id, exc=exc)
|
|
|
|
def update_all_plugins(self) -> None:
|
|
"""
|
|
Update all enabled plugins.
|
|
Calls update() on each enabled plugin using PluginExecutor.
|
|
"""
|
|
for plugin_id, plugin_instance in list(self.plugins.items()):
|
|
if not getattr(plugin_instance, "enabled", True):
|
|
continue
|
|
|
|
if not hasattr(plugin_instance, "update"):
|
|
continue
|
|
|
|
# Check if plugin can execute
|
|
if not self.state_manager.can_execute(plugin_id):
|
|
continue
|
|
|
|
# Update state to RUNNING
|
|
self.state_manager.set_state(plugin_id, PluginState.RUNNING)
|
|
|
|
try:
|
|
success = self.plugin_executor.execute_update(plugin_instance, plugin_id)
|
|
if success:
|
|
self.plugin_last_update[plugin_id] = time.time()
|
|
self.state_manager.record_update(plugin_id)
|
|
self.state_manager.set_state(plugin_id, PluginState.ENABLED)
|
|
else:
|
|
self._record_update_failure(plugin_id)
|
|
except Exception as exc: # pylint: disable=broad-except
|
|
self.logger.exception("Error updating plugin %s: %s", plugin_id, exc)
|
|
self._record_update_failure(plugin_id, exc=exc)
|
|
|
|
def get_plugin_health_metrics(self) -> Dict[str, Any]:
|
|
"""
|
|
Get health metrics for all plugins.
|
|
|
|
Returns:
|
|
Dictionary mapping plugin_id to health metrics
|
|
"""
|
|
metrics = {}
|
|
for plugin_id in self.plugins.keys():
|
|
plugin_metrics = {}
|
|
|
|
# Get state information
|
|
state_info = self.state_manager.get_state_info(plugin_id)
|
|
plugin_metrics.update(state_info)
|
|
|
|
# Get health tracker metrics if available
|
|
if self.health_tracker:
|
|
health_info = self.health_tracker.get_plugin_health(plugin_id)
|
|
plugin_metrics['health'] = health_info
|
|
else:
|
|
plugin_metrics['health'] = {'status': 'unknown'}
|
|
|
|
metrics[plugin_id] = plugin_metrics
|
|
return metrics
|
|
|
|
def get_plugin_resource_metrics(self) -> Dict[str, Any]:
|
|
"""
|
|
Get resource usage metrics for all plugins.
|
|
|
|
Returns:
|
|
Dictionary mapping plugin_id to resource metrics
|
|
"""
|
|
metrics = {}
|
|
for plugin_id in self.plugins.keys():
|
|
plugin_metrics = {}
|
|
|
|
# Get state information
|
|
state_info = self.state_manager.get_state_info(plugin_id)
|
|
plugin_metrics.update(state_info)
|
|
|
|
# Get resource monitor metrics if available
|
|
if self.resource_monitor:
|
|
resource_info = self.resource_monitor.get_plugin_metrics(plugin_id)
|
|
plugin_metrics['resources'] = resource_info
|
|
else:
|
|
plugin_metrics['resources'] = {'status': 'unknown'}
|
|
|
|
metrics[plugin_id] = plugin_metrics
|
|
return metrics
|
|
|
|
def get_plugin_state(self, plugin_id: str) -> Dict[str, Any]:
|
|
"""
|
|
Get comprehensive state information for a plugin.
|
|
|
|
Args:
|
|
plugin_id: Plugin identifier
|
|
|
|
Returns:
|
|
Dictionary with state information
|
|
"""
|
|
return self.state_manager.get_state_info(plugin_id)
|