Files
LEDMatrix/scripts/analyze_plugin_schemas.py
Chuck 05b3fa56cb fix: Codacy security fixes, CVE dependency bumps, and code quality cleanup (#331)
* fix(deps): bump minimum versions to address CVEs

Pillow 10.4.0 → 12.2.0: CVE-2026-40192 (DoS via FITS decompression bomb),
CVE-2026-25990 (OOB write via PSD image), CVE-2026-42311/42308/42310

requests 2.32.0 → 2.33.0: CVE-2026-25645 (temp file security bypass),
CVE-2024-47081 (.netrc credentials leak)

werkzeug 3.0.0 → 3.1.6: CVE-2023-46136, CVE-2024-49766/49767,
CVE-2025-66221, CVE-2026-21860/27199 (DoS, path traversal, safe_join bypass)

Flask 3.0.0 → 3.1.3: CVE-2026-27205 (session data caching info disclosure)

spotipy 2.24.0 → 2.25.2: CVE-2025-27154, CVE-2025-66040

python-socketio 5.11.0 → 5.14.0: CVE-2025-61765

pytest 7.4.0 → 9.0.3: CVE-2025-71176 (insecure temp dir handling)

Updated in requirements.txt, web_interface/requirements.txt,
plugin-repos/starlark-apps/requirements.txt, and
plugin-repos/march-madness/requirements.txt.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: resolve Pylint errors in executor, data service, and odds call

Rename TimeoutError to PluginTimeoutError in plugin_executor.py to
avoid shadowing the built-in; no external callers affected.

Remove dead try/except in BackgroundDataService.shutdown: executor.shutdown()
never accepted a timeout kwarg so the try branch always raised TypeError.
Simplify to a direct shutdown(wait=wait) call.

Remove is_live kwarg from odds_manager.get_odds() call in sports.py;
BaseOddsManager.get_odds() has no such parameter. The live update interval
is already encoded in the update_interval_seconds argument passed alongside.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: MD5→SHA-256, shellcheck warnings, and broken doc links

config_service.py: replace MD5 with SHA-256 for config change detection;
same semantics (equality comparison), no stored hashes affected.

Shell scripts — shellcheck warnings:
- diagnose_web_interface.sh: remove useless cat (SC2002)
- dev_plugin_setup.sh: restructure A&&B||C into if/then (SC2015)
- fix_assets_permissions.sh: remove unused REAL_HOME block (SC2034)
- install_web_service.sh: remove unused USER_HOME assignment (SC2034)
- diagnose_web_ui.sh: remove unused SUDO assignments (SC2034)
- diagnose_plugin_permissions.sh: remove unused BLUE color var (SC2034)
- first_time_install.sh: remove unused CLEAR var, PACKAGE_NAME
  assignment, and replace loop variable with _ (SC2034)

docs/PLUGIN_ARCHITECTURE_SPEC.md: fix 10 broken TOC anchor links to
include section numbers matching the actual headings (MD051).

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: remove unused imports and bare exception aliases (pyflakes F401/F841)

Remove unused imports across 86 files in src/, web_interface/, test/,
and scripts/ using autoflake. No logic changes — only dead import
statements and unused names in from-imports are removed.

Also remove bare exception aliases where the variable is never
referenced in the handler body:
- src/cache/disk_cache.py: except (IOError, OSError, PermissionError) as e
- src/cache_manager.py: except (OSError, IOError, PermissionError) as perm_error
- src/plugin_system/resource_monitor.py: except Exception as e
- web_interface/app.py: except Exception as read_err

86 files changed, 205 lines removed, 18 pre-existing test failures unchanged.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: remove unused local variable assignments (pyflakes F841)

Dead assignments removed across src/ and web_interface/:

- background_data_service: drop future= on fire-and-forget executor.submit
- base_classes/baseball: drop font= (all rendering uses self.fonts['time'])
- base_classes/hockey: drop status_short= (never referenced after assignment)
- common/cli: drop game_helper=/config_helper= bindings in import-test block;
  constructors called for instantiation-only validation
- common/display_helper: drop text_width= (x_position uses display_width
  directly); drop draw= in create_error_image (uses _draw_centered_text)
- config_manager: remove dead secrets_content loading block in migration path
  (comment already noted save_config_atomic handles secrets internally)
- display_manager: drop setup_start= (timing was never completed or read)
- font_manager: drop target_path= (catalog uses font_file_path directly);
  drop face=/font= bindings in validate_font (validation by construction —
  TypeError on failure is the signal, not the return value)
- font_test_manager: drop width=/height= (draw_text uses display_manager directly)
- plugin_system/state_reconciliation: drop manager= (only config/disk/state_mgr used)
- plugin_system/store_manager: drop result= on pip install subprocess.run
  (check=True raises on failure; stdout unused)
- web_interface/blueprints/pages_v3: drop main_config_path=""/secrets_config_path=""
  (render_template uses config_manager.get_*_path() inline)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix(js): resolve ESLint no-undef warnings across 6 JS files

Three distinct patterns:

1. Vendor library globals — htmx is injected by <script> before these
   extension files load; ESLint lints files in isolation and doesn't know.
   Fix: add /* global htmx */ to htmx-sse.js and htmx-json-enc.js.

2. Cross-file globals — showNotification is defined as window.showNotification
   in app.js/notification.js but called bare in app.js and error_handler.js.
   ESLint doesn't connect window.X = Y with a bare call to X.
   Fix: add /* global showNotification */ to app.js and error_handler.js.

3. Forward-reference window.* functions — in array-table.js, checkbox-group.js,
   and custom-feeds.js, functions like removeArrayTableRow are called early
   inside event-handler closures but assigned to window.* later in the file.
   At runtime this works (the handler fires after the assignment), but ESLint
   sees the bare name at the call site.
   Fix: change bare calls to window.removeArrayTableRow(this) etc. so the
   reference is explicit and ESLint-safe.

Also guard the updateSystemStats call in app.js reconnectSSE: the function
is called but defined nowhere in the codebase. Guard with typeof check so
it won't throw ReferenceError if the reconnect path is hit.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix(js): resolve Biome lint warnings across 9 JS files

noUnusedVariables (catch bindings → optional catch syntax):
- app.js, file-upload.js, timezone-selector.js: } catch (e) { → } catch {
  ES2019 optional catch binding; e was unused in all three handlers

noUnusedVariables (dead assignments):
- app.js: remove const data= in display SSE stub (handler does nothing yet)
- api_client.js: remove const timeoutId= (setTimeout ID never used to cancel)
- custom-feeds.js: remove const oldIndex= (getAttribute result never read)
- schedule-picker.js: remove const compactMode= (never used in HTML build)
- select-dropdown.js: remove const icons= (icons not yet rendered in options)

noPrototypeBuiltins:
- day-selector.js: DAY_LABELS.hasOwnProperty(x) →
  Object.prototype.hasOwnProperty.call(DAY_LABELS, x)
  Safe form that works even on null-prototype objects

useIterableCallbackReturn:
- file-upload.js, notification.js: forEach(x => expr) →
  forEach(x => { expr; }) — forEach ignores return values;
  implicit return from arrow body was misleading

htmx-sse.js is a vendor extension file with old-style var/== patterns
that are correct for it; 18 Biome issues suppressed via Codacy API
rather than modifying the vendor source.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix(security): escape user input in raw HTML responses in pages_v3.py

plugin_id comes directly from the URL path
(/partials/plugin-config/<plugin_id>) and was interpolated into an HTML
fragment without escaping. A crafted URL like
/partials/plugin-config/<script>alert(1)</script> would inject that
tag into the DOM via the HTMX partial response.

Fix: wrap all user-controlled values in markupsafe.escape() before
embedding in raw HTML strings. Affects the plugin-not-found 404
response and both error 500 responses in the plugin config partial.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: address Bandit B108/B110 across production code

B110 (try/except/pass):
- display_controller.py: narrow 'except Exception' to 'except AttributeError'
  for get_offset_frame() — plugins not having this optional method is the
  expected case, not all exceptions
- config_manager.py: B110 already resolved by the earlier removal of the
  dead secrets-loading block (the except/pass was inside it)
- All other except/pass blocks in src/ and web_interface/ are intentional
  (last-resort recovery, best-effort fallbacks, non-critical startup probes).
  Annotated each with # nosec B110 and a brief inline reason so the decision
  is explicit for future reviewers.
- Test files and plugin-repos B110 suppressed via Codacy API (not prod code).

B108 (/tmp usage):
- permission_utils.py: /tmp listed to PREVENT permission changes on it — not
  used as a temp path. Annotated # nosec B108.
- display_manager.py: fixed snapshot path is intentional (web UI reads same
  path); path-check guard also annotated.
- wifi_manager.py: named /tmp files match the sudoers allowlist installed with
  the system (the paths are hard-coded in both places by design). Annotated
  all six open/cp references # nosec B108.
- scripts/render_plugin.py: dev script default overridable by user. Annotated.
- web_interface/app.py: reads the same fixed path written by display_manager.
  Annotated # nosec B108.
- Test files suppressed via Codacy API.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: address remaining Codacy security findings

Flask debug=True (real fix):
- web_interface/app.py: debug=True in __main__ block exposes the Werkzeug
  interactive debugger (arbitrary code execution). Changed to
  os.environ.get('FLASK_DEBUG', '0') == '1' — off by default, opt-in
  via environment variable for local development.

nosec annotations (accepted risk with documented rationale):
- disk_cache.py: os.chmod(0o660) is intentional — web UI and LED matrix
  service share a group, 660 gives group write while denying world access
  (B103 + Semgrep insecure-file-permissions suppressed in Codacy)
- wifi_manager.py: urlopen to hardcoded connectivity-check.ubuntu.com URL
  (B310 — no user input involved)
- font_manager.py: urlretrieve URL comes from user's own config file on
  their local device (B310)
- start_web_conditionally.py: os.execvp with both sys.executable and a
  fixed PROJECT_DIR-relative constant (B606)

Confirmed false positives suppressed via Codacy API (15 issues):
- SSRF (3x): client-side JS fetch — SSRF is server-side; browser fetch
  is CORS-restricted to same origin
- B105 (3x): test fixtures use dummy secrets by design; store_manager
  checks for the placeholder string, it is not itself a secret
- PMD numeric literal (2x): 10000000 is within Number.MAX_SAFE_INTEGER
- Prototype pollution (1x): read-only schema traversal, no writes
- no-unsanitized_method (1x): dynamic import() is CORS-restricted
- detect-unsafe-regex (1x): operates on server-controlled config values
- plugin-repos B103 (1x): vendor code chmod on executable
- Semgrep insecure-file-permissions (3x): same disk_cache 0o660 as above

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: remove unnecessary f prefix from f-strings without placeholders (F541)

Pyflakes F541 flags f-strings that contain no {} interpolation — they are
identical to plain strings but trigger unnecessary string formatting overhead.

Fixed in production code:
- src/base_classes/data_sources.py (2 debug log calls)
- src/logo_downloader.py (1 error log)
- src/plugin_system/store_manager.py (5 strings across 3 log calls)
- src/web_interface/validators.py (1 return value)
- src/wifi_manager.py (4 log/message strings)
- web_interface/start.py (1 print)

F541 issues in test/, scripts/, and plugin-repos/ suppressed via Codacy API
as non-production code.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* chore(dev): add Pillow compatibility smoke test script

Covers all Pillow APIs used in LEDMatrix — image creation, drawing,
font metrics, LANCZOS resampling, paste/alpha_composite, and PNG I/O.
Run after any Pillow version bump to catch regressions before deploy.

    python3 scripts/dev/test_pillow_compat.py

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: resolve 8 new Codacy issues introduced by PR changes

shellcheck SC2034:
- first_time_install.sh: 'type' loop variable also unused in the wifi
  status loop (we previously fixed 'device' → '_' but left 'type').
  Changed to '_ _ state' since neither device nor type is referenced.

ESLint no-undef:
- app.js: typeof guards don't satisfy no-undef; added updateSystemStats
  to the /* global */ declaration alongside showNotification.

nosec annotation:
- web_interface/app.py: app.run(host='0.0.0.0') line changed when we
  fixed debug=True, giving it a new issue ID. Re-added # nosec B104.

pyflakes F401:
- scripts/dev/test_pillow_compat.py: ImageFilter was imported but never
  used in the smoke test. Removed from the import.

Codacy API suppressions (false positives on changed lines):
- disk_cache.py 0o660 chmod (2x): lines changed when # nosec B103 was
  added, producing new Semgrep issue IDs. Re-suppressed.
- pages_v3.py raw-html-concat: Semgrep does not recognise escape() as
  a sanitizer; the escape() call IS the correct fix.
- app.py flask 0.0.0.0: same line as B104 above; Semgrep rule also
  re-suppressed.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

* fix: address PR review findings

Fix (10 of 15 findings):

plugin-repos/march-madness/requirements.txt:
  Add urllib3>=1.26.0 — manager.py directly imports from urllib3; it was
  an undeclared transitive dependency via requests.

scripts/dev/dev_plugin_setup.sh:
  Restore subshell form (cd "$target_dir" && git pull --rebase) || true
  so the shell's working directory is not permanently changed after the
  if-cd block. Previous fix for SC2015 leaked cwd into the remainder of
  the script.

src/base_classes/sports.py:
  Narrow 'except Exception' to 'except RuntimeError as e' and log via
  self.logger.debug — Path.home() raises only RuntimeError for service
  users; other exceptions should not be silently swallowed.

src/config_service.py:
  Fix stale "MD5 checksum" in ConfigVersion.__init__ docstring (line 40);
  the implementation uses SHA-256 since the Codacy fix.

src/wifi_manager.py:
  Log the last-resort AP enable failure with exc_info=True instead of
  silently passing — failure here means the device may be unreachable.

web_interface/blueprints/pages_v3.py:
  Log the outer metadata pre-load exception at debug level instead of
  swallowing it silently; schema still loads fully below.

src/background_data_service.py:
  Remove unused 'timeout' parameter from shutdown() — executor.shutdown()
  does not accept timeout; update __del__ caller accordingly.

src/font_manager.py:
  Validate URL scheme before urlretrieve — reject non-http/https schemes
  (e.g. file://) to prevent reading local files from config-supplied URLs.

src/plugin_system/plugin_executor.py:
  Simplify redundant except tuple: (PluginTimeoutError, PluginError,
  Exception) → Exception, which already covers the others.

test/test_display_controller.py:
  Mark empty test_plugin_discovery_and_loading as @pytest.mark.skip with
  reason. Move duplicate 'from datetime import datetime' to module header
  and remove the stray mid-module copy.

Skip (5 of 15 findings, with reasons):
  - pytest 9.0.3 concerns: full suite already verified (467 pass, 18 pre-existing)
  - Pillow 12.2.0 API concerns: no deprecated APIs in codebase; tests + Pi smoke test pass
  - diagnose_web_ui.sh sudo validation: set -e already ensures fail-fast on any sudo failure
  - app.py request-logging except: must stay silent (recursive logging risk); annotated
  - app.py SSE file-read except: genuinely transient I/O; annotated

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>

---------

Co-authored-by: Chuck <chuck@example.com>
Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-05-15 10:19:55 -04:00

281 lines
9.6 KiB
Python
Executable File

#!/usr/bin/env python3
"""
Analyze all plugin config schemas to identify issues:
- Duplicate fields
- Inconsistencies
- Missing common fields
- Naming variations
- Formatting issues
"""
import json
from pathlib import Path
from typing import Dict, List, Any
import jsonschema
from jsonschema import Draft7Validator
# Standard common fields that should be in all plugins
STANDARD_COMMON_FIELDS = {
"enabled": {
"type": "boolean",
"default": False,
"description": "Enable or disable this plugin",
"required": True,
"order": 1
},
"display_duration": {
"type": "number",
"default": 15,
"minimum": 1,
"maximum": 300,
"description": "How long to display this plugin in seconds",
"order": 2
},
"live_priority": {
"type": "boolean",
"default": False,
"description": "Enable live priority takeover when plugin has live content",
"order": 3
},
"high_performance_transitions": {
"type": "boolean",
"default": False,
"description": "Use high-performance transitions (120 FPS) instead of standard (30 FPS)",
"order": 4
},
"update_interval": {
"type": "integer",
"default": 60,
"minimum": 1,
"description": "How often to refresh data in seconds",
"order": 5
},
"transition": {
"type": "object",
"order": 6
}
}
def find_duplicate_fields(schema: Dict[str, Any], path: str = "") -> List[str]:
"""Find duplicate field definitions within a schema."""
duplicates = []
seen_fields = {}
def check_properties(props: Dict[str, Any], current_path: str):
if not isinstance(props, dict):
return
for key, value in props.items():
full_path = f"{current_path}.{key}" if current_path else key
if key in seen_fields:
duplicates.append(f"Duplicate field '{key}' at {full_path} (also at {seen_fields[key]})")
else:
seen_fields[key] = full_path
# Recursively check nested objects
if isinstance(value, dict):
if "properties" in value:
check_properties(value["properties"], full_path)
elif "items" in value and isinstance(value["items"], dict):
if "properties" in value["items"]:
check_properties(value["items"]["properties"], f"{full_path}[items]")
if "properties" in schema:
check_properties(schema["properties"], "")
return duplicates
def validate_schema_syntax(schema_path: Path) -> tuple[bool, List[str]]:
"""Validate JSON Schema syntax."""
errors = []
try:
with open(schema_path, 'r', encoding='utf-8') as f:
schema = json.load(f)
# Validate schema structure
Draft7Validator.check_schema(schema)
return True, []
except json.JSONDecodeError as e:
return False, [f"JSON syntax error: {str(e)}"]
except jsonschema.SchemaError as e:
return False, [f"Schema validation error: {str(e)}"]
except Exception as e:
return False, [f"Error: {str(e)}"]
def analyze_schema(schema_path: Path) -> Dict[str, Any]:
"""Analyze a single schema file."""
plugin_id = schema_path.parent.name
analysis = {
"plugin_id": plugin_id,
"path": str(schema_path),
"valid": False,
"errors": [],
"warnings": [],
"has_title": False,
"has_description": False,
"common_fields": {},
"missing_common_fields": [],
"naming_issues": [],
"duplicates": [],
"property_order": [],
"update_interval_variant": None
}
try:
with open(schema_path, 'r', encoding='utf-8') as f:
schema = json.load(f)
# Check for title and description
analysis["has_title"] = "title" in schema
analysis["has_description"] = "description" in schema
if not analysis["has_title"]:
analysis["warnings"].append("Missing 'title' field at root level")
if not analysis["has_description"]:
analysis["warnings"].append("Missing 'description' field at root level")
# Validate schema syntax
is_valid, errors = validate_schema_syntax(schema_path)
analysis["valid"] = is_valid
analysis["errors"].extend(errors)
if not is_valid:
return analysis
# Check for duplicate fields
duplicates = find_duplicate_fields(schema)
analysis["duplicates"] = duplicates
# Check properties
if "properties" not in schema:
analysis["errors"].append("Missing 'properties' field")
return analysis
properties = schema["properties"]
# Check common fields
for field_name, field_spec in STANDARD_COMMON_FIELDS.items():
if field_name in properties:
analysis["common_fields"][field_name] = properties[field_name]
else:
# Check for variants
if field_name == "update_interval":
# Check for update_interval_seconds variant
if "update_interval_seconds" in properties:
analysis["update_interval_variant"] = "update_interval_seconds"
analysis["naming_issues"].append(
f"Uses 'update_interval_seconds' instead of 'update_interval'"
)
else:
analysis["missing_common_fields"].append(field_name)
else:
analysis["missing_common_fields"].append(field_name)
# Check property order (enabled should be first)
prop_keys = list(properties.keys())
analysis["property_order"] = prop_keys
if prop_keys and prop_keys[0] != "enabled":
analysis["warnings"].append(
f"'enabled' is not first property. First property is '{prop_keys[0]}'"
)
# Check for required fields
required = schema.get("required", [])
if "enabled" not in required:
analysis["warnings"].append("'enabled' is not in required fields")
except Exception as e:
analysis["errors"].append(f"Failed to analyze schema: {str(e)}")
return analysis
def main():
"""Main analysis function."""
project_root = Path(__file__).parent.parent
plugins_dir = project_root / "plugins"
if not plugins_dir.exists():
print(f"Plugins directory not found: {plugins_dir}")
return
results = []
# Find all config_schema.json files
schema_files = list(plugins_dir.glob("*/config_schema.json"))
print(f"Found {len(schema_files)} plugin schemas to analyze\n")
for schema_path in sorted(schema_files):
print(f"Analyzing {schema_path.parent.name}...")
analysis = analyze_schema(schema_path)
results.append(analysis)
# Print summary
print("\n" + "="*80)
print("ANALYSIS SUMMARY")
print("="*80)
for result in results:
print(f"\n{result['plugin_id']}:")
print(f" Valid: {result['valid']}")
if result['errors']:
print(f" Errors ({len(result['errors'])}):")
for error in result['errors']:
print(f" - {error}")
if result['warnings']:
print(f" Warnings ({len(result['warnings'])}):")
for warning in result['warnings']:
print(f" - {warning}")
if result['duplicates']:
print(f" Duplicates ({len(result['duplicates'])}):")
for dup in result['duplicates']:
print(f" - {dup}")
if result['missing_common_fields']:
print(f" Missing common fields: {', '.join(result['missing_common_fields'])}")
if result['naming_issues']:
print(f" Naming issues:")
for issue in result['naming_issues']:
print(f" - {issue}")
if result['property_order'] and result['property_order'][0] != 'enabled':
print(f" Property order: First is '{result['property_order'][0]}' (should be 'enabled')")
# Overall statistics
print("\n" + "="*80)
print("OVERALL STATISTICS")
print("="*80)
valid_count = sum(1 for r in results if r['valid'])
has_title_count = sum(1 for r in results if r['has_title'])
has_description_count = sum(1 for r in results if r['has_description'])
enabled_first_count = sum(1 for r in results if r['property_order'] and r['property_order'][0] == 'enabled')
total_errors = sum(len(r['errors']) for r in results)
total_warnings = sum(len(r['warnings']) for r in results)
total_duplicates = sum(len(r['duplicates']) for r in results)
print(f"Total plugins: {len(results)}")
print(f"Valid schemas: {valid_count}/{len(results)}")
print(f"Has title: {has_title_count}/{len(results)}")
print(f"Has description: {has_description_count}/{len(results)}")
print(f"'enabled' first: {enabled_first_count}/{len(results)}")
print(f"Total errors: {total_errors}")
print(f"Total warnings: {total_warnings}")
print(f"Total duplicates: {total_duplicates}")
# Save detailed report
report_path = project_root / "plugin_schema_analysis.json"
with open(report_path, 'w', encoding='utf-8') as f:
json.dump(results, f, indent=2)
print(f"\nDetailed report saved to: {report_path}")
if __name__ == "__main__":
main()