mirror of
https://github.com/ChuckBuilds/LEDMatrix.git
synced 2026-04-10 21:03:01 +00:00
Chaotic mega-merge into main. THINGS WILL PROBABLY BE BROKEN
* chore: Update soccer-scoreboard submodule to merged commit
- Update submodule reference to include manifest.json v2 registry format
- Version updated to 1.0.1
* refactor: Remove test_mode and logo_dir config reading from base SportsCore
- Remove test_mode initialization and usage
- Remove logo_dir reading from mode_config
- Use LogoDownloader defaults directly for logo directories
* chore: Update plugin submodules after removing global properties
- Update basketball-scoreboard submodule (removed global test_mode, live_priority, dynamic_duration, logo_dir)
- Update soccer-scoreboard submodule (removed global test_mode, live_priority, dynamic_duration, logo_dir)
* feat(calendar): Add credentials.json file upload via web interface
- Add API endpoint /api/v3/plugins/calendar/upload-credentials for file upload
- Validate JSON format and Google OAuth structure
- Save file to plugin directory with secure permissions (0o600)
- Backup existing credentials.json before overwriting
- Add file upload widget support for string fields in config forms
- Add frontend handler handleCredentialsUpload() for single file uploads
- Update .gitignore to allow calendar submodule
- Update calendar submodule reference
* fix(web): Improve spacing for nested configuration sections
- Add dynamic margin based on nesting depth (mb-6 for deeply nested sections)
- Increase padding in nested content areas (py-3 to py-4)
- Add extra spacing after nested sections to prevent overlap
- Enhance CSS spacing for nested sections (1.5rem for nested, 2rem for deeply nested)
- Add padding-bottom to expanded nested content to prevent cutoff
- Fixes issue where game_limits and other nested settings were hidden under next section header
* chore(plugins): Update sports scoreboard plugins with live update interval fix
- Updated hockey-scoreboard, football-scoreboard, basketball-scoreboard, and soccer-scoreboard submodules
- All plugins now fix the interval selection bug that caused live games to update every 5 minutes instead of 30 seconds
- Ensures all live games update at the configured live_update_interval (30s) for timely score updates
* fix: Initialize test_mode in SportsLive and fix config migration
- Add test_mode initialization in SportsLive.__init__() to prevent AttributeError
- Remove invalid new_secrets parameter from save_config_atomic() call in config migration
- Fixes errors: 'NBALiveManager' object has no attribute 'test_mode'
- Fixes errors: ConfigManager.save_config_atomic() got unexpected keyword argument 'new_secrets'
* chore: Update submodules with test_mode initialization fixes
- Update basketball-scoreboard submodule
- Update soccer-scoreboard submodule
* fix(plugins): Auto-stash local changes before plugin updates
- Automatically stash uncommitted changes before git pull during plugin updates
- Prevents update failures when plugins have local modifications
- Improves error messages for git update failures
- Matches behavior of main LEDMatrix update process
* fix(basketball-scoreboard): Update submodule with timeout fix
- Updated basketball-scoreboard plugin to fix update() timeout issue
- Plugin now uses fire-and-forget odds fetching for upcoming games
- Prevents 30-second timeout when processing many upcoming games
Also fixed permission issue on devpi:
- Changed /var/cache/ledmatrix/display_on_demand_state.json permissions
from 600 to 660 to allow web service (devpi user) to read the file
* fix(cache): Ensure cache files use 660 permissions for group access
- Updated setup_cache.sh to set file permissions to 660 (not 775)
- Updated first_time_install.sh to properly set cache file permissions
- Modified DiskCache to set 660 permissions when creating cache files
- Ensures display_on_demand_state.json and other cache files are readable
by web service (devpi user) which is in ledmatrix group
This fixes permission issues where cache files were created with 600
permissions, preventing the web service from reading them. Now files
are created with 660 (rw-rw----) allowing group read access.
* fix(soccer-scoreboard): Update submodule with manifest fix
- Updated soccer-scoreboard plugin submodule
- Added missing entry_point and class_name to manifest.json
- Fixes plugin loading error: 'No class_name in manifest'
Also fixed cache file permissions on devpi server:
- Changed display_on_demand_state.json from 600 to 660 permissions
- Allows web service (devpi user) to read cache files
* fix(display): Remove update_display() calls from clear() to prevent black flash
Previously, display_manager.clear() was calling update_display() twice,
which immediately showed a black screen on the hardware before new
content could be drawn. This caused visible black flashes when switching
between modes, especially when plugins switch from general modes (e.g.,
football_upcoming) to specific sub-modes (e.g., nfl_upcoming).
Now clear() only prepares the buffer without updating the hardware.
Callers can decide when to update the display, allowing smooth transitions
from clear → draw → update_display() without intermediate black flashes.
Places that intentionally show a cleared screen (error cases) already
explicitly call update_display() after clear(), so backward compatibility
is maintained.
* fix(scroll): Prevent wrap-around before cycle completion in dynamic duration
- Check scroll completion BEFORE allowing wrap-around
- Clamp scroll_position when complete to prevent visual loop
- Only wrap-around if cycle is not complete yet
- Fixes issue where stocks plugin showed first stock again at end
- Completion logged only once to avoid spam
- Ensures smooth transition to next mode without visual repeat
* fix(on-demand): Ensure on-demand buttons work and display service runs correctly
- Add early stub functions for on-demand modal to ensure availability when Alpine.js initializes
- Increase on-demand request cache max_age from 5min to 1hr to prevent premature expiration
- Fixes issue where on-demand buttons were not functional due to timing issues
- Ensures display service properly picks up on-demand requests when started
* test: Add comprehensive test coverage (30%+)
- Add 100+ new tests across core components
- Add tests for LayoutManager (27 tests)
- Add tests for PluginLoader (14 tests)
- Add tests for SchemaManager (20 tests)
- Add tests for MemoryCache and DiskCache (24 tests)
- Add tests for TextHelper (9 tests)
- Expand error handling tests (7 new tests)
- Improve coverage from 25.63% to 30.26%
- All 237 tests passing
Test files added:
- test/test_layout_manager.py
- test/test_plugin_loader.py
- test/test_schema_manager.py
- test/test_text_helper.py
- test/test_config_service.py
- test/test_display_controller.py
- test/test_display_manager.py
- test/test_error_handling.py
- test/test_font_manager.py
- test/test_plugin_system.py
Updated:
- pytest.ini: Enable coverage reporting with 30% threshold
- test/conftest.py: Enhanced fixtures for better test isolation
- test/test_cache_manager.py: Expanded cache component tests
- test/test_config_manager.py: Additional config tests
Documentation:
- HOW_TO_RUN_TESTS.md: Guide for running and understanding tests
* test(web): Add comprehensive API endpoint tests
- Add 30 new tests for Flask API endpoints in test/test_web_api.py
- Cover config, system, display, plugins, fonts, and error handling APIs
- Increase test coverage from 30.26% to 30.87%
- All 267 tests passing
Tests cover:
- Config API: GET/POST main config, schedule, secrets
- System API: Status, version, system actions
- Display API: Current display, on-demand start/stop
- Plugins API: Installed plugins, health, config, operations, state
- Fonts API: Catalog, tokens, overrides
- Error handling: Invalid JSON, missing fields, 404s
* test(plugins): Add comprehensive integration tests for all plugins
- Add base test class for plugin integration tests
- Create integration tests for all 6 plugins:
- basketball-scoreboard (11 tests)
- calendar (10 tests)
- clock-simple (11 tests)
- odds-ticker (9 tests)
- soccer-scoreboard (11 tests)
- text-display (12 tests)
- Total: 64 new plugin integration tests
- Increase test coverage from 30.87% to 33.38%
- All 331 tests passing
Tests verify:
- Plugin loading and instantiation
- Required methods (update, display)
- Manifest validation
- Display modes
- Config schema validation
- Graceful handling of missing API credentials
Uses hybrid approach: integration tests in main repo,
plugin-specific unit tests remain in plugin submodules.
* Add mqtt-notifications plugin as submodule
* fix(sports): Respect games_to_show settings for favorite teams
- Fix upcoming games to show N games per team (not just 1)
- Fix recent games to show N games per team (not just 1)
- Add duplicate removal for games involving multiple favorite teams
- Match behavior of basketball-scoreboard plugin
- Affects NFL, NHL, and other sports using base_classes/sports.py
* chore: Remove debug instrumentation logs
- Remove temporary debug logging added during fix verification
- Fix confirmed working by user
* debug: Add instrumentation to debug configuration header visibility issue
* fix: Resolve nested section content sliding under next header
- Remove overflow-hidden from nested-section to allow proper document flow
- Add proper z-index and positioning to prevent overlap
- Add margin-top to nested sections for better spacing
- Remove debug instrumentation that was causing ERR_BLOCKED_BY_CLIENT errors
* fix: Prevent unnecessary plugin tab redraws
- Add check to only update tabs when plugin list actually changes
- Increase debounce timeout to batch rapid changes
- Compare plugin IDs before updating to avoid redundant redraws
- Fix setter to check for actual changes before triggering updates
* fix: Prevent form-groups from sliding out of view when nested sections expand
- Increase margin-bottom on nested-sections for better spacing
- Add clear: both to nested-sections to ensure proper document flow
- Change overflow to visible when expanded to allow natural flow
- Add margin-bottom to expanded content
- Add spacing rules for form-groups that follow nested sections
- Add clear spacer div after nested sections
* fix: Reduce excessive debug logging in generateConfigForm
- Only log once per plugin instead of on every function call
- Prevents log spam when Alpine.js re-renders the form multiple times
- Reduces console noise from 10+ logs per plugin to 1 log per plugin
* fix: Prevent nested section content from sliding out of view when expanded
- Remove overflow-hidden from nested-section in base.html (was causing clipping)
- Add scrollIntoView to scroll expanded sections into view within modal
- Set nested-section overflow to visible to prevent content clipping
- Add min-height to nested-content to ensure proper rendering
- Wait for animation to complete before scrolling into view
* fix: Prevent form-groups from overlapping and appearing outside view
- Change nested-section overflow to hidden by default, visible when expanded
- Add :has() selector to allow overflow when content is expanded
- Ensure form-groups after nested sections have proper spacing and positioning
- Add clear: both and width: 100% to prevent overlap
- Use !important for margin-top to ensure spacing is applied
- Ensure form-groups are in normal document flow with float: none
* fix: Use JavaScript to toggle overflow instead of :has() selector
- :has() selector may not be supported in all browsers
- Use JavaScript to set overflow: visible when expanded, hidden when collapsed
- This ensures better browser compatibility while maintaining functionality
* fix: Make parent sections expand when nested sections expand
- Add updateParentNestedContentHeight() helper to recursively update parent heights
- When a nested section expands, recalculate all parent nested-content max-heights
- Ensures parent sections (like NFL) expand to accommodate expanded child sections
- Updates parent heights both on expand and collapse for proper animation
* refactor: Simplify parent section expansion using CSS max-height: none
- Remove complex recursive parent height update function
- Use CSS max-height: none when expanded to allow natural expansion
- Parent sections automatically expand because nested-content has no height constraint
- Simpler and more maintainable solution
* refactor: Remove complex recursive parent height update function
- CSS max-height: none already handles parent expansion automatically
- No need for JavaScript to manually update parent heights
- Much simpler and cleaner solution
* debug: Add instrumentation to debug auto-collapse issue
- Add logging to track toggle calls and state changes
- Add guard to prevent multiple simultaneous toggles
- Pass event object to prevent bubbling
- Improve state detection logic
- Add return false to onclick handlers
* chore: Remove debug instrumentation from toggleNestedSection
- Remove all debug logging code
- Keep functional fixes: event handling, toggle guard, improved state detection
- Code is now clean and production-ready
* fix(web): Add browser refresh note to plugin fetch errors
* refactor(text-display): Update submodule to use ScrollHelper
* fix(text-display): Fix scrolling display issue - update position in display()
* feat(text-display): Add scroll_loop option and improve scroll speed control
* debug: Add instrumentation to track plugin enabled state changes
Added debug logging to investigate why plugins appear to disable themselves:
- Track enabled state during plugin load (before/after schema merge)
- Track enabled state during plugin reload
- Track enabled state preservation during config save
- Track state reconciliation fixes
- Track enabled state updates in on_config_change
This will help identify which code path is causing plugins to disable.
* debug: Fix debug log path to work on Pi
Changed hardcoded log path to use dynamic project root detection:
- Uses LEDMATRIX_ROOT env var if set
- Falls back to detecting project root by looking for config directory
- Creates .cursor directory if it doesn't exist
- Falls back to /tmp/ledmatrix_debug.log if all else fails
- Added better error handling with logger fallback
* Remove debug instrumentation for plugin enabled state tracking
Removed all debug logging that was added to track plugin enabled state changes.
The instrumentation has been removed as requested.
* Reorganize documentation and cleanup test files
- Move documentation files to docs/ directory
- Remove obsolete test files
- Update .gitignore and README
* feat(text-display): Switch to frame-based scrolling with high FPS support
* fix(text-display): Add backward compatibility for ScrollHelper sub-pixel scrolling
* feat(scroll_helper): Add sub-pixel scrolling support for smooth movement
- Add sub-pixel interpolation using scipy (if available) or numpy fallback
- Add set_sub_pixel_scrolling() method to enable/disable feature
- Implement _get_visible_portion_subpixel() for fractional pixel positioning
- Implement _interpolate_subpixel() for linear interpolation
- Prevents pixel skipping at slow scroll speeds
- Maintains backward compatibility with integer pixel path
* fix(scroll_helper): Reset last_update_time in reset_scroll() to prevent jump-ahead
- Reset last_update_time when resetting scroll position
- Prevents large delta_time on next update after reset
- Fixes issue where scroll would immediately complete again after reset
- Ensures smooth scrolling continuation after loop reset
* fix(scroll_helper): Fix numpy broadcasting error in sub-pixel interpolation
- Add output_width parameter to _interpolate_subpixel() for variable widths
- Fix wrap-around case to use correct widths for interpolation
- Handle edge cases where source array is smaller than expected
- Prevent 'could not broadcast input array' errors in sub-pixel scrolling
- Ensure proper width matching in all interpolation paths
* feat(scroll): Add frame-based scrolling mode for smooth LED matrix movement
- Add frame_based_scrolling flag to ScrollHelper
- When enabled, moves fixed pixels per step, throttled by scroll_delay
- Eliminates time-based jitter by ignoring frame timing variations
- Provides stock-ticker-like smooth, predictable scrolling
- Update text-display plugin to use frame-based mode
This addresses stuttering issues where time-based scrolling caused
visual jitter due to frame timing variations in the main display loop.
* fix(scroll): Fix NumPy broadcasting errors in sub-pixel wrap-around
- Ensure _interpolate_subpixel always returns exactly requested width
- Handle cases where scipy.ndimage.shift produces smaller arrays
- Add padding logic for wrap-around cases when arrays are smaller than expected
- Prevents 'could not broadcast input array' errors during scrolling
* refactor(scroll): Remove sub-pixel interpolation, use high FPS integer scrolling
- Disable sub-pixel scrolling by default in ScrollHelper
- Simplify get_visible_portion to always use integer pixel positioning
- Restore frame-based scrolling logic for smooth high FPS movement
- Use high frame rate (like stock ticker) for smoothness instead of interpolation
- Reduces complexity and eliminates broadcasting errors
* fix(scroll): Prevent large pixel jumps in frame-based scrolling
- Initialize last_step_time properly to prevent huge initial jumps
- Clamp scroll_speed to max 5 pixels/frame in frame-based mode
- Prevents 60-pixel jumps when scroll_speed is misconfigured
- Simplified step calculation to avoid lag catch-up jumps
* fix(text-display): Align config schema and add validation
- Update submodule reference
- Adds warning and logging for scroll_speed config issues
* fix(scroll): Simplify frame-based scrolling to match stock ticker behavior
- Remove throttling logic from frame-based scrolling
- Move pixels every call (DisplayController's loop timing controls rate)
- Add enable_scrolling attribute to text-display plugin for high-FPS treatment
- Matches stock ticker: simple, predictable movement every frame
- Eliminates jitter from timing mismatches between DisplayController and ScrollHelper
* fix(scroll): Restore scroll_delay throttling in frame-based mode
- Restore time-based throttling using scroll_delay
- Move pixels only when scroll_delay has passed
- Handle lag catch-up with reasonable caps to prevent huge jumps
- Preserve fractional timing for smooth operation
- Now scroll_delay actually controls the scroll speed as intended
* feat(text-display): Add FPS counter logging
- Update submodule reference
- Adds FPS tracking and logging every 5 seconds
* fix(text-display): Add display-width buffer so text scrolls completely off
- Update submodule reference
- Adds end buffer to ensure text exits viewport before looping
* fix: Prevent premature game switching in SportsLive
- Set last_game_switch when games load even if current_game already exists
- Set last_game_switch when same games update but it's still 0
- Add guard to prevent switching check when last_game_switch is 0
- Fixes issue where first game shows for only ~2 seconds before switching
- Also fixes random screen flickering when games change prematurely
* feat(plugins): Add branch selection support for plugin installation
- Add optional branch parameter to install_plugin() and install_from_url() in store_manager
- Update API endpoints to accept and pass branch parameter
- Update frontend JavaScript to support branch selection in install calls
- Maintain backward compatibility - branch parameter is optional everywhere
- Falls back to default branch logic if specified branch doesn't exist
* feat(plugins): Add UI for branch selection in plugin installation
- Add branch input field in 'Install Single Plugin' section
- Add global branch input for store installations
- Update JavaScript to read branch from input fields
- Branch input applies to all store installations when specified
* feat(plugins): Change branch selection to be per-plugin instead of global
- Remove global store branch input field
- Add individual branch input field to each plugin card in store
- Add branch input to custom registry plugin cards
- Each plugin can now have its own branch specified independently
* debug: Add logging to _should_exit_dynamic
* feat(display_controller): Add universal get_cycle_duration support for all plugins
UNIVERSAL FEATURE: Any plugin can now implement get_cycle_duration() to dynamically
calculate the total time needed to show all content for a mode.
New method:
- _plugin_cycle_duration(plugin, display_mode): Queries plugin for calculated duration
Integration:
- Display controller calls plugin.get_cycle_duration(display_mode)
- Uses returned duration as target (respecting max cap)
- Falls back to cap if not provided
Benefits:
- Football plugin: Show all games (3 games × 15s = 45s total)
- Basketball plugin: Could implement same logic
- Hockey/Baseball/any sport: Universal support
- Stock ticker: Could calculate based on number of stocks
- Weather: Could calculate based on forecast days
Example plugin implementation:
Result: Plugins control their own display duration based on actual content,
creating a smooth user experience where all content is shown before switching.
* debug: Add logging to cycle duration call
* debug: Change loop exit logs to INFO level
* fix: Change cycle duration logs to INFO level
* fix: Don't exit loop on False for dynamic duration plugins
For plugins with dynamic duration enabled, keep the display loop running
even when display() returns False. This allows games to continue rotating
within the calculated duration.
The loop will only exit when:
- Cycle is complete (plugin reports all content shown)
- Max duration is reached
- Mode is changed externally
* fix(schedule): Improve display scheduling functionality
- Add GET endpoint for schedule configuration retrieval
- Fix mode switching to clean up old config keys (days/start_time/end_time)
- Improve error handling with consistent error_response() usage
- Enhance display controller schedule checking with better edge case handling
- Add validation for time formats and ensure at least one day enabled in per-day mode
- Add debug logging for schedule state changes
Fixes issues where schedule mode switching left stale config causing incorrect behavior.
* fix(install): Add cmake and ninja-build to system dependencies
Resolves h3 package build failure during first-time installation.
The h3 package (dependency of timezonefinder) requires CMake and
Ninja to build from source. Adding these build tools ensures
successful installation of all Python dependencies.
* fix: Pass display_mode in ALL loop calls to maintain sticky manager
CRITICAL FIX: Display controller was only passing display_mode on first call,
causing plugins to fall back to internal mode cycling and bypass sticky
manager logic.
Now consistently passes display_mode=active_mode on every display() call in
both high-FPS and normal loops. This ensures plugins maintain mode context
and sticky manager state throughout the entire display duration.
* feat(install): Add OS check for Raspberry Pi OS Lite (Trixie)
- Verify OS is Raspberry Pi OS (raspbian/debian)
- Require Debian 13 (Trixie) specifically
- Check for Lite version (no desktop environment)
- Exit with clear error message if requirements not met
- Provide instructions for obtaining correct OS version
* fix(web-ui): Add missing notification handlers to quick action buttons
- Added hx-on:htmx:after-request handlers to all quick action buttons in overview.html
- Added hx-ext='json-enc' for proper JSON encoding
- Added missing notification handler for reboot button in index.html
- Users will now see toast notifications when actions complete or fail
* fix(display): Ensure consistent display mode handling in all plugin calls
- Updated display controller to consistently pass display_mode in all plugin display() calls.
- This change maintains the sticky manager state and ensures plugins retain their mode context throughout the display duration.
- Addresses issues with mode cycling and improves overall display reliability.
* fix(display): Enhance display mode persistence across plugin updates
- Updated display controller to ensure display_mode is consistently maintained during plugin updates.
- This change prevents unintended mode resets and improves the reliability of display transitions.
- Addresses issues with mode persistence, ensuring a smoother user experience across all plugins.
* feat: Add Olympics countdown plugin as submodule
- Add olympics-countdown plugin submodule
- Update .gitignore to allow olympics-countdown plugin
- Plugin automatically determines next Olympics and counts down to opening/closing ceremonies
* feat(web-ui): Add checkbox-group widget support for multi-select arrays
- Add checkbox-group widget rendering in plugins_manager.js
- Update form processing to handle checkbox groups with [] naming
- Support for friendly labels via x-options in config schemas
- Update odds-ticker submodule with checkbox-group implementation
* fix(plugins): Preserve enabled state when saving plugin config from main config endpoint
When saving plugin configuration through save_main_config endpoint, the enabled
field was not preserved if missing from the form data. This caused plugins to
be automatically disabled when users saved their configuration from the plugin
manager tab.
This fix adds the same enabled state preservation logic that exists in
save_plugin_config endpoint, ensuring consistent behavior across both endpoints.
The enabled state is preserved from current config, plugin instance, or defaults
to True to prevent unexpected disabling of plugins.
* fix(git): Resolve git status timeout and exclude plugins from base project updates
- Add --untracked-files=no flag to git status for faster execution
- Increase timeout from 5s to 30s for git status operations
- Add timeout exception handling for git status and stash operations
- Filter out plugins directory from git status checks (plugins are separate repos)
- Exclude plugins from stash operations using :!plugins pathspec
- Apply same fixes to plugin store manager update operations
* feat(plugins): Add granular scroll speed control to odds-ticker and leaderboard plugins
- Add display object to both plugins' config schemas with scroll_speed and scroll_delay
- Enable frame-based scrolling mode for precise FPS control (100 FPS for leaderboard)
- Add set_scroll_speed() and set_scroll_delay() methods to both plugins
- Maintain backward compatibility with scroll_pixels_per_second config
- Leaderboard plugin now explicitly sets target_fps to 100 for high-performance scrolling
* fix(scroll): Correct dynamic duration calculation for frame-based scrolling
- Fix calculate_dynamic_duration() to properly handle frame-based scrolling mode
- Convert scroll_speed from pixels/frame to pixels/second when in frame-based mode
- Prevents incorrect duration calculations (e.g., 2609s instead of 52s)
- Affects all plugins using ScrollHelper: odds-ticker, leaderboard, stocks, text-display
- Add debug logging to show scroll mode and effective speed
* Remove version logic from plugin system, use git commits instead
- Remove version parameter from install_plugin() method
- Rename fetch_latest_versions to fetch_commit_info throughout codebase
- Remove version fields from plugins.json registry (versions, latest_version, download_url_template)
- Remove version logging from plugin manager
- Update web UI to use fetch_commit_info parameter
- Update .gitignore to ignore all plugin folders (remove whitelist exceptions)
- Remove plugin directories from git index (plugins now installed via plugin store only)
Plugins now always install latest commit from default branch. Version fields
replaced with git commit SHA and commit dates. System uses git-based approach
for all plugin metadata.
* feat(plugins): Normalize all plugins as git submodules
- Convert all 18 plugins to git submodules for uniform management
- Add submodules for: baseball-scoreboard, christmas-countdown, football-scoreboard, hockey-scoreboard, ledmatrix-flights, ledmatrix-leaderboard, ledmatrix-music, ledmatrix-stocks, ledmatrix-weather, static-image
- Re-initialize mqtt-notifications as proper submodule
- Update .gitignore to allow all plugin submodules
- Add normalize_plugin_submodules.sh script for future plugin management
All plugins with GitHub repositories are now managed as git submodules,
ensuring consistent version control and easier updates.
* refactor(repository): Reorganize scripts and files into organized directory structure
- Move installation scripts to scripts/install/ (except first_time_install.sh)
- Move development scripts to scripts/dev/
- Move utility scripts to scripts/utils/
- Move systemd service files to systemd/
- Keep first_time_install.sh, start_display.sh, stop_display.sh in root
- Update all path references in scripts, documentation, and service files
- Add README.md files to new directories explaining their purpose
- Remove empty tools/ directory (contents moved to scripts/dev/)
- Add .gitkeep to data/ directory
* fix(scripts): Fix PROJECT_DIR path in start_web_conditionally.py after move to scripts/utils/
* fix(scripts): Fix PROJECT_DIR/PROJECT_ROOT path resolution in moved scripts
- Fix wifi_monitor_daemon.py to use project root instead of scripts/utils/
- Fix shell scripts in scripts/ to correctly resolve project root (go up one more level)
- Fix scripts in scripts/fix_perms/ to correctly resolve project root
- Update diagnose_web_interface.sh to reference moved start_web_conditionally.py path
All scripts now correctly determine project root after reorganization.
* fix(install): Update first_time_install.sh to detect and update service files with old paths
- Check for old paths in service files and reinstall if needed
- Always reinstall main service (install_service.sh is idempotent)
- This ensures existing installations get updated paths after reorganization
* fix(install): Update install_service.sh message to indicate it updates existing services
* fix(wifi): Enable WiFi scan to work when AP mode is active
- Temporarily disable AP mode during network scanning
- Automatically re-enable AP mode after scan completes
- Add proper error handling with try/finally to ensure AP mode restoration
- Add user notification when AP mode is temporarily disabled
- Improve error messages for common scanning failures
- Add timing delays for interface mode switching
* fix(wifi): Fix network parsing to handle frequency with 'MHz' suffix
- Strip 'MHz' suffix from frequency field before float conversion
- Add better error logging for parsing failures
- Fixes issue where all networks were silently skipped due to ValueError
* debug(wifi): Add console logging and Alpine.js reactivity fixes for network display
- Add console.log statements to debug network scanning
- Add x-effect to force Alpine.js reactivity updates
- Add unique keys to x-for template
- Add debug display showing network count
- Improve error handling and user feedback
* fix(wifi): Manually update select options instead of using Alpine.js x-for
- Replace Alpine.js x-for template with manual DOM manipulation
- Add updateSelectOptions() method to directly update select dropdown
- This fixes issue where networks weren't appearing in dropdown
- Alpine.js x-for inside select elements can be unreliable
* feat(web-ui): Add patternProperties support for dynamic key-value pairs
- Add UI support for patternProperties objects (custom_feeds, feed_logo_map)
- Implement key-value pair editor with add/remove functionality
- Add JavaScript functions for managing dynamic key-value pairs
- Update form submission to handle patternProperties JSON data
- Enable easy configuration of feed_logo_map in web UI
* chore: Update ledmatrix-news submodule to latest commit
* fix(plugins): Handle arrays of objects in config normalization
Fix configuration validation failure for static-image plugin by adding
recursive normalization support for arrays of objects. The normalize_config_values
function now properly handles arrays containing objects (like image_config.images)
by recursively normalizing each object in the array using the items schema properties.
This resolves the 'configuration validation failed' error when saving static
image plugin configuration with multiple images.
* fix(plugins): Handle union types in config normalization and form generation
Fix configuration validation for fields with union types like ['integer', 'null'].
The normalization function now properly handles:
- Union types in top-level fields (e.g., random_seed: ['integer', 'null'])
- Union types in array items
- Empty string to None conversion for nullable fields
- Form generation and submission for union types
This resolves validation errors when saving plugin configs with nullable
integer/number fields (e.g., rotation_settings.random_seed in static-image plugin).
Also improves UX by:
- Adding placeholder text for nullable fields explaining empty = use default
- Properly handling empty values in form submission for union types
* fix(plugins): Improve union type normalization with better edge case handling
Enhanced normalization for union types like ['integer', 'null']:
- Better handling of whitespace in string values
- More robust empty string to None conversion
- Fallback to None when conversion fails and null is allowed
- Added debug logging for troubleshooting normalization issues
- Improved handling of nested object fields with union types
This should resolve remaining validation errors for nullable integer/number
fields in nested objects (e.g., rotation_settings.random_seed).
* chore: Add ledmatrix-news plugin to .gitignore exceptions
* Fix web interface service script path in install_service.sh
- Updated ExecStart path from start_web_conditionally.py to scripts/utils/start_web_conditionally.py
- Updated diagnose_web_ui.sh to check for correct script path
- Fixes issue where web UI service fails to start due to incorrect script path
* Fix nested configuration section headers not expanding
Fixed toggleNestedSection function to properly calculate scrollHeight when
expanding nested configuration sections. The issue occurred when sections
started with display:none - the scrollHeight was being measured before the
browser had a chance to lay out the element, resulting in a value of 0.
Changes:
- Added setTimeout to delay scrollHeight measurement until after layout
- Added overflow handling during animations to prevent content jumping
- Added fallback for edge cases where scrollHeight might still be 0
- Set maxHeight to 'none' after expansion completes for natural growth
- Updated function in both base.html and plugins_manager.js
This fix applies to all plugins with nested configuration sections, including:
- Hockey/Football/Basketball/Baseball/Soccer scoreboards (customization, global sections)
- All plugins with transition, display, and other nested configuration objects
Fixes configuration header expansion issues across all plugins.
* Fix syntax error in first_time_install.sh step 8.5
Added missing 'fi' statement to close the if block in the WiFi monitor
service installation section. This resolves the 'unexpected end of file'
error that occurred at line 1385 during step 8.5.
* Fix WiFi UI: Display correct SSID and accurate signal strength
- Fix WiFi network selection dropdown not showing available networks
- Replace manual DOM manipulation with Alpine.js x-for directive
- Add fallback watcher to ensure select updates reactively
- Fix WiFi status display showing netplan connection name instead of SSID
- Query actual SSID from device properties (802-11-wireless.ssid)
- Add fallback methods to get SSID from active WiFi connection list
- Improve signal strength accuracy
- Get signal directly from device properties (WIFI.SIGNAL)
- Add multiple fallback methods for robust signal retrieval
- Ensure signal percentage is accurate and up-to-date
* Improve WiFi connection UI and error handling
- Fix connect button disabled condition to check both selectedSSID and manualSSID
- Improve error handling to display actual server error messages from 400 responses
- Add step-by-step labels (Step 1, Step 2, Step 3) to clarify connection workflow
- Add visual feedback showing selected network in blue highlight box
- Improve password field labeling with helpful instructions
- Add auto-clear logic between dropdown and manual SSID entry
- Enhance backend validation with better error messages and logging
- Trim SSID whitespace before processing to prevent validation errors
* Add WiFi disconnect functionality for AP mode testing
- Add disconnect_from_network() method to WiFiManager
- Disconnects from current WiFi network using nmcli
- Automatically triggers AP mode check if auto_enable_ap_mode is enabled
- Returns success/error status with descriptive messages
- Add /api/v3/wifi/disconnect API endpoint
- POST endpoint to disconnect from current WiFi network
- Includes proper error handling and logging
- Add disconnect button to WiFi status section
- Only visible when connected to a network
- Red styling to indicate disconnection action
- Shows 'Disconnecting...' state during operation
- Automatically refreshes status after disconnect
- Integrates with AP mode auto-enable functionality
- When disconnected, automatically enables AP mode if configured
- Perfect for testing captive portal and AP mode features
* Add explicit handling for broken pipe errors during plugin dependency installation
- Catch BrokenPipeError and OSError (errno 32) explicitly in all dependency installation methods
- Add clear error messages explaining network interruption or buffer overflow causes
- Improves error handling in store_manager, plugin_loader, and plugin_manager
- Helps diagnose 'Errno 32 Broken Pipe' errors during pip install operations
* Add WiFi permissions configuration script and integrate into first-time install
- Create configure_wifi_permissions.sh script
- Configures passwordless sudo for nmcli commands
- Configures PolicyKit rules for NetworkManager control
- Fixes 'Not Authorized to control Networking' error
- Allows web interface to connect/disconnect WiFi without password prompts
- Integrate WiFi permissions configuration into first_time_install.sh
- Added as Step 10.1 after passwordless sudo configuration
- Runs automatically during first-time installation
- Ensures WiFi management works out of the box
- Resolves authorization errors when connecting/disconnecting WiFi networks
- NetworkManager requires both sudo and PolicyKit permissions
- Script configures both automatically for seamless WiFi management
* Add WiFi status LED message display integration
- Integrate WiFi status messages from wifi_manager into display_controller
- WiFi status messages interrupt normal rotation (but respect on-demand)
- Priority: on-demand > wifi-status > live-priority > normal rotation
- Safe implementation with comprehensive error handling
- Automatic cleanup of expired/corrupted status files
- Word-wrapping for long messages (max 2 lines)
- Centered text display with small font
- Non-intrusive: all errors are caught and logged, never crash controller
* Fix display loop issues: reduce log spam and handle missing plugins
- Change _should_exit_dynamic logging from INFO to DEBUG to reduce log spam
in tight loops (every 8ms) that was causing high CPU usage
- Fix display loop not running when manager_to_display is None
- Add explicit check to set display_result=False when no plugin manager found
- Fix logic bug where manager_to_display was overwritten after circuit breaker skip
- Ensure proper mode rotation when plugins have no content or aren't found
* Add debug logging to diagnose display loop stuck issue
* Change debug logs to INFO level to diagnose display loop stuck
* Add schedule activation logging and ensure display is blanked when inactive
- Add clear INFO-level log message when schedule makes display inactive
- Track previous display state to detect schedule transitions
- Clear display when schedule makes it inactive to ensure blank screen
(prevents showing initialization screen when schedule kicks in)
- Initialize _was_display_active state tracking in __init__
* Fix indentation errors in schedule state tracking
* Add rotation between hostname and IP address every 10 seconds
- Added _get_local_ip() method to detect device IP address
- Implemented automatic rotation between hostname and IP every 10 seconds
- Enhanced logging to include both hostname and IP in initialization
- Updated get_info() to expose device_ip and current_display_mode
* Add WiFi connection failsafe system
- Save original connection before attempting new connection
- Automatically restore original connection if new connection fails
- Enable AP mode as last resort if restoration fails
- Enhanced connection verification with multiple attempts
- Verify correct SSID (not just 'connected' status)
- Better error handling and exception recovery
- Prevents Pi from becoming unresponsive on connection failure
- Always ensures device remains accessible via original WiFi or AP mode
* feat(web): Improve web UI startup speed and fix cache permissions
- Defer plugin discovery until first API request (removed from startup)
- Add lazy loading to operation queue, state manager, and operation history
- Defer health monitor initialization until first request
- Fix cache directory permission issue:
- Add systemd CacheDirectory feature for automatic cache dir creation
- Add manual cache directory creation in install script as fallback
- Improve cache manager logging (reduce alarming warnings)
- Fix syntax errors in wifi_manager.py (unclosed try blocks)
These changes significantly improve web UI startup time, especially with many
plugins installed, while maintaining full backward compatibility.
* feat(plugins): Improve GitHub token pop-up UX and combine warning/settings
- Fix visibility toggle to handle inline styles properly
- Remove redundant inline styles from HTML elements
- Combine warning banner and settings panel into unified component
- Add loading states to save/load token buttons
- Improve error handling with better user feedback
- Add token format validation (ghp_ or github_pat_ prefix)
- Auto-refresh GitHub auth status after saving token
- Hide warning banner when settings panel opens
- Clear input field after successful save for security
This creates a smoother UX flow where clicking 'Configure Token'
transitions from warning directly to configuration form.
* fix(wifi): Prevent WiFi radio disabling during AP mode disable
- Make NetworkManager restart conditional (only for hostapd mode)
- Add enhanced WiFi radio enable with retry and verification logic
- Add connectivity safety check before NetworkManager restart
- Ensure WiFi radio enabled after all AP mode disable operations
- Fix indentation bug in dnsmasq backup restoration logic
- Add pre-connection WiFi radio check for safety
Fixes issue where WiFi radio was being disabled when disabling AP mode,
especially when connected via Ethernet, making it impossible to enable
WiFi from the web UI.
* fix(plugin-templates): Fix unreachable fallback to expired cache in update() method
The exception handler in update() checked the cached variable, which would
always be None or falsy at that point. If fresh cached data existed, the
method returned early. If cached data was expired, it was filtered out by
max_age constraint. The fix retrieves cached data again in the exception
handler with a very large max_age (1 year) to effectively bypass expiration
check and allow fallback to expired data when fetch fails.
* fix(plugin-templates): Resolve plugin_id mismatch in test template setUp method
* feat(plugins): Standardize manifest version fields schema
- Consolidate version fields to use consistent naming:
- compatible_versions: array of semver ranges (required)
- min_ledmatrix_version: string (optional)
- max_ledmatrix_version: string (optional)
- versions[].ledmatrix_min_version: renamed from ledmatrix_min
- Add manifest schema validation (schema/manifest_schema.json)
- Update store_manager to validate version fields and schema
- Update template and all documentation examples to use standardized fields
- Add deprecation warnings for ledmatrix_version and ledmatrix_min fields
* fix(templates): Update plugin README template script path to correct location
* docs(plugin): Resolve conflicting version management guidance in .cursorrules
* chore(.gitignore): Consolidate plugin exclusion patterns
Remove unnecessary !plugins/*/.git pattern and consolidate duplicate
negations by keeping only trailing-slash directory exclusions.
* docs: Add language specifiers to code blocks in STATIC_IMAGE_MULTI_UPLOAD_PLAN.md
* fix(templates): Remove api_key from config.json example in plugin README template
Remove api_key field from config.json example to prevent credential leakage.
API keys should only be stored in config_secrets.json. Added clarifying note
about proper credential storage.
* docs(README): Add plugin installation and migration information
- Add plugin installation instructions via web interface and GitHub URL
- Add plugin migration guide for users upgrading from old managers
- Improve plugin documentation for new users
* docs(readme): Update donation links and add Discord acknowledgment
* docs: Add comprehensive API references and consolidate documentation
- Add API_REFERENCE.md with complete REST API documentation (50+ endpoints)
- Add PLUGIN_API_REFERENCE.md documenting Display Manager, Cache Manager, and Plugin Manager APIs
- Add ADVANCED_PLUGIN_DEVELOPMENT.md with advanced patterns and examples
- Add DEVELOPER_QUICK_REFERENCE.md for quick developer reference
- Consolidate plugin configuration docs into single PLUGIN_CONFIGURATION_GUIDE.md
- Archive completed implementation summaries to docs/archive/
- Enhance PLUGIN_DEVELOPMENT_GUIDE.md with API links and 3rd party submission guidelines
- Update docs/README.md with new API reference sections
- Update root README.md with documentation links
* fix(install): Fix IP detection and network diagnostics after fresh install
- Fix web-ui-info plugin IP detection to handle no internet, AP mode, and network state changes
- Replace socket-based detection with robust interface scanning using hostname -I and ip addr
- Add AP mode detection returning 192.168.4.1 when AP mode is active
- Add periodic IP refresh every 30 seconds to handle network state changes
- Improve network diagnostics in first_time_install.sh showing actual IPs, WiFi status, and AP mode
- Add WiFi connection check in WiFi monitor installation with warnings
- Enhance web service startup logging to show accessible IP addresses
- Update README with network troubleshooting section and fix port references (5001->5000)
Fixes issue where display showed incorrect IP (127.0.11:5000) and users couldn't access web UI after fresh install.
* chore: Add GitHub sponsor button configuration
* fix(wifi): Fix aggressive AP mode enabling and improve WiFi detection
Critical fixes:
- Change auto_enable_ap_mode default from True to False (manual enable only)
- Fixes issue where Pi would disconnect from network after code updates
- Matches documented behavior (was incorrectly defaulting to True in code)
Improvements:
- Add grace period: require 3 consecutive disconnected checks (90s) before enabling AP mode
- Prevents AP mode from enabling on transient network hiccups
- Improve WiFi status detection with retry logic and better nmcli parsing
- Enhanced logging for debugging WiFi connection issues
- Better handling of WiFi device detection (works with any wlan device)
This prevents the WiFi monitor from aggressively enabling AP mode and
disconnecting the Pi from the network when there are brief network issues
or during system initialization.
* fix(wifi): Revert auto_enable_ap_mode default to True with grace period protection
Change default back to True for auto_enable_ap_mode while keeping the grace
period protection that prevents interrupting valid WiFi connections.
- Default auto_enable_ap_mode back to True (useful for setup scenarios)
- Grace period (3 consecutive checks = 90s) prevents false positives
- Improved WiFi detection with retry logic ensures accurate status
- AP mode will auto-enable when truly disconnected, but won't interrupt
valid connections due to transient detection issues
* fix(news): Update submodule reference for manifest fix
Update ledmatrix-news submodule to include the fixed manifest.json with
required entry_point and class_name fields.
* fix(news): Update submodule reference with validate_config addition
Update ledmatrix-news submodule to include validate_config method for
proper configuration validation.
* feat: Add of-the-day plugin as git submodule
- Add ledmatrix-of-the-day plugin as git submodule
- Rename submodule path from plugins/of-the-day to plugins/ledmatrix-of-the-day to match repository naming convention
- Update .gitignore to allow ledmatrix-of-the-day submodule
- Plugin includes fixes for display rendering and web UI configuration support
* fix(wifi): Make AP mode open network and fix WiFi page loading in AP mode
AP Mode Changes:
- Remove password requirement from AP mode (open network for easier setup)
- Update hostapd config to create open network (no WPA/WPA2)
- Update nmcli hotspot to create open network (no password parameter)
WiFi Page Loading Fixes:
- Download local copies of HTMX and Alpine.js libraries
- Auto-detect AP mode (192.168.4.x) and use local JS files instead of CDN
- Auto-open WiFi tab when accessing via AP mode IP
- Add fallback loading if HTMX fails to load
- Ensures WiFi setup page works in AP mode without internet access
This fixes the issue where the WiFi page wouldn't load on iPhone when
accessing via AP mode (192.168.4.1:5000) because CDN resources couldn't
be fetched without internet connectivity.
* feat(wifi): Add explicit network switching support with clean disconnection
WiFi Manager Improvements:
- Explicitly disconnect from current network before connecting to a new one
- Add skip_ap_check parameter to disconnect_from_network() to prevent AP mode
from activating during network switches
- Check if already connected to target network to avoid unnecessary work
- Improved logging for network switching operations
Web UI Improvements:
- Detect and display network switching status in UI
- Show 'Switching from [old] to [new]...' message when switching networks
- Enhanced status reloading after connection (multiple checks at 2s, 5s, 10s)
- Better user feedback during network transitions
This ensures clean network switching without AP mode interruptions and
provides clear feedback to users when changing WiFi networks.
* fix(web-ui): Add fallback content loading when HTMX fails to load
Problem:
- After recent updates, web UI showed navigation and CPU status but main
content tabs never loaded
- Content tabs depend on HTMX's 'revealed' trigger to load
- If HTMX failed to load or initialize, content would never appear
Solutions:
- Enhanced HTMX loading verification with timeout checks
- Added fallback direct fetch for overview tab if HTMX fails
- Added automatic tab content loading when tabs change
- Added loadTabContent() method to manually trigger content loading
- Added global 'htmx-load-failed' event for error handling
- Automatic retry after 5 seconds if HTMX isn't available
- Better error messages and console logging for debugging
This ensures the web UI loads content even if HTMX has issues,
providing graceful degradation and better user experience.
* feat(web-ui): Add support for plugin custom HTML widgets and static file serving
- Add x-widget: custom-html support in config schema generation
- Add loadCustomHtmlWidget() function to load HTML from plugin directories
- Add /api/v3/plugins/<plugin_id>/static/<file_path> endpoint for serving plugin static files
- Enhance execute_plugin_action() to pass params via stdin as JSON for scripts
- Add JSON output parsing for script action responses
These changes enable plugins to provide custom UI components while keeping
all functionality plugin-scoped. Used by of-the-day plugin for file management.
* fix(web-ui): Resolve Alpine.js initialization errors
- Prevent Alpine.js from auto-initializing before app() function is defined
- Add deferLoadingAlpine to ensure proper initialization order
- Make app() function globally available via window.app
- Fix 'app is not defined' and 'activeTab is not defined' errors
- Remove duplicate Alpine.start() calls that caused double initialization warnings
* fix(web-ui): Fix IndentationError in api_v3.py OAuth flow
- Fix indentation in if action_def.get('oauth_flow') block
- Properly indent try/except block and all nested code
- Resolves IndentationError that prevented web interface from starting
* fix(web-ui): Fix SyntaxError in api_v3.py else block
- Fix indentation of OAuth flow code inside else block
- Properly indent else block for simple script execution
- Resolves SyntaxError at line 3458 that prevented web interface from starting
* fix(web-ui): Restructure OAuth flow check to fix SyntaxError
- Move OAuth flow check before script execution in else block
- Remove unreachable code that was causing syntax error
- OAuth check now happens first, then falls back to script execution
- Resolves SyntaxError at line 3458
* fix(web-ui): Define app() function in head for Alpine.js initialization
- Define minimal app() function in head before Alpine.js loads
- Ensures app() is available when Alpine initializes
- Full implementation in body enhances/replaces the stub
- Fixes 'app is not defined' and 'activeTab is not defined' errors
* fix(web-ui): Ensure plugin tabs load when full app() implementation is available
- Update stub init() to detect and use full implementation when available
- Ensure full implementation properly replaces stub methods
- Call init() after merging to load plugins and set up watchers
- Fixes issue where installed plugins weren't showing in navigation bar
* fix(web-ui): Prevent 'Cannot redefine property' error for installedPlugins
- Check if window.installedPlugins property already exists before defining
- Make property configurable to allow redefinition if needed
- Add _initialized flag to prevent multiple init() calls
- Fixes TypeError when stub tries to enhance with full implementation
* fix(web-ui): Fix variable redeclaration errors in logs tab
- Replace let/const declarations with window properties to avoid redeclaration
- Use window._logsEventSource, window._allLogs, etc. to persist across HTMX reloads
- Clean up existing event source before reinitializing
- Remove and re-add event listeners to prevent duplicates
- Fixes 'Identifier has already been declared' error when accessing logs tab multiple times
* feat(web-ui): Add support for additionalProperties object rendering
- Add handler for objects with additionalProperties containing object schemas
- Render dynamic category controls with enable/disable toggles
- Display category metadata (display name, data file path)
- Used by of-the-day plugin for category management
* fix(wifi): Ensure AP mode hotspot is always open (no password)
Problem:
- LEDMatrix-Setup WiFi AP was still asking for password despite code changes
- Existing hotspot connections with passwords weren't being fully cleaned up
- NetworkManager might reuse old connection profiles with passwords
Solutions:
- More thorough cleanup: Delete all hotspot-related connections, not just known names
- Verification: Check if hotspot has password after creation
- Automatic fix: Remove password and restart connection if security is detected
- Better logging: Log when password is detected and removed
This ensures the AP mode hotspot is always open for easy setup access,
even if there were previously saved connections with passwords.
* fix(wifi): Improve network switching reliability and device state handling
Problem:
- Pi failing to switch WiFi networks via web UI
- Connection attempts happening before device is ready
- Disconnect not fully completing before new connection attempt
- Connection name lookup issues when SSID doesn't match connection name
Solutions:
- Improved disconnect logic: Disconnect specific connection first, then device
- Device state verification: Wait for device to be ready (disconnected/unavailable) before connecting
- Better connection lookup: Search by SSID, not just connection name
- Increased wait times: 2 seconds for disconnect to complete
- State checking before activating existing connections
- Enhanced error handling and logging throughout
This ensures network switching works reliably by properly managing device
state transitions and using correct connection identifiers.
* debug(web-ui): Add debug logging for custom HTML widget loading
- Add console logging to track widget generation
- Improve error messages with missing configuration details
- Help diagnose why file manager widget may not be appearing
* fix(web-ui): Fix [object Object] display in categories field
- Add type checking to ensure category values are strings before rendering
- Safely extract data_file and display_name properties
- Prevent object coercion issues in category display
* perf(web-ui): Optimize plugin loading in navigation bar
- Reduce stub init timeout from 100ms to 10ms for faster enhancement
- Change full implementation merge from 50ms setTimeout to requestAnimationFrame
- Add direct plugin loading in stub while waiting for full implementation
- Skip plugin reload in full implementation if already loaded by stub
- Significantly improves plugin tab loading speed in navigation bar
* feat(web-ui): Adapt file-upload widget for JSON files in of-the-day plugin
- Add specialized JSON upload/delete endpoints for of-the-day plugin
- Modify file-upload widget to support JSON files (file_type: json)
- Render JSON files with file-code icon instead of image preview
- Show entry count for JSON files
- Store files in plugins/ledmatrix-of-the-day/of_the_day/ directory
- Automatically update categories config when files are uploaded/deleted
- Populate uploaded_files array from categories on form load
- Remove custom HTML widget, use standard file-upload widget instead
* fix(web-ui): Add working updatePluginTabs to stub for immediate plugin tab rendering
- Stub's updatePluginTabs was empty, preventing tabs from showing
- Add basic implementation that creates plugin tabs in navigation bar
- Ensures plugin tabs appear immediately when plugins load, even before full implementation merges
- Fixes issue where plugin navigation bar wasn't working
* feat(api): Populate uploaded_files and categories from disk for of-the-day plugin
- Scan of_the_day directory for existing JSON files when loading config
- Populate uploaded_files array from files on disk
- Populate categories from files on disk if not in config
- Categories default to disabled, user can enable them
- Ensures existing JSON files (word_of_the_day.json, slovenian_word_of_the_day.json) appear in UI
* fix(api): Improve category merging logic for of-the-day plugin
- Preserve existing category enabled state when merging with files from disk
- Ensure all JSON files from disk appear in categories section
- Categories from files default to disabled, preserving user choices
- Properly merge existing config with scanned files
* fix(wifi): More aggressive password removal for AP mode hotspot
Problem:
- LEDMatrix-Setup network still asking for password despite previous fixes
- NetworkManager may add default security settings to hotspots
- Existing connections with passwords may not be fully cleaned up
Solutions:
- Always remove ALL security settings after creating hotspot (not just when detected)
- Remove multiple security settings: key-mgmt, psk, wep-key, auth-alg
- Verify security was removed and recreate connection if verification fails
- Improved cleanup: Delete connections by SSID match, not just by name
- Disconnect connections before deleting them
- Always restart connection after removing security to apply changes
- Better logging for debugging
This ensures the AP mode hotspot is always open, even if NetworkManager
tries to add default security settings.
* perf(web): Optimize web interface performance and fix JavaScript errors
- Add resource hints (preconnect, dns-prefetch) for CDN resources to reduce DNS lookup delays
- Fix duplicate response parsing bug in loadPluginConfig that was parsing JSON twice
- Replace direct fetch() calls with PluginAPI.getInstalledPlugins() to leverage caching and throttling
- Fix Alpine.js function availability issues with defensive checks and $nextTick
- Enhance request deduplication with debug logging and statistics
- Add response caching headers for static assets and API responses
- Add performance monitoring utilities with detailed metrics
Fixes console errors for loadPluginConfig and generateConfigForm not being defined.
Reduces duplicate API calls to /api/v3/plugins/installed endpoint.
Improves initial page load time with resource hints and optimized JavaScript loading.
* perf(web-ui): optimize CSS for Raspberry Pi performance
- Remove backdrop-filter blur from modal-backdrop
- Remove box-shadow transitions (use transform/opacity only)
- Remove button ::before pseudo-element animation
- Simplify skeleton loader (gradient to opacity pulse)
- Optimize transition utility (specific properties, not 'all')
- Improve color contrast for WCAG AA compliance
- Add CSS containment to cards, plugin-cards, modals
- Remove unused CSS classes (duration-300, divider, divider-light)
- Remove duplicate spacing utility classes
All animations now GPU-accelerated (transform/opacity only).
Optimized for low-powered Raspberry Pi devices.
* fix(web): Resolve ReferenceError for getInstalledPluginsSafe in v3 stub initialization
Move getInstalledPluginsSafe() function definition before the app() stub code that uses it. The function was previously defined at line 3756 but was being called at line 849 during Alpine.js initialization, causing a ReferenceError when loadInstalledPluginsDirectly() attempted to load plugins before the full implementation was ready.
* fix(web): Resolve TypeError for installedPlugins.map in plugin loading
Fix PluginAPI.getInstalledPlugins() to properly extract plugins array from API response structure. The API returns {status: 'success', data: {plugins: [...]}}, but the method was returning response.data (the object) instead of response.data.plugins (the array).
Changes:
- api_client.js: Extract plugins array from response.data.plugins
- plugins_manager.js: Add defensive array checks and handle array return value correctly
- base.html: Add defensive check in getInstalledPluginsSafe() to ensure plugins is always an array
This prevents 'installedPlugins.map is not a function' errors when loading plugins.
* style(web-ui): Enhance navigation bar styling for better readability
- Improve contrast: Change inactive tab text from gray-500 to gray-700
- Add gradient background and thicker border for active tabs
- Enhance hover states with background highlights
- Add smooth transitions using GPU-accelerated properties
- Update all navigation buttons (system tabs and plugin tabs)
- Add updatePluginTabStates() method for dynamic tab state management
All changes are CSS-only with zero performance overhead.
* fix(web-ui): Optimize plugin loading and reduce initialization errors
- Make generateConfigForm accessible to inline Alpine components via parent scope
- Consolidate plugin initialization to prevent duplicate API calls
- Fix script execution from HTMX-loaded content by extracting scripts before DOM insertion
- Add request deduplication to loadInstalledPlugins() to prevent concurrent requests
- Improve Alpine component initialization with proper guards and fallbacks
This eliminates 'generateConfigForm is not defined' errors and reduces plugin
API calls from 3-4 duplicate calls to 1 per page load, significantly improving
page load performance.
* fix(web-ui): Add guard check for generateConfigForm to prevent Alpine errors
Add typeof check in x-show to prevent Alpine from evaluating generateConfigForm
before the component methods are fully initialized. This eliminates the
'generateConfigForm is not defined' error that was occurring during component
initialization.
* fix(web-ui): Fix try-catch block structure in script execution code
Correct the nesting of try-catch block inside the if statement for script execution.
The catch block was incorrectly placed after the else clause, causing a syntax error.
* fix(web-ui): Escape quotes in querySelector to avoid HTML attribute conflicts
Change double quotes to single quotes in the CSS selector to prevent conflicts
with HTML attribute parsing when the x-data expression is embedded.
* style(web): Improve button text readability in Quick Actions section
* fix(web): Resolve Alpine.js expression errors in plugin configuration component
- Capture plugin from parent scope into component data to fix parsing errors
- Update all plugin references to use this.plugin in component methods
- Fix x-init to properly call loadPluginConfig method
- Resolves 'Uncaught ReferenceError' for isOnDemandLoading, onDemandLastUpdated, and other component properties
* fix(web): Fix remaining Alpine.js scope issues in plugin configuration
- Use this.generateConfigForm in typeof checks and method calls
- Fix form submission to use this.plugin.id
- Use $root. prefix for parent scope function calls (refreshPlugin, updatePlugin, etc.)
- Fix confirm dialog string interpolation
- Ensures all component methods and properties are properly scoped
* fix(web): Add this. prefix to all Alpine.js component property references
- Fix all template expressions to use this. prefix for component properties
- Update isOnDemandLoading, onDemandLastUpdated, onDemandRefreshing references
- Update onDemandStatusClass, onDemandStatusText, onDemandServiceClass, onDemandServiceText
- Update disableRunButton, canStopOnDemand, showEnableHint, loading references
- Ensures Alpine.js can properly resolve all component getters and properties
* fix(web): Resolve Alpine.js expression errors in plugin configuration
- Move complex x-data object to pluginConfigData() function for better parsing
- Fix all template expressions to use this.plugin instead of plugin
- Add this. prefix to all method calls in event handlers
- Fix duplicate x-on:click attribute on uninstall button
- Add proper loading state management in loadPluginConfig method
This resolves the 'Invalid or unexpected token' and 'Uncaught ReferenceError'
errors in the browser console.
* fix(web): Fix plugin undefined errors in Alpine.js plugin configuration
- Change x-data initialization to capture plugin from loop scope first
- Use Object.assign in x-init to merge pluginConfigData properties
- Add safety check in pluginConfigData function for undefined plugins
- Ensure plugin is available before accessing properties in expressions
This resolves the 'Cannot read properties of undefined' errors by ensuring
the plugin object is properly captured from the x-for loop scope before
any template expressions try to access it.
* style(web): Make Quick Actions button text styling consistent
- Update Start Display, Stop Display, and Reboot System buttons
- Change from text-sm font-medium to text-base font-semibold
- All Quick Actions buttons now have consistent bold, larger text
- Matches the styling of Update Code, Restart Display Service, and Restart Web Service buttons
* fix(wifi): Properly handle AP mode disable during WiFi connection
- Check return value of disable_ap_mode() before proceeding with connection
- Add verification loop to ensure AP mode is actually disabled
- Increase wait time to 5 seconds for NetworkManager restart stabilization
- Return clear error messages if AP mode cannot be disabled
- Prevents connection failures when switching networks from web UI or AP mode
This fixes the issue where WiFi network switching would fail silently when
AP mode disable failed, leaving the system in an inconsistent state.
* fix(web): Handle API response errors in plugin configuration loading
- Add null/undefined checks before accessing API response status
- Set fallback defaults when API responses don't have status 'success'
- Add error handling for batch API requests with fallback to individual requests
- Add .catch() handlers to individual fetch calls to prevent unhandled rejections
- Add console warnings to help debug API response failures
- Fix applies to both main loadPluginConfig and PluginConfigHelpers.loadPluginConfig
This fixes the issue where plugin configuration sections would get stuck
showing the loading animation when API responses failed or returned error status.
* fix(web): Fix Alpine.js reactivity for plugin config by using direct x-data
Changed from Object.assign pattern to direct x-data assignment to ensure
Alpine.js properly tracks reactive properties. The previous approach used
Object.assign to merge properties into the component after initialization,
which caused Alpine to not detect changes to config/schema properties.
The fix uses pluginConfigData(plugin) directly as x-data, ensuring all
properties including config, schema, loading, etc. are reactive from
component initialization.
* fix(web): Ensure plugin variable is captured in x-data scope
Use spread operator to merge pluginConfigData properties while explicitly
capturing the plugin variable from outer x-for scope. This fixes undefined
plugin errors when Alpine evaluates the component data.
* fix(web): Use $data for Alpine.js reactivity when merging plugin config
Use Object.assign with Alpine's $data reactive proxy instead of this to
ensure added properties are properly reactive. This fixes the issue where
plugin variable scoping from x-for wasn't accessible in x-data expressions.
* fix(web): Remove incorrect 'this.' prefix in Alpine.js template expressions
Alpine.js template expressions (x-show, x-html, x-text, x-on) use the
component data as the implicit context, so 'this.' prefix is incorrect.
In template expressions, 'this' refers to the DOM element, not the
component data.
Changes:
- Replace 'this.plugin.' with 'plugin.' in all template expressions (19 instances)
- Replace 'this.loading' with 'loading' in x-show directives
- Replace 'this.generateConfigForm' with 'generateConfigForm' in x-show/x-html
- Replace 'this.savePluginConfig' with 'savePluginConfig' in x-on:submit
- Replace 'this.config/schema/webUiActions' with direct property access
- Use '$data.loadPluginConfig' in x-init for explicit method call
Note: 'this.' is still correct inside JavaScript method definitions within
pluginConfigData() function since those run with proper object context.
* fix(web): Prevent infinite recursion in plugin config methods
Add 'parent !== this' check to loadPluginConfig, generateConfigForm, and
savePluginConfig methods in pluginConfigData to prevent infinite recursion
when the component tries to delegate to a parent that resolves to itself.
This fixes the 'Maximum call stack size exceeded' error that occurred when
the nested Alpine component's $root reference resolved to a component that
had the same delegating methods via Object.assign.
* fix(web): Resolve infinite recursion in plugin config by calling $root directly
The previous implementation had delegating methods (generateConfigForm,
savePluginConfig) in pluginConfigData that tried to call parent.method(),
but the parent detection via getParentApp() was causing circular calls
because multiple components had the same methods.
Changes:
- Template now calls $root.generateConfigForm() and $root.savePluginConfig()
directly instead of going through nested component delegation
- Removed delegating generateConfigForm and savePluginConfig from pluginConfigData
- Removed getParentApp() helper that was enabling the circular calls
- Simplified loadPluginConfig to use PluginConfigHelpers directly
This fixes the 'Maximum call stack size exceeded' error when rendering
plugin configuration forms.
* fix(web): Use window.PluginConfigHelpers instead of $root for plugin config
The $root magic variable in Alpine.js doesn't correctly reference the
app() component's data scope from nested x-data contexts. This causes
generateConfigForm and savePluginConfig to be undefined.
Changed to use window.PluginConfigHelpers which has explicit logic to
find and use the app component's methods.
* fix(web): Use direct x-data initialization for plugin config reactivity
Changed from Object.assign($data, pluginConfigData(plugin)) to
x-data="pluginConfigData(plugin)" to ensure Alpine.js properly
tracks reactivity for all plugin config properties. This fixes
the issue where all plugin tabs were showing the same config.
* refactor(web): Implement server-side plugin config rendering with HTMX
Major architectural improvement to plugin configuration management:
- Add server-side Jinja2 template for plugin config forms
(web_interface/templates/v3/partials/plugin_config.html)
- Add Flask route to serve plugin config partials on-demand
- Replace complex client-side form generation with HTMX lazy loading
- Add Alpine.js store for centralized plugin state management
- Mark old pluginConfigData and PluginConfigHelpers as deprecated
Benefits:
- Lazy loading: configs only load when tab is accessed
- Server-side rendering: reduces client-side complexity
- Better performance: especially on Raspberry Pi
- Cleaner code: Jinja2 macros replace JS string templates
- More maintainable: form logic in one place (server)
The old client-side code is preserved for backwards compatibility
but is no longer used by the main plugin configuration UI.
* fix(web): Trigger HTMX manually after Alpine renders plugin tabs
HTMX processes attributes at page load time, before Alpine.js
renders dynamic content. Changed from :hx-get attribute to
x-init with htmx.ajax() to properly trigger the request after
the element is rendered.
* fix(web): Remove duplicate 'enabled' toggle from plugin config form
The 'enabled' field was appearing twice in plugin configuration:
1. Header toggle (quick action, uses HTMX)
2. Configuration form (from schema, requires save)
Now only the header toggle is shown, avoiding user confusion.
The 'enabled' key is explicitly skipped when rendering schema properties.
* perf(web): Optimize plugin manager with request caching and init guards
Major performance improvements to plugins_manager.js:
1. Request Deduplication & Caching
- Added pluginLoadCache with 3-second TTL
- Subsequent calls return cached data instead of making API requests
- In-flight request deduplication prevents parallel duplicate fetches
- Added refreshInstalledPlugins() for explicit force-refresh
2. Initialization Guards
- Added pluginsInitialized flag to prevent multiple initializePlugins() calls
- Added _eventDelegationSetup guard on container to prevent duplicate listeners
- Added _listenerSetup guards on search/category inputs
3. Debug Logging Control
- Added PLUGIN_DEBUG flag (localStorage.setItem('pluginDebug', 'true'))
- Most console.log calls now use pluginLog() which only logs when debug enabled
- Reduces console noise from ~150 logs to ~10 in production
Expected improvements:
- API calls reduced from 6+ to 2 on page load
- Event listeners no longer duplicated
- Cleaner console output
- Faster perceived performance
* fix(web): Handle missing search elements in searchPluginStore
The searchPluginStore function was failing silently when called before
the plugin-search and plugin-category elements existed in the DOM.
This caused the plugin store to never load.
Now safely checks if elements exist before accessing their values.
* fix(web): Ensure plugin store loads via pluginManager.searchPluginStore
- Exposed searchPluginStore on window.pluginManager for easier access
- Updated base.html to fallback to pluginManager.searchPluginStore
- Added logging when loading plugin store
* fix(web): Expose searchPluginStore from inside the IIFE
The function was defined inside the IIFE but only exposed after the IIFE
ended, where the function was out of scope. Now exposed immediately after
definition inside the IIFE.
* fix(web): Add cache-busting version to plugins_manager.js URL
Static JS files were being aggressively cached, preventing updates
from being loaded by browsers.
* fix(web): Fix pluginLog reference error outside IIFE
pluginLog is defined inside the IIFE, so use _PLUGIN_DEBUG_EARLY and
console.log directly for code outside the IIFE.
* chore(web): Update plugins_manager.js cache version
* fix(web): Defer plugin store render when grid not ready
Instead of showing an error when plugin-store-grid doesn't exist,
store plugins in window.__pendingStorePlugins for later rendering
when the tab loads (consistent with how installed plugins work).
* chore: Bump JS cache version
* fix(web): Restore enabledBool variable in plugin render
Variable was removed during debug logging optimization but was still
being used in the template string for toggle switch rendering.
* fix(ui): Add header and improve categories section rendering
- Add proper header (h4) to categories section with label
- Add debug logging to diagnose categories field rendering
- Improve additionalProperties condition check readability
* fix(ui): Improve additionalProperties condition check
- Explicitly exclude objects with properties to avoid conflicts
- Ensure categories section is properly detected and rendered
- Categories should show as header with toggles, not text box
* fix(web-ui): Fix JSON parsing errors and default value loading for plugin configs
- Fix JSON parsing errors when saving file upload fields by properly unescaping HTML entities
- Merge config with schema defaults when loading plugin config so form shows default values
- Improve default value handling in form generation for nested objects and arrays
- Add better error handling for malformed JSON in file upload fields
* fix(plugins): Return plugins array from getInstalledPlugins() instead of data object
Fixed PluginAPI.getInstalledPlugins() to return response.data.plugins (array)
instead of response.data (object). This was preventing window.installedPlugins
from being set correctly, which caused plugin configuration tabs to not appear
and prevented users from saving plugin configurations via the web UI.
The fix ensures that:
- window.installedPlugins is properly populated with plugin array
- Plugin tabs are created automatically on page load
- Configuration forms and save buttons are rendered correctly
- Save functionality works as expected
* fix(api): Support form data submission for plugin config saves
The HTMX form submissions use application/x-www-form-urlencoded format
instead of JSON. This update allows the /api/v3/plugins/config POST
endpoint to accept both formats:
- JSON: plugin_id and config in request body (existing behavior)
- Form data: plugin_id from query string, config fields from form
Added _parse_form_value helper to properly convert form strings to
appropriate Python types (bool, int, float, JSON arrays/objects).
* debug: Add form data logging to diagnose config save issue
* fix(web): Re-discover plugins before loading config partial
The plugin config partial was returning 'not found' for plugins
because the plugin manifests weren't loaded. The installed plugins
API was working because it calls discover_plugins() first.
Changes:
- Add discover_plugins() call in _load_plugin_config_partial when
plugin info is not found on first try
- Remove debug logging from form data handling
* fix(web): Comprehensive plugin config save improvements
SWEEPING FIX for plugin configuration saving issues:
1. Form data now MERGES with existing config instead of replacing
- Partial form submissions (missing fields) no longer wipe out
existing config values
- Fixes plugins with complex schemas (football, clock, etc.)
2. Improved nested value handling with _set_nested_value helper
- Correctly handles deeply nested structures like customization
- Properly merges when intermediate objects already exist
3. Better JSON parsing for arrays
- RGB color arrays like [255, 0, 0] now parse correctly
- Parse JSON before trying number conversion
4. Bump cache version to force JS reload
* fix(web): Add early stubs for updatePlugin and uninstallPlugin
Ensures these functions are available immediately when the page loads,
even before the full IIFE executes. Provides immediate user feedback
and makes API calls directly.
This fixes the 'Update button does not work' issue by ensuring the
function is always defined and callable.
* fix(web): Support form data in toggle endpoint
The toggle endpoint now accepts both JSON and HTMX form submissions.
Also updated the plugin config template to send the enabled state
via hx-vals when the checkbox changes.
Fixes: 415 Unsupported Media Type error when toggling plugins
* fix(web): Prevent config duplication when toggling plugins
Changed handleToggleResponse to update UI in place instead of
refreshing the entire config partial, which was causing duplication.
Also improved refreshPluginConfig with proper container targeting
and concurrent refresh prevention (though it's no longer needed
for toggles since we update in place).
* fix(api): Schema-aware form value parsing for plugin configs
Major fix for plugin config saving issues:
1. Load schema BEFORE processing form data to enable type-aware parsing
2. New _parse_form_value_with_schema() function that:
- Converts comma-separated strings to arrays when schema says 'array'
- Parses JSON strings for arrays/objects
- Handles empty strings for arrays (returns [] instead of None)
- Uses schema to determine correct number types
3. Post-processing to ensure None arrays get converted to empty arrays
4. Proper handling of nested object fields
Fixes validation errors:
- 'category_order': Expected type array, got str
- 'categories': Expected type object, got str
- 'uploaded_files': Expected type array, got NoneType
- RGB color arrays: Expected type array, got str
* fix(web): Make plugin config handlers idempotent and remove scripts from HTMX partials
CRITICAL FIX for script redeclaration errors:
1. Removed all <script> tags from plugin_config.html partial
- Scripts were being re-executed on every HTMX swap
- Caused 'Identifier already declared' errors
2. Moved all handler functions to base.html with idempotent initialization
- Added window.__pluginConfigHandlersInitialized guard
- Functions only initialized once, even if script runs multiple times
- All state stored on window object (e.g., window.pluginConfigRefreshInProgress)
3. Enhanced error logging:
- Client-side: Logs form payload, response status, and parsed error details
- Server-side: Logs raw form data and parsed config on validation failures
4. Functions moved to window scope:
- toggleSection
- handleConfigSave (with detailed error logging)
- handleToggleResponse (updates UI in place, no refresh)
- handlePluginUpdate
- refreshPluginConfig (with duplicate prevention)
- runPluginOnDemand
- stopOnDemand
- executePluginAction
This ensures HTMX-swapped fragments only contain HTML, and all
scripts run once in the base layout.
* fix(api): Filter config to only schema-defined fields before validation
When merging with existing_config, fields not in the plugin's schema
(like high_performance_transitions, transition, dynamic_duration)
were being preserved, causing validation failures when
additionalProperties is false.
Add _filter_config_by_schema() function to recursively filter config
to only include fields defined in the schema before validation.
This fixes validation errors like:
- 'Additional properties are not allowed (high_performance_transitions, transition were unexpected)'
* fix(web): Improve update plugin error handling and support form data
1. Enhanced updatePlugin JavaScript function:
- Validates pluginId before sending request
- Checks response.ok before parsing JSON
- Better error logging with request/response details
- Handles both successful and error responses properly
2. Update endpoint now supports both JSON and form data:
- Similar to config endpoint, accepts plugin_id from query string or form
- Better error messages and debug logging
3. Prevent duplicate function definitions:
- Second updatePlugin definition checks if improved version exists
- Both definitions now have consistent error handling
Fixes: 400 BAD REQUEST 'Request body must be valid JSON' error
* fix(web): Show correct 'update' message instead of 'save' for plugin updates
The handlePluginUpdate function now:
1. Checks actual HTTP status code (not just event.detail.successful)
2. Parses JSON response to get server's actual message
3. Replaces 'save' with 'update' if message incorrectly says 'save'
Fixes: Update button showing 'saved successfully' instead of
'updated successfully'
* fix(web): Execute plugin updates immediately instead of queuing
Plugin updates are now executed directly (synchronously) instead of
being queued for async processing. This provides immediate feedback
to users about whether the update succeeded or failed.
Updates are fast git pull operations, so they don't need async
processing. The operation queue is reserved for longer operations
like install/uninstall.
Fixes: Update button not actually updating plugins (operations were
queued but users didn't see results)
* fix(web): Ensure toggleSection function is always available for collapsible headers
Moved toggleSection outside the initialization guard block so it's
always defined, even if the plugin config handlers have already been
initialized. This ensures collapsible sections in plugin config forms
work correctly.
Added debug logging to help diagnose if sections/icons aren't found.
Fixes: Collapsible headers in plugin config schema not collapsing
* fix(web): Improve toggleSection to explicitly show/hide collapsible content
Changed from classList.toggle() to explicit add/remove of 'hidden' class
based on current state. This ensures the content visibility is properly
controlled when collapsing/expanding sections.
Added better error checking and state detection for more reliable
collapsible section behavior.
* fix(web): Load plugin tabs on page load instead of waiting for plugin manager tab click
The stub's loadInstalledPlugins was an empty function, so plugin tabs
weren't loading until the plugin manager tab was clicked. Now the stub
implementation:
1. Tries to use global window.loadInstalledPlugins if available
2. Falls back to window.pluginManager.loadInstalledPlugins
3. Finally falls back to direct loading via loadInstalledPluginsDirectly
4. Always updates tabs after loading plugins
This ensures plugin navigation tabs are available immediately on page load.
Fixes: Plugin tabs only loading after clicking plugin manager tab
* fix(web): Ensure plugin navigation tabs load on any page regardless of active tab
Multiple improvements to ensure plugin tabs are always visible:
1. Stub's loadInstalledPluginsDirectly now waits for DOM to be ready
before updating tabs, using requestAnimationFrame for proper timing
2. Stub's init() now has a retry mechanism that periodically checks
if plugins have been loaded by plugins_manager.js and updates tabs
accordingly (checks for 2 seconds)
3. Full implementation's init() now properly handles async plugin loading
and ensures tabs are updated after loading completes, checking
window.installedPlugins first before attempting to load
4. Both stub and full implementation ensure tabs update using $nextTick
to wait for Alpine.js rendering cycle
This ensures plugin navigation tabs are visible immediately when the
page loads, regardless of whether the user is on overview, plugin manager,
or any other tab.
Fixes: Plugin tabs only appearing after clicking plugin manager tab
* fix(web): Fix restart display button not working
The initPluginsPage function was returning early before event listeners
were set up, making all the event listener code unreachable. Moved the
return statement to after all event listeners are attached.
This fixes the restart display button and all other buttons in the
plugin manager (refresh plugins, update all, search, etc.) that depend
on event listeners being set up.
Fixes: Restart Display button not working in plugin manager
* fix(web-ui): Improve categories field rendering for of-the-day plugin
- Add more explicit condition checking for additionalProperties objects
- Add debug logging specifically for categories field
- Add fallback handler for objects that don't match special cases (render as JSON textarea)
- Ensure categories section displays correctly with toggle cards instead of plain text
* fix(install): Prevent following broken symlinks during file ownership setup
- Add -P flag to find commands to prevent following symlinks when traversing
- Add -h flag to chown to operate on symlinks themselves rather than targets
- Exclude scripts/dev/plugins directory which contains development symlinks
- Fixes error when chown tries to dereference broken symlinks with extra LEDMatrix in path
* fix(scroll): Ensure scroll completes fully before switching displays
- Add display_width to total scroll distance calculation
- Scroll now continues until content is completely off screen
- Update scroll completion check to use total_scroll_width + display_width
- Prevents scroll from being cut off mid-way when switching to next display
* fix(install): Remove unsupported -P flag from find commands
- Remove -P flag which is not supported on all find versions
- Keep -h flag on chown to operate on symlinks themselves
- Change to {} \; syntax for better error handling
- Add error suppression to continue on broken symlinks
- Exclude scripts/dev/plugins directory to prevent traversal into broken symlinks
* docs(wifi): Add trailing newline to WiFi AP failover setup guide
* fix(web): Suppress non-critical socket errors and fix WiFi permissions script
- Add error filtering in web interface to suppress harmless client disconnection errors
- Downgrade 'No route to host' and broken pipe errors from ERROR to DEBUG level
- Fix WiFi permissions script to use mktemp instead of manual temp file creation
- Add cleanup trap to ensure temp files are removed on script exit
- Resolves permission denied errors when creating temp files during installation
* fix(web): Ensure plugin navigation tabs load on any page by dispatching events
The issue was that when plugins_manager.js loaded and called
loadInstalledPlugins(), it would set window.installedPlugins but the
Alpine.js component wouldn't know to update its tabs unless the plugin
manager tab was clicked.
Changes:
1. loadInstalledPlugins() now always dispatches a 'pluginsUpdated' event
when it sets window.installedPlugins, not just when plugin IDs change
2. renderInstalledPlugins() also dispatches the event and always updates
window.installedPlugins for reactivity
3. Cached plugin data also dispatches the event when returned
The Alpine component already listens for the 'pluginsUpdated' event in
its init() method, so tabs will now update immediately when plugins are
loaded, regardless of which tab is active.
Fixes: Plugin navigation tabs only loading after clicking plugin manager tab
* fix(web): Improve input field contrast in plugin configuration forms
Changed input backgrounds from bg-gray-800 to bg-gray-900 (darker) to
ensure high contrast with white text. Added placeholder:text-gray-400
for better placeholder text visibility.
Updated in both server-side template (plugin_config.html) and client-side
form generation (plugins_manager.js):
- Number inputs
- Text inputs
- Array inputs (comma-separated)
- Select dropdowns
- Textareas (JSON objects)
- Fallback inputs without schema
This ensures all form inputs have high contrast white text on dark
background, making them clearly visible and readable.
Fixes: White text on white background in plugin config inputs
* fix(web): Change plugin config input text from white to black
Changed all input fields in plugin configuration forms to use black text
on white background instead of white text on dark background for better
readability and standard form appearance.
Updated:
- Input backgrounds: bg-gray-900 -> bg-white
- Text color: text-white -> text-black
- Placeholder color: text-gray-400 -> text-gray-500
Applied to both server-side template and client-side form generation
for all input types (number, text, select, textarea).
* fix(web): Ensure toggleSection function is available for plugin config collapsible sections
Moved toggleSection function definition to an early script block so it's
available immediately when HTMX loads plugin configuration content. The
function was previously defined later in the page which could cause it
to not be accessible when inline onclick handlers try to call it.
The function toggles the 'hidden' class on collapsible section content
divs and rotates the chevron icon between right (collapsed) and down
(expanded) states.
Fixes: Plugin configuration section headers not collapsing/expanding
* fix(web): Fix collapsible section toggle to properly hide/show content
Updated toggleSection function to explicitly set display style in addition
to toggling the hidden class. This ensures the content is properly hidden
even if CSS specificity or other styles might interfere with just the
hidden class.
The function now:
- Checks both the hidden class and computed display style
- Explicitly sets display: '' when showing and display: 'none' when hiding
- Rotates chevron icon between right (collapsed) and down (expanded)
This ensures collapsible sections in plugin configuration forms properly
hide and show their content when the header is clicked.
Fixes: Collapsible section headers rotate chevron but don't hide content
* fix(web): Fix collapsible section toggle to work on first click
Simplified the toggle logic to rely primarily on the 'hidden' class check
rather than mixing it with computed display styles. When hiding, we now
remove any inline display style to let Tailwind's 'hidden' class properly
control the display property.
This ensures sections respond correctly on the first click, whether they're
starting in a collapsed or expanded state.
Fixes: Sections requiring 2 clicks to collapse
* fix(web): Ensure collapsible sections start collapsed by default
Added explicit display: none style to nested content divs in plugin config
template to ensure they start collapsed. The hidden class should handle this,
but adding the inline style ensures sections are definitely collapsed on
initial page load.
Sections now:
- Start collapsed (hidden) with chevron pointing right
- Expand when clicked (chevron points down)
- Collapse when clicked again (chevron points right)
This ensures a consistent collapsed initial state across all plugin
configuration sections.
* fix(web): Fix collapsible section toggle to properly collapse on second click
Fixed the toggle logic to explicitly set display: block when showing and
display: none when hiding, rather than clearing the display style. This
ensures the section state is properly tracked and the toggle works correctly
on both expand and collapse clicks.
The function now:
- When hidden: removes hidden class, sets display: block, chevron down
- When visible: adds hidden class, sets display: none, chevron right
This fixes the issue where sections would expand but not collapse again.
Fixes: Sections not collapsing on second click
* feat(web): Ensure plugin navigation tabs load automatically on any page
Implemented comprehensive solution to ensure plugin navigation tabs load
automatically without requiring a visit to the plugin manager page:
1. Global event listener for 'pluginsUpdated' - works even if Alpine isn't
ready yet, updates tabs directly when plugins_manager.js loads plugins
2. Enhanced stub's loadInstalledPluginsDirectly():
- Sets window.installedPlugins after loading
- Dispatches 'pluginsUpdated' event for global listener
- Adds console logging for debugging
3. Event listener in stub's init() method:
- Listens for 'pluginsUpdated' events
- Updates component state and tabs when events fire
4. Fallback timer:
- If plugins_manager.js hasn't loaded after 2 seconds, fetches
plugins directly via API
- Ensures tabs appear even if plugins_manager.js fails
5. Improved checkAndUpdateTabs():
- Better logging
- Fallback to direct fetch after timeout
6. Enhanced logging throughout plugin loading flow for debugging
This ensures plugin tabs are visible immediately on page load, regardless
of which tab is active or when plugins_manager.js loads.
Fixes: Plugin navigation tabs only loading after visiting plugin manager
* fix(web): Improve plugin tabs update logging and ensure immediate execution
Enhanced logging in updatePluginTabs() and _doUpdatePluginTabs() to help
debug why tabs aren't appearing. Changed debounce behavior to execute
immediately on first call to ensure tabs appear quickly.
Added detailed console logging with [FULL] prefix to track:
- When updatePluginTabs() is called
- When _doUpdatePluginTabs() executes
- DOM element availability
- Tab creation process
- Final tab count
This will help identify if tabs are being created but not visible, or if
the update function isn't being called at all.
Fixes: Plugin tabs loading but not visible in navigation bar
* fix(web): Prevent duplicate plugin tab updates and clearing
Added debouncing and duplicate prevention to stub's updatePluginTabs() to
prevent tabs from being cleared and re-added multiple times. Also checks
if tabs already match before clearing them.
Changes:
1. Debounce stub's updatePluginTabs() with 100ms delay
2. Check if existing tabs match current plugin list before clearing
3. Global event listener only triggers full implementation's updatePluginTabs
4. Stub's event listener only works in stub mode (before enhancement)
This prevents the issue where tabs were being cleared and re-added
multiple times in rapid succession, which could leave tabs empty.
Fixes: Plugin tabs being cleared and not re-added properly
* fix(web): Fix plugin tabs not rendering when plugins are loaded
Fixed _doUpdatePluginTabs() to properly use component's installedPlugins
instead of checking window.installedPlugins first. Also fixed the 'unchanged'
check to not skip when both lists are empty (first load scenario).
Changes:
1. Check component's installedPlugins first (most up-to-date)
2. Only skip update if plugins exist AND match (don't skip empty lists)
3. Retry if no plugins found (in case they're still loading)
4. Ensure window.installedPlugins is set when loading directly
5. Better logging to show which plugin source is being used
This ensures tabs are rendered when plugins are loaded, even on first page load.
Fixes: Plugin tabs not being drawn despite plugins being loaded
* fix(config): Fix array field parsing and validation for plugin config forms
- Added logic to detect and combine indexed array fields (text_color.0, text_color.1, etc.)
- Fixed array fields incorrectly stored as dicts with numeric keys
- Improved handling of comma-separated array values from form submissions
- Ensures array fields meet minItems requirements before validation
- Resolves 400 BAD REQUEST errors when saving plugin config with RGB color arrays
* fix(config): Improve array field handling and secrets error handling
- Use schema defaults when array fields don't meet minItems requirement
- Add debug logging for array field parsing
- Improve error handling for secrets file writes
- Fix arrays stored as dicts with numeric keys conversion
- Better handling of incomplete array values from form submissions
* fix(config): Convert array elements to correct types (numbers not strings)
- Fix array element type conversion when converting dicts to arrays
- Ensure RGB color arrays have integer elements, not strings
- Apply type conversion for both nested and top-level array fields
- Fixes validation errors: 'Expected type number, got str'
* fix(config): Fix array fields showing 'none' when value is null
- Handle None/null values in array field templates properly
- Use schema defaults when array values are None/null
- Fix applies to both Jinja2 template and JavaScript form generation
- Resolves issue where stock ticker plugin shows 'none' instead of default values
* fix(config): Add novalidate to plugin config form to prevent HTML5 validation blocking saves
- Prevents browser HTML5 validation from blocking form submission
- Allows custom validation logic to handle form data properly
- Fixes issue where save button appears unclickable due to invalid form controls
- Resolves problems with plugins like clock-simple that have nested/array fields
* feat(config): Add helpful form validation with detailed error messages
- Keep HTML5 validation enabled (removed novalidate) to prevent broken configs
- Add validatePluginConfigForm function that shows which fields fail and why
- Automatically expands collapsed sections containing invalid fields
- Focuses first invalid field and scrolls to it
- Shows user-friendly error messages with field names and specific issues
- Prevents form submission until all fields are valid
* fix(schema): Remove core properties from required array during validation
- Core properties (enabled, display_duration, live_priority) are system-managed
- SchemaManager now removes them from required array after injection
- Added default values for core properties (enabled=True, display_duration=15, live_priority=False)
- Updated generate_default_config() to ensure live_priority has default
- Resolves 186 validation issues, reducing to 3 non-blocking warnings (98.4% reduction)
- All 19 of 20 plugins now pass validation without errors
Documentation:
- Created docs/PLUGIN_CONFIG_CORE_PROPERTIES.md explaining core property handling
- Updated existing docs to reflect core property behavior
- Removed temporary audit files and scripts
* fix(ui): Improve button text contrast on white backgrounds
- Changed Screenshot button text from text-gray-700 to text-gray-900
- Added global CSS rule to ensure all buttons with white backgrounds use dark text (text-gray-900) for better readability
- Fixes contrast issues where light text on light backgrounds was illegible
* fix(ui): Add explicit text color to form-control inputs
- Added color: #111827 to .form-control class to ensure dark text on white backgrounds
- Fixes issue where input fields had white text on white background after button contrast fix
- Ensures all form inputs are readable with proper contrast
* docs: Update impact explanation and plugin config documentation
* docs: Improve documentation and fix template inconsistencies
- Add migration guide for script path reorganization (scripts moved to scripts/install/ and scripts/fix_perms/)
- Add breaking changes section to README with migration guidance
- Fix config template: set plugins_directory to 'plugins' to match actual plugin locations
- Fix test template: replace Jinja2 placeholders with plain text to match other templates
- Fix markdown linting: add language identifiers to code blocks (python, text, javascript)
- Update permission guide: document setgid bit (0o2775) for directory modes
- Fix example JSON: pin dependency versions and fix compatible_versions range
- Improve readability: reduce repetition in IMPACT_EXPLANATION.md
* feat(web): Make v3 interface production-ready for local deployment
- Phase 2: Real Service Integration
- Replace sample data with real psutil system monitoring (CPU, memory, disk, temp, uptime)
- Integrate display controller to read from /tmp/led_matrix_preview.png snapshot
- Scan assets/fonts directory and extract font metadata with freetype
- Phase 1: Security & Input Validation
- Add input validation module with URL, file upload, and config sanitization
- Add optional CSRF protection (gracefully degrades if flask-wtf missing)
- Add rate limiting (lenient for local use, prevents accidental abuse)
- Add file upload validation to font upload endpoint
- Phase 3: Error Handling
- Add global error handlers for 404, 500, and unhandled exceptions
- All endpoints have comprehensive try/except blocks
- Phase 4: Monitoring & Observability
- Add structured logging with JSON format support
- Add request logging middleware (tracks method, path, status, duration, IP)
- Add /api/v3/health endpoint with service status checks
- Phase 5: Performance & Caching
- Add in-memory caching system (separate module to avoid circular imports)
- Cache font catalog (5 minute TTL)
- Cache system status (10 second TTL)
- Invalidate cache on config changes
- All changes are non-blocking with graceful error handling
- Optional dependencies (flask-wtf, flask-limiter) degrade gracefully
- All imports protected with try/except blocks
- Verified compilation and import tests pass
* docs: Fix caching pattern logic flaw and merge conflict resolution plan
- Fix Basic Caching Pattern: Replace broken stale cache fallback with correct pattern
- Re-fetch cache with large max_age (31536000) in except block instead of checking already-falsy cached variable
- Fixes both instances in ADVANCED_PLUGIN_DEVELOPMENT.md
- Matches correct pattern from manager.py.template
- Fix MERGE_CONFLICT_RESOLUTION_PLAN.md merge direction
- Correct Step 1 to checkout main and merge plugins into it (not vice versa)
- Update commit message to reflect 'Merge plugins into main' direction
- Fixes workflow to match documented plugins → main merge
---------
Co-authored-by: Chuck <chuck@example.com>
5628 lines
242 KiB
Python
5628 lines
242 KiB
Python
from flask import Blueprint, request, jsonify, Response
|
|
import json
|
|
import os
|
|
import sys
|
|
import subprocess
|
|
import time
|
|
import hashlib
|
|
import uuid
|
|
from datetime import datetime
|
|
from pathlib import Path
|
|
|
|
# Import new infrastructure
|
|
from src.web_interface.api_helpers import success_response, error_response, validate_request_json
|
|
from src.web_interface.errors import ErrorCode
|
|
from src.plugin_system.operation_types import OperationType
|
|
from src.web_interface.logging_config import log_plugin_operation, log_config_change
|
|
from src.web_interface.validators import (
|
|
validate_image_url, validate_file_upload, validate_mime_type,
|
|
validate_numeric_range, validate_string_length, sanitize_plugin_config
|
|
)
|
|
|
|
# Will be initialized when blueprint is registered
|
|
config_manager = None
|
|
plugin_manager = None
|
|
plugin_store_manager = None
|
|
saved_repositories_manager = None
|
|
cache_manager = None
|
|
schema_manager = None
|
|
operation_queue = None
|
|
plugin_state_manager = None
|
|
operation_history = None
|
|
|
|
# Get project root directory (web_interface/../..)
|
|
PROJECT_ROOT = Path(__file__).resolve().parent.parent.parent
|
|
|
|
api_v3 = Blueprint('api_v3', __name__)
|
|
|
|
def _ensure_cache_manager():
|
|
"""Ensure cache manager is initialized."""
|
|
global cache_manager
|
|
if cache_manager is None:
|
|
from src.cache_manager import CacheManager
|
|
cache_manager = CacheManager()
|
|
return cache_manager
|
|
|
|
def _save_config_atomic(config_manager, config_data, create_backup=True):
|
|
"""
|
|
Save configuration using atomic save if available, fallback to regular save.
|
|
|
|
Returns:
|
|
tuple: (success: bool, error_message: str or None)
|
|
"""
|
|
if hasattr(config_manager, 'save_config_atomic'):
|
|
result = config_manager.save_config_atomic(config_data, create_backup=create_backup)
|
|
if result.status.value != 'success':
|
|
return False, result.message
|
|
return True, None
|
|
else:
|
|
try:
|
|
config_manager.save_config(config_data)
|
|
return True, None
|
|
except Exception as e:
|
|
return False, str(e)
|
|
|
|
def _get_display_service_status():
|
|
"""Return status information about the ledmatrix service."""
|
|
try:
|
|
result = subprocess.run(
|
|
['systemctl', 'is-active', 'ledmatrix'],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=3
|
|
)
|
|
return {
|
|
'active': result.stdout.strip() == 'active',
|
|
'returncode': result.returncode,
|
|
'stdout': result.stdout.strip(),
|
|
'stderr': result.stderr.strip()
|
|
}
|
|
except subprocess.TimeoutExpired:
|
|
return {
|
|
'active': False,
|
|
'returncode': -1,
|
|
'stdout': '',
|
|
'stderr': 'timeout'
|
|
}
|
|
except Exception as err:
|
|
return {
|
|
'active': False,
|
|
'returncode': -1,
|
|
'stdout': '',
|
|
'stderr': str(err)
|
|
}
|
|
|
|
def _run_systemctl_command(args):
|
|
"""Run a systemctl command safely."""
|
|
try:
|
|
result = subprocess.run(
|
|
args,
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=15
|
|
)
|
|
return {
|
|
'returncode': result.returncode,
|
|
'stdout': result.stdout,
|
|
'stderr': result.stderr
|
|
}
|
|
except subprocess.TimeoutExpired:
|
|
return {
|
|
'returncode': -1,
|
|
'stdout': '',
|
|
'stderr': 'timeout'
|
|
}
|
|
except Exception as err:
|
|
return {
|
|
'returncode': -1,
|
|
'stdout': '',
|
|
'stderr': str(err)
|
|
}
|
|
|
|
def _ensure_display_service_running():
|
|
"""Ensure the ledmatrix display service is running."""
|
|
status = _get_display_service_status()
|
|
if status.get('active'):
|
|
status['started'] = False
|
|
return status
|
|
result = _run_systemctl_command(['sudo', 'systemctl', 'start', 'ledmatrix'])
|
|
service_status = _get_display_service_status()
|
|
result['started'] = result.get('returncode') == 0
|
|
result['active'] = service_status.get('active')
|
|
result['status'] = service_status
|
|
return result
|
|
|
|
def _stop_display_service():
|
|
"""Stop the ledmatrix display service."""
|
|
result = _run_systemctl_command(['sudo', 'systemctl', 'stop', 'ledmatrix'])
|
|
status = _get_display_service_status()
|
|
result['active'] = status.get('active')
|
|
result['status'] = status
|
|
return result
|
|
|
|
@api_v3.route('/config/main', methods=['GET'])
|
|
def get_main_config():
|
|
"""Get main configuration"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return jsonify({'status': 'error', 'message': 'Config manager not initialized'}), 500
|
|
|
|
config = api_v3.config_manager.load_config()
|
|
return jsonify({'status': 'success', 'data': config})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/config/schedule', methods=['GET'])
|
|
def get_schedule_config():
|
|
"""Get current schedule configuration"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return error_response(
|
|
ErrorCode.CONFIG_LOAD_FAILED,
|
|
'Config manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
config = api_v3.config_manager.load_config()
|
|
schedule_config = config.get('schedule', {})
|
|
|
|
return success_response(data=schedule_config)
|
|
except Exception as e:
|
|
return error_response(
|
|
ErrorCode.CONFIG_LOAD_FAILED,
|
|
f"Error loading schedule configuration: {str(e)}",
|
|
status_code=500
|
|
)
|
|
|
|
def _validate_time_format(time_str):
|
|
"""Validate time format is HH:MM"""
|
|
try:
|
|
datetime.strptime(time_str, '%H:%M')
|
|
return True, None
|
|
except (ValueError, TypeError):
|
|
return False, f"Invalid time format: {time_str}. Expected HH:MM format."
|
|
|
|
def _validate_time_range(start_time_str, end_time_str, allow_overnight=True):
|
|
"""Validate time range. Returns (is_valid, error_message)"""
|
|
try:
|
|
start_time = datetime.strptime(start_time_str, '%H:%M').time()
|
|
end_time = datetime.strptime(end_time_str, '%H:%M').time()
|
|
|
|
# Allow overnight schedules (start > end) or same-day schedules
|
|
if not allow_overnight and start_time >= end_time:
|
|
return False, f"Start time ({start_time_str}) must be before end time ({end_time_str}) for same-day schedules"
|
|
|
|
return True, None
|
|
except (ValueError, TypeError) as e:
|
|
return False, f"Invalid time format: {str(e)}"
|
|
|
|
@api_v3.route('/config/schedule', methods=['POST'])
|
|
def save_schedule_config():
|
|
"""Save schedule configuration"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return jsonify({'status': 'error', 'message': 'Config manager not initialized'}), 500
|
|
|
|
data = request.get_json()
|
|
if not data:
|
|
return jsonify({'status': 'error', 'message': 'No data provided'}), 400
|
|
|
|
# Load current config
|
|
current_config = api_v3.config_manager.load_config()
|
|
|
|
# Build schedule configuration
|
|
# Handle enabled checkbox - can be True, False, or 'on'
|
|
enabled_value = data.get('enabled', False)
|
|
if isinstance(enabled_value, str):
|
|
enabled_value = enabled_value.lower() in ('true', 'on', '1')
|
|
schedule_config = {
|
|
'enabled': enabled_value
|
|
}
|
|
|
|
mode = data.get('mode', 'global')
|
|
|
|
if mode == 'global':
|
|
# Simple global schedule
|
|
start_time = data.get('start_time', '07:00')
|
|
end_time = data.get('end_time', '23:00')
|
|
|
|
# Validate time formats
|
|
is_valid, error_msg = _validate_time_format(start_time)
|
|
if not is_valid:
|
|
return error_response(
|
|
ErrorCode.VALIDATION_ERROR,
|
|
error_msg,
|
|
status_code=400
|
|
)
|
|
|
|
is_valid, error_msg = _validate_time_format(end_time)
|
|
if not is_valid:
|
|
return error_response(
|
|
ErrorCode.VALIDATION_ERROR,
|
|
error_msg,
|
|
status_code=400
|
|
)
|
|
|
|
schedule_config['start_time'] = start_time
|
|
schedule_config['end_time'] = end_time
|
|
# Remove days config when switching to global mode
|
|
schedule_config.pop('days', None)
|
|
else:
|
|
# Per-day schedule
|
|
schedule_config['days'] = {}
|
|
# Remove global times when switching to per-day mode
|
|
schedule_config.pop('start_time', None)
|
|
schedule_config.pop('end_time', None)
|
|
days = ['monday', 'tuesday', 'wednesday', 'thursday', 'friday', 'saturday', 'sunday']
|
|
enabled_days_count = 0
|
|
|
|
for day in days:
|
|
day_config = {}
|
|
enabled_key = f'{day}_enabled'
|
|
start_key = f'{day}_start'
|
|
end_key = f'{day}_end'
|
|
|
|
# Check if day is enabled
|
|
if enabled_key in data:
|
|
enabled_val = data[enabled_key]
|
|
# Handle checkbox values that may come as 'on', True, or False
|
|
if isinstance(enabled_val, str):
|
|
day_config['enabled'] = enabled_val.lower() in ('true', 'on', '1')
|
|
else:
|
|
day_config['enabled'] = bool(enabled_val)
|
|
else:
|
|
# Default to enabled if not specified
|
|
day_config['enabled'] = True
|
|
|
|
# Only add times if day is enabled
|
|
if day_config.get('enabled', True):
|
|
enabled_days_count += 1
|
|
start_time = None
|
|
end_time = None
|
|
|
|
if start_key in data and data[start_key]:
|
|
start_time = data[start_key]
|
|
else:
|
|
start_time = '07:00'
|
|
|
|
if end_key in data and data[end_key]:
|
|
end_time = data[end_key]
|
|
else:
|
|
end_time = '23:00'
|
|
|
|
# Validate time formats
|
|
is_valid, error_msg = _validate_time_format(start_time)
|
|
if not is_valid:
|
|
return error_response(
|
|
ErrorCode.VALIDATION_ERROR,
|
|
f"Invalid start time for {day}: {error_msg}",
|
|
status_code=400
|
|
)
|
|
|
|
is_valid, error_msg = _validate_time_format(end_time)
|
|
if not is_valid:
|
|
return error_response(
|
|
ErrorCode.VALIDATION_ERROR,
|
|
f"Invalid end time for {day}: {error_msg}",
|
|
status_code=400
|
|
)
|
|
|
|
day_config['start_time'] = start_time
|
|
day_config['end_time'] = end_time
|
|
|
|
schedule_config['days'][day] = day_config
|
|
|
|
# Validate that at least one day is enabled in per-day mode
|
|
if enabled_days_count == 0:
|
|
return error_response(
|
|
ErrorCode.VALIDATION_ERROR,
|
|
"At least one day must be enabled in per-day schedule mode",
|
|
status_code=400
|
|
)
|
|
|
|
# Update and save config using atomic save
|
|
current_config['schedule'] = schedule_config
|
|
success, error_msg = _save_config_atomic(api_v3.config_manager, current_config, create_backup=True)
|
|
if not success:
|
|
return error_response(
|
|
ErrorCode.CONFIG_SAVE_FAILED,
|
|
f"Failed to save schedule configuration: {error_msg}",
|
|
status_code=500
|
|
)
|
|
|
|
# Invalidate cache on config change
|
|
try:
|
|
from web_interface.cache import invalidate_cache
|
|
invalidate_cache()
|
|
except ImportError:
|
|
pass
|
|
|
|
return success_response(message='Schedule configuration saved successfully')
|
|
except Exception as e:
|
|
import logging
|
|
import traceback
|
|
error_msg = f"Error saving schedule config: {str(e)}\n{traceback.format_exc()}"
|
|
logging.error(error_msg)
|
|
return error_response(
|
|
ErrorCode.CONFIG_SAVE_FAILED,
|
|
f"Error saving schedule configuration: {str(e)}",
|
|
details=traceback.format_exc(),
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/config/main', methods=['POST'])
|
|
def save_main_config():
|
|
"""Save main configuration"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return jsonify({'status': 'error', 'message': 'Config manager not initialized'}), 500
|
|
|
|
# Try to get JSON data first, fallback to form data
|
|
data = None
|
|
if request.content_type == 'application/json':
|
|
data = request.get_json()
|
|
else:
|
|
# Handle form data
|
|
data = request.form.to_dict()
|
|
# Convert checkbox values
|
|
for key in ['web_display_autostart']:
|
|
if key in data:
|
|
data[key] = data[key] == 'on'
|
|
|
|
if not data:
|
|
return jsonify({'status': 'error', 'message': 'No data provided'}), 400
|
|
|
|
import logging
|
|
logging.error(f"DEBUG: save_main_config received data: {data}")
|
|
logging.error(f"DEBUG: Content-Type header: {request.content_type}")
|
|
logging.error(f"DEBUG: Headers: {dict(request.headers)}")
|
|
|
|
# Merge with existing config (similar to original implementation)
|
|
current_config = api_v3.config_manager.load_config()
|
|
|
|
# Handle general settings
|
|
# Note: Checkboxes don't send data when unchecked, so we need to check if we're updating general settings
|
|
# If any general setting is present, we're updating the general tab
|
|
is_general_update = any(k in data for k in ['timezone', 'city', 'state', 'country', 'web_display_autostart',
|
|
'auto_discover', 'auto_load_enabled', 'development_mode', 'plugins_directory'])
|
|
|
|
if is_general_update:
|
|
# For checkbox: if not present in data during general update, it means unchecked
|
|
current_config['web_display_autostart'] = data.get('web_display_autostart', False)
|
|
|
|
if 'timezone' in data:
|
|
current_config['timezone'] = data['timezone']
|
|
|
|
# Handle location settings
|
|
if 'city' in data or 'state' in data or 'country' in data:
|
|
if 'location' not in current_config:
|
|
current_config['location'] = {}
|
|
if 'city' in data:
|
|
current_config['location']['city'] = data['city']
|
|
if 'state' in data:
|
|
current_config['location']['state'] = data['state']
|
|
if 'country' in data:
|
|
current_config['location']['country'] = data['country']
|
|
|
|
# Handle plugin system settings
|
|
if 'auto_discover' in data or 'auto_load_enabled' in data or 'development_mode' in data or 'plugins_directory' in data:
|
|
if 'plugin_system' not in current_config:
|
|
current_config['plugin_system'] = {}
|
|
|
|
# Handle plugin system checkboxes
|
|
for checkbox in ['auto_discover', 'auto_load_enabled', 'development_mode']:
|
|
if checkbox in data:
|
|
current_config['plugin_system'][checkbox] = data.get(checkbox, False)
|
|
|
|
# Handle plugins_directory
|
|
if 'plugins_directory' in data:
|
|
current_config['plugin_system']['plugins_directory'] = data['plugins_directory']
|
|
|
|
# Handle display settings
|
|
display_fields = ['rows', 'cols', 'chain_length', 'parallel', 'brightness', 'hardware_mapping',
|
|
'gpio_slowdown', 'scan_mode', 'disable_hardware_pulsing', 'inverse_colors', 'show_refresh_rate',
|
|
'pwm_bits', 'pwm_dither_bits', 'pwm_lsb_nanoseconds', 'limit_refresh_rate_hz', 'use_short_date_format',
|
|
'max_dynamic_duration_seconds']
|
|
|
|
if any(k in data for k in display_fields):
|
|
if 'display' not in current_config:
|
|
current_config['display'] = {}
|
|
if 'hardware' not in current_config['display']:
|
|
current_config['display']['hardware'] = {}
|
|
if 'runtime' not in current_config['display']:
|
|
current_config['display']['runtime'] = {}
|
|
|
|
# Handle hardware settings
|
|
for field in ['rows', 'cols', 'chain_length', 'parallel', 'brightness', 'hardware_mapping', 'scan_mode',
|
|
'pwm_bits', 'pwm_dither_bits', 'pwm_lsb_nanoseconds', 'limit_refresh_rate_hz']:
|
|
if field in data:
|
|
if field in ['rows', 'cols', 'chain_length', 'parallel', 'brightness', 'scan_mode',
|
|
'pwm_bits', 'pwm_dither_bits', 'pwm_lsb_nanoseconds', 'limit_refresh_rate_hz']:
|
|
current_config['display']['hardware'][field] = int(data[field])
|
|
else:
|
|
current_config['display']['hardware'][field] = data[field]
|
|
|
|
# Handle runtime settings
|
|
if 'gpio_slowdown' in data:
|
|
current_config['display']['runtime']['gpio_slowdown'] = int(data['gpio_slowdown'])
|
|
|
|
# Handle checkboxes
|
|
for checkbox in ['disable_hardware_pulsing', 'inverse_colors', 'show_refresh_rate']:
|
|
current_config['display']['hardware'][checkbox] = data.get(checkbox, False)
|
|
|
|
# Handle display-level checkboxes
|
|
if 'use_short_date_format' in data:
|
|
current_config['display']['use_short_date_format'] = data.get('use_short_date_format', False)
|
|
|
|
# Handle dynamic duration settings
|
|
if 'max_dynamic_duration_seconds' in data:
|
|
if 'dynamic_duration' not in current_config['display']:
|
|
current_config['display']['dynamic_duration'] = {}
|
|
current_config['display']['dynamic_duration']['max_duration_seconds'] = int(data['max_dynamic_duration_seconds'])
|
|
|
|
# Handle display durations
|
|
duration_fields = [k for k in data.keys() if k.endswith('_duration') or k in ['default_duration', 'transition_duration']]
|
|
if duration_fields:
|
|
if 'display' not in current_config:
|
|
current_config['display'] = {}
|
|
if 'display_durations' not in current_config['display']:
|
|
current_config['display']['display_durations'] = {}
|
|
|
|
for field in duration_fields:
|
|
if field in data:
|
|
current_config['display']['display_durations'][field] = int(data[field])
|
|
|
|
# Handle plugin configurations dynamically
|
|
# Any key that matches a plugin ID should be saved as plugin config
|
|
# This includes proper secret field handling from schema
|
|
plugin_keys_to_remove = []
|
|
for key in data:
|
|
# Check if this key is a plugin ID
|
|
if api_v3.plugin_manager and key in api_v3.plugin_manager.plugin_manifests:
|
|
plugin_id = key
|
|
plugin_config = data[key]
|
|
|
|
# Load plugin schema to identify secret fields (same logic as save_plugin_config)
|
|
secret_fields = set()
|
|
if api_v3.plugin_manager:
|
|
plugins_dir = api_v3.plugin_manager.plugins_dir
|
|
else:
|
|
plugin_system_config = current_config.get('plugin_system', {})
|
|
plugins_dir_name = plugin_system_config.get('plugins_directory', 'plugin-repos')
|
|
if os.path.isabs(plugins_dir_name):
|
|
plugins_dir = Path(plugins_dir_name)
|
|
else:
|
|
plugins_dir = PROJECT_ROOT / plugins_dir_name
|
|
schema_path = plugins_dir / plugin_id / 'config_schema.json'
|
|
|
|
def find_secret_fields(properties, prefix=''):
|
|
"""Recursively find fields marked with x-secret: true"""
|
|
fields = set()
|
|
for field_name, field_props in properties.items():
|
|
full_path = f"{prefix}.{field_name}" if prefix else field_name
|
|
if field_props.get('x-secret', False):
|
|
fields.add(full_path)
|
|
# Check nested objects
|
|
if field_props.get('type') == 'object' and 'properties' in field_props:
|
|
fields.update(find_secret_fields(field_props['properties'], full_path))
|
|
return fields
|
|
|
|
if schema_path.exists():
|
|
try:
|
|
with open(schema_path, 'r', encoding='utf-8') as f:
|
|
schema = json.load(f)
|
|
if 'properties' in schema:
|
|
secret_fields = find_secret_fields(schema['properties'])
|
|
except Exception as e:
|
|
print(f"Error reading schema for secret detection: {e}")
|
|
|
|
# Separate secrets from regular config (same logic as save_plugin_config)
|
|
def separate_secrets(config, secrets_set, prefix=''):
|
|
"""Recursively separate secret fields from regular config"""
|
|
regular = {}
|
|
secrets = {}
|
|
for key, value in config.items():
|
|
full_path = f"{prefix}.{key}" if prefix else key
|
|
if isinstance(value, dict):
|
|
nested_regular, nested_secrets = separate_secrets(value, secrets_set, full_path)
|
|
if nested_regular:
|
|
regular[key] = nested_regular
|
|
if nested_secrets:
|
|
secrets[key] = nested_secrets
|
|
elif full_path in secrets_set:
|
|
secrets[key] = value
|
|
else:
|
|
regular[key] = value
|
|
return regular, secrets
|
|
|
|
regular_config, secrets_config = separate_secrets(plugin_config, secret_fields)
|
|
|
|
# PRE-PROCESSING: Preserve 'enabled' state if not in regular_config
|
|
# This prevents overwriting the enabled state when saving config from a form that doesn't include the toggle
|
|
if 'enabled' not in regular_config:
|
|
try:
|
|
if plugin_id in current_config and 'enabled' in current_config[plugin_id]:
|
|
regular_config['enabled'] = current_config[plugin_id]['enabled']
|
|
elif api_v3.plugin_manager:
|
|
# Fallback to plugin instance if config doesn't have it
|
|
plugin_instance = api_v3.plugin_manager.get_plugin(plugin_id)
|
|
if plugin_instance:
|
|
regular_config['enabled'] = plugin_instance.enabled
|
|
# Final fallback: default to True if plugin is loaded (matches BasePlugin default)
|
|
if 'enabled' not in regular_config:
|
|
regular_config['enabled'] = True
|
|
except Exception as e:
|
|
print(f"Error preserving enabled state for {plugin_id}: {e}")
|
|
# Default to True on error to avoid disabling plugins
|
|
regular_config['enabled'] = True
|
|
|
|
# Get current secrets config
|
|
current_secrets = api_v3.config_manager.get_raw_file_content('secrets')
|
|
|
|
# Deep merge regular config into main config
|
|
if plugin_id not in current_config:
|
|
current_config[plugin_id] = {}
|
|
current_config[plugin_id] = deep_merge(current_config[plugin_id], regular_config)
|
|
|
|
# Deep merge secrets into secrets config
|
|
if secrets_config:
|
|
if plugin_id not in current_secrets:
|
|
current_secrets[plugin_id] = {}
|
|
current_secrets[plugin_id] = deep_merge(current_secrets[plugin_id], secrets_config)
|
|
# Save secrets file
|
|
api_v3.config_manager.save_raw_file_content('secrets', current_secrets)
|
|
|
|
# Mark for removal from data dict (already processed)
|
|
plugin_keys_to_remove.append(key)
|
|
|
|
# Notify plugin of config change if loaded (with merged config including secrets)
|
|
try:
|
|
if api_v3.plugin_manager:
|
|
plugin_instance = api_v3.plugin_manager.get_plugin(plugin_id)
|
|
if plugin_instance:
|
|
# Reload merged config (includes secrets) and pass the plugin-specific section
|
|
merged_config = api_v3.config_manager.load_config()
|
|
plugin_full_config = merged_config.get(plugin_id, {})
|
|
if hasattr(plugin_instance, 'on_config_change'):
|
|
plugin_instance.on_config_change(plugin_full_config)
|
|
except Exception as hook_err:
|
|
# Don't fail the save if hook fails
|
|
print(f"Warning: on_config_change failed for {plugin_id}: {hook_err}")
|
|
|
|
# Remove processed plugin keys from data (they're already in current_config)
|
|
for key in plugin_keys_to_remove:
|
|
del data[key]
|
|
|
|
# Handle any remaining config keys
|
|
# System settings (timezone, city, etc.) are already handled above
|
|
# Plugin configs should use /api/v3/plugins/config endpoint, but we'll handle them here too for flexibility
|
|
for key in data:
|
|
# Skip system settings that are already handled above
|
|
if key in ['timezone', 'city', 'state', 'country',
|
|
'web_display_autostart', 'auto_discover',
|
|
'auto_load_enabled', 'development_mode',
|
|
'plugins_directory']:
|
|
continue
|
|
# For any remaining keys (including plugin keys), use deep merge to preserve existing settings
|
|
if key in current_config and isinstance(current_config[key], dict) and isinstance(data[key], dict):
|
|
# Deep merge to preserve existing settings
|
|
current_config[key] = deep_merge(current_config[key], data[key])
|
|
else:
|
|
current_config[key] = data[key]
|
|
|
|
# Save the merged config using atomic save
|
|
success, error_msg = _save_config_atomic(api_v3.config_manager, current_config, create_backup=True)
|
|
if not success:
|
|
return error_response(
|
|
ErrorCode.CONFIG_SAVE_FAILED,
|
|
f"Failed to save configuration: {error_msg}",
|
|
status_code=500
|
|
)
|
|
|
|
# Invalidate cache on config change
|
|
try:
|
|
from web_interface.cache import invalidate_cache
|
|
invalidate_cache()
|
|
except ImportError:
|
|
pass
|
|
|
|
return success_response(message='Configuration saved successfully')
|
|
except Exception as e:
|
|
import logging
|
|
import traceback
|
|
error_msg = f"Error saving config: {str(e)}\n{traceback.format_exc()}"
|
|
logging.error(error_msg)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/config/secrets', methods=['GET'])
|
|
def get_secrets_config():
|
|
"""Get secrets configuration"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return jsonify({'status': 'error', 'message': 'Config manager not initialized'}), 500
|
|
|
|
config = api_v3.config_manager.get_raw_file_content('secrets')
|
|
return jsonify({'status': 'success', 'data': config})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/config/raw/main', methods=['POST'])
|
|
def save_raw_main_config():
|
|
"""Save raw main configuration JSON"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return jsonify({'status': 'error', 'message': 'Config manager not initialized'}), 500
|
|
|
|
data = request.get_json()
|
|
if not data:
|
|
return jsonify({'status': 'error', 'message': 'No data provided'}), 400
|
|
|
|
# Validate that it's valid JSON (already parsed by request.get_json())
|
|
# Save the raw config file
|
|
api_v3.config_manager.save_raw_file_content('main', data)
|
|
|
|
return jsonify({'status': 'success', 'message': 'Main configuration saved successfully'})
|
|
except json.JSONDecodeError as e:
|
|
return jsonify({'status': 'error', 'message': f'Invalid JSON: {str(e)}'}), 400
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/config/raw/secrets', methods=['POST'])
|
|
def save_raw_secrets_config():
|
|
"""Save raw secrets configuration JSON"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return jsonify({'status': 'error', 'message': 'Config manager not initialized'}), 500
|
|
|
|
data = request.get_json()
|
|
if not data:
|
|
return jsonify({'status': 'error', 'message': 'No data provided'}), 400
|
|
|
|
# Save the secrets config
|
|
api_v3.config_manager.save_raw_file_content('secrets', data)
|
|
|
|
# Reload GitHub token in plugin store manager if it exists
|
|
if api_v3.plugin_store_manager:
|
|
api_v3.plugin_store_manager.github_token = api_v3.plugin_store_manager._load_github_token()
|
|
|
|
return jsonify({'status': 'success', 'message': 'Secrets configuration saved successfully'})
|
|
except json.JSONDecodeError as e:
|
|
return jsonify({'status': 'error', 'message': f'Invalid JSON: {str(e)}'}), 400
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/system/status', methods=['GET'])
|
|
def get_system_status():
|
|
"""Get system status"""
|
|
try:
|
|
# Check cache first (10 second TTL for system status)
|
|
try:
|
|
from web_interface.cache import get_cached, set_cached
|
|
cached_result = get_cached('system_status', ttl_seconds=10)
|
|
if cached_result is not None:
|
|
return jsonify({'status': 'success', 'data': cached_result})
|
|
except ImportError:
|
|
# Cache not available, continue without caching
|
|
get_cached = None
|
|
set_cached = None
|
|
|
|
# Import psutil for system monitoring
|
|
try:
|
|
import psutil
|
|
except ImportError:
|
|
# Fallback if psutil not available
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'psutil not available for system monitoring'
|
|
}), 503
|
|
|
|
# Get system metrics using psutil
|
|
cpu_percent = psutil.cpu_percent(interval=0.1) # Short interval for responsiveness
|
|
memory = psutil.virtual_memory()
|
|
memory_percent = memory.percent
|
|
disk = psutil.disk_usage('/')
|
|
disk_percent = disk.percent
|
|
|
|
# Calculate uptime
|
|
boot_time = psutil.boot_time()
|
|
uptime_seconds = time.time() - boot_time
|
|
uptime_hours = uptime_seconds / 3600
|
|
uptime_days = uptime_hours / 24
|
|
|
|
# Format uptime string
|
|
if uptime_days >= 1:
|
|
uptime_str = f"{int(uptime_days)}d {int(uptime_hours % 24)}h"
|
|
elif uptime_hours >= 1:
|
|
uptime_str = f"{int(uptime_hours)}h {int((uptime_seconds % 3600) / 60)}m"
|
|
else:
|
|
uptime_str = f"{int(uptime_seconds / 60)}m"
|
|
|
|
# Get CPU temperature (Raspberry Pi)
|
|
cpu_temp = None
|
|
try:
|
|
temp_file = '/sys/class/thermal/thermal_zone0/temp'
|
|
if os.path.exists(temp_file):
|
|
with open(temp_file, 'r') as f:
|
|
temp_millidegrees = int(f.read().strip())
|
|
cpu_temp = temp_millidegrees / 1000.0 # Convert to Celsius
|
|
except (IOError, ValueError, OSError):
|
|
# Temperature sensor not available or error reading
|
|
cpu_temp = None
|
|
|
|
# Get display service status
|
|
service_status = _get_display_service_status()
|
|
|
|
status = {
|
|
'timestamp': time.time(),
|
|
'uptime': uptime_str,
|
|
'uptime_seconds': int(uptime_seconds),
|
|
'service_active': service_status.get('active', False),
|
|
'cpu_percent': round(cpu_percent, 1),
|
|
'memory_used_percent': round(memory_percent, 1),
|
|
'memory_total_mb': round(memory.total / (1024 * 1024), 1),
|
|
'memory_used_mb': round(memory.used / (1024 * 1024), 1),
|
|
'cpu_temp': round(cpu_temp, 1) if cpu_temp is not None else None,
|
|
'disk_used_percent': round(disk_percent, 1),
|
|
'disk_total_gb': round(disk.total / (1024 * 1024 * 1024), 1),
|
|
'disk_used_gb': round(disk.used / (1024 * 1024 * 1024), 1)
|
|
}
|
|
|
|
# Cache the result if available
|
|
if set_cached:
|
|
try:
|
|
set_cached('system_status', status, ttl_seconds=10)
|
|
except Exception:
|
|
pass # Cache write failed, but continue
|
|
|
|
return jsonify({'status': 'success', 'data': status})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/health', methods=['GET'])
|
|
def get_health():
|
|
"""Get system health status"""
|
|
try:
|
|
health_status = {
|
|
'status': 'healthy',
|
|
'timestamp': time.time(),
|
|
'services': {},
|
|
'checks': {}
|
|
}
|
|
|
|
# Check web interface service
|
|
health_status['services']['web_interface'] = {
|
|
'status': 'running',
|
|
'uptime_seconds': time.time() - (getattr(get_health, '_start_time', time.time()))
|
|
}
|
|
get_health._start_time = getattr(get_health, '_start_time', time.time())
|
|
|
|
# Check display service
|
|
display_service_status = _get_display_service_status()
|
|
health_status['services']['display_service'] = {
|
|
'status': 'active' if display_service_status.get('active') else 'inactive',
|
|
'details': display_service_status
|
|
}
|
|
|
|
# Check config file accessibility
|
|
try:
|
|
if config_manager:
|
|
test_config = config_manager.load_config()
|
|
health_status['checks']['config_file'] = {
|
|
'status': 'accessible',
|
|
'readable': True
|
|
}
|
|
else:
|
|
health_status['checks']['config_file'] = {
|
|
'status': 'unknown',
|
|
'readable': False
|
|
}
|
|
except Exception as e:
|
|
health_status['checks']['config_file'] = {
|
|
'status': 'error',
|
|
'readable': False,
|
|
'error': str(e)
|
|
}
|
|
|
|
# Check plugin system
|
|
try:
|
|
if plugin_manager:
|
|
# Try to discover plugins (lightweight check)
|
|
plugin_count = len(plugin_manager.get_available_plugins()) if hasattr(plugin_manager, 'get_available_plugins') else 0
|
|
health_status['checks']['plugin_system'] = {
|
|
'status': 'operational',
|
|
'plugin_count': plugin_count
|
|
}
|
|
else:
|
|
health_status['checks']['plugin_system'] = {
|
|
'status': 'not_initialized'
|
|
}
|
|
except Exception as e:
|
|
health_status['checks']['plugin_system'] = {
|
|
'status': 'error',
|
|
'error': str(e)
|
|
}
|
|
|
|
# Check hardware connectivity (if display manager available)
|
|
try:
|
|
snapshot_path = "/tmp/led_matrix_preview.png"
|
|
if os.path.exists(snapshot_path):
|
|
# Check if snapshot is recent (updated in last 60 seconds)
|
|
mtime = os.path.getmtime(snapshot_path)
|
|
age_seconds = time.time() - mtime
|
|
health_status['checks']['hardware'] = {
|
|
'status': 'connected' if age_seconds < 60 else 'stale',
|
|
'snapshot_age_seconds': round(age_seconds, 1)
|
|
}
|
|
else:
|
|
health_status['checks']['hardware'] = {
|
|
'status': 'no_snapshot',
|
|
'note': 'Display service may not be running'
|
|
}
|
|
except Exception as e:
|
|
health_status['checks']['hardware'] = {
|
|
'status': 'unknown',
|
|
'error': str(e)
|
|
}
|
|
|
|
# Determine overall health
|
|
all_healthy = all(
|
|
check.get('status') in ['accessible', 'operational', 'connected', 'running', 'active']
|
|
for check in health_status['checks'].values()
|
|
)
|
|
|
|
if not all_healthy:
|
|
health_status['status'] = 'degraded'
|
|
|
|
return jsonify({'status': 'success', 'data': health_status})
|
|
except Exception as e:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': str(e),
|
|
'data': {'status': 'unhealthy'}
|
|
}), 500
|
|
|
|
def get_git_version(project_dir=None):
|
|
"""Get git version information from the repository"""
|
|
if project_dir is None:
|
|
project_dir = PROJECT_ROOT
|
|
|
|
try:
|
|
# Try to get tag description (e.g., v2.4-10-g123456)
|
|
result = subprocess.run(
|
|
['git', 'describe', '--tags', '--dirty'],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=5,
|
|
cwd=str(project_dir)
|
|
)
|
|
|
|
if result.returncode == 0:
|
|
return result.stdout.strip()
|
|
|
|
# Fallback to short commit hash
|
|
result = subprocess.run(
|
|
['git', 'rev-parse', '--short', 'HEAD'],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=5,
|
|
cwd=str(project_dir)
|
|
)
|
|
|
|
if result.returncode == 0:
|
|
return result.stdout.strip()
|
|
|
|
return 'Unknown'
|
|
except Exception:
|
|
return 'Unknown'
|
|
|
|
@api_v3.route('/system/version', methods=['GET'])
|
|
def get_system_version():
|
|
"""Get LEDMatrix repository version"""
|
|
try:
|
|
version = get_git_version()
|
|
return jsonify({'status': 'success', 'data': {'version': version}})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/system/action', methods=['POST'])
|
|
def execute_system_action():
|
|
"""Execute system actions (start/stop/reboot/etc)"""
|
|
try:
|
|
# HTMX sends data as form data, not JSON
|
|
data = request.get_json(silent=True) or {}
|
|
if not data:
|
|
# Try to get from form data if JSON fails
|
|
data = {
|
|
'action': request.form.get('action'),
|
|
'mode': request.form.get('mode')
|
|
}
|
|
|
|
if not data or 'action' not in data:
|
|
return jsonify({'status': 'error', 'message': 'Action required'}), 400
|
|
|
|
action = data['action']
|
|
mode = data.get('mode') # For on-demand modes
|
|
|
|
# Map actions to subprocess calls (similar to original implementation)
|
|
if action == 'start_display':
|
|
if mode:
|
|
# For on-demand modes, we would need to integrate with the display controller
|
|
# For now, just start the display service
|
|
result = subprocess.run(['sudo', 'systemctl', 'start', 'ledmatrix'],
|
|
capture_output=True, text=True)
|
|
return jsonify({
|
|
'status': 'success' if result.returncode == 0 else 'error',
|
|
'message': f'Started display in {mode} mode',
|
|
'returncode': result.returncode,
|
|
'stdout': result.stdout,
|
|
'stderr': result.stderr
|
|
})
|
|
else:
|
|
result = subprocess.run(['sudo', 'systemctl', 'start', 'ledmatrix'],
|
|
capture_output=True, text=True)
|
|
elif action == 'stop_display':
|
|
result = subprocess.run(['sudo', 'systemctl', 'stop', 'ledmatrix'],
|
|
capture_output=True, text=True)
|
|
elif action == 'enable_autostart':
|
|
result = subprocess.run(['sudo', 'systemctl', 'enable', 'ledmatrix'],
|
|
capture_output=True, text=True)
|
|
elif action == 'disable_autostart':
|
|
result = subprocess.run(['sudo', 'systemctl', 'disable', 'ledmatrix'],
|
|
capture_output=True, text=True)
|
|
elif action == 'reboot_system':
|
|
result = subprocess.run(['sudo', 'reboot'],
|
|
capture_output=True, text=True)
|
|
elif action == 'git_pull':
|
|
# Use PROJECT_ROOT instead of hardcoded path
|
|
project_dir = str(PROJECT_ROOT)
|
|
|
|
# Check if there are local changes that need to be stashed
|
|
# Exclude plugins directory - plugins are separate repos and shouldn't be stashed with base project
|
|
# Use --untracked-files=no to skip untracked files check (much faster with symlinked plugins)
|
|
try:
|
|
status_result = subprocess.run(
|
|
['git', 'status', '--porcelain', '--untracked-files=no'],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=30,
|
|
cwd=project_dir
|
|
)
|
|
# Filter out any changes in plugins directory - plugins are separate repositories
|
|
# Git status format: XY filename (where X is status of index, Y is status of work tree)
|
|
status_lines = [line for line in status_result.stdout.strip().split('\n')
|
|
if line.strip() and 'plugins/' not in line]
|
|
has_changes = bool('\n'.join(status_lines).strip())
|
|
except subprocess.TimeoutExpired:
|
|
# If status check times out, assume there might be changes and proceed
|
|
# This is safer than failing the update
|
|
has_changes = True
|
|
status_result = type('obj', (object,), {'stdout': '', 'stderr': 'Status check timed out'})()
|
|
|
|
stash_info = ""
|
|
|
|
# Stash local changes if they exist (excluding plugins)
|
|
# Plugins are separate repositories and shouldn't be stashed with base project updates
|
|
if has_changes:
|
|
try:
|
|
# Use pathspec to exclude plugins directory from stash
|
|
stash_result = subprocess.run(
|
|
['git', 'stash', 'push', '-m', 'LEDMatrix auto-stash before update', '--', ':!plugins'],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=30,
|
|
cwd=project_dir
|
|
)
|
|
if stash_result.returncode == 0:
|
|
print(f"Stashed local changes: {stash_result.stdout}")
|
|
stash_info = " Local changes were stashed."
|
|
else:
|
|
# If stash fails, log but continue with pull
|
|
print(f"Stash failed: {stash_result.stderr}")
|
|
except subprocess.TimeoutExpired:
|
|
print("Stash operation timed out, proceeding with pull")
|
|
|
|
# Perform the git pull
|
|
result = subprocess.run(
|
|
['git', 'pull', '--rebase'],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=60,
|
|
cwd=project_dir
|
|
)
|
|
|
|
# Return custom response for git_pull
|
|
if result.returncode == 0:
|
|
pull_message = "Code updated successfully."
|
|
if has_changes:
|
|
pull_message = f"Code updated successfully. Local changes were automatically stashed.{stash_info}"
|
|
if result.stdout and "Already up to date" not in result.stdout:
|
|
pull_message = f"Code updated successfully.{stash_info}"
|
|
else:
|
|
pull_message = f"Update failed: {result.stderr or 'Unknown error'}"
|
|
|
|
return jsonify({
|
|
'status': 'success' if result.returncode == 0 else 'error',
|
|
'message': pull_message,
|
|
'returncode': result.returncode,
|
|
'stdout': result.stdout,
|
|
'stderr': result.stderr
|
|
})
|
|
elif action == 'restart_display_service':
|
|
result = subprocess.run(['sudo', 'systemctl', 'restart', 'ledmatrix'],
|
|
capture_output=True, text=True)
|
|
elif action == 'restart_web_service':
|
|
# Try to restart the web service (assuming it's ledmatrix-web.service)
|
|
result = subprocess.run(['sudo', 'systemctl', 'restart', 'ledmatrix-web'],
|
|
capture_output=True, text=True)
|
|
else:
|
|
return jsonify({'status': 'error', 'message': f'Unknown action: {action}'}), 400
|
|
|
|
return jsonify({
|
|
'status': 'success' if result.returncode == 0 else 'error',
|
|
'message': f'Action {action} completed',
|
|
'returncode': result.returncode,
|
|
'stdout': result.stdout,
|
|
'stderr': result.stderr
|
|
})
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in execute_system_action: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e), 'details': error_details}), 500
|
|
|
|
@api_v3.route('/display/current', methods=['GET'])
|
|
def get_display_current():
|
|
"""Get current display state"""
|
|
try:
|
|
import base64
|
|
from PIL import Image
|
|
import io
|
|
|
|
snapshot_path = "/tmp/led_matrix_preview.png"
|
|
|
|
# Get display dimensions from config
|
|
try:
|
|
if config_manager:
|
|
main_config = config_manager.load_config()
|
|
hardware_config = main_config.get('display', {}).get('hardware', {})
|
|
cols = hardware_config.get('cols', 64)
|
|
chain_length = hardware_config.get('chain_length', 2)
|
|
rows = hardware_config.get('rows', 32)
|
|
parallel = hardware_config.get('parallel', 1)
|
|
width = cols * chain_length
|
|
height = rows * parallel
|
|
else:
|
|
width = 128
|
|
height = 64
|
|
except Exception:
|
|
width = 128
|
|
height = 64
|
|
|
|
# Try to read snapshot file
|
|
image_data = None
|
|
if os.path.exists(snapshot_path):
|
|
try:
|
|
with Image.open(snapshot_path) as img:
|
|
# Convert to PNG and encode as base64
|
|
buffer = io.BytesIO()
|
|
img.save(buffer, format='PNG')
|
|
image_data = base64.b64encode(buffer.getvalue()).decode('utf-8')
|
|
except Exception as img_err:
|
|
# File might be being written or corrupted, return None
|
|
pass
|
|
|
|
display_data = {
|
|
'timestamp': time.time(),
|
|
'width': width,
|
|
'height': height,
|
|
'image': image_data # Base64 encoded image data or None if unavailable
|
|
}
|
|
return jsonify({'status': 'success', 'data': display_data})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/display/on-demand/status', methods=['GET'])
|
|
def get_on_demand_status():
|
|
"""Return the current on-demand display state."""
|
|
try:
|
|
cache = _ensure_cache_manager()
|
|
state = cache.get('display_on_demand_state', max_age=120)
|
|
if state is None:
|
|
state = {
|
|
'active': False,
|
|
'status': 'idle',
|
|
'last_updated': None
|
|
}
|
|
service_status = _get_display_service_status()
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {
|
|
'state': state,
|
|
'service': service_status
|
|
}
|
|
})
|
|
except Exception as exc:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_on_demand_status: {exc}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(exc)}), 500
|
|
|
|
@api_v3.route('/display/on-demand/start', methods=['POST'])
|
|
def start_on_demand_display():
|
|
"""Request the display controller to run a specific plugin on-demand."""
|
|
try:
|
|
data = request.get_json() or {}
|
|
plugin_id = data.get('plugin_id')
|
|
mode = data.get('mode')
|
|
duration = data.get('duration')
|
|
pinned = bool(data.get('pinned', False))
|
|
start_service = data.get('start_service', True)
|
|
|
|
if not plugin_id and not mode:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id or mode is required'}), 400
|
|
|
|
resolved_plugin = plugin_id
|
|
resolved_mode = mode
|
|
|
|
if api_v3.plugin_manager:
|
|
if resolved_plugin and resolved_plugin not in api_v3.plugin_manager.plugin_manifests:
|
|
return jsonify({'status': 'error', 'message': f'Plugin {resolved_plugin} not found'}), 404
|
|
|
|
if resolved_plugin and not resolved_mode:
|
|
modes = api_v3.plugin_manager.get_plugin_display_modes(resolved_plugin)
|
|
resolved_mode = modes[0] if modes else resolved_plugin
|
|
elif resolved_mode and not resolved_plugin:
|
|
resolved_plugin = api_v3.plugin_manager.find_plugin_for_mode(resolved_mode)
|
|
if not resolved_plugin:
|
|
return jsonify({'status': 'error', 'message': f'Mode {resolved_mode} not found'}), 404
|
|
|
|
if api_v3.config_manager and resolved_plugin:
|
|
config = api_v3.config_manager.load_config()
|
|
plugin_config = config.get(resolved_plugin, {})
|
|
if 'enabled' in plugin_config and not plugin_config.get('enabled', False):
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Plugin {resolved_plugin} is disabled in configuration'
|
|
}), 400
|
|
|
|
# Check if display service is running (or will be started)
|
|
service_status = _get_display_service_status()
|
|
if not service_status.get('active') and not start_service:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Display service is not running. Please start the display service or enable "Start Service" option.',
|
|
'service_status': service_status
|
|
}), 400
|
|
|
|
cache = _ensure_cache_manager()
|
|
request_id = data.get('request_id') or str(uuid.uuid4())
|
|
request_payload = {
|
|
'request_id': request_id,
|
|
'action': 'start',
|
|
'plugin_id': resolved_plugin,
|
|
'mode': resolved_mode,
|
|
'duration': duration,
|
|
'pinned': pinned,
|
|
'timestamp': time.time()
|
|
}
|
|
cache.set('display_on_demand_request', request_payload)
|
|
|
|
service_result = None
|
|
if start_service:
|
|
service_result = _ensure_display_service_running()
|
|
# Check if service actually started
|
|
if service_result and not service_result.get('active'):
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Failed to start display service. Please check service logs or start it manually.',
|
|
'service_result': service_result
|
|
}), 500
|
|
|
|
response_data = {
|
|
'request_id': request_id,
|
|
'plugin_id': resolved_plugin,
|
|
'mode': resolved_mode,
|
|
'duration': duration,
|
|
'pinned': pinned,
|
|
'service': service_result
|
|
}
|
|
return jsonify({'status': 'success', 'data': response_data})
|
|
except Exception as exc:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in start_on_demand_display: {exc}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(exc)}), 500
|
|
|
|
@api_v3.route('/display/on-demand/stop', methods=['POST'])
|
|
def stop_on_demand_display():
|
|
"""Request the display controller to stop on-demand mode."""
|
|
try:
|
|
data = request.get_json(silent=True) or {}
|
|
stop_service = data.get('stop_service', False)
|
|
|
|
cache = _ensure_cache_manager()
|
|
request_id = data.get('request_id') or str(uuid.uuid4())
|
|
request_payload = {
|
|
'request_id': request_id,
|
|
'action': 'stop',
|
|
'timestamp': time.time()
|
|
}
|
|
cache.set('display_on_demand_request', request_payload)
|
|
|
|
service_result = None
|
|
if stop_service:
|
|
service_result = _stop_display_service()
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {
|
|
'request_id': request_id,
|
|
'service': service_result
|
|
}
|
|
})
|
|
except Exception as exc:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in stop_on_demand_display: {exc}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(exc)}), 500
|
|
|
|
@api_v3.route('/plugins/installed', methods=['GET'])
|
|
def get_installed_plugins():
|
|
"""Get installed plugins"""
|
|
try:
|
|
if not api_v3.plugin_manager or not api_v3.plugin_store_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin managers not initialized'}), 500
|
|
|
|
import json
|
|
from pathlib import Path
|
|
|
|
# Re-discover plugins to ensure we have the latest list
|
|
# This handles cases where plugins are added/removed after app startup
|
|
api_v3.plugin_manager.discover_plugins()
|
|
|
|
# Get all installed plugin info from the plugin manager
|
|
all_plugin_info = api_v3.plugin_manager.get_all_plugin_info()
|
|
|
|
# Format for the web interface
|
|
plugins = []
|
|
for plugin_info in all_plugin_info:
|
|
plugin_id = plugin_info.get('id')
|
|
|
|
# Re-read manifest from disk to ensure we have the latest metadata
|
|
manifest_path = Path(api_v3.plugin_manager.plugins_dir) / plugin_id / "manifest.json"
|
|
if manifest_path.exists():
|
|
try:
|
|
with open(manifest_path, 'r', encoding='utf-8') as f:
|
|
fresh_manifest = json.load(f)
|
|
# Update plugin_info with fresh manifest data
|
|
plugin_info.update(fresh_manifest)
|
|
except Exception as e:
|
|
# If we can't read the fresh manifest, use the cached one
|
|
print(f"Warning: Could not read fresh manifest for {plugin_id}: {e}")
|
|
|
|
# Get enabled status from config (source of truth)
|
|
# Read from config file first, fall back to plugin instance if config doesn't have the key
|
|
enabled = None
|
|
if api_v3.config_manager:
|
|
full_config = api_v3.config_manager.load_config()
|
|
plugin_config = full_config.get(plugin_id, {})
|
|
# Check if 'enabled' key exists in config (even if False)
|
|
if 'enabled' in plugin_config:
|
|
enabled = bool(plugin_config['enabled'])
|
|
|
|
# Fallback to plugin instance if config doesn't have enabled key
|
|
if enabled is None:
|
|
plugin_instance = api_v3.plugin_manager.get_plugin(plugin_id)
|
|
if plugin_instance:
|
|
enabled = plugin_instance.enabled
|
|
else:
|
|
# Default to True if no config key and plugin not loaded (matches BasePlugin default)
|
|
enabled = True
|
|
|
|
# Get verified status from store registry (if available)
|
|
store_info = api_v3.plugin_store_manager.get_plugin_info(plugin_id)
|
|
verified = store_info.get('verified', False) if store_info else False
|
|
|
|
# Get local git info for installed plugin (actual installed commit)
|
|
plugin_path = Path(api_v3.plugin_manager.plugins_dir) / plugin_id
|
|
local_git_info = api_v3.plugin_store_manager._get_local_git_info(plugin_path) if plugin_path.exists() else None
|
|
|
|
# Use local git info if available (actual installed commit), otherwise fall back to manifest/store info
|
|
if local_git_info:
|
|
last_commit = local_git_info.get('short_sha') or local_git_info.get('sha', '')[:7] if local_git_info.get('sha') else None
|
|
branch = local_git_info.get('branch')
|
|
# Use commit date from git if available
|
|
last_updated = local_git_info.get('date_iso') or local_git_info.get('date')
|
|
else:
|
|
# Fall back to manifest/store info if no local git info
|
|
last_updated = plugin_info.get('last_updated')
|
|
last_commit = plugin_info.get('last_commit') or plugin_info.get('last_commit_sha')
|
|
branch = plugin_info.get('branch')
|
|
|
|
if store_info:
|
|
last_updated = last_updated or store_info.get('last_updated') or store_info.get('last_updated_iso')
|
|
last_commit = last_commit or store_info.get('last_commit') or store_info.get('last_commit_sha')
|
|
branch = branch or store_info.get('branch') or store_info.get('default_branch')
|
|
|
|
last_commit_message = plugin_info.get('last_commit_message')
|
|
if store_info and not last_commit_message:
|
|
last_commit_message = store_info.get('last_commit_message')
|
|
|
|
# Get web_ui_actions from manifest if available
|
|
web_ui_actions = plugin_info.get('web_ui_actions', [])
|
|
|
|
plugins.append({
|
|
'id': plugin_id,
|
|
'name': plugin_info.get('name', plugin_id),
|
|
'author': plugin_info.get('author', 'Unknown'),
|
|
'category': plugin_info.get('category', 'General'),
|
|
'description': plugin_info.get('description', 'No description available'),
|
|
'tags': plugin_info.get('tags', []),
|
|
'enabled': enabled,
|
|
'verified': verified,
|
|
'loaded': plugin_info.get('loaded', False),
|
|
'last_updated': last_updated,
|
|
'last_commit': last_commit,
|
|
'last_commit_message': last_commit_message,
|
|
'branch': branch,
|
|
'web_ui_actions': web_ui_actions
|
|
})
|
|
|
|
return jsonify({'status': 'success', 'data': {'plugins': plugins}})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_installed_plugins: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e), 'details': error_details}), 500
|
|
|
|
@api_v3.route('/plugins/health', methods=['GET'])
|
|
def get_plugin_health():
|
|
"""Get health metrics for all plugins"""
|
|
try:
|
|
if not api_v3.plugin_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin manager not initialized'}), 500
|
|
|
|
# Check if health tracker is available
|
|
if not hasattr(api_v3.plugin_manager, 'health_tracker') or not api_v3.plugin_manager.health_tracker:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {},
|
|
'message': 'Health tracking not available'
|
|
})
|
|
|
|
# Get health summaries for all plugins
|
|
health_summaries = api_v3.plugin_manager.health_tracker.get_all_health_summaries()
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': health_summaries
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_plugin_health: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/health/<plugin_id>', methods=['GET'])
|
|
def get_plugin_health_single(plugin_id):
|
|
"""Get health metrics for a specific plugin"""
|
|
try:
|
|
if not api_v3.plugin_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin manager not initialized'}), 500
|
|
|
|
# Check if health tracker is available
|
|
if not hasattr(api_v3.plugin_manager, 'health_tracker') or not api_v3.plugin_manager.health_tracker:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Health tracking not available'
|
|
}), 503
|
|
|
|
# Get health summary for specific plugin
|
|
health_summary = api_v3.plugin_manager.health_tracker.get_health_summary(plugin_id)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': health_summary
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_plugin_health_single: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/health/<plugin_id>/reset', methods=['POST'])
|
|
def reset_plugin_health(plugin_id):
|
|
"""Reset health state for a plugin (manual recovery)"""
|
|
try:
|
|
if not api_v3.plugin_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin manager not initialized'}), 500
|
|
|
|
# Check if health tracker is available
|
|
if not hasattr(api_v3.plugin_manager, 'health_tracker') or not api_v3.plugin_manager.health_tracker:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Health tracking not available'
|
|
}), 503
|
|
|
|
# Reset health state
|
|
api_v3.plugin_manager.health_tracker.reset_health(plugin_id)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': f'Health state reset for plugin {plugin_id}'
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in reset_plugin_health: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/metrics', methods=['GET'])
|
|
def get_plugin_metrics():
|
|
"""Get resource metrics for all plugins"""
|
|
try:
|
|
if not api_v3.plugin_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin manager not initialized'}), 500
|
|
|
|
# Check if resource monitor is available
|
|
if not hasattr(api_v3.plugin_manager, 'resource_monitor') or not api_v3.plugin_manager.resource_monitor:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {},
|
|
'message': 'Resource monitoring not available'
|
|
})
|
|
|
|
# Get metrics summaries for all plugins
|
|
metrics_summaries = api_v3.plugin_manager.resource_monitor.get_all_metrics_summaries()
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': metrics_summaries
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_plugin_metrics: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/metrics/<plugin_id>', methods=['GET'])
|
|
def get_plugin_metrics_single(plugin_id):
|
|
"""Get resource metrics for a specific plugin"""
|
|
try:
|
|
if not api_v3.plugin_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin manager not initialized'}), 500
|
|
|
|
# Check if resource monitor is available
|
|
if not hasattr(api_v3.plugin_manager, 'resource_monitor') or not api_v3.plugin_manager.resource_monitor:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Resource monitoring not available'
|
|
}), 503
|
|
|
|
# Get metrics summary for specific plugin
|
|
metrics_summary = api_v3.plugin_manager.resource_monitor.get_metrics_summary(plugin_id)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': metrics_summary
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_plugin_metrics_single: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/metrics/<plugin_id>/reset', methods=['POST'])
|
|
def reset_plugin_metrics(plugin_id):
|
|
"""Reset metrics for a plugin"""
|
|
try:
|
|
if not api_v3.plugin_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin manager not initialized'}), 500
|
|
|
|
# Check if resource monitor is available
|
|
if not hasattr(api_v3.plugin_manager, 'resource_monitor') or not api_v3.plugin_manager.resource_monitor:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Resource monitoring not available'
|
|
}), 503
|
|
|
|
# Reset metrics
|
|
api_v3.plugin_manager.resource_monitor.reset_metrics(plugin_id)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': f'Metrics reset for plugin {plugin_id}'
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in reset_plugin_metrics: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/limits/<plugin_id>', methods=['GET', 'POST'])
|
|
def manage_plugin_limits(plugin_id):
|
|
"""Get or set resource limits for a plugin"""
|
|
try:
|
|
if not api_v3.plugin_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin manager not initialized'}), 500
|
|
|
|
# Check if resource monitor is available
|
|
if not hasattr(api_v3.plugin_manager, 'resource_monitor') or not api_v3.plugin_manager.resource_monitor:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Resource monitoring not available'
|
|
}), 503
|
|
|
|
if request.method == 'GET':
|
|
# Get limits
|
|
limits = api_v3.plugin_manager.resource_monitor.get_limits(plugin_id)
|
|
if limits:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {
|
|
'max_memory_mb': limits.max_memory_mb,
|
|
'max_cpu_percent': limits.max_cpu_percent,
|
|
'max_execution_time': limits.max_execution_time,
|
|
'warning_threshold': limits.warning_threshold
|
|
}
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': None,
|
|
'message': 'No limits configured for this plugin'
|
|
})
|
|
else:
|
|
# POST - Set limits
|
|
data = request.get_json() or {}
|
|
from src.plugin_system.resource_monitor import ResourceLimits
|
|
|
|
limits = ResourceLimits(
|
|
max_memory_mb=data.get('max_memory_mb'),
|
|
max_cpu_percent=data.get('max_cpu_percent'),
|
|
max_execution_time=data.get('max_execution_time'),
|
|
warning_threshold=data.get('warning_threshold', 0.8)
|
|
)
|
|
|
|
api_v3.plugin_manager.resource_monitor.set_limits(plugin_id, limits)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': f'Resource limits updated for plugin {plugin_id}'
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in manage_plugin_limits: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/toggle', methods=['POST'])
|
|
def toggle_plugin():
|
|
"""Toggle plugin enabled/disabled"""
|
|
try:
|
|
if not api_v3.plugin_manager or not api_v3.config_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin or config manager not initialized'}), 500
|
|
|
|
# Support both JSON and form data (for HTMX submissions)
|
|
content_type = request.content_type or ''
|
|
|
|
if 'application/json' in content_type:
|
|
data = request.get_json()
|
|
if not data or 'plugin_id' not in data or 'enabled' not in data:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id and enabled required'}), 400
|
|
plugin_id = data['plugin_id']
|
|
enabled = data['enabled']
|
|
else:
|
|
# Form data or query string (HTMX submission)
|
|
plugin_id = request.args.get('plugin_id') or request.form.get('plugin_id')
|
|
if not plugin_id:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id required'}), 400
|
|
|
|
# For checkbox toggle, if form was submitted, the checkbox was checked (enabled)
|
|
# If using HTMX with hx-trigger="change", we need to check if checkbox is checked
|
|
# The checkbox value or 'enabled' form field indicates the state
|
|
enabled_str = request.form.get('enabled', request.args.get('enabled', ''))
|
|
|
|
# Handle various truthy/falsy values
|
|
if enabled_str.lower() in ('true', '1', 'on', 'yes'):
|
|
enabled = True
|
|
elif enabled_str.lower() in ('false', '0', 'off', 'no', ''):
|
|
# Empty string means checkbox was unchecked (toggle off)
|
|
enabled = False
|
|
else:
|
|
# Default: toggle based on current state
|
|
config = api_v3.config_manager.load_config()
|
|
current_enabled = config.get(plugin_id, {}).get('enabled', False)
|
|
enabled = not current_enabled
|
|
|
|
# Check if plugin exists in manifests (discovered but may not be loaded)
|
|
if plugin_id not in api_v3.plugin_manager.plugin_manifests:
|
|
return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404
|
|
|
|
# Update config (this is what the display controller reads)
|
|
config = api_v3.config_manager.load_config()
|
|
if plugin_id not in config:
|
|
config[plugin_id] = {}
|
|
config[plugin_id]['enabled'] = enabled
|
|
|
|
# Use atomic save if available
|
|
if hasattr(api_v3.config_manager, 'save_config_atomic'):
|
|
result = api_v3.config_manager.save_config_atomic(config, create_backup=True)
|
|
if result.status.value != 'success':
|
|
return error_response(
|
|
ErrorCode.CONFIG_SAVE_FAILED,
|
|
f"Failed to save configuration: {result.message}",
|
|
status_code=500
|
|
)
|
|
else:
|
|
api_v3.config_manager.save_config(config)
|
|
|
|
# Update state manager if available
|
|
if api_v3.plugin_state_manager:
|
|
api_v3.plugin_state_manager.set_plugin_enabled(plugin_id, enabled)
|
|
|
|
# Log operation
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"toggle",
|
|
plugin_id=plugin_id,
|
|
status="success" if enabled else "disabled",
|
|
details={"enabled": enabled}
|
|
)
|
|
|
|
# If plugin is loaded, also call its lifecycle methods
|
|
# Wrap in try/except to prevent lifecycle errors from failing the toggle
|
|
plugin = api_v3.plugin_manager.get_plugin(plugin_id)
|
|
if plugin:
|
|
try:
|
|
if enabled:
|
|
if hasattr(plugin, 'on_enable'):
|
|
plugin.on_enable()
|
|
else:
|
|
if hasattr(plugin, 'on_disable'):
|
|
plugin.on_disable()
|
|
except Exception as lifecycle_error:
|
|
# Log the error but don't fail the toggle - config is already saved
|
|
import logging
|
|
logging.warning(f"Lifecycle method error for {plugin_id}: {lifecycle_error}", exc_info=True)
|
|
|
|
return success_response(
|
|
message=f"Plugin {plugin_id} {'enabled' if enabled else 'disabled'} successfully"
|
|
)
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.PLUGIN_OPERATION_CONFLICT)
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"toggle",
|
|
plugin_id=data.get('plugin_id') if 'data' in locals() else None,
|
|
status="failed",
|
|
error=str(e)
|
|
)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
context=error.context,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/operation/<operation_id>', methods=['GET'])
|
|
def get_operation_status(operation_id):
|
|
"""Get status of a plugin operation"""
|
|
try:
|
|
if not api_v3.operation_queue:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'Operation queue not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
operation = api_v3.operation_queue.get_operation_status(operation_id)
|
|
if not operation:
|
|
return error_response(
|
|
ErrorCode.PLUGIN_NOT_FOUND,
|
|
f'Operation {operation_id} not found',
|
|
status_code=404
|
|
)
|
|
|
|
return success_response(data=operation.to_dict())
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.SYSTEM_ERROR)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/operation/history', methods=['GET'])
|
|
def get_operation_history():
|
|
"""Get operation history"""
|
|
try:
|
|
if not api_v3.operation_queue:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'Operation queue not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
limit = request.args.get('limit', 50, type=int)
|
|
plugin_id = request.args.get('plugin_id')
|
|
|
|
history = api_v3.operation_queue.get_operation_history(limit=limit)
|
|
|
|
# Filter by plugin_id if provided
|
|
if plugin_id:
|
|
history = [op for op in history if op.plugin_id == plugin_id]
|
|
|
|
return success_response(data=[op.to_dict() for op in history])
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.SYSTEM_ERROR)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/state', methods=['GET'])
|
|
def get_plugin_state():
|
|
"""Get plugin state from state manager"""
|
|
try:
|
|
if not api_v3.plugin_state_manager:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'State manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
plugin_id = request.args.get('plugin_id')
|
|
|
|
if plugin_id:
|
|
# Get state for specific plugin
|
|
state = api_v3.plugin_state_manager.get_plugin_state(plugin_id)
|
|
if not state:
|
|
return error_response(
|
|
ErrorCode.PLUGIN_NOT_FOUND,
|
|
f'Plugin {plugin_id} not found in state manager',
|
|
context={'plugin_id': plugin_id},
|
|
status_code=404
|
|
)
|
|
return success_response(data=state.to_dict())
|
|
else:
|
|
# Get all plugin states
|
|
all_states = api_v3.plugin_state_manager.get_all_states()
|
|
return success_response(data={
|
|
plugin_id: state.to_dict()
|
|
for plugin_id, state in all_states.items()
|
|
})
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.SYSTEM_ERROR)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
context=error.context,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/state/reconcile', methods=['POST'])
|
|
def reconcile_plugin_state():
|
|
"""Reconcile plugin state across all sources"""
|
|
try:
|
|
if not api_v3.plugin_state_manager or not api_v3.plugin_manager:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'State manager or plugin manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
from src.plugin_system.state_reconciliation import StateReconciliation
|
|
|
|
reconciler = StateReconciliation(
|
|
state_manager=api_v3.plugin_state_manager,
|
|
config_manager=api_v3.config_manager,
|
|
plugin_manager=api_v3.plugin_manager,
|
|
plugins_dir=Path(api_v3.plugin_manager.plugins_dir)
|
|
)
|
|
|
|
result = reconciler.reconcile_state()
|
|
|
|
return success_response(
|
|
data={
|
|
'inconsistencies_found': len(result.inconsistencies_found),
|
|
'inconsistencies_fixed': len(result.inconsistencies_fixed),
|
|
'inconsistencies_manual': len(result.inconsistencies_manual),
|
|
'inconsistencies': [
|
|
{
|
|
'plugin_id': inc.plugin_id,
|
|
'type': inc.inconsistency_type.value,
|
|
'description': inc.description,
|
|
'fix_action': inc.fix_action.value
|
|
}
|
|
for inc in result.inconsistencies_found
|
|
],
|
|
'fixed': [
|
|
{
|
|
'plugin_id': inc.plugin_id,
|
|
'type': inc.inconsistency_type.value,
|
|
'description': inc.description
|
|
}
|
|
for inc in result.inconsistencies_fixed
|
|
],
|
|
'manual_fix_required': [
|
|
{
|
|
'plugin_id': inc.plugin_id,
|
|
'type': inc.inconsistency_type.value,
|
|
'description': inc.description
|
|
}
|
|
for inc in result.inconsistencies_manual
|
|
]
|
|
},
|
|
message=result.message
|
|
)
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.SYSTEM_ERROR)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
context=error.context,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/config', methods=['GET'])
|
|
def get_plugin_config():
|
|
"""Get plugin configuration"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'Config manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
plugin_id = request.args.get('plugin_id')
|
|
if not plugin_id:
|
|
return error_response(
|
|
ErrorCode.INVALID_INPUT,
|
|
'plugin_id required',
|
|
context={'missing_params': ['plugin_id']},
|
|
status_code=400
|
|
)
|
|
|
|
# Get plugin configuration from config manager
|
|
main_config = api_v3.config_manager.load_config()
|
|
plugin_config = main_config.get(plugin_id, {})
|
|
|
|
# Merge with defaults from schema so form shows default values for missing fields
|
|
schema_mgr = api_v3.schema_manager
|
|
if schema_mgr:
|
|
try:
|
|
defaults = schema_mgr.generate_default_config(plugin_id, use_cache=True)
|
|
plugin_config = schema_mgr.merge_with_defaults(plugin_config, defaults)
|
|
except Exception as e:
|
|
# Log but don't fail - defaults merge is best effort
|
|
import logging
|
|
logging.warning(f"Could not merge defaults for {plugin_id}: {e}")
|
|
|
|
# Special handling for of-the-day plugin: populate uploaded_files and categories from disk
|
|
if plugin_id == 'of-the-day' or plugin_id == 'ledmatrix-of-the-day':
|
|
# Get plugin directory - plugin_id in manifest is 'of-the-day', but directory is 'ledmatrix-of-the-day'
|
|
plugin_dir_name = 'ledmatrix-of-the-day'
|
|
if api_v3.plugin_manager:
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_dir_name)
|
|
# If not found, try with the plugin_id
|
|
if not plugin_dir or not Path(plugin_dir).exists():
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_id)
|
|
else:
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_dir_name
|
|
if not plugin_dir.exists():
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_id
|
|
|
|
if plugin_dir and Path(plugin_dir).exists():
|
|
data_dir = Path(plugin_dir) / 'of_the_day'
|
|
if data_dir.exists():
|
|
# Scan for JSON files
|
|
uploaded_files = []
|
|
categories_from_files = {}
|
|
|
|
for json_file in data_dir.glob('*.json'):
|
|
try:
|
|
# Get file stats
|
|
stat = json_file.stat()
|
|
|
|
# Read JSON to count entries
|
|
with open(json_file, 'r', encoding='utf-8') as f:
|
|
json_data = json.load(f)
|
|
entry_count = len(json_data) if isinstance(json_data, dict) else 0
|
|
|
|
# Extract category name from filename
|
|
category_name = json_file.stem
|
|
filename = json_file.name
|
|
|
|
# Create file entry
|
|
file_entry = {
|
|
'id': category_name,
|
|
'category_name': category_name,
|
|
'filename': filename,
|
|
'original_filename': filename,
|
|
'path': f'of_the_day/{filename}',
|
|
'size': stat.st_size,
|
|
'uploaded_at': datetime.fromtimestamp(stat.st_mtime).isoformat() + 'Z',
|
|
'entry_count': entry_count
|
|
}
|
|
uploaded_files.append(file_entry)
|
|
|
|
# Create/update category entry if not in config
|
|
if category_name not in plugin_config.get('categories', {}):
|
|
display_name = category_name.replace('_', ' ').title()
|
|
categories_from_files[category_name] = {
|
|
'enabled': False, # Default to disabled, user can enable
|
|
'data_file': f'of_the_day/{filename}',
|
|
'display_name': display_name
|
|
}
|
|
else:
|
|
# Update with file info if needed
|
|
categories_from_files[category_name] = plugin_config['categories'][category_name]
|
|
# Ensure data_file is correct
|
|
categories_from_files[category_name]['data_file'] = f'of_the_day/{filename}'
|
|
|
|
except Exception as e:
|
|
print(f"Warning: Could not read {json_file}: {e}")
|
|
continue
|
|
|
|
# Update plugin_config with scanned files
|
|
if uploaded_files:
|
|
plugin_config['uploaded_files'] = uploaded_files
|
|
|
|
# Merge categories from files with existing config
|
|
# Start with existing categories (preserve user settings like enabled/disabled)
|
|
existing_categories = plugin_config.get('categories', {}).copy()
|
|
|
|
# Update existing categories with file info, add new ones from files
|
|
for cat_name, cat_data in categories_from_files.items():
|
|
if cat_name in existing_categories:
|
|
# Preserve existing enabled state and display_name, but update data_file path
|
|
existing_categories[cat_name]['data_file'] = cat_data['data_file']
|
|
if 'display_name' not in existing_categories[cat_name] or not existing_categories[cat_name]['display_name']:
|
|
existing_categories[cat_name]['display_name'] = cat_data['display_name']
|
|
else:
|
|
# Add new category from file (default to disabled)
|
|
existing_categories[cat_name] = cat_data
|
|
|
|
if existing_categories:
|
|
plugin_config['categories'] = existing_categories
|
|
|
|
# Update category_order to include all categories
|
|
category_order = plugin_config.get('category_order', []).copy()
|
|
all_category_names = set(existing_categories.keys())
|
|
for cat_name in all_category_names:
|
|
if cat_name not in category_order:
|
|
category_order.append(cat_name)
|
|
if category_order:
|
|
plugin_config['category_order'] = category_order
|
|
|
|
# If no config exists, return defaults
|
|
if not plugin_config:
|
|
plugin_config = {
|
|
'enabled': True,
|
|
'display_duration': 30
|
|
}
|
|
|
|
return success_response(data=plugin_config)
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.CONFIG_LOAD_FAILED)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
context=error.context,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/update', methods=['POST'])
|
|
def update_plugin():
|
|
"""Update plugin"""
|
|
try:
|
|
# Support both JSON and form data
|
|
content_type = request.content_type or ''
|
|
|
|
if 'application/json' in content_type:
|
|
# JSON request
|
|
data, error = validate_request_json(['plugin_id'])
|
|
if error:
|
|
# Log what we received for debugging
|
|
print(f"[UPDATE] JSON validation failed. Content-Type: {content_type}")
|
|
print(f"[UPDATE] Request data: {request.data}")
|
|
print(f"[UPDATE] Request form: {request.form.to_dict()}")
|
|
return error
|
|
else:
|
|
# Form data or query string
|
|
plugin_id = request.args.get('plugin_id') or request.form.get('plugin_id')
|
|
if not plugin_id:
|
|
print(f"[UPDATE] Missing plugin_id. Content-Type: {content_type}")
|
|
print(f"[UPDATE] Query args: {request.args.to_dict()}")
|
|
print(f"[UPDATE] Form data: {request.form.to_dict()}")
|
|
return error_response(
|
|
ErrorCode.INVALID_INPUT,
|
|
'plugin_id required',
|
|
status_code=400
|
|
)
|
|
data = {'plugin_id': plugin_id}
|
|
|
|
if not api_v3.plugin_store_manager:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'Plugin store manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
plugin_id = data['plugin_id']
|
|
|
|
# Always do direct updates (they're fast git pull operations)
|
|
# Operation queue is reserved for longer operations like install/uninstall
|
|
plugin_dir = Path(api_v3.plugin_store_manager.plugins_dir) / plugin_id
|
|
manifest_path = plugin_dir / "manifest.json"
|
|
|
|
current_last_updated = None
|
|
current_commit = None
|
|
current_branch = None
|
|
|
|
if manifest_path.exists():
|
|
try:
|
|
import json
|
|
with open(manifest_path, 'r', encoding='utf-8') as f:
|
|
manifest = json.load(f)
|
|
current_last_updated = manifest.get('last_updated')
|
|
except Exception as e:
|
|
print(f"Warning: Could not read local manifest for {plugin_id}: {e}")
|
|
|
|
if api_v3.plugin_store_manager:
|
|
git_info_before = api_v3.plugin_store_manager._get_local_git_info(plugin_dir)
|
|
if git_info_before:
|
|
current_commit = git_info_before.get('sha')
|
|
current_branch = git_info_before.get('branch')
|
|
|
|
remote_info = api_v3.plugin_store_manager.get_plugin_info(plugin_id, fetch_latest_from_github=True)
|
|
remote_commit = remote_info.get('last_commit_sha') if remote_info else None
|
|
remote_branch = remote_info.get('branch') if remote_info else None
|
|
|
|
# Update the plugin
|
|
success = api_v3.plugin_store_manager.update_plugin(plugin_id)
|
|
|
|
if success:
|
|
updated_last_updated = current_last_updated
|
|
try:
|
|
if manifest_path.exists():
|
|
import json
|
|
with open(manifest_path, 'r', encoding='utf-8') as f:
|
|
manifest = json.load(f)
|
|
updated_last_updated = manifest.get('last_updated', current_last_updated)
|
|
except Exception as e:
|
|
print(f"Warning: Could not read updated manifest for {plugin_id}: {e}")
|
|
|
|
updated_commit = None
|
|
updated_branch = remote_branch or current_branch
|
|
if api_v3.plugin_store_manager:
|
|
git_info_after = api_v3.plugin_store_manager._get_local_git_info(plugin_dir)
|
|
if git_info_after:
|
|
updated_commit = git_info_after.get('sha')
|
|
updated_branch = git_info_after.get('branch') or updated_branch
|
|
|
|
message = f'Plugin {plugin_id} updated successfully'
|
|
if current_commit and updated_commit and current_commit == updated_commit:
|
|
message = f'Plugin {plugin_id} already up to date (commit {updated_commit[:7]})'
|
|
elif updated_commit:
|
|
message = f'Plugin {plugin_id} updated to commit {updated_commit[:7]}'
|
|
if updated_branch:
|
|
message += f' on branch {updated_branch}'
|
|
elif updated_last_updated and updated_last_updated != current_last_updated:
|
|
message = f'Plugin {plugin_id} refreshed (Last Updated {updated_last_updated})'
|
|
|
|
remote_commit_short = remote_commit[:7] if remote_commit else None
|
|
if remote_commit_short and updated_commit and remote_commit_short != updated_commit[:7]:
|
|
message += f' (remote latest {remote_commit_short})'
|
|
|
|
# Invalidate schema cache
|
|
if api_v3.schema_manager:
|
|
api_v3.schema_manager.invalidate_cache(plugin_id)
|
|
|
|
# Rediscover plugins
|
|
if api_v3.plugin_manager:
|
|
api_v3.plugin_manager.discover_plugins()
|
|
if plugin_id in api_v3.plugin_manager.plugins:
|
|
api_v3.plugin_manager.reload_plugin(plugin_id)
|
|
|
|
# Update state and history
|
|
if api_v3.plugin_state_manager:
|
|
api_v3.plugin_state_manager.update_plugin_state(
|
|
plugin_id,
|
|
{'last_updated': datetime.now()}
|
|
)
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"update",
|
|
plugin_id=plugin_id,
|
|
status="success",
|
|
details={
|
|
"last_updated": updated_last_updated,
|
|
"commit": updated_commit
|
|
}
|
|
)
|
|
|
|
return success_response(
|
|
data={
|
|
'last_updated': updated_last_updated,
|
|
'commit': updated_commit
|
|
},
|
|
message=message
|
|
)
|
|
else:
|
|
error_msg = f'Failed to update plugin {plugin_id}'
|
|
plugin_path_dir = Path(api_v3.plugin_store_manager.plugins_dir) / plugin_id
|
|
if not plugin_path_dir.exists():
|
|
error_msg += ': Plugin not found'
|
|
else:
|
|
plugin_info = api_v3.plugin_store_manager.get_plugin_info(plugin_id)
|
|
if not plugin_info:
|
|
error_msg += ': Plugin not found in registry'
|
|
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"update",
|
|
plugin_id=plugin_id,
|
|
status="failed",
|
|
error=error_msg
|
|
)
|
|
|
|
return error_response(
|
|
ErrorCode.PLUGIN_UPDATE_FAILED,
|
|
error_msg,
|
|
status_code=500
|
|
)
|
|
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.PLUGIN_UPDATE_FAILED)
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"update",
|
|
plugin_id=data.get('plugin_id') if 'data' in locals() else None,
|
|
status="failed",
|
|
error=str(e)
|
|
)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
context=error.context,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/uninstall', methods=['POST'])
|
|
def uninstall_plugin():
|
|
"""Uninstall plugin"""
|
|
try:
|
|
# Validate request
|
|
data, error = validate_request_json(['plugin_id'])
|
|
if error:
|
|
return error
|
|
|
|
if not api_v3.plugin_store_manager:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'Plugin store manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
plugin_id = data['plugin_id']
|
|
preserve_config = data.get('preserve_config', False)
|
|
|
|
# Use operation queue if available
|
|
if api_v3.operation_queue:
|
|
def uninstall_callback(operation):
|
|
"""Callback to execute plugin uninstallation."""
|
|
# Unload the plugin first if it's loaded
|
|
if api_v3.plugin_manager and plugin_id in api_v3.plugin_manager.plugins:
|
|
api_v3.plugin_manager.unload_plugin(plugin_id)
|
|
|
|
# Uninstall the plugin
|
|
success = api_v3.plugin_store_manager.uninstall_plugin(plugin_id)
|
|
|
|
if not success:
|
|
error_msg = f'Failed to uninstall plugin {plugin_id}'
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"uninstall",
|
|
plugin_id=plugin_id,
|
|
status="failed",
|
|
error=error_msg
|
|
)
|
|
raise Exception(error_msg)
|
|
|
|
# Invalidate schema cache
|
|
if api_v3.schema_manager:
|
|
api_v3.schema_manager.invalidate_cache(plugin_id)
|
|
|
|
# Clean up plugin configuration if not preserving
|
|
if not preserve_config:
|
|
try:
|
|
api_v3.config_manager.cleanup_plugin_config(plugin_id, remove_secrets=True)
|
|
except Exception as cleanup_err:
|
|
print(f"Warning: Failed to cleanup config for {plugin_id}: {cleanup_err}")
|
|
|
|
# Remove from state manager
|
|
if api_v3.plugin_state_manager:
|
|
api_v3.plugin_state_manager.remove_plugin_state(plugin_id)
|
|
|
|
# Record in history
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"uninstall",
|
|
plugin_id=plugin_id,
|
|
status="success",
|
|
details={"preserve_config": preserve_config}
|
|
)
|
|
|
|
return {'success': True, 'message': f'Plugin {plugin_id} uninstalled successfully'}
|
|
|
|
# Enqueue operation
|
|
operation_id = api_v3.operation_queue.enqueue_operation(
|
|
OperationType.UNINSTALL,
|
|
plugin_id,
|
|
operation_callback=uninstall_callback
|
|
)
|
|
|
|
return success_response(
|
|
data={'operation_id': operation_id},
|
|
message=f'Plugin {plugin_id} uninstallation queued'
|
|
)
|
|
else:
|
|
# Fallback to direct uninstall
|
|
# Unload the plugin first if it's loaded
|
|
if api_v3.plugin_manager and plugin_id in api_v3.plugin_manager.plugins:
|
|
api_v3.plugin_manager.unload_plugin(plugin_id)
|
|
|
|
# Uninstall the plugin
|
|
success = api_v3.plugin_store_manager.uninstall_plugin(plugin_id)
|
|
|
|
if success:
|
|
# Invalidate schema cache
|
|
if api_v3.schema_manager:
|
|
api_v3.schema_manager.invalidate_cache(plugin_id)
|
|
|
|
# Clean up plugin configuration if not preserving
|
|
if not preserve_config:
|
|
try:
|
|
api_v3.config_manager.cleanup_plugin_config(plugin_id, remove_secrets=True)
|
|
except Exception as cleanup_err:
|
|
print(f"Warning: Failed to cleanup config for {plugin_id}: {cleanup_err}")
|
|
|
|
# Remove from state manager
|
|
if api_v3.plugin_state_manager:
|
|
api_v3.plugin_state_manager.remove_plugin_state(plugin_id)
|
|
|
|
# Record in history
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"uninstall",
|
|
plugin_id=plugin_id,
|
|
status="success",
|
|
details={"preserve_config": preserve_config}
|
|
)
|
|
|
|
return success_response(message=f'Plugin {plugin_id} uninstalled successfully')
|
|
else:
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"uninstall",
|
|
plugin_id=plugin_id,
|
|
status="failed",
|
|
error=f'Failed to uninstall plugin {plugin_id}'
|
|
)
|
|
|
|
return error_response(
|
|
ErrorCode.PLUGIN_UNINSTALL_FAILED,
|
|
f'Failed to uninstall plugin {plugin_id}',
|
|
status_code=500
|
|
)
|
|
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.PLUGIN_UNINSTALL_FAILED)
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"uninstall",
|
|
plugin_id=data.get('plugin_id') if 'data' in locals() else None,
|
|
status="failed",
|
|
error=str(e)
|
|
)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
context=error.context,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/install', methods=['POST'])
|
|
def install_plugin():
|
|
"""Install plugin from store"""
|
|
try:
|
|
if not api_v3.plugin_store_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin store manager not initialized'}), 500
|
|
|
|
data = request.get_json()
|
|
if not data or 'plugin_id' not in data:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id required'}), 400
|
|
|
|
plugin_id = data['plugin_id']
|
|
branch = data.get('branch') # Optional branch parameter
|
|
|
|
# Install the plugin
|
|
# Log the plugins directory being used for debugging
|
|
plugins_dir = api_v3.plugin_store_manager.plugins_dir
|
|
branch_info = f" (branch: {branch})" if branch else ""
|
|
print(f"Installing plugin {plugin_id}{branch_info} to directory: {plugins_dir}", flush=True)
|
|
|
|
# Use operation queue if available
|
|
if api_v3.operation_queue:
|
|
def install_callback(operation):
|
|
"""Callback to execute plugin installation."""
|
|
success = api_v3.plugin_store_manager.install_plugin(plugin_id, branch=branch)
|
|
|
|
if success:
|
|
# Invalidate schema cache
|
|
if api_v3.schema_manager:
|
|
api_v3.schema_manager.invalidate_cache(plugin_id)
|
|
|
|
# Discover and load the new plugin
|
|
if api_v3.plugin_manager:
|
|
api_v3.plugin_manager.discover_plugins()
|
|
api_v3.plugin_manager.load_plugin(plugin_id)
|
|
|
|
# Update state manager
|
|
if api_v3.plugin_state_manager:
|
|
api_v3.plugin_state_manager.set_plugin_installed(plugin_id)
|
|
|
|
# Record in history
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"install",
|
|
plugin_id=plugin_id,
|
|
status="success"
|
|
)
|
|
|
|
branch_msg = f" (branch: {branch})" if branch else ""
|
|
return {'success': True, 'message': f'Plugin {plugin_id} installed successfully{branch_msg}'}
|
|
else:
|
|
error_msg = f'Failed to install plugin {plugin_id}'
|
|
if branch:
|
|
error_msg += f' (branch: {branch})'
|
|
plugin_info = api_v3.plugin_store_manager.get_plugin_info(plugin_id)
|
|
if not plugin_info:
|
|
error_msg += ' (plugin not found in registry)'
|
|
|
|
# Record failure in history
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"install",
|
|
plugin_id=plugin_id,
|
|
status="failed",
|
|
error=error_msg
|
|
)
|
|
|
|
raise Exception(error_msg)
|
|
|
|
# Enqueue operation
|
|
operation_id = api_v3.operation_queue.enqueue_operation(
|
|
OperationType.INSTALL,
|
|
plugin_id,
|
|
operation_callback=install_callback
|
|
)
|
|
|
|
branch_msg = f" (branch: {branch})" if branch else ""
|
|
return success_response(
|
|
data={'operation_id': operation_id},
|
|
message=f'Plugin {plugin_id} installation queued{branch_msg}'
|
|
)
|
|
else:
|
|
# Fallback to direct installation
|
|
success = api_v3.plugin_store_manager.install_plugin(plugin_id, branch=branch)
|
|
|
|
if success:
|
|
if api_v3.schema_manager:
|
|
api_v3.schema_manager.invalidate_cache(plugin_id)
|
|
if api_v3.plugin_manager:
|
|
api_v3.plugin_manager.discover_plugins()
|
|
api_v3.plugin_manager.load_plugin(plugin_id)
|
|
if api_v3.plugin_state_manager:
|
|
api_v3.plugin_state_manager.set_plugin_installed(plugin_id)
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation("install", plugin_id=plugin_id, status="success")
|
|
|
|
branch_msg = f" (branch: {branch})" if branch else ""
|
|
return success_response(message=f'Plugin {plugin_id} installed successfully{branch_msg}')
|
|
else:
|
|
error_msg = f'Failed to install plugin {plugin_id}'
|
|
if branch:
|
|
error_msg += f' (branch: {branch})'
|
|
plugin_info = api_v3.plugin_store_manager.get_plugin_info(plugin_id)
|
|
if not plugin_info:
|
|
error_msg += ' (plugin not found in registry)'
|
|
|
|
return error_response(
|
|
ErrorCode.PLUGIN_INSTALL_FAILED,
|
|
error_msg,
|
|
status_code=500
|
|
)
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in install_plugin: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/install-from-url', methods=['POST'])
|
|
def install_plugin_from_url():
|
|
"""Install plugin from custom GitHub URL"""
|
|
try:
|
|
if not api_v3.plugin_store_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin store manager not initialized'}), 500
|
|
|
|
data = request.get_json()
|
|
if not data or 'repo_url' not in data:
|
|
return jsonify({'status': 'error', 'message': 'repo_url required'}), 400
|
|
|
|
repo_url = data['repo_url'].strip()
|
|
plugin_id = data.get('plugin_id') # Optional, for monorepo installations
|
|
plugin_path = data.get('plugin_path') # Optional, for monorepo subdirectory
|
|
branch = data.get('branch') # Optional branch parameter
|
|
|
|
# Install the plugin
|
|
result = api_v3.plugin_store_manager.install_from_url(
|
|
repo_url=repo_url,
|
|
plugin_id=plugin_id,
|
|
plugin_path=plugin_path,
|
|
branch=branch
|
|
)
|
|
|
|
if result.get('success'):
|
|
# Invalidate schema cache for the installed plugin
|
|
installed_plugin_id = result.get('plugin_id')
|
|
if api_v3.schema_manager and installed_plugin_id:
|
|
api_v3.schema_manager.invalidate_cache(installed_plugin_id)
|
|
|
|
# Discover and load the new plugin
|
|
if api_v3.plugin_manager and installed_plugin_id:
|
|
api_v3.plugin_manager.discover_plugins()
|
|
api_v3.plugin_manager.load_plugin(installed_plugin_id)
|
|
|
|
branch_msg = f" (branch: {result.get('branch', branch)})" if (result.get('branch') or branch) else ""
|
|
response_data = {
|
|
'status': 'success',
|
|
'message': f"Plugin {installed_plugin_id} installed successfully{branch_msg}",
|
|
'plugin_id': installed_plugin_id,
|
|
'name': result.get('name')
|
|
}
|
|
if result.get('branch'):
|
|
response_data['branch'] = result.get('branch')
|
|
return jsonify(response_data)
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': result.get('error', 'Failed to install plugin from URL')
|
|
}), 500
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in install_plugin_from_url: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/registry-from-url', methods=['POST'])
|
|
def get_registry_from_url():
|
|
"""Get plugin list from a registry-style monorepo URL"""
|
|
try:
|
|
if not api_v3.plugin_store_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin store manager not initialized'}), 500
|
|
|
|
data = request.get_json()
|
|
if not data or 'repo_url' not in data:
|
|
return jsonify({'status': 'error', 'message': 'repo_url required'}), 400
|
|
|
|
repo_url = data['repo_url'].strip()
|
|
|
|
# Get registry from the URL
|
|
registry = api_v3.plugin_store_manager.fetch_registry_from_url(repo_url)
|
|
|
|
if registry:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'plugins': registry.get('plugins', []),
|
|
'registry_url': repo_url
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Failed to fetch registry from URL or URL does not contain a valid registry'
|
|
}), 400
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_registry_from_url: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/saved-repositories', methods=['GET'])
|
|
def get_saved_repositories():
|
|
"""Get all saved repositories"""
|
|
try:
|
|
if not api_v3.saved_repositories_manager:
|
|
return jsonify({'status': 'error', 'message': 'Saved repositories manager not initialized'}), 500
|
|
|
|
repositories = api_v3.saved_repositories_manager.get_all()
|
|
return jsonify({'status': 'success', 'data': {'repositories': repositories}})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_saved_repositories: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/saved-repositories', methods=['POST'])
|
|
def add_saved_repository():
|
|
"""Add a repository to saved list"""
|
|
try:
|
|
if not api_v3.saved_repositories_manager:
|
|
return jsonify({'status': 'error', 'message': 'Saved repositories manager not initialized'}), 500
|
|
|
|
data = request.get_json()
|
|
if not data or 'repo_url' not in data:
|
|
return jsonify({'status': 'error', 'message': 'repo_url required'}), 400
|
|
|
|
repo_url = data['repo_url'].strip()
|
|
name = data.get('name')
|
|
|
|
success = api_v3.saved_repositories_manager.add(repo_url, name)
|
|
|
|
if success:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': 'Repository saved successfully',
|
|
'data': {'repositories': api_v3.saved_repositories_manager.get_all()}
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Repository already exists or failed to save'
|
|
}), 400
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in add_saved_repository: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/saved-repositories', methods=['DELETE'])
|
|
def remove_saved_repository():
|
|
"""Remove a repository from saved list"""
|
|
try:
|
|
if not api_v3.saved_repositories_manager:
|
|
return jsonify({'status': 'error', 'message': 'Saved repositories manager not initialized'}), 500
|
|
|
|
data = request.get_json()
|
|
if not data or 'repo_url' not in data:
|
|
return jsonify({'status': 'error', 'message': 'repo_url required'}), 400
|
|
|
|
repo_url = data['repo_url']
|
|
|
|
success = api_v3.saved_repositories_manager.remove(repo_url)
|
|
|
|
if success:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': 'Repository removed successfully',
|
|
'data': {'repositories': api_v3.saved_repositories_manager.get_all()}
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Repository not found'
|
|
}), 404
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in remove_saved_repository: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/store/list', methods=['GET'])
|
|
def list_plugin_store():
|
|
"""Search plugin store"""
|
|
try:
|
|
if not api_v3.plugin_store_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin store manager not initialized'}), 500
|
|
|
|
query = request.args.get('query', '')
|
|
category = request.args.get('category', '')
|
|
tags = request.args.getlist('tags')
|
|
# Default to fetching commit metadata to ensure accurate commit timestamps
|
|
fetch_commit_param = request.args.get('fetch_commit_info', request.args.get('fetch_latest_versions', '')).lower()
|
|
fetch_commit = fetch_commit_param != 'false'
|
|
|
|
# Search plugins from the registry (including saved repositories)
|
|
plugins = api_v3.plugin_store_manager.search_plugins(
|
|
query=query,
|
|
category=category,
|
|
tags=tags,
|
|
fetch_commit_info=fetch_commit,
|
|
include_saved_repos=True,
|
|
saved_repositories_manager=api_v3.saved_repositories_manager
|
|
)
|
|
|
|
# Format plugins for the web interface
|
|
formatted_plugins = []
|
|
for plugin in plugins:
|
|
formatted_plugins.append({
|
|
'id': plugin.get('id'),
|
|
'name': plugin.get('name'),
|
|
'author': plugin.get('author'),
|
|
'category': plugin.get('category'),
|
|
'description': plugin.get('description'),
|
|
'tags': plugin.get('tags', []),
|
|
'stars': plugin.get('stars', 0),
|
|
'verified': plugin.get('verified', False),
|
|
'repo': plugin.get('repo', ''),
|
|
'last_updated': plugin.get('last_updated') or plugin.get('last_updated_iso', ''),
|
|
'last_updated_iso': plugin.get('last_updated_iso', ''),
|
|
'last_commit': plugin.get('last_commit') or plugin.get('last_commit_sha'),
|
|
'last_commit_message': plugin.get('last_commit_message'),
|
|
'last_commit_author': plugin.get('last_commit_author'),
|
|
'branch': plugin.get('branch') or plugin.get('default_branch'),
|
|
'default_branch': plugin.get('default_branch')
|
|
})
|
|
|
|
return jsonify({'status': 'success', 'data': {'plugins': formatted_plugins}})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in list_plugin_store: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/store/github-status', methods=['GET'])
|
|
def get_github_auth_status():
|
|
"""Check if GitHub authentication is configured"""
|
|
try:
|
|
if not api_v3.plugin_store_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin store manager not initialized'}), 500
|
|
|
|
# Check if GitHub token is configured
|
|
has_token = api_v3.plugin_store_manager.github_token is not None and len(api_v3.plugin_store_manager.github_token) > 0
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {
|
|
'authenticated': has_token,
|
|
'rate_limit': 5000 if has_token else 60,
|
|
'message': 'GitHub API authenticated' if has_token else 'No GitHub token configured'
|
|
}
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_github_auth_status: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/store/refresh', methods=['POST'])
|
|
def refresh_plugin_store():
|
|
"""Refresh plugin store repository"""
|
|
try:
|
|
if not api_v3.plugin_store_manager:
|
|
return jsonify({'status': 'error', 'message': 'Plugin store manager not initialized'}), 500
|
|
|
|
data = request.get_json() or {}
|
|
fetch_commit_info = data.get('fetch_commit_info', data.get('fetch_latest_versions', False))
|
|
|
|
# Force refresh the registry
|
|
registry = api_v3.plugin_store_manager.fetch_registry(force_refresh=True)
|
|
plugin_count = len(registry.get('plugins', []))
|
|
|
|
message = 'Plugin store refreshed'
|
|
if fetch_commit_info:
|
|
message += ' (with refreshed commit metadata from GitHub)'
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': message,
|
|
'plugin_count': plugin_count
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in refresh_plugin_store: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
def deep_merge(base_dict, update_dict):
|
|
"""
|
|
Deep merge update_dict into base_dict.
|
|
For nested dicts, recursively merge. For other types, update_dict takes precedence.
|
|
"""
|
|
result = base_dict.copy()
|
|
for key, value in update_dict.items():
|
|
if key in result and isinstance(result[key], dict) and isinstance(value, dict):
|
|
# Recursively merge nested dicts
|
|
result[key] = deep_merge(result[key], value)
|
|
else:
|
|
# For non-dict values or new keys, use the update value
|
|
result[key] = value
|
|
return result
|
|
|
|
|
|
def _parse_form_value(value):
|
|
"""
|
|
Parse a form value into the appropriate Python type.
|
|
Handles booleans, numbers, JSON arrays/objects, and strings.
|
|
"""
|
|
import json
|
|
|
|
if value is None:
|
|
return None
|
|
|
|
# Handle string values
|
|
if isinstance(value, str):
|
|
stripped = value.strip()
|
|
|
|
# Check for boolean strings
|
|
if stripped.lower() == 'true':
|
|
return True
|
|
if stripped.lower() == 'false':
|
|
return False
|
|
if stripped.lower() in ('null', 'none') or stripped == '':
|
|
return None
|
|
|
|
# Try parsing as JSON (for arrays and objects) - do this BEFORE number parsing
|
|
# This handles RGB arrays like "[255, 0, 0]" correctly
|
|
if stripped.startswith('[') or stripped.startswith('{'):
|
|
try:
|
|
return json.loads(stripped)
|
|
except json.JSONDecodeError:
|
|
pass
|
|
|
|
# Try parsing as number
|
|
try:
|
|
if '.' in stripped:
|
|
return float(stripped)
|
|
return int(stripped)
|
|
except ValueError:
|
|
pass
|
|
|
|
# Return as string (original value, not stripped)
|
|
return value
|
|
|
|
return value
|
|
|
|
|
|
def _get_schema_property(schema, key_path):
|
|
"""
|
|
Get the schema property for a given key path (supports dot notation).
|
|
|
|
Args:
|
|
schema: The JSON schema dict
|
|
key_path: Dot-separated path like "customization.time_text.font"
|
|
|
|
Returns:
|
|
The property schema dict or None if not found
|
|
"""
|
|
if not schema or 'properties' not in schema:
|
|
return None
|
|
|
|
parts = key_path.split('.')
|
|
current = schema['properties']
|
|
|
|
for i, part in enumerate(parts):
|
|
if part not in current:
|
|
return None
|
|
|
|
prop = current[part]
|
|
|
|
# If this is the last part, return the property
|
|
if i == len(parts) - 1:
|
|
return prop
|
|
|
|
# If this is an object with properties, navigate deeper
|
|
if isinstance(prop, dict) and 'properties' in prop:
|
|
current = prop['properties']
|
|
else:
|
|
return None
|
|
|
|
return None
|
|
|
|
|
|
def _parse_form_value_with_schema(value, key_path, schema):
|
|
"""
|
|
Parse a form value using schema information to determine correct type.
|
|
Handles arrays (comma-separated strings), objects, and other types.
|
|
|
|
Args:
|
|
value: The form value (usually a string)
|
|
key_path: Dot-separated path like "category_order" or "customization.time_text.font"
|
|
schema: The plugin's JSON schema
|
|
|
|
Returns:
|
|
Parsed value with correct type
|
|
"""
|
|
import json
|
|
|
|
# Get the schema property for this field
|
|
prop = _get_schema_property(schema, key_path)
|
|
|
|
# Handle None/empty values
|
|
if value is None or (isinstance(value, str) and value.strip() == ''):
|
|
# If schema says it's an array, return empty array instead of None
|
|
if prop and prop.get('type') == 'array':
|
|
return []
|
|
# If schema says it's an object, return empty dict instead of None
|
|
if prop and prop.get('type') == 'object':
|
|
return {}
|
|
return None
|
|
|
|
# Handle string values
|
|
if isinstance(value, str):
|
|
stripped = value.strip()
|
|
|
|
# Check for boolean strings
|
|
if stripped.lower() == 'true':
|
|
return True
|
|
if stripped.lower() == 'false':
|
|
return False
|
|
|
|
# Handle arrays based on schema
|
|
if prop and prop.get('type') == 'array':
|
|
# Try parsing as JSON first (handles "[1,2,3]" format)
|
|
if stripped.startswith('['):
|
|
try:
|
|
return json.loads(stripped)
|
|
except json.JSONDecodeError:
|
|
pass
|
|
|
|
# Otherwise, treat as comma-separated string
|
|
if stripped:
|
|
# Split by comma and strip each item
|
|
items = [item.strip() for item in stripped.split(',') if item.strip()]
|
|
# Try to convert items to numbers if schema items are numbers
|
|
items_schema = prop.get('items', {})
|
|
if items_schema.get('type') in ('number', 'integer'):
|
|
try:
|
|
return [int(item) if '.' not in item else float(item) for item in items]
|
|
except ValueError:
|
|
pass
|
|
return items
|
|
return []
|
|
|
|
# Handle objects based on schema
|
|
if prop and prop.get('type') == 'object':
|
|
# Try parsing as JSON
|
|
if stripped.startswith('{'):
|
|
try:
|
|
return json.loads(stripped)
|
|
except json.JSONDecodeError:
|
|
pass
|
|
# If it's not JSON, return empty dict (form shouldn't send objects as strings)
|
|
return {}
|
|
|
|
# Try parsing as JSON (for arrays and objects) - do this BEFORE number parsing
|
|
if stripped.startswith('[') or stripped.startswith('{'):
|
|
try:
|
|
return json.loads(stripped)
|
|
except json.JSONDecodeError:
|
|
pass
|
|
|
|
# Handle numbers based on schema
|
|
if prop:
|
|
prop_type = prop.get('type')
|
|
if prop_type == 'integer':
|
|
try:
|
|
return int(stripped)
|
|
except ValueError:
|
|
return prop.get('default', 0)
|
|
elif prop_type == 'number':
|
|
try:
|
|
return float(stripped)
|
|
except ValueError:
|
|
return prop.get('default', 0.0)
|
|
|
|
# Try parsing as number (fallback)
|
|
try:
|
|
if '.' in stripped:
|
|
return float(stripped)
|
|
return int(stripped)
|
|
except ValueError:
|
|
pass
|
|
|
|
# Return as string
|
|
return value
|
|
|
|
return value
|
|
|
|
|
|
def _set_nested_value(config, key_path, value):
|
|
"""
|
|
Set a value in a nested dict using dot notation path.
|
|
Handles existing nested dicts correctly by merging instead of replacing.
|
|
|
|
Args:
|
|
config: The config dict to modify
|
|
key_path: Dot-separated path (e.g., "customization.period_text.font")
|
|
value: The value to set
|
|
"""
|
|
parts = key_path.split('.')
|
|
current = config
|
|
|
|
# Navigate/create intermediate dicts
|
|
for i, part in enumerate(parts[:-1]):
|
|
if part not in current:
|
|
current[part] = {}
|
|
elif not isinstance(current[part], dict):
|
|
# If the existing value is not a dict, replace it with a dict
|
|
current[part] = {}
|
|
current = current[part]
|
|
|
|
# Set the final value (don't overwrite with empty dict if value is None and we want to preserve structure)
|
|
if value is not None or parts[-1] not in current:
|
|
current[parts[-1]] = value
|
|
|
|
|
|
def _filter_config_by_schema(config, schema, prefix=''):
|
|
"""
|
|
Filter config to only include fields defined in the schema.
|
|
Removes fields not in schema, especially important when additionalProperties is false.
|
|
|
|
Args:
|
|
config: The config dict to filter
|
|
schema: The JSON schema dict
|
|
prefix: Prefix for nested paths (used recursively)
|
|
|
|
Returns:
|
|
Filtered config dict containing only schema-defined fields
|
|
"""
|
|
if not schema or 'properties' not in schema:
|
|
return config
|
|
|
|
filtered = {}
|
|
schema_props = schema.get('properties', {})
|
|
|
|
for key, value in config.items():
|
|
if key not in schema_props:
|
|
# Field not in schema, skip it
|
|
continue
|
|
|
|
prop_schema = schema_props[key]
|
|
|
|
# Handle nested objects recursively
|
|
if isinstance(value, dict) and prop_schema.get('type') == 'object' and 'properties' in prop_schema:
|
|
filtered[key] = _filter_config_by_schema(value, prop_schema, f"{prefix}.{key}" if prefix else key)
|
|
else:
|
|
# Keep the value as-is for non-object types
|
|
filtered[key] = value
|
|
|
|
return filtered
|
|
|
|
|
|
@api_v3.route('/plugins/config', methods=['POST'])
|
|
def save_plugin_config():
|
|
"""Save plugin configuration, separating secrets from regular config"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'Config manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
# Support both JSON and form data (for HTMX submissions)
|
|
content_type = request.content_type or ''
|
|
|
|
if 'application/json' in content_type:
|
|
# JSON request
|
|
data, error = validate_request_json(['plugin_id'])
|
|
if error:
|
|
return error
|
|
plugin_id = data['plugin_id']
|
|
plugin_config = data.get('config', {})
|
|
else:
|
|
# Form data (HTMX submission)
|
|
# plugin_id comes from query string, config from form fields
|
|
plugin_id = request.args.get('plugin_id')
|
|
if not plugin_id:
|
|
return error_response(
|
|
ErrorCode.INVALID_INPUT,
|
|
'plugin_id required in query string',
|
|
status_code=400
|
|
)
|
|
|
|
# Load existing config as base (partial form updates should merge, not replace)
|
|
existing_config = {}
|
|
if api_v3.config_manager:
|
|
full_config = api_v3.config_manager.load_config()
|
|
existing_config = full_config.get(plugin_id, {}).copy()
|
|
|
|
# Get schema manager instance (needed for type conversion)
|
|
schema_mgr = api_v3.schema_manager
|
|
if not schema_mgr:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'Schema manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
# Load plugin schema BEFORE processing form data (needed for type conversion)
|
|
schema = schema_mgr.load_schema(plugin_id, use_cache=False)
|
|
|
|
# Start with existing config and apply form updates
|
|
plugin_config = existing_config
|
|
|
|
# Convert form data to config dict
|
|
# Form fields can use dot notation for nested values (e.g., "transition.type")
|
|
form_data = request.form.to_dict()
|
|
|
|
# First pass: detect and combine array index fields (e.g., "text_color.0", "text_color.1" -> "text_color" as array)
|
|
# This handles cases where forms send array fields as indexed inputs
|
|
array_fields = {} # Maps base field path to list of (index, value) tuples
|
|
processed_keys = set()
|
|
indexed_base_paths = set() # Track which base paths have indexed fields
|
|
|
|
for key, value in form_data.items():
|
|
# Check if this looks like an array index field (ends with .0, .1, .2, etc.)
|
|
if '.' in key:
|
|
parts = key.rsplit('.', 1) # Split on last dot
|
|
if len(parts) == 2:
|
|
base_path, last_part = parts
|
|
# Check if last part is a numeric string (array index)
|
|
if last_part.isdigit():
|
|
# Get schema property for the base path to verify it's an array
|
|
base_prop = _get_schema_property(schema, base_path)
|
|
if base_prop and base_prop.get('type') == 'array':
|
|
# This is an array index field
|
|
index = int(last_part)
|
|
if base_path not in array_fields:
|
|
array_fields[base_path] = []
|
|
array_fields[base_path].append((index, value))
|
|
processed_keys.add(key)
|
|
indexed_base_paths.add(base_path)
|
|
continue
|
|
|
|
# Process combined array fields
|
|
for base_path, index_values in array_fields.items():
|
|
# Sort by index and extract values
|
|
index_values.sort(key=lambda x: x[0])
|
|
values = [v for _, v in index_values]
|
|
# Combine values into comma-separated string for parsing
|
|
combined_value = ', '.join(str(v) for v in values)
|
|
# Parse as array using schema
|
|
parsed_value = _parse_form_value_with_schema(combined_value, base_path, schema)
|
|
# Debug logging
|
|
import logging
|
|
logger = logging.getLogger(__name__)
|
|
logger.debug(f"Combined indexed array field {base_path}: {values} -> {combined_value} -> {parsed_value}")
|
|
_set_nested_value(plugin_config, base_path, parsed_value)
|
|
|
|
# Process remaining (non-indexed) fields
|
|
# Skip any base paths that were processed as indexed arrays
|
|
for key, value in form_data.items():
|
|
if key not in processed_keys:
|
|
# Skip if this key is a base path that was processed as indexed array
|
|
# (to avoid overwriting the combined array with a single value)
|
|
if key not in indexed_base_paths:
|
|
# Parse value using schema to determine correct type
|
|
parsed_value = _parse_form_value_with_schema(value, key, schema)
|
|
# Debug logging for array fields
|
|
if schema:
|
|
prop = _get_schema_property(schema, key)
|
|
if prop and prop.get('type') == 'array':
|
|
import logging
|
|
logger = logging.getLogger(__name__)
|
|
logger.debug(f"Array field {key}: form value='{value}' -> parsed={parsed_value}")
|
|
# Use helper to set nested values correctly
|
|
_set_nested_value(plugin_config, key, parsed_value)
|
|
|
|
# Post-process: Fix array fields that might have been incorrectly structured
|
|
# This handles cases where array fields are stored as dicts (e.g., from indexed form fields)
|
|
def fix_array_structures(config_dict, schema_props, prefix=''):
|
|
"""Recursively fix array structures (convert dicts with numeric keys to arrays, fix length issues)"""
|
|
for prop_key, prop_schema in schema_props.items():
|
|
prop_type = prop_schema.get('type')
|
|
|
|
if prop_type == 'array':
|
|
# Navigate to the field location
|
|
if prefix:
|
|
parent_parts = prefix.split('.')
|
|
parent = config_dict
|
|
for part in parent_parts:
|
|
if isinstance(parent, dict) and part in parent:
|
|
parent = parent[part]
|
|
else:
|
|
parent = None
|
|
break
|
|
|
|
if parent is not None and isinstance(parent, dict) and prop_key in parent:
|
|
current_value = parent[prop_key]
|
|
# If it's a dict with numeric string keys, convert to array
|
|
if isinstance(current_value, dict) and not isinstance(current_value, list):
|
|
try:
|
|
# Check if all keys are numeric strings (array indices)
|
|
keys = [k for k in current_value.keys()]
|
|
if all(k.isdigit() for k in keys):
|
|
# Convert to sorted array by index
|
|
sorted_keys = sorted(keys, key=int)
|
|
array_value = [current_value[k] for k in sorted_keys]
|
|
# Convert array elements to correct types based on schema
|
|
items_schema = prop_schema.get('items', {})
|
|
item_type = items_schema.get('type')
|
|
if item_type in ('number', 'integer'):
|
|
converted_array = []
|
|
for v in array_value:
|
|
if isinstance(v, str):
|
|
try:
|
|
if item_type == 'integer':
|
|
converted_array.append(int(v))
|
|
else:
|
|
converted_array.append(float(v))
|
|
except (ValueError, TypeError):
|
|
converted_array.append(v)
|
|
else:
|
|
converted_array.append(v)
|
|
array_value = converted_array
|
|
parent[prop_key] = array_value
|
|
current_value = array_value # Update for length check below
|
|
except (ValueError, KeyError, TypeError):
|
|
# Conversion failed, check if we should use default
|
|
pass
|
|
|
|
# If it's an array, ensure correct types and check minItems
|
|
if isinstance(current_value, list):
|
|
# First, ensure array elements are correct types
|
|
items_schema = prop_schema.get('items', {})
|
|
item_type = items_schema.get('type')
|
|
if item_type in ('number', 'integer'):
|
|
converted_array = []
|
|
for v in current_value:
|
|
if isinstance(v, str):
|
|
try:
|
|
if item_type == 'integer':
|
|
converted_array.append(int(v))
|
|
else:
|
|
converted_array.append(float(v))
|
|
except (ValueError, TypeError):
|
|
converted_array.append(v)
|
|
else:
|
|
converted_array.append(v)
|
|
parent[prop_key] = converted_array
|
|
current_value = converted_array
|
|
|
|
# Then check minItems
|
|
min_items = prop_schema.get('minItems')
|
|
if min_items is not None and len(current_value) < min_items:
|
|
# Use default if available, otherwise keep as-is (validation will catch it)
|
|
default = prop_schema.get('default')
|
|
if default and isinstance(default, list) and len(default) >= min_items:
|
|
parent[prop_key] = default
|
|
else:
|
|
# Top-level field
|
|
if prop_key in config_dict:
|
|
current_value = config_dict[prop_key]
|
|
# If it's a dict with numeric string keys, convert to array
|
|
if isinstance(current_value, dict) and not isinstance(current_value, list):
|
|
try:
|
|
keys = [k for k in current_value.keys()]
|
|
if all(k.isdigit() for k in keys):
|
|
sorted_keys = sorted(keys, key=int)
|
|
array_value = [current_value[k] for k in sorted_keys]
|
|
# Convert array elements to correct types based on schema
|
|
items_schema = prop_schema.get('items', {})
|
|
item_type = items_schema.get('type')
|
|
if item_type in ('number', 'integer'):
|
|
converted_array = []
|
|
for v in array_value:
|
|
if isinstance(v, str):
|
|
try:
|
|
if item_type == 'integer':
|
|
converted_array.append(int(v))
|
|
else:
|
|
converted_array.append(float(v))
|
|
except (ValueError, TypeError):
|
|
converted_array.append(v)
|
|
else:
|
|
converted_array.append(v)
|
|
array_value = converted_array
|
|
config_dict[prop_key] = array_value
|
|
current_value = array_value # Update for length check below
|
|
except (ValueError, KeyError, TypeError):
|
|
pass
|
|
|
|
# If it's an array, ensure correct types and check minItems
|
|
if isinstance(current_value, list):
|
|
# First, ensure array elements are correct types
|
|
items_schema = prop_schema.get('items', {})
|
|
item_type = items_schema.get('type')
|
|
if item_type in ('number', 'integer'):
|
|
converted_array = []
|
|
for v in current_value:
|
|
if isinstance(v, str):
|
|
try:
|
|
if item_type == 'integer':
|
|
converted_array.append(int(v))
|
|
else:
|
|
converted_array.append(float(v))
|
|
except (ValueError, TypeError):
|
|
converted_array.append(v)
|
|
else:
|
|
converted_array.append(v)
|
|
config_dict[prop_key] = converted_array
|
|
current_value = converted_array
|
|
|
|
# Then check minItems
|
|
min_items = prop_schema.get('minItems')
|
|
if min_items is not None and len(current_value) < min_items:
|
|
default = prop_schema.get('default')
|
|
if default and isinstance(default, list) and len(default) >= min_items:
|
|
config_dict[prop_key] = default
|
|
|
|
# Recurse into nested objects
|
|
elif prop_type == 'object' and 'properties' in prop_schema:
|
|
nested_prefix = f"{prefix}.{prop_key}" if prefix else prop_key
|
|
if prefix:
|
|
parent_parts = prefix.split('.')
|
|
parent = config_dict
|
|
for part in parent_parts:
|
|
if isinstance(parent, dict) and part in parent:
|
|
parent = parent[part]
|
|
else:
|
|
parent = None
|
|
break
|
|
nested_dict = parent.get(prop_key) if parent is not None and isinstance(parent, dict) else None
|
|
else:
|
|
nested_dict = config_dict.get(prop_key)
|
|
|
|
if isinstance(nested_dict, dict):
|
|
fix_array_structures(nested_dict, prop_schema['properties'], nested_prefix)
|
|
|
|
# Also ensure array fields that are None get converted to empty arrays
|
|
def ensure_array_defaults(config_dict, schema_props, prefix=''):
|
|
"""Recursively ensure array fields have defaults if None"""
|
|
for prop_key, prop_schema in schema_props.items():
|
|
prop_type = prop_schema.get('type')
|
|
|
|
if prop_type == 'array':
|
|
if prefix:
|
|
parent_parts = prefix.split('.')
|
|
parent = config_dict
|
|
for part in parent_parts:
|
|
if isinstance(parent, dict) and part in parent:
|
|
parent = parent[part]
|
|
else:
|
|
parent = None
|
|
break
|
|
|
|
if parent is not None and isinstance(parent, dict):
|
|
if prop_key not in parent or parent[prop_key] is None:
|
|
default = prop_schema.get('default', [])
|
|
parent[prop_key] = default if default else []
|
|
else:
|
|
if prop_key not in config_dict or config_dict[prop_key] is None:
|
|
default = prop_schema.get('default', [])
|
|
config_dict[prop_key] = default if default else []
|
|
|
|
elif prop_type == 'object' and 'properties' in prop_schema:
|
|
nested_prefix = f"{prefix}.{prop_key}" if prefix else prop_key
|
|
if prefix:
|
|
parent_parts = prefix.split('.')
|
|
parent = config_dict
|
|
for part in parent_parts:
|
|
if isinstance(parent, dict) and part in parent:
|
|
parent = parent[part]
|
|
else:
|
|
parent = None
|
|
break
|
|
nested_dict = parent.get(prop_key) if parent is not None and isinstance(parent, dict) else None
|
|
else:
|
|
nested_dict = config_dict.get(prop_key)
|
|
|
|
if nested_dict is None:
|
|
if prefix:
|
|
parent_parts = prefix.split('.')
|
|
parent = config_dict
|
|
for part in parent_parts:
|
|
if part not in parent:
|
|
parent[part] = {}
|
|
parent = parent[part]
|
|
if prop_key not in parent:
|
|
parent[prop_key] = {}
|
|
nested_dict = parent[prop_key]
|
|
else:
|
|
if prop_key not in config_dict:
|
|
config_dict[prop_key] = {}
|
|
nested_dict = config_dict[prop_key]
|
|
|
|
if isinstance(nested_dict, dict):
|
|
ensure_array_defaults(nested_dict, prop_schema['properties'], nested_prefix)
|
|
|
|
if schema and 'properties' in schema:
|
|
# First, fix any dict structures that should be arrays
|
|
fix_array_structures(plugin_config, schema['properties'])
|
|
# Then, ensure None arrays get defaults
|
|
ensure_array_defaults(plugin_config, schema['properties'])
|
|
|
|
# Get schema manager instance (for JSON requests)
|
|
schema_mgr = api_v3.schema_manager
|
|
if not schema_mgr:
|
|
return error_response(
|
|
ErrorCode.SYSTEM_ERROR,
|
|
'Schema manager not initialized',
|
|
status_code=500
|
|
)
|
|
|
|
# Load plugin schema using SchemaManager (force refresh to get latest schema)
|
|
# For JSON requests, schema wasn't loaded yet
|
|
if 'application/json' in content_type:
|
|
schema = schema_mgr.load_schema(plugin_id, use_cache=False)
|
|
|
|
# PRE-PROCESSING: Preserve 'enabled' state if not in request
|
|
# This prevents overwriting the enabled state when saving config from a form that doesn't include the toggle
|
|
if 'enabled' not in plugin_config:
|
|
try:
|
|
current_config = api_v3.config_manager.load_config()
|
|
if plugin_id in current_config and 'enabled' in current_config[plugin_id]:
|
|
plugin_config['enabled'] = current_config[plugin_id]['enabled']
|
|
# logger.debug(f"Preserving enabled state for {plugin_id}: {plugin_config['enabled']}")
|
|
elif api_v3.plugin_manager:
|
|
# Fallback to plugin instance if config doesn't have it
|
|
plugin_instance = api_v3.plugin_manager.get_plugin(plugin_id)
|
|
if plugin_instance:
|
|
plugin_config['enabled'] = plugin_instance.enabled
|
|
# Final fallback: default to True if plugin is loaded (matches BasePlugin default)
|
|
if 'enabled' not in plugin_config:
|
|
plugin_config['enabled'] = True
|
|
except Exception as e:
|
|
print(f"Error preserving enabled state: {e}")
|
|
# Default to True on error to avoid disabling plugins
|
|
plugin_config['enabled'] = True
|
|
|
|
# Find secret fields (supports nested schemas)
|
|
secret_fields = set()
|
|
|
|
def find_secret_fields(properties, prefix=''):
|
|
"""Recursively find fields marked with x-secret: true"""
|
|
fields = set()
|
|
if not isinstance(properties, dict):
|
|
return fields
|
|
for field_name, field_props in properties.items():
|
|
full_path = f"{prefix}.{field_name}" if prefix else field_name
|
|
if isinstance(field_props, dict) and field_props.get('x-secret', False):
|
|
fields.add(full_path)
|
|
# Check nested objects
|
|
if isinstance(field_props, dict) and field_props.get('type') == 'object' and 'properties' in field_props:
|
|
fields.update(find_secret_fields(field_props['properties'], full_path))
|
|
return fields
|
|
|
|
if schema and 'properties' in schema:
|
|
secret_fields = find_secret_fields(schema['properties'])
|
|
|
|
# Apply defaults from schema to config BEFORE validation
|
|
# This ensures required fields with defaults are present before validation
|
|
# Store preserved enabled value before merge to protect it from defaults
|
|
preserved_enabled = None
|
|
if 'enabled' in plugin_config:
|
|
preserved_enabled = plugin_config['enabled']
|
|
|
|
if schema:
|
|
defaults = schema_mgr.generate_default_config(plugin_id, use_cache=True)
|
|
plugin_config = schema_mgr.merge_with_defaults(plugin_config, defaults)
|
|
|
|
# Ensure enabled state is preserved after defaults merge
|
|
# Defaults should not overwrite an explicitly preserved enabled value
|
|
if preserved_enabled is not None:
|
|
# Restore preserved value if it was changed by defaults merge
|
|
if plugin_config.get('enabled') != preserved_enabled:
|
|
plugin_config['enabled'] = preserved_enabled
|
|
|
|
# Normalize config data: convert string numbers to integers/floats where schema expects numbers
|
|
# This handles form data which sends everything as strings
|
|
def normalize_config_values(config, schema_props, prefix=''):
|
|
"""Recursively normalize config values based on schema types"""
|
|
if not isinstance(config, dict) or not isinstance(schema_props, dict):
|
|
return config
|
|
|
|
normalized = {}
|
|
for key, value in config.items():
|
|
field_path = f"{prefix}.{key}" if prefix else key
|
|
|
|
if key not in schema_props:
|
|
# Field not in schema, keep as-is (will be caught by additionalProperties check if needed)
|
|
normalized[key] = value
|
|
continue
|
|
|
|
prop_schema = schema_props[key]
|
|
prop_type = prop_schema.get('type')
|
|
|
|
# Handle union types (e.g., ["integer", "null"])
|
|
if isinstance(prop_type, list):
|
|
# Check if null is allowed and value is empty/null
|
|
if 'null' in prop_type:
|
|
# Handle various representations of null/empty
|
|
if value is None:
|
|
normalized[key] = None
|
|
continue
|
|
elif isinstance(value, str):
|
|
# Strip whitespace and check for null representations
|
|
value_stripped = value.strip()
|
|
if value_stripped == '' or value_stripped.lower() in ('null', 'none', 'undefined'):
|
|
normalized[key] = None
|
|
continue
|
|
|
|
# Try to normalize based on non-null types in the union
|
|
# Check integer first (more specific than number)
|
|
if 'integer' in prop_type:
|
|
if isinstance(value, str):
|
|
value_stripped = value.strip()
|
|
if value_stripped == '':
|
|
# Empty string with null allowed - already handled above, but double-check
|
|
if 'null' in prop_type:
|
|
normalized[key] = None
|
|
continue
|
|
try:
|
|
normalized[key] = int(value_stripped)
|
|
continue
|
|
except (ValueError, TypeError):
|
|
pass
|
|
elif isinstance(value, (int, float)):
|
|
normalized[key] = int(value)
|
|
continue
|
|
|
|
# Check number (less specific, but handles floats)
|
|
if 'number' in prop_type:
|
|
if isinstance(value, str):
|
|
value_stripped = value.strip()
|
|
if value_stripped == '':
|
|
# Empty string with null allowed - already handled above, but double-check
|
|
if 'null' in prop_type:
|
|
normalized[key] = None
|
|
continue
|
|
try:
|
|
normalized[key] = float(value_stripped)
|
|
continue
|
|
except (ValueError, TypeError):
|
|
pass
|
|
elif isinstance(value, (int, float)):
|
|
normalized[key] = float(value)
|
|
continue
|
|
|
|
# Check boolean
|
|
if 'boolean' in prop_type:
|
|
if isinstance(value, str):
|
|
normalized[key] = value.strip().lower() in ('true', '1', 'on', 'yes')
|
|
continue
|
|
|
|
# If no conversion worked and null is allowed, try to set to None
|
|
# This handles cases where the value is an empty string or can't be converted
|
|
if 'null' in prop_type:
|
|
if isinstance(value, str):
|
|
value_stripped = value.strip()
|
|
if value_stripped == '' or value_stripped.lower() in ('null', 'none', 'undefined'):
|
|
normalized[key] = None
|
|
continue
|
|
# If it's already None, keep it
|
|
if value is None:
|
|
normalized[key] = None
|
|
continue
|
|
|
|
# If no conversion worked, keep original value (will fail validation, but that's expected)
|
|
# Log a warning for debugging
|
|
logger.warning(f"Could not normalize field {field_path}: value={repr(value)}, type={type(value)}, schema_type={prop_type}")
|
|
normalized[key] = value
|
|
continue
|
|
|
|
if isinstance(value, dict) and prop_type == 'object' and 'properties' in prop_schema:
|
|
# Recursively normalize nested objects
|
|
normalized[key] = normalize_config_values(value, prop_schema['properties'], field_path)
|
|
elif isinstance(value, list) and prop_type == 'array' and 'items' in prop_schema:
|
|
# Normalize array items
|
|
items_schema = prop_schema['items']
|
|
item_type = items_schema.get('type')
|
|
|
|
# Handle union types in array items
|
|
if isinstance(item_type, list):
|
|
normalized_array = []
|
|
for v in value:
|
|
# Check if null is allowed
|
|
if 'null' in item_type:
|
|
if v is None or v == '' or (isinstance(v, str) and v.lower() in ('null', 'none')):
|
|
normalized_array.append(None)
|
|
continue
|
|
|
|
# Try to normalize based on non-null types
|
|
if 'integer' in item_type:
|
|
if isinstance(v, str):
|
|
try:
|
|
normalized_array.append(int(v))
|
|
continue
|
|
except (ValueError, TypeError):
|
|
pass
|
|
elif isinstance(v, (int, float)):
|
|
normalized_array.append(int(v))
|
|
continue
|
|
elif 'number' in item_type:
|
|
if isinstance(v, str):
|
|
try:
|
|
normalized_array.append(float(v))
|
|
continue
|
|
except (ValueError, TypeError):
|
|
pass
|
|
elif isinstance(v, (int, float)):
|
|
normalized_array.append(float(v))
|
|
continue
|
|
|
|
# If no conversion worked, keep original value
|
|
normalized_array.append(v)
|
|
normalized[key] = normalized_array
|
|
elif item_type == 'integer':
|
|
# Convert string numbers to integers
|
|
normalized_array = []
|
|
for v in value:
|
|
if isinstance(v, str):
|
|
try:
|
|
normalized_array.append(int(v))
|
|
except (ValueError, TypeError):
|
|
normalized_array.append(v)
|
|
elif isinstance(v, (int, float)):
|
|
normalized_array.append(int(v))
|
|
else:
|
|
normalized_array.append(v)
|
|
normalized[key] = normalized_array
|
|
elif item_type == 'number':
|
|
# Convert string numbers to floats
|
|
normalized_array = []
|
|
for v in value:
|
|
if isinstance(v, str):
|
|
try:
|
|
normalized_array.append(float(v))
|
|
except (ValueError, TypeError):
|
|
normalized_array.append(v)
|
|
else:
|
|
normalized_array.append(v)
|
|
normalized[key] = normalized_array
|
|
elif item_type == 'object' and 'properties' in items_schema:
|
|
# Recursively normalize array of objects
|
|
normalized_array = []
|
|
for v in value:
|
|
if isinstance(v, dict):
|
|
normalized_array.append(
|
|
normalize_config_values(v, items_schema['properties'], f"{field_path}[]")
|
|
)
|
|
else:
|
|
normalized_array.append(v)
|
|
normalized[key] = normalized_array
|
|
else:
|
|
normalized[key] = value
|
|
elif prop_type == 'integer':
|
|
# Convert string to integer
|
|
if isinstance(value, str):
|
|
try:
|
|
normalized[key] = int(value)
|
|
except (ValueError, TypeError):
|
|
normalized[key] = value
|
|
else:
|
|
normalized[key] = value
|
|
elif prop_type == 'number':
|
|
# Convert string to float
|
|
if isinstance(value, str):
|
|
try:
|
|
normalized[key] = float(value)
|
|
except (ValueError, TypeError):
|
|
normalized[key] = value
|
|
else:
|
|
normalized[key] = value
|
|
elif prop_type == 'boolean':
|
|
# Convert string booleans
|
|
if isinstance(value, str):
|
|
normalized[key] = value.lower() in ('true', '1', 'on', 'yes')
|
|
else:
|
|
normalized[key] = value
|
|
else:
|
|
normalized[key] = value
|
|
|
|
return normalized
|
|
|
|
# Normalize config before validation
|
|
if schema and 'properties' in schema:
|
|
plugin_config = normalize_config_values(plugin_config, schema['properties'])
|
|
|
|
# Filter config to only include schema-defined fields (important when additionalProperties is false)
|
|
if schema and 'properties' in schema:
|
|
plugin_config = _filter_config_by_schema(plugin_config, schema)
|
|
|
|
# Debug logging for union type fields (temporary)
|
|
if 'rotation_settings' in plugin_config and 'random_seed' in plugin_config.get('rotation_settings', {}):
|
|
seed_value = plugin_config['rotation_settings']['random_seed']
|
|
logger.debug(f"After normalization, random_seed value: {repr(seed_value)}, type: {type(seed_value)}")
|
|
|
|
# Validate configuration against schema before saving
|
|
if schema:
|
|
# Log what we're validating for debugging
|
|
import logging
|
|
logger = logging.getLogger(__name__)
|
|
logger.info(f"Validating config for {plugin_id}")
|
|
logger.info(f"Config keys being validated: {list(plugin_config.keys())}")
|
|
logger.info(f"Full config: {plugin_config}")
|
|
|
|
# Get enhanced schema keys (including injected core properties)
|
|
# We need to create an enhanced schema to get the actual allowed keys
|
|
import copy
|
|
enhanced_schema = copy.deepcopy(schema)
|
|
if "properties" not in enhanced_schema:
|
|
enhanced_schema["properties"] = {}
|
|
|
|
# Core properties that are always injected during validation
|
|
core_properties = ["enabled", "display_duration", "live_priority"]
|
|
for prop_name in core_properties:
|
|
if prop_name not in enhanced_schema["properties"]:
|
|
# Add placeholder to get the full list of allowed keys
|
|
enhanced_schema["properties"][prop_name] = {"type": "any"}
|
|
|
|
is_valid, validation_errors = schema_mgr.validate_config_against_schema(
|
|
plugin_config, schema, plugin_id
|
|
)
|
|
if not is_valid:
|
|
# Log validation errors for debugging
|
|
logger.error(f"Config validation failed for {plugin_id}")
|
|
logger.error(f"Validation errors: {validation_errors}")
|
|
logger.error(f"Config that failed: {plugin_config}")
|
|
logger.error(f"Schema properties: {list(enhanced_schema.get('properties', {}).keys())}")
|
|
|
|
# Also print to console for immediate visibility
|
|
import json
|
|
print(f"[ERROR] Config validation failed for {plugin_id}")
|
|
print(f"[ERROR] Validation errors: {validation_errors}")
|
|
print(f"[ERROR] Config keys: {list(plugin_config.keys())}")
|
|
print(f"[ERROR] Schema property keys: {list(enhanced_schema.get('properties', {}).keys())}")
|
|
|
|
# Log raw form data if this was a form submission
|
|
if 'application/json' not in (request.content_type or ''):
|
|
form_data = request.form.to_dict()
|
|
print(f"[ERROR] Raw form data: {json.dumps({k: str(v)[:200] for k, v in form_data.items()}, indent=2)}")
|
|
print(f"[ERROR] Parsed config: {json.dumps(plugin_config, indent=2, default=str)}")
|
|
return error_response(
|
|
ErrorCode.CONFIG_VALIDATION_FAILED,
|
|
'Configuration validation failed',
|
|
details='; '.join(validation_errors) if validation_errors else 'Unknown validation error',
|
|
context={
|
|
'plugin_id': plugin_id,
|
|
'validation_errors': validation_errors,
|
|
'config_keys': list(plugin_config.keys()),
|
|
'schema_keys': list(enhanced_schema.get('properties', {}).keys())
|
|
},
|
|
suggested_fixes=[
|
|
'Review validation errors above',
|
|
'Check config against schema',
|
|
'Verify all required fields are present'
|
|
],
|
|
status_code=400
|
|
)
|
|
|
|
# Separate secrets from regular config (handles nested configs)
|
|
def separate_secrets(config, secrets_set, prefix=''):
|
|
"""Recursively separate secret fields from regular config"""
|
|
regular = {}
|
|
secrets = {}
|
|
|
|
for key, value in config.items():
|
|
full_path = f"{prefix}.{key}" if prefix else key
|
|
|
|
if isinstance(value, dict):
|
|
# Recursively handle nested dicts
|
|
nested_regular, nested_secrets = separate_secrets(value, secrets_set, full_path)
|
|
if nested_regular:
|
|
regular[key] = nested_regular
|
|
if nested_secrets:
|
|
secrets[key] = nested_secrets
|
|
elif full_path in secrets_set:
|
|
secrets[key] = value
|
|
else:
|
|
regular[key] = value
|
|
|
|
return regular, secrets
|
|
|
|
regular_config, secrets_config = separate_secrets(plugin_config, secret_fields)
|
|
|
|
# Get current configs
|
|
current_config = api_v3.config_manager.load_config()
|
|
current_secrets = api_v3.config_manager.get_raw_file_content('secrets')
|
|
|
|
# Deep merge plugin configuration in main config (preserves nested structures)
|
|
if plugin_id not in current_config:
|
|
current_config[plugin_id] = {}
|
|
|
|
# Debug logging for live_priority before merge
|
|
if plugin_id == 'football-scoreboard':
|
|
print(f"[DEBUG] Before merge - current NFL live_priority: {current_config[plugin_id].get('nfl', {}).get('live_priority')}")
|
|
print(f"[DEBUG] Before merge - regular_config NFL live_priority: {regular_config.get('nfl', {}).get('live_priority')}")
|
|
|
|
current_config[plugin_id] = deep_merge(current_config[plugin_id], regular_config)
|
|
|
|
# Debug logging for live_priority after merge
|
|
if plugin_id == 'football-scoreboard':
|
|
print(f"[DEBUG] After merge - NFL live_priority: {current_config[plugin_id].get('nfl', {}).get('live_priority')}")
|
|
print(f"[DEBUG] After merge - NCAA FB live_priority: {current_config[plugin_id].get('ncaa_fb', {}).get('live_priority')}")
|
|
|
|
# Deep merge plugin secrets in secrets config
|
|
if secrets_config:
|
|
if plugin_id not in current_secrets:
|
|
current_secrets[plugin_id] = {}
|
|
current_secrets[plugin_id] = deep_merge(current_secrets[plugin_id], secrets_config)
|
|
# Save secrets file
|
|
try:
|
|
api_v3.config_manager.save_raw_file_content('secrets', current_secrets)
|
|
except Exception as e:
|
|
# Log the error but don't fail the entire config save
|
|
import logging
|
|
logger = logging.getLogger(__name__)
|
|
logger.error(f"Error saving secrets config for {plugin_id}: {e}", exc_info=True)
|
|
# Return error response
|
|
return error_response(
|
|
ErrorCode.CONFIG_SAVE_FAILED,
|
|
f"Failed to save secrets configuration: {str(e)}",
|
|
status_code=500
|
|
)
|
|
|
|
# Save the updated main config using atomic save
|
|
success, error_msg = _save_config_atomic(api_v3.config_manager, current_config, create_backup=True)
|
|
if not success:
|
|
return error_response(
|
|
ErrorCode.CONFIG_SAVE_FAILED,
|
|
f"Failed to save configuration: {error_msg}",
|
|
status_code=500
|
|
)
|
|
|
|
# If the plugin is loaded, notify it of the config change with merged config
|
|
try:
|
|
if api_v3.plugin_manager:
|
|
plugin_instance = api_v3.plugin_manager.get_plugin(plugin_id)
|
|
if plugin_instance:
|
|
# Reload merged config (includes secrets) and pass the plugin-specific section
|
|
merged_config = api_v3.config_manager.load_config()
|
|
plugin_full_config = merged_config.get(plugin_id, {})
|
|
if hasattr(plugin_instance, 'on_config_change'):
|
|
plugin_instance.on_config_change(plugin_full_config)
|
|
|
|
# Update plugin state manager and call lifecycle methods based on enabled state
|
|
# This ensures the plugin state is synchronized with the config
|
|
enabled = plugin_full_config.get('enabled', plugin_instance.enabled)
|
|
|
|
# Update state manager if available
|
|
if api_v3.plugin_state_manager:
|
|
api_v3.plugin_state_manager.set_plugin_enabled(plugin_id, enabled)
|
|
|
|
# Call lifecycle methods to ensure plugin state matches config
|
|
try:
|
|
if enabled:
|
|
if hasattr(plugin_instance, 'on_enable'):
|
|
plugin_instance.on_enable()
|
|
else:
|
|
if hasattr(plugin_instance, 'on_disable'):
|
|
plugin_instance.on_disable()
|
|
except Exception as lifecycle_error:
|
|
# Log the error but don't fail the save - config is already saved
|
|
import logging
|
|
logging.warning(f"Lifecycle method error for {plugin_id}: {lifecycle_error}", exc_info=True)
|
|
except Exception as hook_err:
|
|
# Do not fail the save if hook fails; just log
|
|
print(f"Warning: on_config_change failed for {plugin_id}: {hook_err}")
|
|
|
|
secret_count = len(secrets_config)
|
|
message = f'Plugin {plugin_id} configuration saved successfully'
|
|
if secret_count > 0:
|
|
message += f' ({secret_count} secret field(s) saved to config_secrets.json)'
|
|
|
|
return success_response(message=message)
|
|
except Exception as e:
|
|
from src.web_interface.errors import WebInterfaceError
|
|
error = WebInterfaceError.from_exception(e, ErrorCode.CONFIG_SAVE_FAILED)
|
|
if api_v3.operation_history:
|
|
api_v3.operation_history.record_operation(
|
|
"configure",
|
|
plugin_id=data.get('plugin_id') if 'data' in locals() else None,
|
|
status="failed",
|
|
error=str(e)
|
|
)
|
|
return error_response(
|
|
error.error_code,
|
|
error.message,
|
|
details=error.details,
|
|
context=error.context,
|
|
status_code=500
|
|
)
|
|
|
|
@api_v3.route('/plugins/schema', methods=['GET'])
|
|
def get_plugin_schema():
|
|
"""Get plugin configuration schema"""
|
|
try:
|
|
plugin_id = request.args.get('plugin_id')
|
|
if not plugin_id:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id required'}), 400
|
|
|
|
# Get schema manager instance
|
|
schema_mgr = api_v3.schema_manager
|
|
if not schema_mgr:
|
|
return jsonify({'status': 'error', 'message': 'Schema manager not initialized'}), 500
|
|
|
|
# Load schema using SchemaManager (uses caching)
|
|
schema = schema_mgr.load_schema(plugin_id, use_cache=True)
|
|
|
|
if schema:
|
|
return jsonify({'status': 'success', 'data': {'schema': schema}})
|
|
|
|
# Return a simple default schema if file not found
|
|
default_schema = {
|
|
'type': 'object',
|
|
'properties': {
|
|
'enabled': {
|
|
'type': 'boolean',
|
|
'title': 'Enable Plugin',
|
|
'description': 'Enable or disable this plugin',
|
|
'default': True
|
|
},
|
|
'display_duration': {
|
|
'type': 'integer',
|
|
'title': 'Display Duration',
|
|
'description': 'How long to show content (seconds)',
|
|
'minimum': 5,
|
|
'maximum': 300,
|
|
'default': 30
|
|
}
|
|
}
|
|
}
|
|
|
|
return jsonify({'status': 'success', 'data': {'schema': default_schema}})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in get_plugin_schema: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/config/reset', methods=['POST'])
|
|
def reset_plugin_config():
|
|
"""Reset plugin configuration to schema defaults"""
|
|
try:
|
|
if not api_v3.config_manager:
|
|
return jsonify({'status': 'error', 'message': 'Config manager not initialized'}), 500
|
|
|
|
data = request.get_json() or {}
|
|
plugin_id = data.get('plugin_id')
|
|
preserve_secrets = data.get('preserve_secrets', True)
|
|
|
|
if not plugin_id:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id required'}), 400
|
|
|
|
# Get schema manager instance
|
|
schema_mgr = api_v3.schema_manager
|
|
if not schema_mgr:
|
|
return jsonify({'status': 'error', 'message': 'Schema manager not initialized'}), 500
|
|
|
|
# Generate defaults from schema
|
|
defaults = schema_mgr.generate_default_config(plugin_id, use_cache=True)
|
|
|
|
# Get current configs
|
|
current_config = api_v3.config_manager.load_config()
|
|
current_secrets = api_v3.config_manager.get_raw_file_content('secrets')
|
|
|
|
# Load schema to identify secret fields
|
|
schema = schema_mgr.load_schema(plugin_id, use_cache=True)
|
|
secret_fields = set()
|
|
|
|
def find_secret_fields(properties, prefix=''):
|
|
"""Recursively find fields marked with x-secret: true"""
|
|
fields = set()
|
|
if not isinstance(properties, dict):
|
|
return fields
|
|
for field_name, field_props in properties.items():
|
|
full_path = f"{prefix}.{field_name}" if prefix else field_name
|
|
if isinstance(field_props, dict) and field_props.get('x-secret', False):
|
|
fields.add(full_path)
|
|
if isinstance(field_props, dict) and field_props.get('type') == 'object' and 'properties' in field_props:
|
|
fields.update(find_secret_fields(field_props['properties'], full_path))
|
|
return fields
|
|
|
|
if schema and 'properties' in schema:
|
|
secret_fields = find_secret_fields(schema['properties'])
|
|
|
|
# Separate defaults into regular and secret configs
|
|
def separate_secrets(config, secrets_set, prefix=''):
|
|
"""Recursively separate secret fields from regular config"""
|
|
regular = {}
|
|
secrets = {}
|
|
for key, value in config.items():
|
|
full_path = f"{prefix}.{key}" if prefix else key
|
|
if isinstance(value, dict):
|
|
nested_regular, nested_secrets = separate_secrets(value, secrets_set, full_path)
|
|
if nested_regular:
|
|
regular[key] = nested_regular
|
|
if nested_secrets:
|
|
secrets[key] = nested_secrets
|
|
elif full_path in secrets_set:
|
|
secrets[key] = value
|
|
else:
|
|
regular[key] = value
|
|
return regular, secrets
|
|
|
|
default_regular, default_secrets = separate_secrets(defaults, secret_fields)
|
|
|
|
# Update main config with defaults
|
|
current_config[plugin_id] = default_regular
|
|
|
|
# Update secrets config (preserve existing secrets if preserve_secrets=True)
|
|
if preserve_secrets:
|
|
# Keep existing secrets for this plugin
|
|
if plugin_id in current_secrets:
|
|
# Merge defaults with existing secrets
|
|
existing_secrets = current_secrets[plugin_id]
|
|
for key, value in default_secrets.items():
|
|
if key not in existing_secrets or not existing_secrets[key]:
|
|
existing_secrets[key] = value
|
|
else:
|
|
current_secrets[plugin_id] = default_secrets
|
|
else:
|
|
# Replace all secrets with defaults
|
|
current_secrets[plugin_id] = default_secrets
|
|
|
|
# Save updated configs
|
|
api_v3.config_manager.save_config(current_config)
|
|
if default_secrets or not preserve_secrets:
|
|
api_v3.config_manager.save_raw_file_content('secrets', current_secrets)
|
|
|
|
# Notify plugin of config change if loaded
|
|
try:
|
|
if api_v3.plugin_manager:
|
|
plugin_instance = api_v3.plugin_manager.get_plugin(plugin_id)
|
|
if plugin_instance:
|
|
merged_config = api_v3.config_manager.load_config()
|
|
plugin_full_config = merged_config.get(plugin_id, {})
|
|
if hasattr(plugin_instance, 'on_config_change'):
|
|
plugin_instance.on_config_change(plugin_full_config)
|
|
except Exception as hook_err:
|
|
print(f"Warning: on_config_change failed for {plugin_id}: {hook_err}")
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': f'Plugin {plugin_id} configuration reset to defaults',
|
|
'data': {'config': defaults}
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in reset_plugin_config: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/action', methods=['POST'])
|
|
def execute_plugin_action():
|
|
"""Execute a plugin-defined action (e.g., authentication)"""
|
|
try:
|
|
data = request.get_json() or {}
|
|
plugin_id = data.get('plugin_id')
|
|
action_id = data.get('action_id')
|
|
action_params = data.get('params', {})
|
|
|
|
if not plugin_id or not action_id:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id and action_id required'}), 400
|
|
|
|
# Get plugin directory
|
|
if api_v3.plugin_manager:
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_id)
|
|
else:
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_id
|
|
|
|
if not plugin_dir or not Path(plugin_dir).exists():
|
|
return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404
|
|
|
|
# Load manifest to get action definition
|
|
manifest_path = Path(plugin_dir) / 'manifest.json'
|
|
if not manifest_path.exists():
|
|
return jsonify({'status': 'error', 'message': 'Plugin manifest not found'}), 404
|
|
|
|
with open(manifest_path, 'r', encoding='utf-8') as f:
|
|
manifest = json.load(f)
|
|
|
|
web_ui_actions = manifest.get('web_ui_actions', [])
|
|
action_def = None
|
|
for action in web_ui_actions:
|
|
if action.get('id') == action_id:
|
|
action_def = action
|
|
break
|
|
|
|
if not action_def:
|
|
return jsonify({'status': 'error', 'message': f'Action {action_id} not found in plugin manifest'}), 404
|
|
|
|
# Set LEDMATRIX_ROOT environment variable
|
|
env = os.environ.copy()
|
|
env['LEDMATRIX_ROOT'] = str(PROJECT_ROOT)
|
|
|
|
# Execute action based on type
|
|
action_type = action_def.get('type', 'script')
|
|
|
|
if action_type == 'script':
|
|
# Execute a Python script
|
|
script_path = action_def.get('script')
|
|
if not script_path:
|
|
return jsonify({'status': 'error', 'message': 'Script path not defined for action'}), 400
|
|
|
|
script_file = Path(plugin_dir) / script_path
|
|
if not script_file.exists():
|
|
return jsonify({'status': 'error', 'message': f'Script not found: {script_path}'}), 404
|
|
|
|
# Handle multi-step actions (like Spotify OAuth)
|
|
step = action_params.get('step')
|
|
|
|
if step == '2' and action_params.get('redirect_url'):
|
|
# Step 2: Complete authentication with redirect URL
|
|
redirect_url = action_params.get('redirect_url')
|
|
import tempfile
|
|
import json as json_lib
|
|
|
|
redirect_url_escaped = json_lib.dumps(redirect_url)
|
|
with tempfile.NamedTemporaryFile(mode='w', suffix='.py', delete=False) as wrapper:
|
|
wrapper.write(f'''import sys
|
|
import subprocess
|
|
import os
|
|
|
|
# Set LEDMATRIX_ROOT
|
|
os.environ['LEDMATRIX_ROOT'] = r"{PROJECT_ROOT}"
|
|
|
|
# Run the script and provide redirect URL
|
|
proc = subprocess.Popen(
|
|
[sys.executable, r"{script_file}"],
|
|
stdin=subprocess.PIPE,
|
|
stdout=subprocess.PIPE,
|
|
stderr=subprocess.STDOUT,
|
|
text=True,
|
|
env=os.environ
|
|
)
|
|
|
|
# Send redirect URL to stdin
|
|
redirect_url = {redirect_url_escaped}
|
|
stdout, _ = proc.communicate(input=redirect_url + "\\n", timeout=120)
|
|
print(stdout)
|
|
sys.exit(proc.returncode)
|
|
''')
|
|
wrapper_path = wrapper.name
|
|
|
|
try:
|
|
result = subprocess.run(
|
|
['python3', wrapper_path],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=120,
|
|
env=env
|
|
)
|
|
os.unlink(wrapper_path)
|
|
|
|
if result.returncode == 0:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': action_def.get('success_message', 'Action completed successfully'),
|
|
'output': result.stdout
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': action_def.get('error_message', 'Action failed'),
|
|
'output': result.stdout + result.stderr
|
|
}), 400
|
|
except subprocess.TimeoutExpired:
|
|
if os.path.exists(wrapper_path):
|
|
os.unlink(wrapper_path)
|
|
return jsonify({'status': 'error', 'message': 'Action timed out'}), 408
|
|
else:
|
|
# Regular script execution - pass params via stdin if provided
|
|
if action_params:
|
|
# Pass params as JSON via stdin
|
|
import tempfile
|
|
import json as json_lib
|
|
|
|
params_json = json_lib.dumps(action_params)
|
|
with tempfile.NamedTemporaryFile(mode='w', suffix='.py', delete=False) as wrapper:
|
|
wrapper.write(f'''import sys
|
|
import subprocess
|
|
import os
|
|
import json
|
|
|
|
# Set LEDMATRIX_ROOT
|
|
os.environ['LEDMATRIX_ROOT'] = r"{PROJECT_ROOT}"
|
|
|
|
# Run the script and provide params as JSON via stdin
|
|
proc = subprocess.Popen(
|
|
[sys.executable, r"{script_file}"],
|
|
stdin=subprocess.PIPE,
|
|
stdout=subprocess.PIPE,
|
|
stderr=subprocess.STDOUT,
|
|
text=True,
|
|
env=os.environ
|
|
)
|
|
|
|
# Send params as JSON to stdin
|
|
params = {params_json}
|
|
stdout, _ = proc.communicate(input=json.dumps(params), timeout=120)
|
|
print(stdout)
|
|
sys.exit(proc.returncode)
|
|
''')
|
|
wrapper_path = wrapper.name
|
|
|
|
try:
|
|
result = subprocess.run(
|
|
['python3', wrapper_path],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=120,
|
|
env=env
|
|
)
|
|
os.unlink(wrapper_path)
|
|
|
|
# Try to parse output as JSON
|
|
try:
|
|
output_data = json.loads(result.stdout)
|
|
if result.returncode == 0:
|
|
return jsonify(output_data)
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': output_data.get('message', action_def.get('error_message', 'Action failed')),
|
|
'output': result.stdout + result.stderr
|
|
}), 400
|
|
except json.JSONDecodeError:
|
|
# Output is not JSON, return as text
|
|
if result.returncode == 0:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': action_def.get('success_message', 'Action completed successfully'),
|
|
'output': result.stdout
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': action_def.get('error_message', 'Action failed'),
|
|
'output': result.stdout + result.stderr
|
|
}), 400
|
|
except subprocess.TimeoutExpired:
|
|
if os.path.exists(wrapper_path):
|
|
os.unlink(wrapper_path)
|
|
return jsonify({'status': 'error', 'message': 'Action timed out'}), 408
|
|
else:
|
|
# No params - check for OAuth flow first, then run script normally
|
|
# Step 1: Get initial data (like auth URL)
|
|
# For OAuth flows, we might need to import the script as a module
|
|
if action_def.get('oauth_flow'):
|
|
# Import script as module to get auth URL
|
|
import sys
|
|
import importlib.util
|
|
|
|
spec = importlib.util.spec_from_file_location("plugin_action", script_file)
|
|
action_module = importlib.util.module_from_spec(spec)
|
|
sys.modules["plugin_action"] = action_module
|
|
|
|
try:
|
|
spec.loader.exec_module(action_module)
|
|
|
|
# Try to get auth URL using common patterns
|
|
auth_url = None
|
|
if hasattr(action_module, 'get_auth_url'):
|
|
auth_url = action_module.get_auth_url()
|
|
elif hasattr(action_module, 'load_spotify_credentials'):
|
|
# Spotify-specific pattern
|
|
client_id, client_secret, redirect_uri = action_module.load_spotify_credentials()
|
|
if all([client_id, client_secret, redirect_uri]):
|
|
from spotipy.oauth2 import SpotifyOAuth
|
|
sp_oauth = SpotifyOAuth(
|
|
client_id=client_id,
|
|
client_secret=client_secret,
|
|
redirect_uri=redirect_uri,
|
|
scope=getattr(action_module, 'SCOPE', ''),
|
|
cache_path=getattr(action_module, 'SPOTIFY_AUTH_CACHE_PATH', None),
|
|
open_browser=False
|
|
)
|
|
auth_url = sp_oauth.get_authorize_url()
|
|
|
|
if auth_url:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': action_def.get('step1_message', 'Authorization URL generated'),
|
|
'auth_url': auth_url,
|
|
'requires_step2': True
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Could not generate authorization URL'
|
|
}), 400
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error executing action step 1: {e}")
|
|
print(error_details)
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error executing action: {str(e)}'
|
|
}), 500
|
|
else:
|
|
# Simple script execution
|
|
result = subprocess.run(
|
|
['python3', str(script_file)],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=60,
|
|
env=env
|
|
)
|
|
|
|
# Try to parse output as JSON
|
|
try:
|
|
import json as json_module
|
|
output_data = json_module.loads(result.stdout)
|
|
if result.returncode == 0:
|
|
return jsonify(output_data)
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': output_data.get('message', action_def.get('error_message', 'Action failed')),
|
|
'output': result.stdout + result.stderr
|
|
}), 400
|
|
except json.JSONDecodeError:
|
|
# Output is not JSON, return as text
|
|
if result.returncode == 0:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': action_def.get('success_message', 'Action completed successfully'),
|
|
'output': result.stdout
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': action_def.get('error_message', 'Action failed'),
|
|
'output': result.stdout + result.stderr
|
|
}), 400
|
|
|
|
elif action_type == 'endpoint':
|
|
# Call a plugin-defined HTTP endpoint (future feature)
|
|
return jsonify({'status': 'error', 'message': 'Endpoint actions not yet implemented'}), 501
|
|
|
|
else:
|
|
return jsonify({'status': 'error', 'message': f'Unknown action type: {action_type}'}), 400
|
|
|
|
except subprocess.TimeoutExpired:
|
|
return jsonify({'status': 'error', 'message': 'Action timed out'}), 408
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in execute_plugin_action: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/authenticate/spotify', methods=['POST'])
|
|
def authenticate_spotify():
|
|
"""Run Spotify authentication script"""
|
|
try:
|
|
data = request.get_json() or {}
|
|
redirect_url = data.get('redirect_url', '').strip()
|
|
|
|
# Get plugin directory
|
|
plugin_id = 'ledmatrix-music'
|
|
if api_v3.plugin_manager:
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_id)
|
|
else:
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_id
|
|
|
|
if not plugin_dir or not Path(plugin_dir).exists():
|
|
return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404
|
|
|
|
auth_script = Path(plugin_dir) / 'authenticate_spotify.py'
|
|
if not auth_script.exists():
|
|
return jsonify({'status': 'error', 'message': 'Authentication script not found'}), 404
|
|
|
|
# Set LEDMATRIX_ROOT environment variable
|
|
env = os.environ.copy()
|
|
env['LEDMATRIX_ROOT'] = str(PROJECT_ROOT)
|
|
|
|
if redirect_url:
|
|
# Step 2: Complete authentication with redirect URL
|
|
# Create a wrapper script that provides the redirect URL as input
|
|
import tempfile
|
|
|
|
# Create a wrapper script that provides the redirect URL
|
|
import json
|
|
redirect_url_escaped = json.dumps(redirect_url) # Properly escape the URL
|
|
with tempfile.NamedTemporaryFile(mode='w', suffix='.py', delete=False) as wrapper:
|
|
wrapper.write(f'''import sys
|
|
import subprocess
|
|
import os
|
|
|
|
# Set LEDMATRIX_ROOT
|
|
os.environ['LEDMATRIX_ROOT'] = r"{PROJECT_ROOT}"
|
|
|
|
# Run the auth script and provide redirect URL
|
|
proc = subprocess.Popen(
|
|
[sys.executable, r"{auth_script}"],
|
|
stdin=subprocess.PIPE,
|
|
stdout=subprocess.PIPE,
|
|
stderr=subprocess.STDOUT,
|
|
text=True,
|
|
env=os.environ
|
|
)
|
|
|
|
# Send redirect URL to stdin
|
|
redirect_url = {redirect_url_escaped}
|
|
stdout, _ = proc.communicate(input=redirect_url + "\\n", timeout=120)
|
|
print(stdout)
|
|
sys.exit(proc.returncode)
|
|
''')
|
|
wrapper_path = wrapper.name
|
|
|
|
try:
|
|
result = subprocess.run(
|
|
['python3', wrapper_path],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=120,
|
|
env=env
|
|
)
|
|
os.unlink(wrapper_path)
|
|
|
|
if result.returncode == 0:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': 'Spotify authentication completed successfully',
|
|
'output': result.stdout
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Spotify authentication failed',
|
|
'output': result.stdout + result.stderr
|
|
}), 400
|
|
except subprocess.TimeoutExpired:
|
|
if os.path.exists(wrapper_path):
|
|
os.unlink(wrapper_path)
|
|
return jsonify({'status': 'error', 'message': 'Authentication timed out'}), 408
|
|
else:
|
|
# Step 1: Get authorization URL
|
|
# Import the script's functions directly to get the auth URL
|
|
import sys
|
|
import importlib.util
|
|
|
|
# Load the authentication script as a module
|
|
spec = importlib.util.spec_from_file_location("auth_spotify", auth_script)
|
|
auth_module = importlib.util.module_from_spec(spec)
|
|
sys.modules["auth_spotify"] = auth_module
|
|
|
|
# Set LEDMATRIX_ROOT before loading
|
|
os.environ['LEDMATRIX_ROOT'] = str(PROJECT_ROOT)
|
|
|
|
try:
|
|
spec.loader.exec_module(auth_module)
|
|
|
|
# Get credentials and create OAuth object
|
|
client_id, client_secret, redirect_uri = auth_module.load_spotify_credentials()
|
|
if not all([client_id, client_secret, redirect_uri]):
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Could not load Spotify credentials. Please check config/config_secrets.json.'
|
|
}), 400
|
|
|
|
from spotipy.oauth2 import SpotifyOAuth
|
|
sp_oauth = SpotifyOAuth(
|
|
client_id=client_id,
|
|
client_secret=client_secret,
|
|
redirect_uri=redirect_uri,
|
|
scope=auth_module.SCOPE,
|
|
cache_path=auth_module.SPOTIFY_AUTH_CACHE_PATH,
|
|
open_browser=False
|
|
)
|
|
|
|
auth_url = sp_oauth.get_authorize_url()
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': 'Authorization URL generated',
|
|
'auth_url': auth_url
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error getting Spotify auth URL: {e}")
|
|
print(error_details)
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error generating authorization URL: {str(e)}'
|
|
}), 500
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in authenticate_spotify: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/authenticate/ytm', methods=['POST'])
|
|
def authenticate_ytm():
|
|
"""Run YouTube Music authentication script"""
|
|
try:
|
|
# Get plugin directory
|
|
plugin_id = 'ledmatrix-music'
|
|
if api_v3.plugin_manager:
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_id)
|
|
else:
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_id
|
|
|
|
if not plugin_dir or not Path(plugin_dir).exists():
|
|
return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404
|
|
|
|
auth_script = Path(plugin_dir) / 'authenticate_ytm.py'
|
|
if not auth_script.exists():
|
|
return jsonify({'status': 'error', 'message': 'Authentication script not found'}), 404
|
|
|
|
# Set LEDMATRIX_ROOT environment variable
|
|
env = os.environ.copy()
|
|
env['LEDMATRIX_ROOT'] = str(PROJECT_ROOT)
|
|
|
|
# Run the authentication script
|
|
result = subprocess.run(
|
|
['python3', str(auth_script)],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=60,
|
|
env=env
|
|
)
|
|
|
|
if result.returncode == 0:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': 'YouTube Music authentication completed successfully',
|
|
'output': result.stdout
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'YouTube Music authentication failed',
|
|
'output': result.stdout + result.stderr
|
|
}), 400
|
|
|
|
except subprocess.TimeoutExpired:
|
|
return jsonify({'status': 'error', 'message': 'Authentication timed out'}), 408
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in authenticate_ytm: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/fonts/catalog', methods=['GET'])
|
|
def get_fonts_catalog():
|
|
"""Get fonts catalog"""
|
|
try:
|
|
# Check cache first (5 minute TTL)
|
|
try:
|
|
from web_interface.cache import get_cached, set_cached
|
|
cached_result = get_cached('fonts_catalog', ttl_seconds=300)
|
|
if cached_result is not None:
|
|
return jsonify({'status': 'success', 'data': {'catalog': cached_result}})
|
|
except ImportError:
|
|
# Cache not available, continue without caching
|
|
get_cached = None
|
|
set_cached = None
|
|
|
|
# Try to import freetype, but continue without it if unavailable
|
|
try:
|
|
import freetype
|
|
freetype_available = True
|
|
except ImportError:
|
|
freetype_available = False
|
|
|
|
# Scan assets/fonts directory for actual font files
|
|
fonts_dir = PROJECT_ROOT / "assets" / "fonts"
|
|
catalog = {}
|
|
|
|
if fonts_dir.exists() and fonts_dir.is_dir():
|
|
for filename in os.listdir(fonts_dir):
|
|
if filename.endswith(('.ttf', '.otf', '.bdf')):
|
|
filepath = fonts_dir / filename
|
|
# Generate family name from filename (without extension)
|
|
family_name = os.path.splitext(filename)[0]
|
|
|
|
# Try to get font metadata using freetype (for TTF/OTF)
|
|
metadata = {}
|
|
if filename.endswith(('.ttf', '.otf')) and freetype_available:
|
|
try:
|
|
face = freetype.Face(str(filepath))
|
|
if face.valid:
|
|
# Get font family name from font file
|
|
family_name_from_font = face.family_name.decode('utf-8') if face.family_name else family_name
|
|
metadata = {
|
|
'family': family_name_from_font,
|
|
'style': face.style_name.decode('utf-8') if face.style_name else 'Regular',
|
|
'num_glyphs': face.num_glyphs,
|
|
'units_per_em': face.units_per_EM
|
|
}
|
|
# Use font's family name if available
|
|
if family_name_from_font:
|
|
family_name = family_name_from_font
|
|
except Exception:
|
|
# If freetype fails, use filename-based name
|
|
pass
|
|
|
|
# Store relative path from project root
|
|
relative_path = str(filepath.relative_to(PROJECT_ROOT))
|
|
catalog[family_name] = {
|
|
'path': relative_path,
|
|
'type': 'ttf' if filename.endswith('.ttf') else 'otf' if filename.endswith('.otf') else 'bdf',
|
|
'metadata': metadata if metadata else None
|
|
}
|
|
|
|
# Cache the result (5 minute TTL) if available
|
|
if set_cached:
|
|
try:
|
|
set_cached('fonts_catalog', catalog, ttl_seconds=300)
|
|
except Exception:
|
|
pass # Cache write failed, but continue
|
|
|
|
return jsonify({'status': 'success', 'data': {'catalog': catalog}})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/fonts/tokens', methods=['GET'])
|
|
def get_font_tokens():
|
|
"""Get font size tokens"""
|
|
try:
|
|
# This would integrate with the actual font system
|
|
# For now, return sample tokens
|
|
tokens = {
|
|
'xs': 6,
|
|
'sm': 8,
|
|
'md': 10,
|
|
'lg': 12,
|
|
'xl': 14,
|
|
'xxl': 16
|
|
}
|
|
return jsonify({'status': 'success', 'data': {'tokens': tokens}})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/fonts/overrides', methods=['GET'])
|
|
def get_fonts_overrides():
|
|
"""Get font overrides"""
|
|
try:
|
|
# This would integrate with the actual font system
|
|
# For now, return empty overrides
|
|
overrides = {}
|
|
return jsonify({'status': 'success', 'data': {'overrides': overrides}})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/fonts/overrides', methods=['POST'])
|
|
def save_fonts_overrides():
|
|
"""Save font overrides"""
|
|
try:
|
|
data = request.get_json()
|
|
if not data:
|
|
return jsonify({'status': 'error', 'message': 'No data provided'}), 400
|
|
|
|
# This would integrate with the actual font system
|
|
return jsonify({'status': 'success', 'message': 'Font overrides saved'})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/fonts/overrides/<element_key>', methods=['DELETE'])
|
|
def delete_font_override(element_key):
|
|
"""Delete font override"""
|
|
try:
|
|
# This would integrate with the actual font system
|
|
return jsonify({'status': 'success', 'message': f'Font override for {element_key} deleted'})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/fonts/upload', methods=['POST'])
|
|
def upload_font():
|
|
"""Upload font file"""
|
|
try:
|
|
if 'font_file' not in request.files:
|
|
return jsonify({'status': 'error', 'message': 'No font file provided'}), 400
|
|
|
|
font_file = request.files['font_file']
|
|
if font_file.filename == '':
|
|
return jsonify({'status': 'error', 'message': 'No file selected'}), 400
|
|
|
|
# Validate filename
|
|
is_valid, error_msg = validate_file_upload(
|
|
font_file.filename,
|
|
max_size_mb=10,
|
|
allowed_extensions=['.ttf', '.otf', '.bdf']
|
|
)
|
|
if not is_valid:
|
|
return jsonify({'status': 'error', 'message': error_msg}), 400
|
|
|
|
font_file = request.files['font_file']
|
|
font_family = request.form.get('font_family', '')
|
|
|
|
if not font_file or not font_family:
|
|
return jsonify({'status': 'error', 'message': 'Font file and family name required'}), 400
|
|
|
|
# Validate file type
|
|
allowed_extensions = ['.ttf', '.bdf']
|
|
file_extension = font_file.filename.lower().split('.')[-1]
|
|
if f'.{file_extension}' not in allowed_extensions:
|
|
return jsonify({'status': 'error', 'message': 'Only .ttf and .bdf files are allowed'}), 400
|
|
|
|
# Validate font family name
|
|
if not font_family.replace('_', '').replace('-', '').isalnum():
|
|
return jsonify({'status': 'error', 'message': 'Font family name must contain only letters, numbers, underscores, and hyphens'}), 400
|
|
|
|
# This would integrate with the actual font system to save the file
|
|
# For now, just return success
|
|
return jsonify({'status': 'success', 'message': f'Font {font_family} uploaded successfully', 'font_family': font_family})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/assets/upload', methods=['POST'])
|
|
def upload_plugin_asset():
|
|
"""Upload asset files for a plugin"""
|
|
try:
|
|
plugin_id = request.form.get('plugin_id')
|
|
if not plugin_id:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id is required'}), 400
|
|
|
|
if 'files' not in request.files:
|
|
return jsonify({'status': 'error', 'message': 'No files provided'}), 400
|
|
|
|
files = request.files.getlist('files')
|
|
if not files or all(not f.filename for f in files):
|
|
return jsonify({'status': 'error', 'message': 'No files provided'}), 400
|
|
|
|
# Validate file count
|
|
if len(files) > 10:
|
|
return jsonify({'status': 'error', 'message': 'Maximum 10 files per upload'}), 400
|
|
|
|
# Setup plugin assets directory
|
|
assets_dir = PROJECT_ROOT / 'assets' / 'plugins' / plugin_id / 'uploads'
|
|
assets_dir.mkdir(parents=True, exist_ok=True)
|
|
|
|
# Load metadata file
|
|
metadata_file = assets_dir / '.metadata.json'
|
|
if metadata_file.exists():
|
|
with open(metadata_file, 'r') as f:
|
|
metadata = json.load(f)
|
|
else:
|
|
metadata = {}
|
|
|
|
uploaded_files = []
|
|
total_size = 0
|
|
max_size_per_file = 5 * 1024 * 1024 # 5MB
|
|
max_total_size = 50 * 1024 * 1024 # 50MB
|
|
|
|
# Calculate current total size
|
|
for entry in metadata.values():
|
|
if 'size' in entry:
|
|
total_size += entry.get('size', 0)
|
|
|
|
for file in files:
|
|
if not file.filename:
|
|
continue
|
|
|
|
# Validate file type
|
|
allowed_extensions = ['.png', '.jpg', '.jpeg', '.bmp', '.gif']
|
|
file_ext = '.' + file.filename.lower().split('.')[-1]
|
|
if file_ext not in allowed_extensions:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Invalid file type: {file_ext}. Allowed: {allowed_extensions}'
|
|
}), 400
|
|
|
|
# Read file to check size and validate
|
|
file.seek(0, os.SEEK_END)
|
|
file_size = file.tell()
|
|
file.seek(0)
|
|
|
|
if file_size > max_size_per_file:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'File {file.filename} exceeds 5MB limit'
|
|
}), 400
|
|
|
|
if total_size + file_size > max_total_size:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Upload would exceed 50MB total storage limit'
|
|
}), 400
|
|
|
|
# Validate file is actually an image (check magic bytes)
|
|
file_content = file.read(8)
|
|
file.seek(0)
|
|
is_valid_image = False
|
|
if file_content.startswith(b'\x89PNG\r\n\x1a\n'): # PNG
|
|
is_valid_image = True
|
|
elif file_content[:2] == b'\xff\xd8': # JPEG
|
|
is_valid_image = True
|
|
elif file_content[:2] == b'BM': # BMP
|
|
is_valid_image = True
|
|
elif file_content[:6] in [b'GIF87a', b'GIF89a']: # GIF
|
|
is_valid_image = True
|
|
|
|
if not is_valid_image:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'File {file.filename} is not a valid image file'
|
|
}), 400
|
|
|
|
# Generate unique filename
|
|
timestamp = int(time.time())
|
|
file_hash = hashlib.md5(file_content + file.filename.encode()).hexdigest()[:8]
|
|
safe_filename = f"image_{timestamp}_{file_hash}{file_ext}"
|
|
file_path = assets_dir / safe_filename
|
|
|
|
# Ensure filename is unique
|
|
counter = 1
|
|
while file_path.exists():
|
|
safe_filename = f"image_{timestamp}_{file_hash}_{counter}{file_ext}"
|
|
file_path = assets_dir / safe_filename
|
|
counter += 1
|
|
|
|
# Save file
|
|
file.save(str(file_path))
|
|
|
|
# Make file readable
|
|
os.chmod(file_path, 0o644)
|
|
|
|
# Generate unique ID
|
|
image_id = str(uuid.uuid4())
|
|
|
|
# Store metadata
|
|
relative_path = f"assets/plugins/{plugin_id}/uploads/{safe_filename}"
|
|
metadata[image_id] = {
|
|
'id': image_id,
|
|
'filename': safe_filename,
|
|
'path': relative_path,
|
|
'size': file_size,
|
|
'uploaded_at': datetime.utcnow().isoformat() + 'Z',
|
|
'original_filename': file.filename
|
|
}
|
|
|
|
uploaded_files.append({
|
|
'id': image_id,
|
|
'filename': safe_filename,
|
|
'path': relative_path,
|
|
'size': file_size,
|
|
'uploaded_at': metadata[image_id]['uploaded_at']
|
|
})
|
|
|
|
total_size += file_size
|
|
|
|
# Save metadata
|
|
with open(metadata_file, 'w') as f:
|
|
json.dump(metadata, f, indent=2)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'uploaded_files': uploaded_files,
|
|
'total_files': len(metadata)
|
|
})
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
return jsonify({'status': 'error', 'message': str(e), 'traceback': traceback.format_exc()}), 500
|
|
|
|
@api_v3.route('/plugins/of-the-day/json/upload', methods=['POST'])
|
|
def upload_of_the_day_json():
|
|
"""Upload JSON files for of-the-day plugin"""
|
|
try:
|
|
if 'files' not in request.files:
|
|
return jsonify({'status': 'error', 'message': 'No files provided'}), 400
|
|
|
|
files = request.files.getlist('files')
|
|
if not files or all(not f.filename for f in files):
|
|
return jsonify({'status': 'error', 'message': 'No files provided'}), 400
|
|
|
|
# Get plugin directory
|
|
plugin_id = 'ledmatrix-of-the-day'
|
|
if api_v3.plugin_manager:
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_id)
|
|
else:
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_id
|
|
|
|
if not plugin_dir or not Path(plugin_dir).exists():
|
|
return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404
|
|
|
|
# Setup of_the_day directory
|
|
data_dir = Path(plugin_dir) / 'of_the_day'
|
|
data_dir.mkdir(parents=True, exist_ok=True)
|
|
|
|
uploaded_files = []
|
|
max_size_per_file = 5 * 1024 * 1024 # 5MB
|
|
|
|
for file in files:
|
|
if not file.filename:
|
|
continue
|
|
|
|
# Validate file extension
|
|
if not file.filename.lower().endswith('.json'):
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'File {file.filename} must be a JSON file (.json)'
|
|
}), 400
|
|
|
|
# Read and validate file size
|
|
file.seek(0, os.SEEK_END)
|
|
file_size = file.tell()
|
|
file.seek(0)
|
|
|
|
if file_size > max_size_per_file:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'File {file.filename} exceeds 5MB limit'
|
|
}), 400
|
|
|
|
# Read and validate JSON content
|
|
try:
|
|
file_content = file.read().decode('utf-8')
|
|
json_data = json.loads(file_content)
|
|
except json.JSONDecodeError as e:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Invalid JSON in {file.filename}: {str(e)}'
|
|
}), 400
|
|
except UnicodeDecodeError:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'File {file.filename} is not valid UTF-8 text'
|
|
}), 400
|
|
|
|
# Validate JSON structure (must be object with day number keys)
|
|
if not isinstance(json_data, dict):
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'JSON in {file.filename} must be an object with day numbers (1-365) as keys'
|
|
}), 400
|
|
|
|
# Check if keys are valid day numbers
|
|
for key in json_data.keys():
|
|
try:
|
|
day_num = int(key)
|
|
if day_num < 1 or day_num > 365:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Day number {day_num} in {file.filename} is out of range (must be 1-365)'
|
|
}), 400
|
|
except ValueError:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Invalid key "{key}" in {file.filename}: must be a day number (1-365)'
|
|
}), 400
|
|
|
|
# Generate safe filename from original (preserve user's filename)
|
|
original_filename = file.filename
|
|
safe_filename = original_filename.lower().replace(' ', '_')
|
|
# Ensure it's a valid filename
|
|
safe_filename = ''.join(c for c in safe_filename if c.isalnum() or c in '._-')
|
|
if not safe_filename.endswith('.json'):
|
|
safe_filename += '.json'
|
|
|
|
file_path = data_dir / safe_filename
|
|
|
|
# If file exists, add counter
|
|
counter = 1
|
|
base_name = safe_filename.replace('.json', '')
|
|
while file_path.exists():
|
|
safe_filename = f"{base_name}_{counter}.json"
|
|
file_path = data_dir / safe_filename
|
|
counter += 1
|
|
|
|
# Save file
|
|
with open(file_path, 'w', encoding='utf-8') as f:
|
|
json.dump(json_data, f, indent=2, ensure_ascii=False)
|
|
|
|
# Make file readable
|
|
os.chmod(file_path, 0o644)
|
|
|
|
# Extract category name from filename (remove .json extension)
|
|
category_name = safe_filename.replace('.json', '')
|
|
display_name = category_name.replace('_', ' ').title()
|
|
|
|
# Update plugin config to add category
|
|
try:
|
|
sys.path.insert(0, str(plugin_dir))
|
|
from scripts.update_config import add_category_to_config
|
|
add_category_to_config(category_name, f'of_the_day/{safe_filename}', display_name)
|
|
except Exception as e:
|
|
print(f"Warning: Could not update config: {e}")
|
|
# Continue anyway - file is uploaded
|
|
|
|
# Generate file ID (use category name as ID for simplicity)
|
|
file_id = category_name
|
|
|
|
uploaded_files.append({
|
|
'id': file_id,
|
|
'filename': safe_filename,
|
|
'original_filename': original_filename,
|
|
'path': f'of_the_day/{safe_filename}',
|
|
'size': file_size,
|
|
'uploaded_at': datetime.utcnow().isoformat() + 'Z',
|
|
'category_name': category_name,
|
|
'display_name': display_name,
|
|
'entry_count': len(json_data)
|
|
})
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'uploaded_files': uploaded_files,
|
|
'total_files': len(uploaded_files)
|
|
})
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
return jsonify({'status': 'error', 'message': str(e), 'traceback': traceback.format_exc()}), 500
|
|
|
|
@api_v3.route('/plugins/of-the-day/json/delete', methods=['POST'])
|
|
def delete_of_the_day_json():
|
|
"""Delete a JSON file from of-the-day plugin"""
|
|
try:
|
|
data = request.get_json() or {}
|
|
file_id = data.get('file_id') # This is the category_name
|
|
|
|
if not file_id:
|
|
return jsonify({'status': 'error', 'message': 'file_id is required'}), 400
|
|
|
|
# Get plugin directory
|
|
plugin_id = 'ledmatrix-of-the-day'
|
|
if api_v3.plugin_manager:
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_id)
|
|
else:
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_id
|
|
|
|
if not plugin_dir or not Path(plugin_dir).exists():
|
|
return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404
|
|
|
|
data_dir = Path(plugin_dir) / 'of_the_day'
|
|
filename = f"{file_id}.json"
|
|
file_path = data_dir / filename
|
|
|
|
if not file_path.exists():
|
|
return jsonify({'status': 'error', 'message': f'File {filename} not found'}), 404
|
|
|
|
# Delete file
|
|
file_path.unlink()
|
|
|
|
# Update config to remove category
|
|
try:
|
|
sys.path.insert(0, str(plugin_dir))
|
|
from scripts.update_config import remove_category_from_config
|
|
remove_category_from_config(file_id)
|
|
except Exception as e:
|
|
print(f"Warning: Could not update config: {e}")
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': f'File {filename} deleted successfully'
|
|
})
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
return jsonify({'status': 'error', 'message': str(e), 'traceback': traceback.format_exc()}), 500
|
|
|
|
@api_v3.route('/plugins/<plugin_id>/static/<path:file_path>', methods=['GET'])
|
|
def serve_plugin_static(plugin_id, file_path):
|
|
"""Serve static files from plugin directory"""
|
|
try:
|
|
# Get plugin directory
|
|
if api_v3.plugin_manager:
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_id)
|
|
else:
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_id
|
|
|
|
if not plugin_dir or not Path(plugin_dir).exists():
|
|
return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404
|
|
|
|
# Resolve file path (prevent directory traversal)
|
|
plugin_dir = Path(plugin_dir).resolve()
|
|
requested_file = (plugin_dir / file_path).resolve()
|
|
|
|
# Security check: ensure file is within plugin directory
|
|
if not str(requested_file).startswith(str(plugin_dir)):
|
|
return jsonify({'status': 'error', 'message': 'Invalid file path'}), 403
|
|
|
|
# Check if file exists
|
|
if not requested_file.exists() or not requested_file.is_file():
|
|
return jsonify({'status': 'error', 'message': 'File not found'}), 404
|
|
|
|
# Determine content type
|
|
content_type = 'text/plain'
|
|
if file_path.endswith('.html'):
|
|
content_type = 'text/html'
|
|
elif file_path.endswith('.js'):
|
|
content_type = 'application/javascript'
|
|
elif file_path.endswith('.css'):
|
|
content_type = 'text/css'
|
|
elif file_path.endswith('.json'):
|
|
content_type = 'application/json'
|
|
|
|
# Read and return file
|
|
with open(requested_file, 'r', encoding='utf-8') as f:
|
|
content = f.read()
|
|
|
|
return Response(content, mimetype=content_type)
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
return jsonify({'status': 'error', 'message': str(e), 'traceback': traceback.format_exc()}), 500
|
|
|
|
@api_v3.route('/plugins/calendar/upload-credentials', methods=['POST'])
|
|
def upload_calendar_credentials():
|
|
"""Upload credentials.json file for calendar plugin"""
|
|
try:
|
|
if 'file' not in request.files:
|
|
return jsonify({'status': 'error', 'message': 'No file provided'}), 400
|
|
|
|
file = request.files['file']
|
|
if not file or not file.filename:
|
|
return jsonify({'status': 'error', 'message': 'No file provided'}), 400
|
|
|
|
# Validate file extension
|
|
if not file.filename.lower().endswith('.json'):
|
|
return jsonify({'status': 'error', 'message': 'File must be a JSON file (.json)'}), 400
|
|
|
|
# Validate file size (max 1MB for credentials)
|
|
file.seek(0, os.SEEK_END)
|
|
file_size = file.tell()
|
|
file.seek(0)
|
|
|
|
if file_size > 1024 * 1024: # 1MB
|
|
return jsonify({'status': 'error', 'message': 'File exceeds 1MB limit'}), 400
|
|
|
|
# Validate it's valid JSON
|
|
try:
|
|
file_content = file.read()
|
|
file.seek(0)
|
|
json.loads(file_content)
|
|
except json.JSONDecodeError:
|
|
return jsonify({'status': 'error', 'message': 'File is not valid JSON'}), 400
|
|
|
|
# Validate it looks like Google OAuth credentials
|
|
try:
|
|
file.seek(0)
|
|
creds_data = json.loads(file.read())
|
|
file.seek(0)
|
|
|
|
# Check for required Google OAuth fields
|
|
if 'installed' not in creds_data and 'web' not in creds_data:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'File does not appear to be a valid Google OAuth credentials file'
|
|
}), 400
|
|
except Exception:
|
|
pass # Continue even if validation fails
|
|
|
|
# Get plugin directory
|
|
plugin_id = 'calendar'
|
|
if api_v3.plugin_manager:
|
|
plugin_dir = api_v3.plugin_manager.get_plugin_directory(plugin_id)
|
|
else:
|
|
plugin_dir = PROJECT_ROOT / 'plugins' / plugin_id
|
|
|
|
if not plugin_dir or not Path(plugin_dir).exists():
|
|
return jsonify({'status': 'error', 'message': f'Plugin {plugin_id} not found'}), 404
|
|
|
|
# Save file to plugin directory
|
|
credentials_path = Path(plugin_dir) / 'credentials.json'
|
|
|
|
# Backup existing file if it exists
|
|
if credentials_path.exists():
|
|
backup_path = Path(plugin_dir) / f'credentials.json.backup.{int(time.time())}'
|
|
import shutil
|
|
shutil.copy2(credentials_path, backup_path)
|
|
|
|
# Save new file
|
|
file.save(str(credentials_path))
|
|
|
|
# Set proper permissions
|
|
os.chmod(credentials_path, 0o600) # Read/write for owner only
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': 'Credentials file uploaded successfully',
|
|
'path': str(credentials_path)
|
|
})
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in upload_calendar_credentials: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/plugins/assets/delete', methods=['POST'])
|
|
def delete_plugin_asset():
|
|
"""Delete an asset file for a plugin"""
|
|
try:
|
|
data = request.get_json()
|
|
plugin_id = data.get('plugin_id')
|
|
image_id = data.get('image_id')
|
|
|
|
if not plugin_id or not image_id:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id and image_id are required'}), 400
|
|
|
|
# Get asset directory
|
|
assets_dir = PROJECT_ROOT / 'assets' / 'plugins' / plugin_id / 'uploads'
|
|
metadata_file = assets_dir / '.metadata.json'
|
|
|
|
if not metadata_file.exists():
|
|
return jsonify({'status': 'error', 'message': 'Metadata file not found'}), 404
|
|
|
|
# Load metadata
|
|
with open(metadata_file, 'r') as f:
|
|
metadata = json.load(f)
|
|
|
|
if image_id not in metadata:
|
|
return jsonify({'status': 'error', 'message': 'Image not found'}), 404
|
|
|
|
# Delete file
|
|
file_path = PROJECT_ROOT / metadata[image_id]['path']
|
|
if file_path.exists():
|
|
file_path.unlink()
|
|
|
|
# Remove from metadata
|
|
del metadata[image_id]
|
|
|
|
# Save metadata
|
|
with open(metadata_file, 'w') as f:
|
|
json.dump(metadata, f, indent=2)
|
|
|
|
return jsonify({'status': 'success', 'message': 'Image deleted successfully'})
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
return jsonify({'status': 'error', 'message': str(e), 'traceback': traceback.format_exc()}), 500
|
|
|
|
@api_v3.route('/plugins/assets/list', methods=['GET'])
|
|
def list_plugin_assets():
|
|
"""List asset files for a plugin"""
|
|
try:
|
|
plugin_id = request.args.get('plugin_id')
|
|
if not plugin_id:
|
|
return jsonify({'status': 'error', 'message': 'plugin_id is required'}), 400
|
|
|
|
# Get asset directory
|
|
assets_dir = PROJECT_ROOT / 'assets' / 'plugins' / plugin_id / 'uploads'
|
|
metadata_file = assets_dir / '.metadata.json'
|
|
|
|
if not metadata_file.exists():
|
|
return jsonify({'status': 'success', 'data': {'assets': []}})
|
|
|
|
# Load metadata
|
|
with open(metadata_file, 'r') as f:
|
|
metadata = json.load(f)
|
|
|
|
# Convert to list
|
|
assets = list(metadata.values())
|
|
|
|
return jsonify({'status': 'success', 'data': {'assets': assets}})
|
|
|
|
except Exception as e:
|
|
import traceback
|
|
return jsonify({'status': 'error', 'message': str(e), 'traceback': traceback.format_exc()}), 500
|
|
|
|
@api_v3.route('/fonts/delete/<font_family>', methods=['DELETE'])
|
|
def delete_font(font_family):
|
|
"""Delete font"""
|
|
try:
|
|
# This would integrate with the actual font system
|
|
return jsonify({'status': 'success', 'message': f'Font {font_family} deleted'})
|
|
except Exception as e:
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/logs', methods=['GET'])
|
|
def get_logs():
|
|
"""Get system logs from journalctl"""
|
|
try:
|
|
# Get recent logs from journalctl
|
|
result = subprocess.run(
|
|
['sudo', 'journalctl', '-u', 'ledmatrix.service', '-n', '100', '--no-pager'],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=5
|
|
)
|
|
|
|
if result.returncode == 0:
|
|
logs_text = result.stdout.strip()
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {
|
|
'logs': logs_text if logs_text else 'No logs available from ledmatrix service'
|
|
}
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Failed to get logs: {result.stderr}'
|
|
}), 500
|
|
|
|
except subprocess.TimeoutExpired:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Timeout while fetching logs'
|
|
}), 500
|
|
except Exception as e:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error fetching logs: {str(e)}'
|
|
}), 500
|
|
|
|
# WiFi Management Endpoints
|
|
@api_v3.route('/wifi/status', methods=['GET'])
|
|
def get_wifi_status():
|
|
"""Get current WiFi connection status"""
|
|
try:
|
|
from src.wifi_manager import WiFiManager
|
|
|
|
wifi_manager = WiFiManager()
|
|
status = wifi_manager.get_wifi_status()
|
|
|
|
# Get auto-enable setting from config
|
|
auto_enable_ap = wifi_manager.config.get("auto_enable_ap_mode", True) # Default: True (safe due to grace period)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {
|
|
'connected': status.connected,
|
|
'ssid': status.ssid,
|
|
'ip_address': status.ip_address,
|
|
'signal': status.signal,
|
|
'ap_mode_active': status.ap_mode_active,
|
|
'auto_enable_ap_mode': auto_enable_ap
|
|
}
|
|
})
|
|
except Exception as e:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error getting WiFi status: {str(e)}'
|
|
}), 500
|
|
|
|
@api_v3.route('/wifi/scan', methods=['GET'])
|
|
def scan_wifi_networks():
|
|
"""Scan for available WiFi networks
|
|
|
|
If AP mode is active, it will be temporarily disabled during scanning
|
|
and automatically re-enabled afterward. Users connected to the AP will
|
|
be briefly disconnected during this process.
|
|
"""
|
|
try:
|
|
from src.wifi_manager import WiFiManager
|
|
|
|
wifi_manager = WiFiManager()
|
|
|
|
# Check if AP mode is active before scanning (for user notification)
|
|
ap_was_active = wifi_manager._is_ap_mode_active()
|
|
|
|
# Perform the scan (this will handle AP mode disabling/enabling internally)
|
|
networks = wifi_manager.scan_networks()
|
|
|
|
# Convert to dict format
|
|
networks_data = [
|
|
{
|
|
'ssid': net.ssid,
|
|
'signal': net.signal,
|
|
'security': net.security,
|
|
'frequency': net.frequency
|
|
}
|
|
for net in networks
|
|
]
|
|
|
|
response_data = {
|
|
'status': 'success',
|
|
'data': networks_data
|
|
}
|
|
|
|
# Inform user if AP mode was temporarily disabled
|
|
if ap_was_active:
|
|
response_data['message'] = (
|
|
f'Found {len(networks_data)} networks. '
|
|
'Note: AP mode was temporarily disabled during scanning and has been re-enabled. '
|
|
'If you were connected to the setup network, you may need to reconnect.'
|
|
)
|
|
|
|
return jsonify(response_data)
|
|
except Exception as e:
|
|
error_message = f'Error scanning WiFi networks: {str(e)}'
|
|
|
|
# Provide more specific error messages for common issues
|
|
error_str = str(e).lower()
|
|
if 'permission' in error_str or 'sudo' in error_str:
|
|
error_message = (
|
|
'Permission error while scanning. '
|
|
'The WiFi scan requires appropriate permissions. '
|
|
'Please ensure the application has necessary privileges.'
|
|
)
|
|
elif 'timeout' in error_str:
|
|
error_message = (
|
|
'WiFi scan timed out. '
|
|
'The scan took too long to complete. '
|
|
'This may happen if the WiFi interface is busy or in use.'
|
|
)
|
|
elif 'no wifi' in error_str or 'not available' in error_str:
|
|
error_message = (
|
|
'WiFi scanning tools are not available. '
|
|
'Please ensure NetworkManager (nmcli) or iwlist is installed.'
|
|
)
|
|
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': error_message
|
|
}), 500
|
|
|
|
@api_v3.route('/wifi/connect', methods=['POST'])
|
|
def connect_wifi():
|
|
"""Connect to a WiFi network"""
|
|
try:
|
|
from src.wifi_manager import WiFiManager
|
|
|
|
data = request.get_json()
|
|
if not data:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'Request body is required'
|
|
}), 400
|
|
|
|
if 'ssid' not in data:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'SSID is required'
|
|
}), 400
|
|
|
|
ssid = data['ssid']
|
|
if not ssid or not ssid.strip():
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'SSID cannot be empty'
|
|
}), 400
|
|
|
|
ssid = ssid.strip()
|
|
password = data.get('password', '') or ''
|
|
|
|
wifi_manager = WiFiManager()
|
|
success, message = wifi_manager.connect_to_network(ssid, password)
|
|
|
|
if success:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': message
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': message or 'Failed to connect to network'
|
|
}), 400
|
|
except Exception as e:
|
|
import logging
|
|
import traceback
|
|
logger = logging.getLogger(__name__)
|
|
logger.error(f"Error connecting to WiFi: {e}\n{traceback.format_exc()}")
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error connecting to WiFi: {str(e)}'
|
|
}), 500
|
|
|
|
@api_v3.route('/wifi/disconnect', methods=['POST'])
|
|
def disconnect_wifi():
|
|
"""Disconnect from the current WiFi network"""
|
|
try:
|
|
from src.wifi_manager import WiFiManager
|
|
|
|
wifi_manager = WiFiManager()
|
|
success, message = wifi_manager.disconnect_from_network()
|
|
|
|
if success:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': message
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': message or 'Failed to disconnect from network'
|
|
}), 400
|
|
except Exception as e:
|
|
import logging
|
|
import traceback
|
|
logger = logging.getLogger(__name__)
|
|
logger.error(f"Error disconnecting from WiFi: {e}\n{traceback.format_exc()}")
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error disconnecting from WiFi: {str(e)}'
|
|
}), 500
|
|
|
|
@api_v3.route('/wifi/ap/enable', methods=['POST'])
|
|
def enable_ap_mode():
|
|
"""Enable access point mode"""
|
|
try:
|
|
from src.wifi_manager import WiFiManager
|
|
|
|
wifi_manager = WiFiManager()
|
|
success, message = wifi_manager.enable_ap_mode()
|
|
|
|
if success:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': message
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': message
|
|
}), 400
|
|
except Exception as e:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error enabling AP mode: {str(e)}'
|
|
}), 500
|
|
|
|
@api_v3.route('/wifi/ap/disable', methods=['POST'])
|
|
def disable_ap_mode():
|
|
"""Disable access point mode"""
|
|
try:
|
|
from src.wifi_manager import WiFiManager
|
|
|
|
wifi_manager = WiFiManager()
|
|
success, message = wifi_manager.disable_ap_mode()
|
|
|
|
if success:
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': message
|
|
})
|
|
else:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': message
|
|
}), 400
|
|
except Exception as e:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error disabling AP mode: {str(e)}'
|
|
}), 500
|
|
|
|
@api_v3.route('/wifi/ap/auto-enable', methods=['GET'])
|
|
def get_auto_enable_ap_mode():
|
|
"""Get auto-enable AP mode setting"""
|
|
try:
|
|
from src.wifi_manager import WiFiManager
|
|
|
|
wifi_manager = WiFiManager()
|
|
auto_enable = wifi_manager.config.get("auto_enable_ap_mode", True) # Default: True (safe due to grace period)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {
|
|
'auto_enable_ap_mode': auto_enable
|
|
}
|
|
})
|
|
except Exception as e:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error getting auto-enable setting: {str(e)}'
|
|
}), 500
|
|
|
|
@api_v3.route('/wifi/ap/auto-enable', methods=['POST'])
|
|
def set_auto_enable_ap_mode():
|
|
"""Set auto-enable AP mode setting"""
|
|
try:
|
|
from src.wifi_manager import WiFiManager
|
|
|
|
data = request.get_json()
|
|
if data is None or 'auto_enable_ap_mode' not in data:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': 'auto_enable_ap_mode is required'
|
|
}), 400
|
|
|
|
auto_enable = bool(data['auto_enable_ap_mode'])
|
|
|
|
wifi_manager = WiFiManager()
|
|
wifi_manager.config["auto_enable_ap_mode"] = auto_enable
|
|
wifi_manager._save_config()
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': f'Auto-enable AP mode set to {auto_enable}',
|
|
'data': {
|
|
'auto_enable_ap_mode': auto_enable
|
|
}
|
|
})
|
|
except Exception as e:
|
|
return jsonify({
|
|
'status': 'error',
|
|
'message': f'Error setting auto-enable: {str(e)}'
|
|
}), 500
|
|
|
|
@api_v3.route('/cache/list', methods=['GET'])
|
|
def list_cache_files():
|
|
"""List all cache files with metadata"""
|
|
try:
|
|
if not api_v3.cache_manager:
|
|
# Initialize cache manager if not already initialized
|
|
from src.cache_manager import CacheManager
|
|
api_v3.cache_manager = CacheManager()
|
|
|
|
cache_files = api_v3.cache_manager.list_cache_files()
|
|
cache_dir = api_v3.cache_manager.get_cache_dir()
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'data': {
|
|
'cache_files': cache_files,
|
|
'cache_dir': cache_dir,
|
|
'total_files': len(cache_files)
|
|
}
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in list_cache_files: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500
|
|
|
|
@api_v3.route('/cache/delete', methods=['POST'])
|
|
def delete_cache_file():
|
|
"""Delete a specific cache file by key"""
|
|
try:
|
|
if not api_v3.cache_manager:
|
|
# Initialize cache manager if not already initialized
|
|
from src.cache_manager import CacheManager
|
|
api_v3.cache_manager = CacheManager()
|
|
|
|
data = request.get_json()
|
|
if not data or 'key' not in data:
|
|
return jsonify({'status': 'error', 'message': 'cache key is required'}), 400
|
|
|
|
cache_key = data['key']
|
|
|
|
# Delete the cache file
|
|
api_v3.cache_manager.clear_cache(cache_key)
|
|
|
|
return jsonify({
|
|
'status': 'success',
|
|
'message': f'Cache file for key "{cache_key}" deleted successfully'
|
|
})
|
|
except Exception as e:
|
|
import traceback
|
|
error_details = traceback.format_exc()
|
|
print(f"Error in delete_cache_file: {str(e)}")
|
|
print(error_details)
|
|
return jsonify({'status': 'error', 'message': str(e)}), 500 |