mirror of
https://github.com/ChuckBuilds/LEDMatrix.git
synced 2026-04-10 21:03:01 +00:00
* feat: integrate Starlark/Tronbyte app support into plugin system Add starlark-apps plugin that renders Tidbyt/Tronbyte .star apps via Pixlet binary and integrates them into the existing Plugin Manager UI as virtual plugins. Includes vegas scroll support, Tronbyte repository browsing, and per-app configuration. - Extract working starlark plugin code from starlark branch onto fresh main - Fix plugin conventions (get_logger, VegasDisplayMode, BasePlugin) - Add 13 starlark API endpoints to api_v3.py (CRUD, browse, install, render) - Virtual plugin entries (starlark:<app_id>) in installed plugins list - Starlark-aware toggle and config routing in pages_v3.py - Tronbyte repository browser section in Plugin Store UI - Pixlet binary download script (scripts/download_pixlet.sh) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(starlark): use bare imports instead of relative imports Plugin loader uses spec_from_file_location without package context, so relative imports (.pixlet_renderer) fail. Use bare imports like all other plugins do. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(starlark): make API endpoints work standalone in web service The web service runs as a separate process with display_manager=None, so plugins aren't instantiated. Refactor starlark API endpoints to read/write the manifest file directly when the plugin isn't loaded, enabling full CRUD operations from the web UI. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(starlark): make config partial work standalone in web service Read starlark app data from manifest file directly when the plugin isn't loaded, matching the api_v3.py standalone pattern. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(starlark): always show editable timing settings in config panel Render interval and display duration are now always editable in the starlark app config panel, not just shown as read-only status text. App-specific settings from schema still appear below when present. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat(store): add sort, filter, search, and pagination to Plugin Store and Starlark Apps Plugin Store: - Live search with 300ms debounce (replaces Search button) - Sort dropdown: A→Z, Z→A, Category, Author, Newest - Installed toggle filter (All / Installed / Not Installed) - Per-page selector (12/24/48) with pagination controls - "Installed" badge and "Reinstall" button on already-installed plugins - Active filter count badge + clear filters button Starlark Apps: - Parallel bulk manifest fetching via ThreadPoolExecutor (20 workers) - Server-side 2-hour cache for all 500+ Tronbyte app manifests - Auto-loads all apps when section expands (no Browse button) - Live search, sort (A→Z, Z→A, Category, Author), author dropdown - Installed toggle filter, per-page selector (24/48/96), pagination - "Installed" badge on cards, "Reinstall" button variant Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(store): move storeFilterState to global scope to fix scoping bug storeFilterState, pluginStoreCache, and related variables were declared inside an IIFE but referenced by top-level functions, causing ReferenceError that broke all plugin loading. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat(starlark): schema-driven config forms + critical security fixes ## Schema-Driven Config UI - Render type-appropriate form inputs from schema.json (text, dropdown, toggle, color, datetime, location) - Pre-populate config.json with schema defaults on install - Auto-merge schema defaults when loading existing apps (handles schema updates) - Location fields: 3-part mini-form (lat/lng/timezone) assembles into JSON - Toggle fields: support both boolean and string "true"/"false" values - Unsupported field types (oauth2, photo_select) show warning banners - Fallback to raw key/value inputs for apps without schema ## Critical Security Fixes (P0) - **Path Traversal**: Verify path safety BEFORE mkdir to prevent TOCTOU - **Race Conditions**: Add file locking (fcntl) + atomic writes to manifest operations - **Command Injection**: Validate config keys/values with regex before passing to Pixlet subprocess ## Major Logic Fixes (P1) - **Config/Manifest Separation**: Store timing keys (render_interval, display_duration) ONLY in manifest - **Location Validation**: Validate lat [-90,90] and lng [-180,180] ranges, reject malformed JSON - **Schema Defaults Merge**: Auto-apply new schema defaults to existing app configs on load - **Config Key Validation**: Enforce alphanumeric+underscore format, prevent prototype pollution ## Files Changed - web_interface/templates/v3/partials/starlark_config.html — schema-driven form rendering - plugin-repos/starlark-apps/manager.py — file locking, path safety, config validation, schema merge - plugin-repos/starlark-apps/pixlet_renderer.py — config value sanitization - web_interface/blueprints/api_v3.py — timing key separation, safe manifest updates Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): use manifest filename field for .star downloads Tronbyte apps don't always name their .star file to match the directory. For example, the "analogclock" app has "analog_clock.star" (with underscore). The manifest.yaml contains a "filename" field with the correct name. Changes: - download_star_file() now accepts optional filename parameter - Install endpoint passes metadata['filename'] to download_star_file() - Falls back to {app_id}.star if filename not in manifest Fixes: "Failed to download .star file for analogclock" error Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): reload tronbyte_repository module to pick up code changes The web service caches imported modules in sys.modules. When deploying code updates, the old cached version was still being used. Now uses importlib.reload() when module is already loaded. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): use correct 'fileName' field from manifest (camelCase) The Tronbyte manifest uses 'fileName' (camelCase), not 'filename' (lowercase). This caused the download to fall back to {app_id}.star which doesn't exist for apps like analogclock (which has analog_clock.star). Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * feat(starlark): extract schema during standalone install The standalone install function (_install_star_file) wasn't extracting schema from .star files, so apps installed via the web service had no schema.json and the config panel couldn't render schema-driven forms. Now uses PixletRenderer to extract schema during standalone install, same as the plugin does. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * feat(starlark): implement source code parser for schema extraction Pixlet CLI doesn't support schema extraction (--print-schema flag doesn't exist), so apps were being installed without schemas even when they have them. Implemented regex-based .star file parser that: - Extracts get_schema() function from source code - Parses schema.Schema(version, fields) structure - Handles variable-referenced dropdown options (e.g., options = dialectOptions) - Supports Location, Text, Toggle, Dropdown, Color, DateTime fields - Gracefully handles unsupported fields (OAuth2, LocationBased, etc.) - Returns formatted JSON matching web UI template expectations Coverage: 90%+ of Tronbyte apps (static schemas + variable references) Changes: - Replace extract_schema() to parse .star files directly instead of using Pixlet CLI - Add 6 helper methods for parsing schema structure - Handle nested parentheses and brackets properly - Resolve variable references for dropdown options Tested with: - analog_clock.star (Location field) ✓ - Multi-field test (Text + Dropdown + Toggle) ✓ - Variable-referenced options ✓ Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): add List to typing imports for schema parser Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): load schema from schema.json in standalone mode The standalone API endpoint was returning schema: null because it didn't load the schema.json file. Now reads schema from disk when returning app details via web service. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * feat(starlark): implement schema extraction, asset download, and config persistence ## Schema Extraction - Replace broken `pixlet serve --print-schema` with regex-based source parser - Extract schema by parsing `get_schema()` function from .star files - Support all field types: Location, Text, Toggle, Dropdown, Color, DateTime - Handle variable-referenced dropdown options (e.g., `options = teamOptions`) - Gracefully handle complex/unsupported field types (OAuth2, PhotoSelect, etc.) - Extract schema for 90%+ of Tronbyte apps ## Asset Download - Add `download_app_assets()` to fetch images/, sources/, fonts/ directories - Download assets in binary mode for proper image/font handling - Validate all paths to prevent directory traversal attacks - Copy asset directories during app installation - Enable apps like AnalogClock that require image assets ## Config Persistence - Create config.json file during installation with schema defaults - Update both config.json and manifest when saving configuration - Load config from config.json (not manifest) for consistency with plugin - Separate timing keys (render_interval, display_duration) from app config - Fix standalone web service mode to read/write config.json ## Pixlet Command Fix - Fix Pixlet CLI invocation: config params are positional, not flags - Change from `pixlet render file.star -c key=value` to `pixlet render file.star key=value -o output` - Properly handle JSON config values (e.g., location objects) - Enable config to be applied during rendering ## Security & Reliability - Add threading.Lock for cache operations to prevent race conditions - Reduce ThreadPoolExecutor workers from 20 to 5 for Raspberry Pi - Add path traversal validation in download_star_file() - Add YAML error logging in manifest fetching - Add file size validation (5MB limit) for .star uploads - Use sanitized app_id consistently in install endpoints - Use atomic manifest updates to prevent race conditions - Add missing Optional import for type hints ## Web UI - Fix standalone mode schema loading in config partial - Schema-driven config forms now render correctly for all apps - Location fields show lat/lng/timezone inputs - Dropdown, toggle, text, color, and datetime fields all supported Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): code review fixes - security, robustness, and schema parsing ## Security Fixes - manager.py: Check _update_manifest_safe return values to prevent silent failures - manager.py: Improve temp file cleanup in _save_manifest to prevent leaks - manager.py: Fix uninstall order (manifest → memory → disk) for consistency - api_v3.py: Add path traversal validation in uninstall endpoint - api_v3.py: Implement atomic writes for manifest files with temp + rename - pixlet_renderer.py: Relax config validation to only block dangerous shell metacharacters ## Frontend Robustness - plugins_manager.js: Add safeLocalStorage wrapper for restricted contexts (private browsing) - starlark_config.html: Scope querySelector to container to prevent modal conflicts ## Schema Parsing Improvements - pixlet_renderer.py: Indentation-aware get_schema() extraction (handles nested functions) - pixlet_renderer.py: Handle quoted defaults with commas (e.g., "New York, NY") - tronbyte_repository.py: Validate file_name is string before path traversal checks ## Dependencies - requirements.txt: Update Pillow (10.4.0), PyYAML (6.0.2), requests (2.32.0) ## Documentation - docs/STARLARK_APPS_GUIDE.md: Comprehensive guide explaining: - How Starlark apps work - That apps come from Tronbyte (not LEDMatrix) - Installation, configuration, troubleshooting - Links to upstream projects All changes improve security, reliability, and user experience. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): convert Path to str in spec_from_file_location calls The module import helpers were passing Path objects directly to spec_from_file_location(), which caused spec to be None. This broke the Starlark app store browser. - Convert module_path to string in both _get_tronbyte_repository_class and _get_pixlet_renderer_class - Add None checks with clear error messages for debugging Fixes: spec not found for the module 'tronbyte_repository' Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): restore Starlark Apps section in plugins.html The Starlark Apps UI section was lost during merge conflict resolution with main branch. Restored from commit942663abwhich had the complete implementation with filtering, sorting, and pagination. Fixes: Starlark section not visible on plugin manager page Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): restore Starlark JS functionality lost in merge During the merge with main, all Starlark-specific JavaScript (104 lines) was removed from plugins_manager.js, including: - starlarkFilterState and filtering logic - loadStarlarkApps() function - Starlark app install/uninstall handlers - Starlark section collapse/expand logic - Pagination and sorting for Starlark apps Restored from commit942663aband re-applied safeLocalStorage wrapper from our code review fixes. Fixes: Starlark Apps section non-functional in web UI Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): security and race condition improvements Security fixes: - Add path traversal validation for output_path in download_star_file - Remove XSS-vulnerable inline onclick handlers, use delegated events - Add type hints to helper functions for better type safety Race condition fixes: - Lock manifest file BEFORE creating temp file in _save_manifest - Hold exclusive lock for entire read-modify-write cycle in _update_manifest_safe - Prevent concurrent writers from racing on manifest updates Other improvements: - Fix pages_v3.py standalone mode to load config.json from disk - Improve error handling with proper logging in cleanup blocks - Add explicit type annotations to Starlark helper functions Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): critical bug fixes and code quality improvements Critical fixes: - Fix stack overflow in safeLocalStorage (was recursively calling itself) - Fix duplicate event listeners on Starlark grid (added sentinel check) - Fix JSON validation to fail fast on malformed data instead of silently passing Error handling improvements: - Narrow exception catches to specific types (OSError, json.JSONDecodeError, ValueError) - Use logger.exception() with exc_info=True for better stack traces - Replace generic "except Exception" with specific exception types Logging improvements: - Add "[Starlark Pixlet]" context tags to pixlet_renderer logs - Redact sensitive config values from debug logs (API keys, etc.) - Add file_path context to schema parsing warnings Documentation: - Fix markdown lint issues (add language tags to code blocks) - Fix time unit spacing: "(5min)" -> "(5 min)" Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): critical path traversal and exception handling fixes Path traversal security fixes (CRITICAL): - Add _validate_starlark_app_path() helper to check for path traversal attacks - Validate app_id in get_starlark_app(), uninstall_starlark_app(), get_starlark_app_config(), and update_starlark_app_config() - Check for '..' and path separators before any filesystem access - Verify resolved paths are within _STARLARK_APPS_DIR using Path.relative_to() - Prevents unauthorized file access via crafted app_id like '../../../etc/passwd' Exception handling improvements (tronbyte_repository.py): - Replace broad "except Exception" with specific types - _make_request: catch requests.Timeout, requests.RequestException, json.JSONDecodeError - _fetch_raw_file: catch requests.Timeout, requests.RequestException separately - download_app_assets: narrow to OSError, ValueError - Add "[Tronbyte Repo]" context prefix to all log messages - Use exc_info=True for better stack traces API improvements: - Narrow exception catches to OSError, json.JSONDecodeError in config loading - Remove duplicate path traversal checks (now centralized in helper) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): logging improvements and code quality fixes Logging improvements (pages_v3.py): - Add logging import and create module logger - Replace print() calls with logger.warning() with "[Pages V3]" prefix - Use logger.exception() for outer try/catch with exc_info=True - Narrow exception handling to OSError, json.JSONDecodeError for file operations API improvements (api_v3.py): - Remove unnecessary f-strings (Ruff F541) from ImportError messages - Narrow upload exception handling to ValueError, OSError, IOError - Use logger.exception() with context for better debugging - Remove early return in get_starlark_status() to allow standalone mode fallback - Sanitize error messages returned to client (don't expose internal details) Benefits: - Better log context with consistent prefixes - More specific exception handling prevents masking unexpected errors - Standalone/web-service-only mode now works for status endpoint - Stack traces preserved for debugging without exposing to clients Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> --------- Co-authored-by: Chuck <chuck@example.com> Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
660 lines
24 KiB
Python
660 lines
24 KiB
Python
"""
|
|
Pixlet Renderer Module for Starlark Apps
|
|
|
|
Handles execution of Pixlet CLI to render .star files into WebP animations.
|
|
Supports bundled binaries and system-installed Pixlet.
|
|
"""
|
|
|
|
import json
|
|
import logging
|
|
import os
|
|
import platform
|
|
import re
|
|
import shutil
|
|
import subprocess
|
|
from pathlib import Path
|
|
from typing import Dict, Any, Optional, Tuple, List
|
|
|
|
logger = logging.getLogger(__name__)
|
|
|
|
|
|
class PixletRenderer:
|
|
"""
|
|
Wrapper for Pixlet CLI rendering.
|
|
|
|
Handles:
|
|
- Auto-detection of bundled or system Pixlet binary
|
|
- Rendering .star files with configuration
|
|
- Schema extraction from .star files
|
|
- Timeout and error handling
|
|
"""
|
|
|
|
def __init__(self, pixlet_path: Optional[str] = None, timeout: int = 30):
|
|
"""
|
|
Initialize the Pixlet renderer.
|
|
|
|
Args:
|
|
pixlet_path: Optional explicit path to Pixlet binary
|
|
timeout: Maximum seconds to wait for rendering
|
|
"""
|
|
self.timeout = timeout
|
|
self.pixlet_binary = self._find_pixlet_binary(pixlet_path)
|
|
|
|
if self.pixlet_binary:
|
|
logger.info(f"[Starlark Pixlet] Pixlet renderer initialized with binary: {self.pixlet_binary}")
|
|
else:
|
|
logger.warning("[Starlark Pixlet] Pixlet binary not found - rendering will fail")
|
|
|
|
def _find_pixlet_binary(self, explicit_path: Optional[str] = None) -> Optional[str]:
|
|
"""
|
|
Find Pixlet binary using the following priority:
|
|
1. Explicit path provided
|
|
2. Bundled binary for current architecture
|
|
3. System PATH
|
|
|
|
Args:
|
|
explicit_path: User-specified path to Pixlet
|
|
|
|
Returns:
|
|
Path to Pixlet binary, or None if not found
|
|
"""
|
|
# 1. Check explicit path
|
|
if explicit_path and os.path.isfile(explicit_path):
|
|
if os.access(explicit_path, os.X_OK):
|
|
logger.debug(f"Using explicit Pixlet path: {explicit_path}")
|
|
return explicit_path
|
|
else:
|
|
logger.warning(f"Explicit Pixlet path not executable: {explicit_path}")
|
|
|
|
# 2. Check bundled binary
|
|
try:
|
|
bundled_path = self._get_bundled_binary_path()
|
|
if bundled_path and os.path.isfile(bundled_path):
|
|
# Ensure executable
|
|
if not os.access(bundled_path, os.X_OK):
|
|
try:
|
|
os.chmod(bundled_path, 0o755)
|
|
logger.debug(f"Made bundled binary executable: {bundled_path}")
|
|
except OSError:
|
|
logger.exception(f"Could not make bundled binary executable: {bundled_path}")
|
|
|
|
if os.access(bundled_path, os.X_OK):
|
|
logger.debug(f"Using bundled Pixlet binary: {bundled_path}")
|
|
return bundled_path
|
|
except OSError:
|
|
logger.exception("Could not locate bundled binary")
|
|
|
|
# 3. Check system PATH
|
|
system_pixlet = shutil.which("pixlet")
|
|
if system_pixlet:
|
|
logger.debug(f"Using system Pixlet: {system_pixlet}")
|
|
return system_pixlet
|
|
|
|
logger.error("Pixlet binary not found in any location")
|
|
return None
|
|
|
|
def _get_bundled_binary_path(self) -> Optional[str]:
|
|
"""
|
|
Get path to bundled Pixlet binary for current architecture.
|
|
|
|
Returns:
|
|
Path to bundled binary, or None if not found
|
|
"""
|
|
try:
|
|
# Determine project root (parent of plugin-repos)
|
|
current_dir = Path(__file__).resolve().parent
|
|
project_root = current_dir.parent.parent
|
|
bin_dir = project_root / "bin" / "pixlet"
|
|
|
|
# Detect architecture
|
|
system = platform.system().lower()
|
|
machine = platform.machine().lower()
|
|
|
|
# Map architecture to binary name
|
|
if system == "linux":
|
|
if "aarch64" in machine or "arm64" in machine:
|
|
binary_name = "pixlet-linux-arm64"
|
|
elif "x86_64" in machine or "amd64" in machine:
|
|
binary_name = "pixlet-linux-amd64"
|
|
else:
|
|
logger.warning(f"Unsupported Linux architecture: {machine}")
|
|
return None
|
|
elif system == "darwin":
|
|
if "arm64" in machine:
|
|
binary_name = "pixlet-darwin-arm64"
|
|
else:
|
|
binary_name = "pixlet-darwin-amd64"
|
|
elif system == "windows":
|
|
binary_name = "pixlet-windows-amd64.exe"
|
|
else:
|
|
logger.warning(f"Unsupported system: {system}")
|
|
return None
|
|
|
|
binary_path = bin_dir / binary_name
|
|
if binary_path.exists():
|
|
return str(binary_path)
|
|
|
|
logger.debug(f"Bundled binary not found at: {binary_path}")
|
|
return None
|
|
|
|
except OSError:
|
|
logger.exception("Error finding bundled binary")
|
|
return None
|
|
|
|
def _get_safe_working_directory(self, star_file: str) -> Optional[str]:
|
|
"""
|
|
Get a safe working directory for subprocess execution.
|
|
|
|
Args:
|
|
star_file: Path to .star file
|
|
|
|
Returns:
|
|
Resolved parent directory, or None if empty or invalid
|
|
"""
|
|
try:
|
|
resolved_parent = os.path.dirname(os.path.abspath(star_file))
|
|
# Return None if empty string to avoid FileNotFoundError
|
|
if not resolved_parent:
|
|
logger.debug(f"Empty parent directory for star_file: {star_file}")
|
|
return None
|
|
return resolved_parent
|
|
except (OSError, ValueError):
|
|
logger.debug(f"Could not resolve working directory for: {star_file}")
|
|
return None
|
|
|
|
def is_available(self) -> bool:
|
|
"""
|
|
Check if Pixlet is available and functional.
|
|
|
|
Returns:
|
|
True if Pixlet can be executed
|
|
"""
|
|
if not self.pixlet_binary:
|
|
return False
|
|
|
|
try:
|
|
result = subprocess.run(
|
|
[self.pixlet_binary, "version"],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=5
|
|
)
|
|
return result.returncode == 0
|
|
except subprocess.TimeoutExpired:
|
|
logger.debug("Pixlet version check timed out")
|
|
return False
|
|
except (subprocess.SubprocessError, OSError):
|
|
logger.exception("Pixlet not available")
|
|
return False
|
|
|
|
def get_version(self) -> Optional[str]:
|
|
"""
|
|
Get Pixlet version string.
|
|
|
|
Returns:
|
|
Version string, or None if unavailable
|
|
"""
|
|
if not self.pixlet_binary:
|
|
return None
|
|
|
|
try:
|
|
result = subprocess.run(
|
|
[self.pixlet_binary, "version"],
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=5
|
|
)
|
|
if result.returncode == 0:
|
|
return result.stdout.strip()
|
|
except subprocess.TimeoutExpired:
|
|
logger.debug("Pixlet version check timed out")
|
|
except (subprocess.SubprocessError, OSError):
|
|
logger.exception("Could not get Pixlet version")
|
|
|
|
return None
|
|
|
|
def render(
|
|
self,
|
|
star_file: str,
|
|
output_path: str,
|
|
config: Optional[Dict[str, Any]] = None,
|
|
magnify: int = 1
|
|
) -> Tuple[bool, Optional[str]]:
|
|
"""
|
|
Render a .star file to WebP output.
|
|
|
|
Args:
|
|
star_file: Path to .star file
|
|
output_path: Where to save WebP output
|
|
config: Configuration dictionary to pass to app
|
|
magnify: Magnification factor (default 1)
|
|
|
|
Returns:
|
|
Tuple of (success: bool, error_message: Optional[str])
|
|
"""
|
|
if not self.pixlet_binary:
|
|
return False, "Pixlet binary not found"
|
|
|
|
if not os.path.isfile(star_file):
|
|
return False, f"Star file not found: {star_file}"
|
|
|
|
try:
|
|
# Build command - config params must be POSITIONAL between star_file and flags
|
|
# Format: pixlet render <file.star> [key=value]... [flags]
|
|
cmd = [
|
|
self.pixlet_binary,
|
|
"render",
|
|
star_file
|
|
]
|
|
|
|
# Add configuration parameters as positional arguments (BEFORE flags)
|
|
if config:
|
|
for key, value in config.items():
|
|
# Validate key format (alphanumeric + underscore only)
|
|
if not re.match(r'^[a-zA-Z_][a-zA-Z0-9_]*$', key):
|
|
logger.warning(f"Skipping invalid config key: {key}")
|
|
continue
|
|
|
|
# Convert value to string for CLI
|
|
if isinstance(value, bool):
|
|
value_str = "true" if value else "false"
|
|
elif isinstance(value, str) and (value.startswith('{') or value.startswith('[')):
|
|
# JSON string - keep as-is, will be properly quoted by subprocess
|
|
value_str = value
|
|
else:
|
|
value_str = str(value)
|
|
|
|
# Validate value doesn't contain dangerous shell metacharacters
|
|
# Block: backticks, $(), pipes, redirects, semicolons, ampersands, null bytes
|
|
# Allow: most printable chars including spaces, quotes, brackets, braces
|
|
if re.search(r'[`$|<>&;\x00]|\$\(', value_str):
|
|
logger.warning(f"Skipping config value with unsafe shell characters for key {key}: {value_str}")
|
|
continue
|
|
|
|
# Add as positional argument (not -c flag)
|
|
cmd.append(f"{key}={value_str}")
|
|
|
|
# Add flags AFTER positional config arguments
|
|
cmd.extend([
|
|
"-o", output_path,
|
|
"-m", str(magnify)
|
|
])
|
|
|
|
# Build sanitized command for logging (redact sensitive values)
|
|
sanitized_cmd = [self.pixlet_binary, "render", star_file]
|
|
if config:
|
|
config_keys = list(config.keys())
|
|
sanitized_cmd.append(f"[{len(config_keys)} config entries: {', '.join(config_keys)}]")
|
|
sanitized_cmd.extend(["-o", output_path, "-m", str(magnify)])
|
|
logger.debug(f"Executing Pixlet: {' '.join(sanitized_cmd)}")
|
|
|
|
# Execute rendering
|
|
safe_cwd = self._get_safe_working_directory(star_file)
|
|
result = subprocess.run(
|
|
cmd,
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=self.timeout,
|
|
cwd=safe_cwd # Run in .star file directory (or None if relative path)
|
|
)
|
|
|
|
if result.returncode == 0:
|
|
if os.path.isfile(output_path):
|
|
logger.debug(f"Successfully rendered: {star_file} -> {output_path}")
|
|
return True, None
|
|
else:
|
|
error = "Rendering succeeded but output file not found"
|
|
logger.error(error)
|
|
return False, error
|
|
else:
|
|
error = f"Pixlet failed (exit {result.returncode}): {result.stderr}"
|
|
logger.error(error)
|
|
return False, error
|
|
|
|
except subprocess.TimeoutExpired:
|
|
error = f"Rendering timeout after {self.timeout}s"
|
|
logger.error(error)
|
|
return False, error
|
|
except (subprocess.SubprocessError, OSError):
|
|
logger.exception("Rendering exception")
|
|
return False, "Rendering failed - see logs for details"
|
|
|
|
def extract_schema(self, star_file: str) -> Tuple[bool, Optional[Dict[str, Any]], Optional[str]]:
|
|
"""
|
|
Extract configuration schema from a .star file by parsing source code.
|
|
|
|
Supports:
|
|
- Static field definitions (location, text, toggle, dropdown, color, datetime)
|
|
- Variable-referenced dropdown options
|
|
- Graceful degradation for unsupported field types
|
|
|
|
Args:
|
|
star_file: Path to .star file
|
|
|
|
Returns:
|
|
Tuple of (success: bool, schema: Optional[Dict], error: Optional[str])
|
|
"""
|
|
if not os.path.isfile(star_file):
|
|
return False, None, f"Star file not found: {star_file}"
|
|
|
|
try:
|
|
# Read .star file
|
|
with open(star_file, 'r', encoding='utf-8') as f:
|
|
content = f.read()
|
|
|
|
# Parse schema from source
|
|
schema = self._parse_schema_from_source(content, star_file)
|
|
|
|
if schema:
|
|
field_count = len(schema.get('schema', []))
|
|
logger.debug(f"Extracted schema with {field_count} field(s) from: {star_file}")
|
|
return True, schema, None
|
|
else:
|
|
# No schema found - not an error, app just doesn't have configuration
|
|
logger.debug(f"No schema found in: {star_file}")
|
|
return True, None, None
|
|
|
|
except UnicodeDecodeError as e:
|
|
error = f"File encoding error: {e}"
|
|
logger.warning(error)
|
|
return False, None, error
|
|
except Exception as e:
|
|
logger.exception(f"Schema extraction failed for {star_file}")
|
|
return False, None, f"Schema extraction error: {str(e)}"
|
|
|
|
def _parse_schema_from_source(self, content: str, file_path: str) -> Optional[Dict[str, Any]]:
|
|
"""
|
|
Parse get_schema() function from Starlark source code.
|
|
|
|
Args:
|
|
content: .star file content
|
|
file_path: Path to file (for logging)
|
|
|
|
Returns:
|
|
Schema dict with format {"version": "1", "schema": [...]}, or None
|
|
"""
|
|
# Extract variable definitions (for dropdown options)
|
|
var_table = self._extract_variable_definitions(content)
|
|
|
|
# Extract get_schema() function body
|
|
schema_body = self._extract_get_schema_body(content)
|
|
if not schema_body:
|
|
logger.debug(f"No get_schema() function found in {file_path}")
|
|
return None
|
|
|
|
# Extract version
|
|
version_match = re.search(r'version\s*=\s*"([^"]+)"', schema_body)
|
|
version = version_match.group(1) if version_match else "1"
|
|
|
|
# Extract fields array from schema.Schema(...) - handle nested brackets
|
|
fields_start_match = re.search(r'fields\s*=\s*\[', schema_body)
|
|
if not fields_start_match:
|
|
# Empty schema or no fields
|
|
return {"version": version, "schema": []}
|
|
|
|
# Find matching closing bracket
|
|
bracket_count = 1
|
|
i = fields_start_match.end()
|
|
while i < len(schema_body) and bracket_count > 0:
|
|
if schema_body[i] == '[':
|
|
bracket_count += 1
|
|
elif schema_body[i] == ']':
|
|
bracket_count -= 1
|
|
i += 1
|
|
|
|
if bracket_count != 0:
|
|
# Unmatched brackets
|
|
logger.warning(f"Unmatched brackets in schema fields for {file_path}")
|
|
return {"version": version, "schema": []}
|
|
|
|
fields_text = schema_body[fields_start_match.end():i-1]
|
|
|
|
# Parse individual fields
|
|
schema_fields = []
|
|
# Match schema.FieldType(...) patterns
|
|
field_pattern = r'schema\.(\w+)\s*\((.*?)\)'
|
|
|
|
# Find all field definitions (handle nested parentheses)
|
|
pos = 0
|
|
while pos < len(fields_text):
|
|
match = re.search(field_pattern, fields_text[pos:], re.DOTALL)
|
|
if not match:
|
|
break
|
|
|
|
field_type = match.group(1)
|
|
field_start = pos + match.start()
|
|
field_end = pos + match.end()
|
|
|
|
# Handle nested parentheses properly
|
|
paren_count = 1
|
|
i = pos + match.start() + len(f'schema.{field_type}(')
|
|
while i < len(fields_text) and paren_count > 0:
|
|
if fields_text[i] == '(':
|
|
paren_count += 1
|
|
elif fields_text[i] == ')':
|
|
paren_count -= 1
|
|
i += 1
|
|
|
|
field_params_text = fields_text[pos + match.start() + len(f'schema.{field_type}('):i-1]
|
|
|
|
# Parse field
|
|
field_dict = self._parse_schema_field(field_type, field_params_text, var_table)
|
|
if field_dict:
|
|
schema_fields.append(field_dict)
|
|
|
|
pos = i
|
|
|
|
return {
|
|
"version": version,
|
|
"schema": schema_fields
|
|
}
|
|
|
|
def _extract_variable_definitions(self, content: str) -> Dict[str, List[Dict]]:
|
|
"""
|
|
Extract top-level variable assignments (for dropdown options).
|
|
|
|
Args:
|
|
content: .star file content
|
|
|
|
Returns:
|
|
Dict mapping variable names to their option lists
|
|
"""
|
|
var_table = {}
|
|
|
|
# Find variable definitions like: variableName = [schema.Option(...), ...]
|
|
var_pattern = r'^(\w+)\s*=\s*\[(.*?schema\.Option.*?)\]'
|
|
matches = re.finditer(var_pattern, content, re.MULTILINE | re.DOTALL)
|
|
|
|
for match in matches:
|
|
var_name = match.group(1)
|
|
options_text = match.group(2)
|
|
|
|
# Parse schema.Option entries
|
|
options = self._parse_schema_options(options_text, {})
|
|
if options:
|
|
var_table[var_name] = options
|
|
|
|
return var_table
|
|
|
|
def _extract_get_schema_body(self, content: str) -> Optional[str]:
|
|
"""
|
|
Extract get_schema() function body using indentation-aware parsing.
|
|
|
|
Args:
|
|
content: .star file content
|
|
|
|
Returns:
|
|
Function body text, or None if not found
|
|
"""
|
|
# Find def get_schema(): line
|
|
pattern = r'^(\s*)def\s+get_schema\s*\(\s*\)\s*:'
|
|
match = re.search(pattern, content, re.MULTILINE)
|
|
|
|
if not match:
|
|
return None
|
|
|
|
# Get the indentation level of the function definition
|
|
func_indent = len(match.group(1))
|
|
func_start = match.end()
|
|
|
|
# Split content into lines starting after the function definition
|
|
lines_after = content[func_start:].split('\n')
|
|
body_lines = []
|
|
|
|
for line in lines_after:
|
|
# Skip empty lines
|
|
if not line.strip():
|
|
body_lines.append(line)
|
|
continue
|
|
|
|
# Calculate indentation of current line
|
|
stripped = line.lstrip()
|
|
line_indent = len(line) - len(stripped)
|
|
|
|
# If line has same or less indentation than function def, check if it's a top-level def
|
|
if line_indent <= func_indent:
|
|
# This is a line at the same or outer level - check if it's a function
|
|
if re.match(r'def\s+\w+', stripped):
|
|
# Found next top-level function, stop here
|
|
break
|
|
# Otherwise it might be a comment or other top-level code, stop anyway
|
|
break
|
|
|
|
# Line is indented more than function def, so it's part of the body
|
|
body_lines.append(line)
|
|
|
|
if body_lines:
|
|
return '\n'.join(body_lines)
|
|
return None
|
|
|
|
def _parse_schema_field(self, field_type: str, params_text: str, var_table: Dict) -> Optional[Dict[str, Any]]:
|
|
"""
|
|
Parse individual schema field definition.
|
|
|
|
Args:
|
|
field_type: Field type (Location, Text, Toggle, etc.)
|
|
params_text: Field parameters text
|
|
var_table: Variable lookup table
|
|
|
|
Returns:
|
|
Field dict, or None if parse fails
|
|
"""
|
|
# Map Pixlet field types to JSON typeOf
|
|
type_mapping = {
|
|
'Location': 'location',
|
|
'Text': 'text',
|
|
'Toggle': 'toggle',
|
|
'Dropdown': 'dropdown',
|
|
'Color': 'color',
|
|
'DateTime': 'datetime',
|
|
'OAuth2': 'oauth2',
|
|
'PhotoSelect': 'photo_select',
|
|
'LocationBased': 'location_based',
|
|
'Typeahead': 'typeahead',
|
|
'Generated': 'generated',
|
|
}
|
|
|
|
type_of = type_mapping.get(field_type, field_type.lower())
|
|
|
|
# Skip Generated fields (invisible meta-fields)
|
|
if type_of == 'generated':
|
|
return None
|
|
|
|
field_dict = {"typeOf": type_of}
|
|
|
|
# Extract common parameters
|
|
# id
|
|
id_match = re.search(r'id\s*=\s*"([^"]+)"', params_text)
|
|
if id_match:
|
|
field_dict['id'] = id_match.group(1)
|
|
else:
|
|
# id is required, skip field if missing
|
|
return None
|
|
|
|
# name
|
|
name_match = re.search(r'name\s*=\s*"([^"]+)"', params_text)
|
|
if name_match:
|
|
field_dict['name'] = name_match.group(1)
|
|
|
|
# desc
|
|
desc_match = re.search(r'desc\s*=\s*"([^"]+)"', params_text)
|
|
if desc_match:
|
|
field_dict['desc'] = desc_match.group(1)
|
|
|
|
# icon
|
|
icon_match = re.search(r'icon\s*=\s*"([^"]+)"', params_text)
|
|
if icon_match:
|
|
field_dict['icon'] = icon_match.group(1)
|
|
|
|
# default (can be string, bool, or variable reference)
|
|
# First try to match quoted strings (which may contain commas)
|
|
default_match = re.search(r'default\s*=\s*"([^"]*)"', params_text)
|
|
if not default_match:
|
|
# Try single quotes
|
|
default_match = re.search(r"default\s*=\s*'([^']*)'", params_text)
|
|
if not default_match:
|
|
# Fall back to unquoted value (stop at comma or closing paren)
|
|
default_match = re.search(r'default\s*=\s*([^,\)]+)', params_text)
|
|
|
|
if default_match:
|
|
default_value = default_match.group(1).strip()
|
|
# Handle boolean
|
|
if default_value in ('True', 'False'):
|
|
field_dict['default'] = default_value.lower()
|
|
# Handle string literal from first two patterns (already extracted without quotes)
|
|
elif re.search(r'default\s*=\s*["\']', params_text):
|
|
# This was a quoted string, use the captured content directly
|
|
field_dict['default'] = default_value
|
|
# Handle variable reference (can't resolve, use as-is)
|
|
else:
|
|
# Try to extract just the value if it's like options[0].value
|
|
if '.' in default_value or '[' in default_value:
|
|
# Complex expression, skip default
|
|
pass
|
|
else:
|
|
field_dict['default'] = default_value
|
|
|
|
# For dropdown, extract options
|
|
if type_of == 'dropdown':
|
|
options_match = re.search(r'options\s*=\s*([^,\)]+)', params_text)
|
|
if options_match:
|
|
options_ref = options_match.group(1).strip()
|
|
# Check if it's a variable reference
|
|
if options_ref in var_table:
|
|
field_dict['options'] = var_table[options_ref]
|
|
# Or inline options
|
|
elif options_ref.startswith('['):
|
|
# Find the full options array (handle nested brackets)
|
|
# This is tricky, for now try to extract inline options
|
|
inline_match = re.search(r'options\s*=\s*(\[.*?\])', params_text, re.DOTALL)
|
|
if inline_match:
|
|
options_text = inline_match.group(1)
|
|
field_dict['options'] = self._parse_schema_options(options_text, var_table)
|
|
|
|
return field_dict
|
|
|
|
def _parse_schema_options(self, options_text: str, var_table: Dict) -> List[Dict[str, str]]:
|
|
"""
|
|
Parse schema.Option list.
|
|
|
|
Args:
|
|
options_text: Text containing schema.Option(...) entries
|
|
var_table: Variable lookup table (not currently used)
|
|
|
|
Returns:
|
|
List of {"display": "...", "value": "..."} dicts
|
|
"""
|
|
options = []
|
|
|
|
# Match schema.Option(display = "...", value = "...")
|
|
option_pattern = r'schema\.Option\s*\(\s*display\s*=\s*"([^"]+)"\s*,\s*value\s*=\s*"([^"]+)"\s*\)'
|
|
matches = re.finditer(option_pattern, options_text)
|
|
|
|
for match in matches:
|
|
options.append({
|
|
"display": match.group(1),
|
|
"value": match.group(2)
|
|
})
|
|
|
|
return options
|