mirror of
https://github.com/ChuckBuilds/LEDMatrix.git
synced 2026-04-10 21:03:01 +00:00
* feat: integrate Starlark/Tronbyte app support into plugin system Add starlark-apps plugin that renders Tidbyt/Tronbyte .star apps via Pixlet binary and integrates them into the existing Plugin Manager UI as virtual plugins. Includes vegas scroll support, Tronbyte repository browsing, and per-app configuration. - Extract working starlark plugin code from starlark branch onto fresh main - Fix plugin conventions (get_logger, VegasDisplayMode, BasePlugin) - Add 13 starlark API endpoints to api_v3.py (CRUD, browse, install, render) - Virtual plugin entries (starlark:<app_id>) in installed plugins list - Starlark-aware toggle and config routing in pages_v3.py - Tronbyte repository browser section in Plugin Store UI - Pixlet binary download script (scripts/download_pixlet.sh) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(starlark): use bare imports instead of relative imports Plugin loader uses spec_from_file_location without package context, so relative imports (.pixlet_renderer) fail. Use bare imports like all other plugins do. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(starlark): make API endpoints work standalone in web service The web service runs as a separate process with display_manager=None, so plugins aren't instantiated. Refactor starlark API endpoints to read/write the manifest file directly when the plugin isn't loaded, enabling full CRUD operations from the web UI. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(starlark): make config partial work standalone in web service Read starlark app data from manifest file directly when the plugin isn't loaded, matching the api_v3.py standalone pattern. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(starlark): always show editable timing settings in config panel Render interval and display duration are now always editable in the starlark app config panel, not just shown as read-only status text. App-specific settings from schema still appear below when present. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat(store): add sort, filter, search, and pagination to Plugin Store and Starlark Apps Plugin Store: - Live search with 300ms debounce (replaces Search button) - Sort dropdown: A→Z, Z→A, Category, Author, Newest - Installed toggle filter (All / Installed / Not Installed) - Per-page selector (12/24/48) with pagination controls - "Installed" badge and "Reinstall" button on already-installed plugins - Active filter count badge + clear filters button Starlark Apps: - Parallel bulk manifest fetching via ThreadPoolExecutor (20 workers) - Server-side 2-hour cache for all 500+ Tronbyte app manifests - Auto-loads all apps when section expands (no Browse button) - Live search, sort (A→Z, Z→A, Category, Author), author dropdown - Installed toggle filter, per-page selector (24/48/96), pagination - "Installed" badge on cards, "Reinstall" button variant Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(store): move storeFilterState to global scope to fix scoping bug storeFilterState, pluginStoreCache, and related variables were declared inside an IIFE but referenced by top-level functions, causing ReferenceError that broke all plugin loading. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat(starlark): schema-driven config forms + critical security fixes ## Schema-Driven Config UI - Render type-appropriate form inputs from schema.json (text, dropdown, toggle, color, datetime, location) - Pre-populate config.json with schema defaults on install - Auto-merge schema defaults when loading existing apps (handles schema updates) - Location fields: 3-part mini-form (lat/lng/timezone) assembles into JSON - Toggle fields: support both boolean and string "true"/"false" values - Unsupported field types (oauth2, photo_select) show warning banners - Fallback to raw key/value inputs for apps without schema ## Critical Security Fixes (P0) - **Path Traversal**: Verify path safety BEFORE mkdir to prevent TOCTOU - **Race Conditions**: Add file locking (fcntl) + atomic writes to manifest operations - **Command Injection**: Validate config keys/values with regex before passing to Pixlet subprocess ## Major Logic Fixes (P1) - **Config/Manifest Separation**: Store timing keys (render_interval, display_duration) ONLY in manifest - **Location Validation**: Validate lat [-90,90] and lng [-180,180] ranges, reject malformed JSON - **Schema Defaults Merge**: Auto-apply new schema defaults to existing app configs on load - **Config Key Validation**: Enforce alphanumeric+underscore format, prevent prototype pollution ## Files Changed - web_interface/templates/v3/partials/starlark_config.html — schema-driven form rendering - plugin-repos/starlark-apps/manager.py — file locking, path safety, config validation, schema merge - plugin-repos/starlark-apps/pixlet_renderer.py — config value sanitization - web_interface/blueprints/api_v3.py — timing key separation, safe manifest updates Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): use manifest filename field for .star downloads Tronbyte apps don't always name their .star file to match the directory. For example, the "analogclock" app has "analog_clock.star" (with underscore). The manifest.yaml contains a "filename" field with the correct name. Changes: - download_star_file() now accepts optional filename parameter - Install endpoint passes metadata['filename'] to download_star_file() - Falls back to {app_id}.star if filename not in manifest Fixes: "Failed to download .star file for analogclock" error Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): reload tronbyte_repository module to pick up code changes The web service caches imported modules in sys.modules. When deploying code updates, the old cached version was still being used. Now uses importlib.reload() when module is already loaded. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): use correct 'fileName' field from manifest (camelCase) The Tronbyte manifest uses 'fileName' (camelCase), not 'filename' (lowercase). This caused the download to fall back to {app_id}.star which doesn't exist for apps like analogclock (which has analog_clock.star). Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * feat(starlark): extract schema during standalone install The standalone install function (_install_star_file) wasn't extracting schema from .star files, so apps installed via the web service had no schema.json and the config panel couldn't render schema-driven forms. Now uses PixletRenderer to extract schema during standalone install, same as the plugin does. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * feat(starlark): implement source code parser for schema extraction Pixlet CLI doesn't support schema extraction (--print-schema flag doesn't exist), so apps were being installed without schemas even when they have them. Implemented regex-based .star file parser that: - Extracts get_schema() function from source code - Parses schema.Schema(version, fields) structure - Handles variable-referenced dropdown options (e.g., options = dialectOptions) - Supports Location, Text, Toggle, Dropdown, Color, DateTime fields - Gracefully handles unsupported fields (OAuth2, LocationBased, etc.) - Returns formatted JSON matching web UI template expectations Coverage: 90%+ of Tronbyte apps (static schemas + variable references) Changes: - Replace extract_schema() to parse .star files directly instead of using Pixlet CLI - Add 6 helper methods for parsing schema structure - Handle nested parentheses and brackets properly - Resolve variable references for dropdown options Tested with: - analog_clock.star (Location field) ✓ - Multi-field test (Text + Dropdown + Toggle) ✓ - Variable-referenced options ✓ Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): add List to typing imports for schema parser Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): load schema from schema.json in standalone mode The standalone API endpoint was returning schema: null because it didn't load the schema.json file. Now reads schema from disk when returning app details via web service. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * feat(starlark): implement schema extraction, asset download, and config persistence ## Schema Extraction - Replace broken `pixlet serve --print-schema` with regex-based source parser - Extract schema by parsing `get_schema()` function from .star files - Support all field types: Location, Text, Toggle, Dropdown, Color, DateTime - Handle variable-referenced dropdown options (e.g., `options = teamOptions`) - Gracefully handle complex/unsupported field types (OAuth2, PhotoSelect, etc.) - Extract schema for 90%+ of Tronbyte apps ## Asset Download - Add `download_app_assets()` to fetch images/, sources/, fonts/ directories - Download assets in binary mode for proper image/font handling - Validate all paths to prevent directory traversal attacks - Copy asset directories during app installation - Enable apps like AnalogClock that require image assets ## Config Persistence - Create config.json file during installation with schema defaults - Update both config.json and manifest when saving configuration - Load config from config.json (not manifest) for consistency with plugin - Separate timing keys (render_interval, display_duration) from app config - Fix standalone web service mode to read/write config.json ## Pixlet Command Fix - Fix Pixlet CLI invocation: config params are positional, not flags - Change from `pixlet render file.star -c key=value` to `pixlet render file.star key=value -o output` - Properly handle JSON config values (e.g., location objects) - Enable config to be applied during rendering ## Security & Reliability - Add threading.Lock for cache operations to prevent race conditions - Reduce ThreadPoolExecutor workers from 20 to 5 for Raspberry Pi - Add path traversal validation in download_star_file() - Add YAML error logging in manifest fetching - Add file size validation (5MB limit) for .star uploads - Use sanitized app_id consistently in install endpoints - Use atomic manifest updates to prevent race conditions - Add missing Optional import for type hints ## Web UI - Fix standalone mode schema loading in config partial - Schema-driven config forms now render correctly for all apps - Location fields show lat/lng/timezone inputs - Dropdown, toggle, text, color, and datetime fields all supported Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): code review fixes - security, robustness, and schema parsing ## Security Fixes - manager.py: Check _update_manifest_safe return values to prevent silent failures - manager.py: Improve temp file cleanup in _save_manifest to prevent leaks - manager.py: Fix uninstall order (manifest → memory → disk) for consistency - api_v3.py: Add path traversal validation in uninstall endpoint - api_v3.py: Implement atomic writes for manifest files with temp + rename - pixlet_renderer.py: Relax config validation to only block dangerous shell metacharacters ## Frontend Robustness - plugins_manager.js: Add safeLocalStorage wrapper for restricted contexts (private browsing) - starlark_config.html: Scope querySelector to container to prevent modal conflicts ## Schema Parsing Improvements - pixlet_renderer.py: Indentation-aware get_schema() extraction (handles nested functions) - pixlet_renderer.py: Handle quoted defaults with commas (e.g., "New York, NY") - tronbyte_repository.py: Validate file_name is string before path traversal checks ## Dependencies - requirements.txt: Update Pillow (10.4.0), PyYAML (6.0.2), requests (2.32.0) ## Documentation - docs/STARLARK_APPS_GUIDE.md: Comprehensive guide explaining: - How Starlark apps work - That apps come from Tronbyte (not LEDMatrix) - Installation, configuration, troubleshooting - Links to upstream projects All changes improve security, reliability, and user experience. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): convert Path to str in spec_from_file_location calls The module import helpers were passing Path objects directly to spec_from_file_location(), which caused spec to be None. This broke the Starlark app store browser. - Convert module_path to string in both _get_tronbyte_repository_class and _get_pixlet_renderer_class - Add None checks with clear error messages for debugging Fixes: spec not found for the module 'tronbyte_repository' Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): restore Starlark Apps section in plugins.html The Starlark Apps UI section was lost during merge conflict resolution with main branch. Restored from commit942663abwhich had the complete implementation with filtering, sorting, and pagination. Fixes: Starlark section not visible on plugin manager page Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): restore Starlark JS functionality lost in merge During the merge with main, all Starlark-specific JavaScript (104 lines) was removed from plugins_manager.js, including: - starlarkFilterState and filtering logic - loadStarlarkApps() function - Starlark app install/uninstall handlers - Starlark section collapse/expand logic - Pagination and sorting for Starlark apps Restored from commit942663aband re-applied safeLocalStorage wrapper from our code review fixes. Fixes: Starlark Apps section non-functional in web UI Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): security and race condition improvements Security fixes: - Add path traversal validation for output_path in download_star_file - Remove XSS-vulnerable inline onclick handlers, use delegated events - Add type hints to helper functions for better type safety Race condition fixes: - Lock manifest file BEFORE creating temp file in _save_manifest - Hold exclusive lock for entire read-modify-write cycle in _update_manifest_safe - Prevent concurrent writers from racing on manifest updates Other improvements: - Fix pages_v3.py standalone mode to load config.json from disk - Improve error handling with proper logging in cleanup blocks - Add explicit type annotations to Starlark helper functions Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): critical bug fixes and code quality improvements Critical fixes: - Fix stack overflow in safeLocalStorage (was recursively calling itself) - Fix duplicate event listeners on Starlark grid (added sentinel check) - Fix JSON validation to fail fast on malformed data instead of silently passing Error handling improvements: - Narrow exception catches to specific types (OSError, json.JSONDecodeError, ValueError) - Use logger.exception() with exc_info=True for better stack traces - Replace generic "except Exception" with specific exception types Logging improvements: - Add "[Starlark Pixlet]" context tags to pixlet_renderer logs - Redact sensitive config values from debug logs (API keys, etc.) - Add file_path context to schema parsing warnings Documentation: - Fix markdown lint issues (add language tags to code blocks) - Fix time unit spacing: "(5min)" -> "(5 min)" Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): critical path traversal and exception handling fixes Path traversal security fixes (CRITICAL): - Add _validate_starlark_app_path() helper to check for path traversal attacks - Validate app_id in get_starlark_app(), uninstall_starlark_app(), get_starlark_app_config(), and update_starlark_app_config() - Check for '..' and path separators before any filesystem access - Verify resolved paths are within _STARLARK_APPS_DIR using Path.relative_to() - Prevents unauthorized file access via crafted app_id like '../../../etc/passwd' Exception handling improvements (tronbyte_repository.py): - Replace broad "except Exception" with specific types - _make_request: catch requests.Timeout, requests.RequestException, json.JSONDecodeError - _fetch_raw_file: catch requests.Timeout, requests.RequestException separately - download_app_assets: narrow to OSError, ValueError - Add "[Tronbyte Repo]" context prefix to all log messages - Use exc_info=True for better stack traces API improvements: - Narrow exception catches to OSError, json.JSONDecodeError in config loading - Remove duplicate path traversal checks (now centralized in helper) Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> * fix(starlark): logging improvements and code quality fixes Logging improvements (pages_v3.py): - Add logging import and create module logger - Replace print() calls with logger.warning() with "[Pages V3]" prefix - Use logger.exception() for outer try/catch with exc_info=True - Narrow exception handling to OSError, json.JSONDecodeError for file operations API improvements (api_v3.py): - Remove unnecessary f-strings (Ruff F541) from ImportError messages - Narrow upload exception handling to ValueError, OSError, IOError - Use logger.exception() with context for better debugging - Remove early return in get_starlark_status() to allow standalone mode fallback - Sanitize error messages returned to client (don't expose internal details) Benefits: - Better log context with consistent prefixes - More specific exception handling prevents masking unexpected errors - Standalone/web-service-only mode now works for status endpoint - Stack traces preserved for debugging without exposing to clients Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com> --------- Co-authored-by: Chuck <chuck@example.com> Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
602 lines
22 KiB
Python
602 lines
22 KiB
Python
"""
|
|
Tronbyte Repository Module
|
|
|
|
Handles interaction with the Tronbyte apps repository on GitHub.
|
|
Fetches app listings, metadata, and downloads .star files.
|
|
"""
|
|
|
|
import logging
|
|
import time
|
|
import requests
|
|
import yaml
|
|
import threading
|
|
from typing import Dict, Any, Optional, List, Tuple
|
|
from pathlib import Path
|
|
from concurrent.futures import ThreadPoolExecutor, as_completed
|
|
|
|
logger = logging.getLogger(__name__)
|
|
|
|
# Module-level cache for bulk app listing (survives across requests)
|
|
_apps_cache = {'data': None, 'timestamp': 0, 'categories': [], 'authors': []}
|
|
_CACHE_TTL = 7200 # 2 hours
|
|
_cache_lock = threading.Lock()
|
|
|
|
|
|
class TronbyteRepository:
|
|
"""
|
|
Interface to the Tronbyte apps repository.
|
|
|
|
Provides methods to:
|
|
- List available apps
|
|
- Fetch app metadata
|
|
- Download .star files
|
|
- Parse manifest.yaml files
|
|
"""
|
|
|
|
REPO_OWNER = "tronbyt"
|
|
REPO_NAME = "apps"
|
|
DEFAULT_BRANCH = "main"
|
|
APPS_PATH = "apps"
|
|
|
|
def __init__(self, github_token: Optional[str] = None):
|
|
"""
|
|
Initialize repository interface.
|
|
|
|
Args:
|
|
github_token: Optional GitHub personal access token for higher rate limits
|
|
"""
|
|
self.github_token = github_token
|
|
self.base_url = "https://api.github.com"
|
|
self.raw_url = "https://raw.githubusercontent.com"
|
|
|
|
self.session = requests.Session()
|
|
if github_token:
|
|
self.session.headers.update({
|
|
'Authorization': f'token {github_token}'
|
|
})
|
|
self.session.headers.update({
|
|
'Accept': 'application/vnd.github.v3+json',
|
|
'User-Agent': 'LEDMatrix-Starlark-Plugin'
|
|
})
|
|
|
|
def _make_request(self, url: str, timeout: int = 10) -> Optional[Dict[str, Any]]:
|
|
"""
|
|
Make a request to GitHub API with error handling.
|
|
|
|
Args:
|
|
url: API URL to request
|
|
timeout: Request timeout in seconds
|
|
|
|
Returns:
|
|
JSON response or None on error
|
|
"""
|
|
try:
|
|
response = self.session.get(url, timeout=timeout)
|
|
|
|
if response.status_code == 403:
|
|
# Rate limit exceeded
|
|
logger.warning("[Tronbyte Repo] GitHub API rate limit exceeded")
|
|
return None
|
|
elif response.status_code == 404:
|
|
logger.warning(f"[Tronbyte Repo] Resource not found: {url}")
|
|
return None
|
|
elif response.status_code != 200:
|
|
logger.error(f"[Tronbyte Repo] GitHub API error: {response.status_code}")
|
|
return None
|
|
|
|
return response.json()
|
|
|
|
except requests.Timeout:
|
|
logger.error(f"[Tronbyte Repo] Request timeout: {url}")
|
|
return None
|
|
except requests.RequestException as e:
|
|
logger.error(f"[Tronbyte Repo] Request error: {e}", exc_info=True)
|
|
return None
|
|
except (json.JSONDecodeError, ValueError) as e:
|
|
logger.error(f"[Tronbyte Repo] JSON parse error for {url}: {e}", exc_info=True)
|
|
return None
|
|
|
|
def _fetch_raw_file(self, file_path: str, branch: Optional[str] = None, binary: bool = False):
|
|
"""
|
|
Fetch raw file content from repository.
|
|
|
|
Args:
|
|
file_path: Path to file in repository
|
|
branch: Branch name (default: DEFAULT_BRANCH)
|
|
binary: If True, return bytes; if False, return text
|
|
|
|
Returns:
|
|
File content as string/bytes, or None on error
|
|
"""
|
|
branch = branch or self.DEFAULT_BRANCH
|
|
url = f"{self.raw_url}/{self.REPO_OWNER}/{self.REPO_NAME}/{branch}/{file_path}"
|
|
|
|
try:
|
|
response = self.session.get(url, timeout=10)
|
|
if response.status_code == 200:
|
|
return response.content if binary else response.text
|
|
else:
|
|
logger.warning(f"[Tronbyte Repo] Failed to fetch raw file: {file_path} ({response.status_code})")
|
|
return None
|
|
except requests.Timeout:
|
|
logger.error(f"[Tronbyte Repo] Timeout fetching raw file: {file_path}")
|
|
return None
|
|
except requests.RequestException as e:
|
|
logger.error(f"[Tronbyte Repo] Network error fetching raw file {file_path}: {e}", exc_info=True)
|
|
return None
|
|
|
|
def list_apps(self) -> Tuple[bool, Optional[List[Dict[str, Any]]], Optional[str]]:
|
|
"""
|
|
List all available apps in the repository.
|
|
|
|
Returns:
|
|
Tuple of (success, apps_list, error_message)
|
|
"""
|
|
url = f"{self.base_url}/repos/{self.REPO_OWNER}/{self.REPO_NAME}/contents/{self.APPS_PATH}"
|
|
|
|
data = self._make_request(url)
|
|
if data is None:
|
|
return False, None, "Failed to fetch repository contents"
|
|
|
|
if not isinstance(data, list):
|
|
return False, None, "Invalid response format"
|
|
|
|
# Filter directories (apps)
|
|
apps = []
|
|
for item in data:
|
|
if item.get('type') == 'dir':
|
|
app_id = item.get('name')
|
|
if app_id and not app_id.startswith('.'):
|
|
apps.append({
|
|
'id': app_id,
|
|
'path': item.get('path'),
|
|
'url': item.get('url')
|
|
})
|
|
|
|
logger.info(f"Found {len(apps)} apps in repository")
|
|
return True, apps, None
|
|
|
|
def get_app_metadata(self, app_id: str) -> Tuple[bool, Optional[Dict[str, Any]], Optional[str]]:
|
|
"""
|
|
Fetch metadata for a specific app.
|
|
|
|
Reads the manifest.yaml file for the app and parses it.
|
|
|
|
Args:
|
|
app_id: App identifier
|
|
|
|
Returns:
|
|
Tuple of (success, metadata_dict, error_message)
|
|
"""
|
|
manifest_path = f"{self.APPS_PATH}/{app_id}/manifest.yaml"
|
|
|
|
content = self._fetch_raw_file(manifest_path)
|
|
if not content:
|
|
return False, None, f"Failed to fetch manifest for {app_id}"
|
|
|
|
try:
|
|
metadata = yaml.safe_load(content)
|
|
|
|
# Validate that metadata is a dict before mutating
|
|
if not isinstance(metadata, dict):
|
|
if metadata is None:
|
|
logger.warning(f"Manifest for {app_id} is empty or None, initializing empty dict")
|
|
metadata = {}
|
|
else:
|
|
logger.error(f"Manifest for {app_id} is not a dict (got {type(metadata).__name__}), skipping")
|
|
return False, None, f"Invalid manifest format: expected dict, got {type(metadata).__name__}"
|
|
|
|
# Enhance with app_id
|
|
metadata['id'] = app_id
|
|
|
|
# Parse schema if present
|
|
if 'schema' in metadata:
|
|
# Schema is already parsed from YAML
|
|
pass
|
|
|
|
return True, metadata, None
|
|
|
|
except (yaml.YAMLError, TypeError) as e:
|
|
logger.error(f"Failed to parse manifest for {app_id}: {e}")
|
|
return False, None, f"Invalid manifest format: {e}"
|
|
|
|
def list_apps_with_metadata(self, max_apps: Optional[int] = None) -> List[Dict[str, Any]]:
|
|
"""
|
|
List all apps with their metadata.
|
|
|
|
This is slower as it fetches manifest.yaml for each app.
|
|
|
|
Args:
|
|
max_apps: Optional limit on number of apps to fetch
|
|
|
|
Returns:
|
|
List of app metadata dictionaries
|
|
"""
|
|
success, apps, error = self.list_apps()
|
|
|
|
if not success:
|
|
logger.error(f"Failed to list apps: {error}")
|
|
return []
|
|
|
|
if max_apps is not None:
|
|
apps = apps[:max_apps]
|
|
|
|
apps_with_metadata = []
|
|
for app_info in apps:
|
|
app_id = app_info['id']
|
|
success, metadata, error = self.get_app_metadata(app_id)
|
|
|
|
if success and metadata:
|
|
# Merge basic info with metadata
|
|
metadata.update({
|
|
'repository_path': app_info['path']
|
|
})
|
|
apps_with_metadata.append(metadata)
|
|
else:
|
|
# Add basic info even if metadata fetch failed
|
|
apps_with_metadata.append({
|
|
'id': app_id,
|
|
'name': app_id.replace('_', ' ').title(),
|
|
'summary': 'No description available',
|
|
'repository_path': app_info['path'],
|
|
'metadata_error': error
|
|
})
|
|
|
|
return apps_with_metadata
|
|
|
|
def list_all_apps_cached(self) -> Dict[str, Any]:
|
|
"""
|
|
Fetch ALL apps with metadata, using a module-level cache.
|
|
|
|
On first call (or after cache TTL expires), fetches the directory listing
|
|
via the GitHub API (1 call) then fetches all manifests in parallel via
|
|
raw.githubusercontent.com (not rate-limited). Results are cached for 2 hours.
|
|
|
|
Returns:
|
|
Dict with keys: apps, categories, authors, count, cached
|
|
"""
|
|
global _apps_cache
|
|
|
|
now = time.time()
|
|
|
|
# Check cache with lock (read-only check)
|
|
with _cache_lock:
|
|
if _apps_cache['data'] is not None and (now - _apps_cache['timestamp']) < _CACHE_TTL:
|
|
return {
|
|
'apps': _apps_cache['data'],
|
|
'categories': _apps_cache['categories'],
|
|
'authors': _apps_cache['authors'],
|
|
'count': len(_apps_cache['data']),
|
|
'cached': True
|
|
}
|
|
|
|
# Fetch directory listing (1 GitHub API call)
|
|
success, app_dirs, error = self.list_apps()
|
|
if not success or not app_dirs:
|
|
logger.error(f"Failed to list apps for bulk fetch: {error}")
|
|
return {'apps': [], 'categories': [], 'authors': [], 'count': 0, 'cached': False}
|
|
|
|
logger.info(f"Bulk-fetching manifests for {len(app_dirs)} apps...")
|
|
|
|
def fetch_one(app_info):
|
|
"""Fetch a single app's manifest (runs in thread pool)."""
|
|
app_id = app_info['id']
|
|
manifest_path = f"{self.APPS_PATH}/{app_id}/manifest.yaml"
|
|
content = self._fetch_raw_file(manifest_path)
|
|
if content:
|
|
try:
|
|
metadata = yaml.safe_load(content)
|
|
if not isinstance(metadata, dict):
|
|
metadata = {}
|
|
metadata['id'] = app_id
|
|
metadata['repository_path'] = app_info.get('path', '')
|
|
return metadata
|
|
except (yaml.YAMLError, TypeError) as e:
|
|
logger.warning(f"Failed to parse manifest for {app_id}: {e}")
|
|
# Fallback: minimal entry
|
|
return {
|
|
'id': app_id,
|
|
'name': app_id.replace('_', ' ').replace('-', ' ').title(),
|
|
'summary': 'No description available',
|
|
'repository_path': app_info.get('path', ''),
|
|
}
|
|
|
|
# Parallel manifest fetches via raw.githubusercontent.com (high rate limit)
|
|
apps_with_metadata = []
|
|
with ThreadPoolExecutor(max_workers=5) as executor:
|
|
futures = {executor.submit(fetch_one, info): info for info in app_dirs}
|
|
for future in as_completed(futures):
|
|
try:
|
|
result = future.result(timeout=30)
|
|
if result:
|
|
apps_with_metadata.append(result)
|
|
except Exception as e:
|
|
app_info = futures[future]
|
|
logger.warning(f"Failed to fetch manifest for {app_info['id']}: {e}")
|
|
apps_with_metadata.append({
|
|
'id': app_info['id'],
|
|
'name': app_info['id'].replace('_', ' ').replace('-', ' ').title(),
|
|
'summary': 'No description available',
|
|
'repository_path': app_info.get('path', ''),
|
|
})
|
|
|
|
# Sort by name for consistent ordering
|
|
apps_with_metadata.sort(key=lambda a: (a.get('name') or a.get('id', '')).lower())
|
|
|
|
# Extract unique categories and authors
|
|
categories = sorted({a.get('category', '') for a in apps_with_metadata if a.get('category')})
|
|
authors = sorted({a.get('author', '') for a in apps_with_metadata if a.get('author')})
|
|
|
|
# Update cache with lock
|
|
with _cache_lock:
|
|
_apps_cache['data'] = apps_with_metadata
|
|
_apps_cache['timestamp'] = now
|
|
_apps_cache['categories'] = categories
|
|
_apps_cache['authors'] = authors
|
|
|
|
logger.info(f"Cached {len(apps_with_metadata)} apps ({len(categories)} categories, {len(authors)} authors)")
|
|
|
|
return {
|
|
'apps': apps_with_metadata,
|
|
'categories': categories,
|
|
'authors': authors,
|
|
'count': len(apps_with_metadata),
|
|
'cached': False
|
|
}
|
|
|
|
def download_star_file(self, app_id: str, output_path: Path, filename: Optional[str] = None) -> Tuple[bool, Optional[str]]:
|
|
"""
|
|
Download the .star file for an app.
|
|
|
|
Args:
|
|
app_id: App identifier (directory name)
|
|
output_path: Where to save the .star file
|
|
filename: Optional specific filename from manifest (e.g., "analog_clock.star")
|
|
If not provided, assumes {app_id}.star
|
|
|
|
Returns:
|
|
Tuple of (success, error_message)
|
|
"""
|
|
# Validate inputs for path traversal
|
|
if '..' in app_id or '/' in app_id or '\\' in app_id:
|
|
return False, f"Invalid app_id: contains path traversal characters"
|
|
|
|
star_filename = filename or f"{app_id}.star"
|
|
if '..' in star_filename or '/' in star_filename or '\\' in star_filename:
|
|
return False, f"Invalid filename: contains path traversal characters"
|
|
|
|
# Validate output_path to prevent path traversal
|
|
import tempfile
|
|
try:
|
|
resolved_output = output_path.resolve()
|
|
temp_dir = Path(tempfile.gettempdir()).resolve()
|
|
|
|
# Check if output_path is within the system temp directory
|
|
# Use try/except for compatibility with Python < 3.9 (is_relative_to)
|
|
try:
|
|
is_safe = resolved_output.is_relative_to(temp_dir)
|
|
except AttributeError:
|
|
# Fallback for Python < 3.9: compare string paths
|
|
is_safe = str(resolved_output).startswith(str(temp_dir) + '/')
|
|
|
|
if not is_safe:
|
|
logger.warning(f"Path traversal attempt in download_star_file: app_id={app_id}, output_path={output_path}")
|
|
return False, f"Invalid output_path for {app_id}: must be within temp directory"
|
|
except Exception as e:
|
|
logger.error(f"Error validating output_path for {app_id}: {e}")
|
|
return False, f"Invalid output_path for {app_id}"
|
|
|
|
# Use provided filename or fall back to app_id.star
|
|
star_path = f"{self.APPS_PATH}/{app_id}/{star_filename}"
|
|
|
|
content = self._fetch_raw_file(star_path)
|
|
if not content:
|
|
return False, f"Failed to download .star file for {app_id} (tried {star_filename})"
|
|
|
|
try:
|
|
output_path.parent.mkdir(parents=True, exist_ok=True)
|
|
with open(output_path, 'w', encoding='utf-8') as f:
|
|
f.write(content)
|
|
|
|
logger.info(f"Downloaded {app_id}.star to {output_path}")
|
|
return True, None
|
|
|
|
except OSError as e:
|
|
logger.exception(f"Failed to save .star file: {e}")
|
|
return False, f"Failed to save file: {e}"
|
|
|
|
def get_app_files(self, app_id: str) -> Tuple[bool, Optional[List[str]], Optional[str]]:
|
|
"""
|
|
List all files in an app directory.
|
|
|
|
Args:
|
|
app_id: App identifier
|
|
|
|
Returns:
|
|
Tuple of (success, file_list, error_message)
|
|
"""
|
|
url = f"{self.base_url}/repos/{self.REPO_OWNER}/{self.REPO_NAME}/contents/{self.APPS_PATH}/{app_id}"
|
|
|
|
data = self._make_request(url)
|
|
if not data:
|
|
return False, None, "Failed to fetch app files"
|
|
|
|
if not isinstance(data, list):
|
|
return False, None, "Invalid response format"
|
|
|
|
files = [item['name'] for item in data if item.get('type') == 'file']
|
|
return True, files, None
|
|
|
|
def download_app_assets(self, app_id: str, output_dir: Path) -> Tuple[bool, Optional[str]]:
|
|
"""
|
|
Download all asset files (images, sources, etc.) for an app.
|
|
|
|
Args:
|
|
app_id: App identifier
|
|
output_dir: Directory to save assets to
|
|
|
|
Returns:
|
|
Tuple of (success, error_message)
|
|
"""
|
|
# Validate app_id for path traversal
|
|
if '..' in app_id or '/' in app_id or '\\' in app_id:
|
|
return False, f"Invalid app_id: contains path traversal characters"
|
|
|
|
try:
|
|
# Get directory listing for the app
|
|
url = f"{self.base_url}/repos/{self.REPO_OWNER}/{self.REPO_NAME}/contents/{self.APPS_PATH}/{app_id}"
|
|
data = self._make_request(url)
|
|
if not data:
|
|
return False, f"Failed to fetch app directory listing"
|
|
|
|
if not isinstance(data, list):
|
|
return False, f"Invalid directory listing format"
|
|
|
|
# Find directories that contain assets (images, sources, etc.)
|
|
asset_dirs = []
|
|
for item in data:
|
|
if item.get('type') == 'dir':
|
|
dir_name = item.get('name')
|
|
# Common asset directory names in Tronbyte apps
|
|
if dir_name in ('images', 'sources', 'fonts', 'assets'):
|
|
asset_dirs.append((dir_name, item.get('url')))
|
|
|
|
if not asset_dirs:
|
|
# No asset directories, this is fine
|
|
return True, None
|
|
|
|
# Download each asset directory
|
|
for dir_name, dir_url in asset_dirs:
|
|
# Validate directory name for path traversal
|
|
if '..' in dir_name or '/' in dir_name or '\\' in dir_name:
|
|
logger.warning(f"Skipping potentially unsafe directory: {dir_name}")
|
|
continue
|
|
|
|
# Get files in this directory
|
|
dir_data = self._make_request(dir_url)
|
|
if not dir_data or not isinstance(dir_data, list):
|
|
logger.warning(f"Could not list files in {app_id}/{dir_name}")
|
|
continue
|
|
|
|
# Create local directory
|
|
local_dir = output_dir / dir_name
|
|
local_dir.mkdir(parents=True, exist_ok=True)
|
|
|
|
# Download each file
|
|
for file_item in dir_data:
|
|
if file_item.get('type') == 'file':
|
|
file_name = file_item.get('name')
|
|
|
|
# Ensure file_name is a non-empty string before validation
|
|
if not file_name or not isinstance(file_name, str):
|
|
logger.warning(f"Skipping file with invalid name in {dir_name}: {file_item}")
|
|
continue
|
|
|
|
# Validate filename for path traversal
|
|
if '..' in file_name or '/' in file_name or '\\' in file_name:
|
|
logger.warning(f"Skipping potentially unsafe file: {file_name}")
|
|
continue
|
|
|
|
file_path = f"{self.APPS_PATH}/{app_id}/{dir_name}/{file_name}"
|
|
content = self._fetch_raw_file(file_path, binary=True)
|
|
if content:
|
|
# Write binary content to file
|
|
output_path = local_dir / file_name
|
|
try:
|
|
with open(output_path, 'wb') as f:
|
|
f.write(content)
|
|
logger.debug(f"[Tronbyte Repo] Downloaded asset: {dir_name}/{file_name}")
|
|
except OSError as e:
|
|
logger.warning(f"[Tronbyte Repo] Failed to save {dir_name}/{file_name}: {e}", exc_info=True)
|
|
else:
|
|
logger.warning(f"Failed to download {dir_name}/{file_name}")
|
|
|
|
logger.info(f"[Tronbyte Repo] Downloaded assets for {app_id} ({len(asset_dirs)} directories)")
|
|
return True, None
|
|
|
|
except (OSError, ValueError) as e:
|
|
logger.exception(f"[Tronbyte Repo] Error downloading assets for {app_id}: {e}")
|
|
return False, f"Error downloading assets: {e}"
|
|
|
|
def search_apps(self, query: str, apps_with_metadata: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
|
|
"""
|
|
Search apps by name, summary, or description.
|
|
|
|
Args:
|
|
query: Search query string
|
|
apps_with_metadata: List of apps with metadata
|
|
|
|
Returns:
|
|
Filtered list of apps matching query
|
|
"""
|
|
if not query:
|
|
return apps_with_metadata
|
|
|
|
query_lower = query.lower()
|
|
results = []
|
|
|
|
for app in apps_with_metadata:
|
|
# Search in name, summary, description, author
|
|
searchable = ' '.join([
|
|
app.get('name', ''),
|
|
app.get('summary', ''),
|
|
app.get('desc', ''),
|
|
app.get('author', ''),
|
|
app.get('id', '')
|
|
]).lower()
|
|
|
|
if query_lower in searchable:
|
|
results.append(app)
|
|
|
|
return results
|
|
|
|
def filter_by_category(self, category: str, apps_with_metadata: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
|
|
"""
|
|
Filter apps by category.
|
|
|
|
Args:
|
|
category: Category name (or 'all' for no filtering)
|
|
apps_with_metadata: List of apps with metadata
|
|
|
|
Returns:
|
|
Filtered list of apps
|
|
"""
|
|
if not category or category.lower() == 'all':
|
|
return apps_with_metadata
|
|
|
|
category_lower = category.lower()
|
|
results = []
|
|
|
|
for app in apps_with_metadata:
|
|
app_category = app.get('category', '').lower()
|
|
if app_category == category_lower:
|
|
results.append(app)
|
|
|
|
return results
|
|
|
|
def get_rate_limit_info(self) -> Dict[str, Any]:
|
|
"""
|
|
Get current GitHub API rate limit information.
|
|
|
|
Returns:
|
|
Dictionary with rate limit info
|
|
"""
|
|
url = f"{self.base_url}/rate_limit"
|
|
data = self._make_request(url)
|
|
|
|
if data:
|
|
core = data.get('resources', {}).get('core', {})
|
|
return {
|
|
'limit': core.get('limit', 0),
|
|
'remaining': core.get('remaining', 0),
|
|
'reset': core.get('reset', 0),
|
|
'used': core.get('used', 0)
|
|
}
|
|
|
|
return {
|
|
'limit': 0,
|
|
'remaining': 0,
|
|
'reset': 0,
|
|
'used': 0
|
|
}
|