mirror of
https://github.com/ChuckBuilds/LEDMatrix.git
synced 2026-05-16 02:13:32 +00:00
perf(plugins): dramatically speed up plugin manager tab load time (#333)
* fix(cache): check odds keys before generic live check in get_data_type_from_key Cache keys like odds_espn_basketball_nba_<id>_live contain both 'odds' and 'live'. The previous ordering matched the generic 'live' check first, returning 'sports_live' (30 s TTL) instead of the correct 'odds_live' (120 s TTL). This caused the ESPN odds API to be hit every 30 s per live game, frequently triggering the 3-second per-request timeout and returning no odds data. Moving the 'odds' check above the generic 'live' block restores the correct 120-second cache TTL for in-progress game odds. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(display): use single-quoted HTML attributes for JSON hidden inputs Placing |tojson output (which contains double quotes) inside a double-quoted HTML attribute broke the attribute — browsers closed the attribute at the first inner quote, leaving JS with an empty or truncated value. JSON.parse then failed silently, leaving excluded=[] so all Vegas scroll plugins appeared checked (included) regardless of the actual excluded_plugins config. Switch to single-quoted HTML attributes so the JSON double quotes are valid inside the attribute value. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * perf(plugins): dramatically speed up plugin manager tab load time ## Problem The Plugins tab loaded slowly and inconsistently (5–30s depending on cache state), with a blank spinner for the entire wait. Three root causes: 1. **N+1 subprocess per installed plugin** — `_get_local_git_info` ran 4 separate git subprocesses per plugin (rev-parse HEAD, abbrev-ref, config --get remote.origin.url, log --format=%cI). With 15 plugins that's 60 blocking subprocess spawns before the endpoint returned. 2. **Serial per-plugin loop** — the `/plugins/installed` endpoint processed each plugin sequentially: manifest read → git info → instance lookup → Vegas mode query, one plugin at a time. 3. **Serial JS loading** — the store search only started after installed plugins fully completed, so users waited for both round-trips back to back. No UI feedback during the wait. ## Changes ### Backend — src/plugin_system/store_manager.py - Consolidate 4 git subprocesses → 1: branch read from `.git/HEAD` (file I/O, no subprocess), remote URL parsed from `.git/config` (file I/O, no subprocess), SHA + commit date fetched together in a single `git log -1 --format=%H%n%cI` call - Existing signature-based cache already eliminates all subprocesses on warm hits; this change cuts cold-cache cost from 4 → 1 per plugin ### Backend — web_interface/blueprints/api_v3.py - Wrap per-plugin work in a `_build_plugin_entry()` helper and execute it across a `ThreadPoolExecutor(max_workers=8)` so all plugins are processed in parallel instead of sequentially - Fix double `get_plugin()` call per plugin (was called once for the enabled fallback and again for Vegas mode — now one shared call) ### Frontend — web_interface/static/v3/plugins_manager.js - Fire `searchPluginStore()` and `loadInstalledPlugins()` simultaneously instead of waiting for installed to complete before starting the store - After installed data arrives, call `applyStoreFiltersAndSort(true)` to refresh install/update/reinstall badges from already-cached store data (instant, no extra network call) ### Frontend — web_interface/templates/v3/partials/plugins.html - Add responsive skeleton cards to the installed plugins section that match real card proportions (removed automatically when data renders) - Replace the 5 featureless gray boxes in the store skeleton with 10 structured skeleton cards matching the real card layout ## Measured improvement on Pi 4 (11 installed plugins, ledpi-ticker) | Scenario | Before | After | |---|---|---| | Cold cache (first open) | ~8–15s | **0.9s** | | Warm cache (git cache hit) | ~1–2s | **55ms** | | UI feedback during load | blank spinner | skeleton cards | | Store waits for installed | yes (serial) | no (parallel) | Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> * fix(plugins): harden git metadata parsing and plugin entry building store_manager.py: - Detect worktree/submodule .git files (gitdir: <path>) and resolve to the actual git directory before reading HEAD or config - Wrap HEAD read_text in try/except OSError/NotADirectoryError so atypical repos return None instead of propagating exceptions - Guard config url line split with '=' presence check to avoid IndexError on malformed lines api_v3.py: - Wrap _build_plugin_entry body in a try/except via a thin outer wrapper so a single plugin's failure doesn't 500 the whole endpoint; failed entries return None and are filtered by the existing [r for r in results if r is not None] step - Narrow manifest except clause to FileNotFoundError, PermissionError, json.JSONDecodeError instead of bare Exception - Validate manifest is a dict before calling plugin_info.update() and log a debug message when it isn't Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com> --------- Co-authored-by: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -1850,58 +1850,72 @@ class PluginStoreManager:
|
||||
return cached[1]
|
||||
|
||||
try:
|
||||
sha_result = subprocess.run(
|
||||
['git', '-C', str(plugin_path), 'rev-parse', 'HEAD'],
|
||||
# .git may be a file (worktree / submodule) containing "gitdir: <path>".
|
||||
# Resolve it to the actual git directory before reading any files.
|
||||
try:
|
||||
if git_dir.is_file():
|
||||
pointer = git_dir.read_text(encoding='utf-8', errors='replace').strip()
|
||||
if pointer.startswith('gitdir:'):
|
||||
resolved = (plugin_path / pointer[len('gitdir:'):].strip()).resolve()
|
||||
if resolved.is_dir():
|
||||
git_dir = resolved
|
||||
else:
|
||||
return None
|
||||
else:
|
||||
return None
|
||||
except (OSError, NotADirectoryError):
|
||||
return None
|
||||
|
||||
# Read branch directly from .git/HEAD (no subprocess).
|
||||
branch = ''
|
||||
try:
|
||||
head_text = (git_dir / 'HEAD').read_text(encoding='utf-8', errors='replace').strip()
|
||||
if head_text.startswith('ref: refs/heads/'):
|
||||
branch = head_text[len('ref: refs/heads/'):]
|
||||
elif head_text.startswith('ref: '):
|
||||
branch = head_text[len('ref: '):]
|
||||
# else: detached HEAD — branch stays ''
|
||||
except (OSError, NotADirectoryError):
|
||||
pass
|
||||
|
||||
# Remote URL from .git/config — parse [remote "origin"] url line.
|
||||
remote_url = None
|
||||
try:
|
||||
config_text = (git_dir / 'config').read_text(encoding='utf-8', errors='replace')
|
||||
in_origin = False
|
||||
for line in config_text.splitlines():
|
||||
stripped = line.strip()
|
||||
if stripped == '[remote "origin"]':
|
||||
in_origin = True
|
||||
elif stripped.startswith('['):
|
||||
in_origin = False
|
||||
elif in_origin and stripped.startswith('url') and '=' in stripped:
|
||||
remote_url = stripped.split('=', 1)[1].strip()
|
||||
break
|
||||
except (OSError, NotADirectoryError):
|
||||
pass
|
||||
|
||||
# Single subprocess: SHA + commit date in one call.
|
||||
log_result = subprocess.run(
|
||||
['git', '-C', str(plugin_path), 'log', '-1', '--format=%H%n%cI', 'HEAD'],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=10,
|
||||
check=True
|
||||
)
|
||||
sha = sha_result.stdout.strip()
|
||||
|
||||
branch_result = subprocess.run(
|
||||
['git', '-C', str(plugin_path), 'rev-parse', '--abbrev-ref', 'HEAD'],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=10,
|
||||
check=True
|
||||
)
|
||||
branch = branch_result.stdout.strip()
|
||||
|
||||
if branch == 'HEAD':
|
||||
branch = ''
|
||||
|
||||
# Get remote URL
|
||||
remote_url_result = subprocess.run(
|
||||
['git', '-C', str(plugin_path), 'config', '--get', 'remote.origin.url'],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=10,
|
||||
check=False
|
||||
)
|
||||
remote_url = remote_url_result.stdout.strip() if remote_url_result.returncode == 0 else None
|
||||
|
||||
# Get commit date in ISO format
|
||||
date_result = subprocess.run(
|
||||
['git', '-C', str(plugin_path), 'log', '-1', '--format=%cI', 'HEAD'],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=10,
|
||||
check=True
|
||||
)
|
||||
commit_date_iso = date_result.stdout.strip()
|
||||
lines = log_result.stdout.strip().splitlines()
|
||||
sha = lines[0] if lines else ''
|
||||
commit_date_iso = lines[1] if len(lines) > 1 else ''
|
||||
|
||||
result = {
|
||||
'sha': sha,
|
||||
'short_sha': sha[:7] if sha else '',
|
||||
'branch': branch
|
||||
'branch': branch,
|
||||
}
|
||||
|
||||
# Add remote URL if available
|
||||
|
||||
if remote_url:
|
||||
result['remote_url'] = remote_url
|
||||
|
||||
# Add commit date if available
|
||||
if commit_date_iso:
|
||||
result['date_iso'] = commit_date_iso
|
||||
result['date'] = self._iso_to_date(commit_date_iso)
|
||||
|
||||
Reference in New Issue
Block a user