DocsAnalyticsComplete User Guide

Searchable, structured training for readers, crawlers, and AI assistants.

Public training path

Complete User Guide

Analytics & Dashboard · Beginner-friendly walkthrough

The dashboard is not only for observing the platform. It is where you decide what needs attention next. Cost, adoption, drift, workspaces, and governed release signals should help you choose actions, not just admire charts. Daily triage Check for broken runs,...

Next best action

Preview the guidance here, then create an account to save workspaces, unlock guided execution, and continue inside the platform.

Sections

2 guided blocks

Read Time

5 min focused read

Coverage

206 searchable doc sections

dashboardanalyticstutorialoperationsgovernancehomesignalspinning

Section 1 of 2

How to Use Analytics and Dashboard as an Operating Rhythm

dashboardanalyticstutorialoperationsgovernance

The dashboard is not only for observing the platform. It is where you decide what needs attention next. Cost, adoption, drift, workspaces, and governed release signals should help you choose actions, not just admire charts.

Daily triage

Check for broken runs, unusual spend, blocked approvals, or obvious workflow friction.

Weekly review

Look for drift, low-adoption teams, stale workspaces, and prompt families that deserve benchmark refresh.

Monthly ROI view

Translate usage and automation into business value, savings, and investment decisions leaders can understand.

Governance layer

Use workspace evidence, approvals, and BOM views to decide whether important outputs are actually release-ready.

Daily: catch anomalies, failed launches, blocked reviews, and urgent cost spikes.

Weekly: review quality drift, adoption gaps, and the top prompts or workflows that may need re-benchmarking.

Monthly: review ROI, credit usage patterns, and whether the current mix of tools still matches team priorities.

1

Step 1: Start with exceptions, not averages

Look first for red flags, spikes, blocked approvals, or weak records. Averages can hide the items that actually need attention.

2

Step 2: Review cost in business context

Ask whether increased spend came from valuable work, rework, or experimentation without discipline.

3

Step 3: Check drift and prompt health

If a high-use prompt starts degrading, fix or benchmark it before users quietly create workarounds.

4

Step 4: Review adoption honestly

Low adoption might mean a training gap, a confusing workflow, or a service that does not fit the job as currently configured.

5

Step 5: Inspect workspace evidence quality

Look at weak provenance, missing reviewers, and decisions that are not properly linked to supporting artifacts.

6

Step 6: Turn findings into follow-up

Every dashboard review should end with actions: retrain, benchmark, archive, tighten billing, or open review loops.

Do This

Use the dashboard to choose the next intervention, not just to monitor historical data.

Link cost spikes back to prompts, teams, or workflows so the response is specific.

Review workspace governance signals on the same cadence as quality and spend.

Keep one explicit list of actions created by the dashboard review.

Avoid

Do not treat a good monthly average as proof there are no urgent problems.

Do not optimize spend without checking the effect on output quality and team throughput.

Do not let adoption charts become blame charts; use them to target enablement.

Do not approve releases from weak workspace records just because the output looks polished.

Section 2 of 2

Home Dashboard Signals, Personalization, and Admin Access

dashboardhomesignalspinningadmin

The home dashboard is designed to surface one clear next move while keeping the rest of the platform available but quieter. Its job is not to show every metric at once. Its job is to combine status, shortcuts, recommendations, and recent evidence into a usable operating picture.

Status board

The hero cards summarize workspace, support runway, AI credit capacity, and recent trail so you can orient yourself quickly.

Primary actions

The home page proposes immediate actions such as resuming the latest workspace, opening the suggested service, browsing the service hub, or entering admin areas when permitted.

Recommendations and shortcuts

Recommended services can be pinned, recently opened services stay close, and shortcuts become a personal dock for repeated work.

Operating picture

Plan runway, team footprint, projects, expert credits, recent output, and the optional progress layer round out the wider context.

Workspace: the latest project room and collaborator count.

Support: upcoming consultation state plus consultation and mentoring credits.

AI capacity: current AI credits.

Recent trail: recent history items such as Prompt Architect output, Academy activity, or other service runs.

1

Read the hero first

Start with workspace, support, AI capacity, and recent trail before opening deeper areas.

2

Take the suggested next move

Use the recommended action if it matches the real job in front of you. If not, open the services hub intentionally.

3

Pin only what earns the space

Reserve pins for truly repeated work so the dock stays useful instead of noisy.

4

Use recent output as evidence

The recent trail is valuable because it points back to real generated work, not just navigation clicks.

Pro Tip: A good dashboard is selective

If the home view feels busy, hide sections before you ignore the whole page. The customization controls exist so the dashboard stays useful at different maturity levels.

Academy v4.0 · Interactive Documentation · Beginner Mode