Skip to main content
Step Replays let teams inspect a recorded interaction timeline with full context: UI state, console logs, and network data. This is a core feature of Samelogic’s element intelligence platform.

Purpose

  • Reproduce bugs with a deterministic sequence of steps.
  • Pinpoint the exact moment a regression occurs.
  • Share replay evidence across teams.
  • Build institutional knowledge of how elements behave over time.

Primary users

  • QA engineers verifying fixes.
  • Developers diagnosing UI regressions.
  • Support teams attaching evidence to tickets.
  • Product managers reviewing user interaction patterns.

Plan limits

Step replays quotas vary by subscription tier:
PlanStep Replays/monthRetention
Free153 months
Starter ($99/mo)2,0001 year
Pro ($199/mo)50,0003 years
EnterpriseUnlimitedCustom

Entry points

  • Bug report detail view
  • Project activity feed
  • Element detail links to related replays
  • Chrome Extension recording preview

Core replay flow

  1. Open a replay session from a report or element.
  2. Review the timeline and step list.
  3. Scrub the timeline to inspect state changes.
  4. Inspect console, network, and DOM context.
  5. Share or export replay metadata.

CI workflow (GitHub Actions)

Step Replays now include a CI workflow so QA can move from “recorded issue” to “PR-visible validation” without manual handoff.

CI workspace

  • Open each replay in a dedicated CI workspace.
  • Choose a validation profile (smoke, regression, or auth).
  • Trigger Run in CI directly from the replay context.
  • Track run status and history from the same screen.

Project-level CI configuration

  • Configure GitHub repository mapping per project.
  • Select default workflow file and branch/ref target.
  • Set allowed branches to limit where replay validation can run.
  • Optionally require approval for protected branches.

Replay artifact bundle

Each CI run can consume a generated replay artifact bundle that includes:
  • Playwright replay test scaffold
  • Replay and step metadata
  • Comments and step log data
  • Screenshot manifest and available screenshot files
  • Optional rrweb event log
  • PR comment draft markdown

Run lifecycle tracking

Replay validation runs are tracked from dispatch to completion with statuses such as:
  • Queued
  • Running
  • Passed / Failed
  • Dispatch failed / Policy blocked
Runs can publish links back to CI logs and pull request check context.

Page structure (web app)

1. Replay header

  • Session metadata (URL, user, timestamp)
  • Status and environment badges
  • Share and export controls

2. Playback viewport

  • Recorded UI playback with controls (rrweb-based)
  • Speed selector (0.5x, 1x, 1.5x, 2x) and pause/resume
  • Fullscreen mode for detailed inspection
  • Screenshot fallbacks for missing frames

3. Timeline and steps

  • Interactive timeline with event chips (click, scroll, focus, mutation, navigation, error)
  • Playhead scrubbing with drag support
  • Step list panel with timestamped events
  • Auto-scroll to active event during playback

4. Diagnostics panel

  • Console logs with severity filters
  • Network requests summary
  • DOM and selector context snapshot
  • Error highlighting with stack traces

Timeline event types

The timeline captures and visualizes these interaction types:
  • Click: Mouse clicks and double-clicks
  • Focus/Blur: Element focus transitions
  • Scroll: Page and element scrolling
  • Mutation: DOM changes and UI updates
  • Navigation: Page navigations and URL changes
  • Error: Console errors and exceptions
Each event type has distinct colors and icons for quick identification.

Replay triage taxonomy

Replay comments and timeline pills use the CRO issue taxonomy:
  • Broken Experience
  • UX Friction
  • Unclear Messaging
  • Weak CTA
  • Trust Gap
  • Form Friction
  • Performance/Speed
  • Mobile Usability
  • Tracking Gap
  • No Issue
Legacy replay labels remain readable during the migration window through parser compatibility.

Data captured

  • Event timeline and interaction metadata
  • DOM snapshots and element references
  • Console logs and error stack traces
  • Network request summaries
  • Element selector stability data
  • Optional screenshots or frames

Key actions

  • Jump to a step or timestamp
  • Copy element selectors from a replay
  • Export logs and share links
  • Add comments/annotations at specific timestamps
  • Tag events for team review
  • Run replay validation in CI
  • Download CI artifact bundle

Integration with Element Library

Step replays are connected to the Element Library:
  • Element captures link to related replays showing the element in action
  • Selector stability is validated against replay data
  • Bug reports can reference specific replay timestamps

Known limits

  • Cross-origin frames may reduce detail
  • Large sessions may stream data progressively
  • Retention periods vary by plan (3 months to custom)
  • Maximum recording length: 30 minutes per session

Export capabilities

Step replays can be exported as reproduction steps for issue tracking and test automation:
  • Jira: Formatted reproduction steps with console logs and environment details
  • Linear: Issue-ready markdown with technical context
  • Playwright: Auto-generated TypeScript/JavaScript test code
For detailed export instructions and code examples, see the Complete Guide for Step Replays.