Operate

Testing & Monitoring

What was tested before launch and what tells you the system is healthy now.

What was tested before launch and what tells you the system is healthy now.

Testing summary

Track What was tested Notes
1 — Membership Happy path each plan · auto-promo · pro-rating · ACH vs card · existing-member signup · field validation · CORS All six plans: Passport, Startup, All Access, Founder, Creator Pass, Innovator Pass
2 — Booking Single-day each room · 10% volume discount (eligible + ineligible) · multi-day default · multi-day custom · Gate 1 · Gate 2 · multi-day atomic cleanup · webhook idempotency · booking window All seven rooms: Slinky, Day Office, Mackinaw, Shears, Board Room, Radiator Annex, Sonic Studio
3 — Day Pass Happy path · kiosk acceptance · plan validity config (including 0-interval reject) · receipt · existing member · conditional form rendering Sub/One-Time toggle via Purchase Type confirmed

Explicitly not tested. No automated suite, no CI, no load tests, no PCI audit (Stripe hosted elements keep us out of scope, but no formal audit performed).

Expand — test detail per track

Track 1

  • Happy path, each plan → verified member creation, membership assignment, active Stripe subscription, welcome email, team notification
  • Auto-promo AUTO-{PLAN-SLUG} applies against initiation fee; silent failure when missing
  • Pro-rating applies where configured; full month charged where not
  • ACH + card both complete the flow (Stripe handles tokenization and 3DS)
  • Existing OfficeRnD email adds new membership to existing member (no duplicate)
  • Required fields enforced
  • CORS: allowed origins accepted, unlisted origins rejected with clear error

Track 2

  • Each bookable room: hourly rate, min hours, Sonic extras flow
  • 10% volume discount on 8+ hour bookings for eligible rooms only
  • Multi-day default: same duration/time across weekdays in range; weekends auto-skipped; per-day discount
  • Multi-day custom: independent duration/time per day; correct total
  • Gate 1 race simulation: booking rejected before PaymentIntent
  • Gate 2 race simulation: auto-refund fires, no event created
  • Multi-day atomic cleanup: day-3 fail in 5-day booking → all created events deleted + full refund
  • Stripe webhook idempotency: retried payment_intent.succeeded doesn't duplicate events
  • Booking window: dates outside 24h–60d disabled in calendar

Track 3

  • Happy path: Stripe customer + charge, OfficeRnD member if new, pass with type: hotdesk, grantActiveStatus: true, UTC today validFrom/validTo
  • Kiosk accepts a freshly-created pass within OfficeRnD sync latency
  • passesValidityPeriod.intervalCount non-zero verified; kiosk correctly rejects when set to zero
  • Stripe receipt_url populates and is publicly accessible
  • Existing member purchase adds a new pass to existing record
  • Conditional rendering: Purchase Type = "Subscription" → Track 1 form; "One-Time" → Track 3 form

What's monitored

Everything monitored is reactive — you learn about problems when you look, not via a page/alert.

Surface What you get
Cloudflare Workers Execution logs (wrangler tail or Dashboard) · request + error-rate charts
Stripe Event delivery retries (up to 3 days) · auto-emails for disputes + failed payouts · payment success rate KPI
Google Calendar / MailerSend / OfficeRnD No proactive monitoring. You notice via complaint or reconciliation.
Docs homepage widgets System Status, Membership Checkout, Room Bookings widgets at docs.labourtemple.com — daily at-a-glance

No Cloudflare Worker error emails by default. Wiring Logpush → Slack or email is a recommended next step (30–60 days post-launch).

Baseline numbers

Directional, not historical — system is new. Treat as "watch for drift" rather than hard thresholds.

Metric Expected Abnormal
Track 1 checkout completion rate 15–40% <5% — broken form or payment
Track 1 card decline rate 3–8% >15% — Stripe config or fraud
Track 2 booking conversion Highly variable by room Track per-room trends
Track 2 auto-refund rate (Gate 2 fires) <1 in 50 >5% — contention issue
Track 3 pass creation success >99% Any consistent miss → OfficeRnD API check
MailerSend delivery >99% <95% — check sender reputation / SPF / DKIM

Capture these monthly during reconciliation so drift becomes visible.

What alerts you'll see in your inbox


The "first five purchases" ritual

For the first week post-handoff, screen-share ~15 min per new live purchase (up to five across the three tracks):

Most issues that will ever happen show up in the first five real transactions. Watching together removes "is this supposed to look like that?" anxiety.