Skip to main content

Instructor: Mike Chu, Staff Software Engineer, Experimentation
Audience: Developers, Data/Analytics Engineers, and Technical Program Managers involved in experiment analysis or reporting

Explore how Optimizely measures experimentation results and how its methodology compares to general-purpose analytics tools. This session covers common sources of data discrepancies with tools like GA4 and practical validation techniques including A/A tests. Walk away with frameworks for strengthening your team's data-driven decision-making and communicating results clearly to leadership.

Agenda:

  • Understanding Data Discrepancies (how Optimizely and GA4 measure differently — and what each is optimized for)
  • Aligning Tracking & Attribution (conversion counting, attribution windows, session vs. visitor models)
  • Validating with A/A Tests (how to run them, what to expect, how to interpret)
  • Communicating Experiment Results to Leadership (frameworks for clear, confident reporting)
  • Q&A

Choose an Upcoming Session

  • 1 hour

    Thursday, Apr 9, 2026 at 10:00 AM to 11:00 AM EDT
    • Instructor: Mike Chu
  • 1 hour

    Tuesday, May 12, 2026 at 9:00 AM to 10:00 AM EDT
    • Instructor: Mike Chu
  • 1 hour

    Tuesday, Jun 16, 2026 at 11:00 AM to 12:00 PM EDT
    • Instructor: Mike Chu