SlideShare a Scribd company logo
AMC Experiments Faster
on the Server-side
Jon Keilson
VP Product Management
AMC NETWORKS
Yoshitaka Ito
VP Information Platforms
AMC NETWORKS
Too many product changes rolled out
that weren’t measured incrementally.
Hard to distinguish feature
performance with content noise.
Thinking about solving business
problems but not the user problems.
Why we invested in
experimentation
How we invested in
experimentation
Hypothesis Generation Process:
● Identify the user problem
● Test multiple solutions
● Clear metrics
Test solutions quickly and
inexpensively to get signal
To get started experimenting quickly
we implemented Optimizely purely on
the front end using Optimizely Full
Stack
Our first win
+2% increase in vid views
Auto advance enhancement:
Took time to ultimately determine it
was positive due to content
differences.
They can’t all be
winners
Guest home experiment:
● 20% decrease in video views
for new users (based on small
sample)
● Test your assumptions!
● We needed to test smaller
changes faster
The Problem:
We could not get enough velocity
building experiments at the
application level
The Challenges
5Brands
8Platforms
?Experiments
* *
= { Massive Headache }
The Solution:
Server-side Driven Experimentation
Deciding on
Application Rebuild
Goal: Architect to execute 80% of
experimentation efficiently
● Create templated apps
● Define roles and responsibilities
of the Backend, Backend for
Frontend (BFF), and Client Apps
● Define clear types of
experiments to run
Templated Apps
Apps are made of
components that can be
placed in many different
layouts
Application Roles
Backend Services
● Act as data source
Backend for Frontend (BFF):
● Provide business logic
● Control layout and styles
Apps:
● Know the context
● Own user interactions
Apps
Experimentation
Responsibilities
Be data set for
variations
- Understand
context passed
- Bucket users
- return appropriate
layout and dataset
- Know who, what,
where
- Present experiment
to the user
Backend AppsBFF
Client Side
● If it is a painted door
● New features impacting many areas
of the apps
Server-Side
● Fine-tune existing feature
● New features that can be
feature flagged at server level
● If there are many variations
across platforms
● Content experiment
When we use Client-Side vs Server-Side with Full Stack
Client-side Experiment
Architecture - Painted Door
Client-side Experiment
Architecture - New Feature
Server-side Experiment
Architecture
Outcome:
Operational Efficiency
● App Quality of Service metrics
○ App launch time
○ Time to interact
● Faster time to market
○ Release frequency
○ Experimentation frequency
○ Reducing dependencies on
3rd parties

More Related Content

What's hot (20)

PDF
Overcoming the Challenges of Experimentation on a Service Oriented Architecture
Optimizely
 
PPTX
Developer Night Opticon 2017
Optimizely
 
PPTX
Opticon 2017 How Developers Can Take Experimentation
Optimizely
 
PPTX
Optimizely NYC Developer Meetup - Experimentation at Blue Apron
Optimizely
 
PPTX
Practical Use Case: How Dosh Uses Feature Experiments To Accelerate Mobile De...
Optimizely
 
PPTX
7 Habits of Highly Effective Personalisation Organisations | Optimizely ANZ W...
Optimizely
 
PDF
Failure is an Option: Scaling Resilient Feature Delivery
Optimizely
 
PDF
[Webinar] Innovate Faster by Adopting The Modern Growth Stack
Optimizely
 
PDF
Opticon 2017 Decisions at Scale
Optimizely
 
PDF
Better Decisions with Experimentation
Jill Martay
 
PPTX
Delivering Excellent Digital Quality to Your Customers
Applause
 
PPTX
Tackling Strategic Engineering Challenges
Applause
 
PPTX
How to Go Codeless for Automated Mobile App Testing
Applause
 
PDF
Experimentation at Blue Apron (webinar)
Optimizely
 
PDF
Evolving Experimentation from CRO to Product Development
Optimizely
 
PPTX
How google crush Quality Assurance
MadurangaDeSilva1
 
PDF
Quality Jam 2017: Jesse Reed & Kyle McMeekin "Test Case Management & Explorat...
QASymphony
 
PPTX
Streamlining Automation Scripts and Test Data Management
QASymphony
 
PPTX
Boost Your Intelligent Assistants with UX Testing
Applause
 
PDF
Experimentation Platform at Netflix
Steve Urban
 
Overcoming the Challenges of Experimentation on a Service Oriented Architecture
Optimizely
 
Developer Night Opticon 2017
Optimizely
 
Opticon 2017 How Developers Can Take Experimentation
Optimizely
 
Optimizely NYC Developer Meetup - Experimentation at Blue Apron
Optimizely
 
Practical Use Case: How Dosh Uses Feature Experiments To Accelerate Mobile De...
Optimizely
 
7 Habits of Highly Effective Personalisation Organisations | Optimizely ANZ W...
Optimizely
 
Failure is an Option: Scaling Resilient Feature Delivery
Optimizely
 
[Webinar] Innovate Faster by Adopting The Modern Growth Stack
Optimizely
 
Opticon 2017 Decisions at Scale
Optimizely
 
Better Decisions with Experimentation
Jill Martay
 
Delivering Excellent Digital Quality to Your Customers
Applause
 
Tackling Strategic Engineering Challenges
Applause
 
How to Go Codeless for Automated Mobile App Testing
Applause
 
Experimentation at Blue Apron (webinar)
Optimizely
 
Evolving Experimentation from CRO to Product Development
Optimizely
 
How google crush Quality Assurance
MadurangaDeSilva1
 
Quality Jam 2017: Jesse Reed & Kyle McMeekin "Test Case Management & Explorat...
QASymphony
 
Streamlining Automation Scripts and Test Data Management
QASymphony
 
Boost Your Intelligent Assistants with UX Testing
Applause
 
Experimentation Platform at Netflix
Steve Urban
 

Similar to AMC Networks Experiments Faster on the Server Side (20)

PDF
Getting Started with Server-Side Testing
Optimizely
 
PDF
[Webinar] Getting started with server-side testing - presented by WiderFunnel...
Chris Goward
 
PPTX
Internship Presentation 1 Web Developer
Hemant Sarthak
 
PPT
Webapplicationtesting
nazeer pasha
 
PDF
Dynamic Website Designing Company in Delhi NCR Dynamic website Designers
MambaSoftwares
 
PDF
Techieguide
Brijesh Tiwari
 
PDF
Gwo Techieguide
VML
 
PDF
Mobile App Feature Configuration and A/B Experiments
lacyrhoades
 
PPTX
presentaion.pptx
mehulmaheshwari3
 
PDF
Testing Ajax, Mobile Apps the Agile Way
Clever Moe
 
PDF
UX Masterclass at muru-D
Doralin Kelly
 
PDF
Anand Ramdeo - Automation Frameworks - EuroSTAR 2012
TEST Huddle
 
PPTX
Full Stack Experimentation
Optimizely
 
PPTX
Rachel Costello — The Landscape of Site Speed and Web Vitals
Semrush
 
PDF
Front-End Performance Checklist 2020
Harsha MV
 
PDF
Extreme Web Performance for Mobile Devices - Velocity Barcelona 2014
Maximiliano Firtman
 
PDF
Lean Quality & Engineering
Mek Srunyu Stittri
 
PPTX
Strategies in continuous delivery
Aviran Mordo
 
PPT
Mobile app development using PhoneGap - A comprehensive walkthrough - Touch T...
RIA RUI Society
 
PDF
Comprehensive Performance Testing: From Early Dev to Live Production
TechWell
 
Getting Started with Server-Side Testing
Optimizely
 
[Webinar] Getting started with server-side testing - presented by WiderFunnel...
Chris Goward
 
Internship Presentation 1 Web Developer
Hemant Sarthak
 
Webapplicationtesting
nazeer pasha
 
Dynamic Website Designing Company in Delhi NCR Dynamic website Designers
MambaSoftwares
 
Techieguide
Brijesh Tiwari
 
Gwo Techieguide
VML
 
Mobile App Feature Configuration and A/B Experiments
lacyrhoades
 
presentaion.pptx
mehulmaheshwari3
 
Testing Ajax, Mobile Apps the Agile Way
Clever Moe
 
UX Masterclass at muru-D
Doralin Kelly
 
Anand Ramdeo - Automation Frameworks - EuroSTAR 2012
TEST Huddle
 
Full Stack Experimentation
Optimizely
 
Rachel Costello — The Landscape of Site Speed and Web Vitals
Semrush
 
Front-End Performance Checklist 2020
Harsha MV
 
Extreme Web Performance for Mobile Devices - Velocity Barcelona 2014
Maximiliano Firtman
 
Lean Quality & Engineering
Mek Srunyu Stittri
 
Strategies in continuous delivery
Aviran Mordo
 
Mobile app development using PhoneGap - A comprehensive walkthrough - Touch T...
RIA RUI Society
 
Comprehensive Performance Testing: From Early Dev to Live Production
TechWell
 
Ad

More from Optimizely (20)

PDF
Clover Rings Up Digital Growth to Drive Experimentation
Optimizely
 
PPTX
Make Every Touchpoint Count: How to Drive Revenue in an Increasingly Online W...
Optimizely
 
PPTX
The Science of Getting Testing Right
Optimizely
 
PPTX
Autotrader Case Study: Migrating from Home-Grown Testing to Best-in-Class Too...
Optimizely
 
PPTX
Zillow + Optimizely: Building the Bridge to $20 Billion Revenue
Optimizely
 
PPTX
Empowering Agents to Provide Service from Anywhere: Contact Centers in the Ti...
Optimizely
 
PPTX
Experimentation Everywhere: Create Exceptional Online Shopping Experiences an...
Optimizely
 
PDF
Building an Experiment Pipeline for GitHub’s New Free Team Offering
Optimizely
 
PPTX
Making Your Hypothesis Work Harder to Inform Future Product Strategy
Optimizely
 
PPTX
Kick Your Assumptions: How Scholl's Test-Everything Culture Drives Revenue
Optimizely
 
PPTX
Experimentation through Clients' Eyes
Optimizely
 
PDF
Run High Impact Experimentation with High-quality Customer Discovery
Optimizely
 
PDF
Using Empathy to Build Custom Solutions at Scale
Optimizely
 
PPTX
How to find data insights that will drive a 10X impact
Optimizely
 
PPTX
Targeted Rollouts: How to Release Features to Multiple Audiences
Optimizely
 
PDF
Deploying Fearlessly in a Continuous World
Optimizely
 
PPTX
Detecting incorrectly implemented experiments
Optimizely
 
PDF
Move Fast in the Age of Uncertainty
Optimizely
 
PDF
The Future of Building Good Products: Progressive Delivery and Experimentation
Optimizely
 
PDF
How Clorox Experiments Across Brands to Turn Visitors into Consumers
Optimizely
 
Clover Rings Up Digital Growth to Drive Experimentation
Optimizely
 
Make Every Touchpoint Count: How to Drive Revenue in an Increasingly Online W...
Optimizely
 
The Science of Getting Testing Right
Optimizely
 
Autotrader Case Study: Migrating from Home-Grown Testing to Best-in-Class Too...
Optimizely
 
Zillow + Optimizely: Building the Bridge to $20 Billion Revenue
Optimizely
 
Empowering Agents to Provide Service from Anywhere: Contact Centers in the Ti...
Optimizely
 
Experimentation Everywhere: Create Exceptional Online Shopping Experiences an...
Optimizely
 
Building an Experiment Pipeline for GitHub’s New Free Team Offering
Optimizely
 
Making Your Hypothesis Work Harder to Inform Future Product Strategy
Optimizely
 
Kick Your Assumptions: How Scholl's Test-Everything Culture Drives Revenue
Optimizely
 
Experimentation through Clients' Eyes
Optimizely
 
Run High Impact Experimentation with High-quality Customer Discovery
Optimizely
 
Using Empathy to Build Custom Solutions at Scale
Optimizely
 
How to find data insights that will drive a 10X impact
Optimizely
 
Targeted Rollouts: How to Release Features to Multiple Audiences
Optimizely
 
Deploying Fearlessly in a Continuous World
Optimizely
 
Detecting incorrectly implemented experiments
Optimizely
 
Move Fast in the Age of Uncertainty
Optimizely
 
The Future of Building Good Products: Progressive Delivery and Experimentation
Optimizely
 
How Clorox Experiments Across Brands to Turn Visitors into Consumers
Optimizely
 
Ad

Recently uploaded (20)

PPTX
Home Care Tools: Benefits, features and more
Third Rock Techkno
 
PDF
IDM Crack with Internet Download Manager 6.42 Build 43 with Patch Latest 2025
bashirkhan333g
 
PDF
Wondershare PDFelement Pro Crack for MacOS New Version Latest 2025
bashirkhan333g
 
PDF
Odoo CRM vs Zoho CRM: Honest Comparison 2025
Odiware Technologies Private Limited
 
PDF
MiniTool Partition Wizard 12.8 Crack License Key LATEST
hashhshs786
 
PDF
SciPy 2025 - Packaging a Scientific Python Project
Henry Schreiner
 
PDF
유니티에서 Burst Compiler+ThreadedJobs+SIMD 적용사례
Seongdae Kim
 
PPTX
Customise Your Correlation Table in IBM SPSS Statistics.pptx
Version 1 Analytics
 
PDF
SAP Firmaya İade ABAB Kodları - ABAB ile yazılmıl hazır kod örneği
Salih Küçük
 
PDF
Top Agile Project Management Tools for Teams in 2025
Orangescrum
 
PPTX
ChiSquare Procedure in IBM SPSS Statistics Version 31.pptx
Version 1 Analytics
 
PPTX
In From the Cold: Open Source as Part of Mainstream Software Asset Management
Shane Coughlan
 
PDF
The 5 Reasons for IT Maintenance - Arna Softech
Arna Softech
 
PPTX
OpenChain @ OSS NA - In From the Cold: Open Source as Part of Mainstream Soft...
Shane Coughlan
 
PDF
vMix Pro 28.0.0.42 Download vMix Registration key Bundle
kulindacore
 
PDF
Empower Your Tech Vision- Why Businesses Prefer to Hire Remote Developers fro...
logixshapers59
 
PPTX
Agentic Automation Journey Session 1/5: Context Grounding and Autopilot for E...
klpathrudu
 
PPTX
Help for Correlations in IBM SPSS Statistics.pptx
Version 1 Analytics
 
PDF
[Solution] Why Choose the VeryPDF DRM Protector Custom-Built Solution for You...
Lingwen1998
 
PPTX
Change Common Properties in IBM SPSS Statistics Version 31.pptx
Version 1 Analytics
 
Home Care Tools: Benefits, features and more
Third Rock Techkno
 
IDM Crack with Internet Download Manager 6.42 Build 43 with Patch Latest 2025
bashirkhan333g
 
Wondershare PDFelement Pro Crack for MacOS New Version Latest 2025
bashirkhan333g
 
Odoo CRM vs Zoho CRM: Honest Comparison 2025
Odiware Technologies Private Limited
 
MiniTool Partition Wizard 12.8 Crack License Key LATEST
hashhshs786
 
SciPy 2025 - Packaging a Scientific Python Project
Henry Schreiner
 
유니티에서 Burst Compiler+ThreadedJobs+SIMD 적용사례
Seongdae Kim
 
Customise Your Correlation Table in IBM SPSS Statistics.pptx
Version 1 Analytics
 
SAP Firmaya İade ABAB Kodları - ABAB ile yazılmıl hazır kod örneği
Salih Küçük
 
Top Agile Project Management Tools for Teams in 2025
Orangescrum
 
ChiSquare Procedure in IBM SPSS Statistics Version 31.pptx
Version 1 Analytics
 
In From the Cold: Open Source as Part of Mainstream Software Asset Management
Shane Coughlan
 
The 5 Reasons for IT Maintenance - Arna Softech
Arna Softech
 
OpenChain @ OSS NA - In From the Cold: Open Source as Part of Mainstream Soft...
Shane Coughlan
 
vMix Pro 28.0.0.42 Download vMix Registration key Bundle
kulindacore
 
Empower Your Tech Vision- Why Businesses Prefer to Hire Remote Developers fro...
logixshapers59
 
Agentic Automation Journey Session 1/5: Context Grounding and Autopilot for E...
klpathrudu
 
Help for Correlations in IBM SPSS Statistics.pptx
Version 1 Analytics
 
[Solution] Why Choose the VeryPDF DRM Protector Custom-Built Solution for You...
Lingwen1998
 
Change Common Properties in IBM SPSS Statistics Version 31.pptx
Version 1 Analytics
 

AMC Networks Experiments Faster on the Server Side

  • 1. AMC Experiments Faster on the Server-side Jon Keilson VP Product Management AMC NETWORKS Yoshitaka Ito VP Information Platforms AMC NETWORKS
  • 2. Too many product changes rolled out that weren’t measured incrementally. Hard to distinguish feature performance with content noise. Thinking about solving business problems but not the user problems. Why we invested in experimentation
  • 3. How we invested in experimentation Hypothesis Generation Process: ● Identify the user problem ● Test multiple solutions ● Clear metrics Test solutions quickly and inexpensively to get signal To get started experimenting quickly we implemented Optimizely purely on the front end using Optimizely Full Stack
  • 4. Our first win +2% increase in vid views Auto advance enhancement: Took time to ultimately determine it was positive due to content differences.
  • 5. They can’t all be winners Guest home experiment: ● 20% decrease in video views for new users (based on small sample) ● Test your assumptions! ● We needed to test smaller changes faster
  • 6. The Problem: We could not get enough velocity building experiments at the application level
  • 9. Deciding on Application Rebuild Goal: Architect to execute 80% of experimentation efficiently ● Create templated apps ● Define roles and responsibilities of the Backend, Backend for Frontend (BFF), and Client Apps ● Define clear types of experiments to run
  • 10. Templated Apps Apps are made of components that can be placed in many different layouts
  • 11. Application Roles Backend Services ● Act as data source Backend for Frontend (BFF): ● Provide business logic ● Control layout and styles Apps: ● Know the context ● Own user interactions Apps
  • 12. Experimentation Responsibilities Be data set for variations - Understand context passed - Bucket users - return appropriate layout and dataset - Know who, what, where - Present experiment to the user Backend AppsBFF
  • 13. Client Side ● If it is a painted door ● New features impacting many areas of the apps Server-Side ● Fine-tune existing feature ● New features that can be feature flagged at server level ● If there are many variations across platforms ● Content experiment When we use Client-Side vs Server-Side with Full Stack
  • 17. Outcome: Operational Efficiency ● App Quality of Service metrics ○ App launch time ○ Time to interact ● Faster time to market ○ Release frequency ○ Experimentation frequency ○ Reducing dependencies on 3rd parties

Editor's Notes

  • #2: Yoshi and Jon to deliver intros with backgrounds.
  • #3: Yoshi and Jon to deliver intros with backgrounds. [ID: CON BO1] AMC Experiments Faster on the Server-side Speakers: Jon Keilison, VP Product Management, AMC Networks Yoshi Ito, VP Information Networks, AMC Networks SESSION TYPE: Customer TRACK: Product, Scale, Engineering
  • #4: Jon
  • #5: Jon
  • #6: Jon
  • #7: Jon: This was an expensive test because we ultimately built this on both front end and front end Assumptions Taught us we should be doing more painted doors - explain a painted door!
  • #8: Yoshi As jon mentioned, we learned there are huge values to experimentation for us both from understanding our users as well as avoiding unnecessary risks, and ultimately trying to focus on how our customers really leverage our product Unfortunately, we also learned we had some key limitations. The biggest of which, we learned is that we could not experiment at velocity we wanted and maintain structure / process / code well enough to take advantages to gain learnings through our planning and delivery pipelines Some of these challenges were inheritent of products we work on
  • #9: So - what are our challenges? In short - its permutation. We manage 5 brands on 8 streaming platforms. We work on our streaming platforms and we we need to be where our customers are: We are on - iOS, tvOS, Android, Android Tv, FireTV, Roku, Samsung, and Web At same time - we have 5 very distinctively unique brands in our portafolio AMC, BBCA, IFC, WE tv, and Sundance As you may imagine - we find that all of our audiences acts quite differently across all these brands / platforms how to experiment across theses permutations efficiently were just... very hard We also couldn’t easily control when our experiments gets released - because there are certification process for apps, we weren’t always able to release these experiments when we wanted With not all the learnings are easily translatable into actionable items, we had to find ways to do this efficiently and to maximize our scale.
  • #10: So what did we do? Enter - Server-side Driven Experimentation In a nutshell - we decided to move decision making process to server side and power our experiments there whenever possible
  • #11: Yoshi How did we approach all of this? Well, after month and months of adding duck tape after duck tape until you can no longer see the facade the application - we came to our senses and we needed to do some replatforming. One of the key learning of not being able to gain experimental velocity is that is that, we needed a have a focus on how to experiment just like anything else in our product stack to do it well. We decided to focus on creating structure that will allow us to do 80% of our experiments efficiently instead of trying to do everything. In order to achieve this, we tackled it from 3 angles. First - We looked to make changes in how our apps were written. We updated our apps to become more templated so we can control major aspects of app from server side Second - We decided to assign roles within the stack so developers had clear frameworks as to where logic should be added for experiments Third - We agreed on types experiments we want to focus on running efficiently
  • #12: First thing we focused on is, app. During our internal discussions, we quickly came to realization apps needs to be better componentized, so it can be templatized. We needed to have units which we can control from backend in a flexible and reliable way. So, working closely with Product and design, we vetted our application’s UX components from ground up - breaking down every aspect our components - from fonts, color, styles, positioning, type of action it can trigger - and ensured the reusability of those components in the templates across the entire app. Think of it this way - Imagine if all visual components blocks can be moved to across screen and different layouts reliable, it is easy to understand control you gain
  • #13: Now that we have decided on templating system, we needed to assign roles what each of our systems needs to do to execute this strategy efficiently. Our systems, broadly speaking, are comprised of 3 components - Backend Services, Backend for Frontend, and Applications. Roles they play are... Backend, I am sure everyone can guess - Is responsible to provide key meta data - content info. As well as user profile information Backend for Frontend We introduced backend for frontend in our architecture, which often is used to avoid monolithic APIs but we felt this is actually a great way to extract complexity out of frontend and provide means for us to control templating easier, and as a result it can help us make decision on bucketing and control user experience Apps It needs to handles the user interaction, as it is the only portion of our app that comes in contact with the users So we wanted to make sure they are well aware of who, what, where, and how of our users In essence, we wanted to build a thinnest app that can be controlled via API
  • #14: How does these roles manifest itself in the responsibilities for experiment? Let’s start with backend - it is not surprising backend is the data that provides what we need for variations. If we need to show different set of content across different experiement, this is the source App - it still remains owner of the context owner, and place where experiment needs to get presented to end user BFF is where the hard work is done it takes in context from the app Buckets users based on the context Communicate with backend to gather appropriate set of information And return them as a instruction set for App to consume This gave us clear way of how systems needed to work
  • #15: Yoshi As much as we’d love to claim we solved all of our woes - of course, you still need to run client side experiments. We tend to think about general cases around how experiments need to be run. When do we run tests Client-side? If you want to run a painted door, as Jon referred to before, there is no way to do this via backend - for example - adding a casting button to gouge customer interest in new casting feature. This still remained the cheap way to allow product teams to get initial sample of what they need without defining full fledged component Another type of experiments we still want to do on the client side is -if there are new features that can not easily be componentized. For example, if feature requires bucketing has to persist across multiple interactions of the apps, complexity becomes too much for server side to maintain When do we run experiments in Server-side? If we are fine tuning different view for existing features - what happens if we tweak how out content is displayed poster image vs 16x9 image. This plays really nicely into templating concept. If we are introducing features that are very much boolean - where there can be simple on-off switch as a block, i.e. exposing new set of CTA to subset of user Of course important case for us is if an experiment require validation across many platforms, i.e. showing new way to feature content on different device times Also, as a content company - content experiments to validate ow content does in different positioning is important and much more scalable with template systems Giving myself little moment to nerdi out as an engineer - i want to show few diagram how these experiment cases work:
  • #16: First is client side architecture - used for painted door In this model, 3 components talks to each other to make decision. Context Handler handles the context App Core / UX instantiates against context handler And SDK Wrapper interacts directly to context handler to create bucketing Based on decision making, app core will paint the UX Key to notice here is that the experiments handled here are experiments that requires no backend data, it is isolated and works really well with painted door because it keeps the scope narrow
  • #17: Next here is Client side experiment architecture - such as for the case when we work on features that works across multiple section of the app. In this case, you notice that app is still bucketing the experiment at app level, but it is requesting data from BFF - which then retrieves required information for backend. So the app and the clietn SDK retains the control of how experiments are carried out as well as what data is being retrieved
  • #18: And last but not least - this is the model we are using most often. App core talks to context handler Gather the information and converse with Content Compiler in BFF Content Compiler makes bucketing decision using Optimizely SDK Content Compiler gather backend dataset based on the SDK decision You see, how in this case, our application remains relatively unaware, and BFF controls our logic to run the experiment With all of that said, I am now going to pass it back to Jon.
  • #19: Jon: By decoupling the features on the front end from application changes, we can release experiment and features faster This allows for fast iteration This ultimately also allows for more stable applications and improves our QoS since we aren’t making as many large changes to the apps.