Overview
We addressed user churn caused by creative fatigue and friction in publishing AI-generated content by introducing a smart Auto Posting feature. This led to improved engagement among paid users, increased activation among free users, and ultimately, higher retention.
Problem
Retention was taking a toil and controlling churn became of colossal importance. We started looking at clarity, mixpanel, started conducting user interviews, we looked at engagement metrics and left no stone unturned. What we understood was that our users were dropping off due to friction in content creation and publishing it to their social handles. Key issues:
- Users running out of relevant ideas/repetitive inputs
- Frequent errors while publishing the post
- Low engagement and activation
Hypothesis
Automate the process of content creation and scheduling, and continuously personalize through iterative improvements to ensure alignment with user content expectations. This will help reduce friction, improve activation, and increase retention.
Approach
Automate the process of content creation and scheduling, and continuously personalize through iterative improvements to ensure alignment with user content expectations. This will help reduce friction, improve activation, and increase retention.
- Analyzed user flows to understand the behavior of churned users after purchasing a plan.
- Segmented these users and evaluated their click-through rates (CTR) for post publishing.
- Conducted user interviews with low-engagement users—those publishing minimal content—to uncover key frustrations.
- Launched a beta program where selected users were enrolled, and content was automatically published on their social media handles.
- Tracked user participation, including opt-ins and opt-outs from the beta program.
- Identified loopholes and challenges from the beta phase and used these insights to develop a comprehensive auto-posting feature.
- Rolled out the feature following thorough internal testing, including dogfooding and feedback from internal users.
Outcomes
Represents the acquired vs churned behaviour of users enrolling for autoposting.
Represents the acquired vs churned behaviour of users not enrolling for autoposting.
Note:
Actual numbers are intentionally hidden to protect sensitive business information and ensure confidentiality.
-
Users who enrolled in auto posting (AP) experienced up to 3× higher acquisition growth
compared to those who didn’t.
-
Churn remained 40–50% lower
for AP users during the first 4–5 months post-enrollment.
-
Retention improved significantly
for users with AP enabled, especially in their first 60–90 days.
-
Non-AP users showed poor retention
, with churn reaching up to 85–90% of acquisitions in key months.
-
Retention improved significantly
for users with AP enabled, especially in their first 60–90 days.
-
Auto posting led to net positive user growth
, while the non-AP group experienced stagnant or negative growth.
-
The feature demonstrated a clear impact on user stickiness, engagement, and satisfaction
in the early lifecycle.
- Insights from the beta helped shape a more robust full-feature rollout with better handling of churn factors.
Key Learnings
-
Early Automation Drives Retention:
Introducing auto posting early in the user journey significantly boosts retention. Users who enrolled in the feature were far less likely to churn in the first 2–3 months, indicating the value of reducing effort through automation.
-
Engagement Is Not Just About Acquisition:
While the non-AP group had higher acquisition initially, poor engagement and high churn offset the growth—highlighting that acquisition without activation is a leaky funnel
.
-
Time-to-Value Matters:
Users who didn’t start auto posting likely didn’t experience value quickly enough, leading to higher dropout rates. This reinforced the importance of delivering quick wins in the product experience.
-
Churn Increases After a Few Months:
Even users with auto posting started to drop off after 2–3 months. This shows the need to keep users engaged over time with ongoing value and more personalized content.
-
Beta Testing Is Critical:
Running the beta program helped surface key usability issues and opt-out reasons early, allowing for a more stable and impactful full-feature rollout.
-
Behavioral Data Tells the Real Story:
Segmenting users by behavior (AP vs non-AP) and comparing retention patterns gave far deeper insights than top-level metrics alone. This underscored the value of behavioral product analytics in decision-making.