How Chain Cube Went from Weak Metrics to the Top of the Charts
In a recent AppLovin Connects event in Japan, Azur Games shared how their 3D merge game Chain Cube (iOS | Android) rose to the top of the charts through focused and specific testing, iterations based on those results, and by using AppLovin’s software and expertise.
Chain Cube, a hybrid hyper-casual game, did not perform as well as the team had hoped, so they mapped out a six-month plan to improve their metrics and profits.
Azur Games’ producer, Milania Gelmanova, gave a detailed webinar presentation, sharing insights behind how the team was able to successfully scale Chain Cube.
An overview of updates and testing
The team at Azur Games made over 50 updates to the game and focused on improving the following:
- The feel of the game, including visuals and game logic
- Iterating and testing their creatives through AppLovin
- Monetization: scaling marketing campaigns through MAX
- App store ratings and reviews
“AppLovin’s MAX provides good features for A/B testing of ad waterfalls. With Chain Cube, this allowed us to deliver each impression, at the maximum eCPM.”
Making ads more native
In the first version of Chain Cube, Milania explained that the game had four interstitial ads after every fifth cube. For users, this format was both predictable and annoying, for users, who made this clear when giving the app a review on the app stores. The game’s rating was lower than two stars and not surprisingly, their retention rate was also low.
The team decided to replace the interstitial ads with another monetization system that was more native to the game. They added achievement boosters, which helped improve engagement through encouragement, and added the interstitial ads after the boosters were shown.
Difficulty experiments to enhance the game’s feel and overall logic
Sometimes changes are not well received, and Milania recalled, “We tried to implement bigger cubes for higher numbers to leave less space for the players.”
After receiving that feedback, the team decided to add them only as temporary events—as achievement rewards. She added, “For those who wanted a challenging element, we also added a leaderboard. That encouraged users to play more and try to achieve a higher score.”
To add another layer of engagement, they added more encouragement screens at specific moments in the game, challenging users to continue playing to achieve a greater number.
On the design side, they made the background softer with new shapes but received feedback from players that it was messy and distracting. Milania said, “Instead, we added cubes with higher numbers to increase the playtime duration, and it worked out well.”
These changes resulted in positive results:
- Retention on day 1 went from 39.3 percent to 60 percent
- Their user rating improved from 2 stars to 4.3 stars
- Play time for a day went from 16 mins to 30 minutes
- Play time for day 30 and on, went from 7.5 minutes to 15 minutes
Testing their UA creatives with AppLovin
In order to improve their UA, they focused their efforts on further iterating and testing their creatives with the help of AppLovin’s software.
The team consistently checked their return on ad spend (ROAS) for each creative in order to understand the impact, and at what cost. The team put their full trust in a data-based approach.
The team had best practices they followed for their A/B tests:
- To minimize the impact on marketing results and to make metrics as clear as possible, they focused on testing one iteration, one feature at a time.
- They used AppLovin’s software to attract the best users
- They leveraged AppLovin’s algorithm that automatically selects the best creatives
Milania said working with SparkLabs was a smooth process, saying, “AppLovin uses machine learning to attract the best users for your game. Our campaigns went well and advertiser revenue also grew. We used all of our available tools to increase the effectiveness of our campaigns.”
Learn more about how AppLovin’s expertise and software helps developers: