March 5 2020

Sometimes, a marketing campaign can yield poor results. This could be due to bad data or poor methodology, but the end result is the same. Experiencing lackluster metrics on a campaign is not the end of the road, but may rather be an opportunity to learn from what happened and what needs to be adjusted.

There are several tips, tricks and tools for App Store Optimization, but not all of them are actually helpful for an ASO strategy. In order to improve your app promotion, you’ll want to avoid the most common ASO mistakes. These include using inaccurate ASO tools, poor structure, testing too much at once and poorly-timed updates. 


1. How Inaccurate Tools Damage Your ASO Marketing

In order to properly optimize an app, you’ll need an ASO platform that uses real mobile data. There are many ASO “tools” that claim to offer data such as searches or installs per keyword and numerical keyword volumes – these are not actually shared by the App Store or Google Play to any third-party.

These often also use web-based data meant for SEO. This does not translate for ASO, as web search data is very different from mobile search data. Using web data can lead to developers and marketers targeting keywords that have little to no real volume within the app stores. As there is only a 20% overlap between web and mobile search trends, using web-based keyword data can point you towards keywords that will not benefit your App Store Optimization and, thus, damage your ASO marketing

At Gummicube, we see these issues occur frequently and understand how targeting the right keywords can make a difference. As an example, after deploying metadata identified through a keyword targeting strategy, IMVU’s #1 ranked keywords went up by 33%. With visibility from the keyword growth, IMVU also gained a 45% increase in daily downloads. This is the difference that a proper ASO platform and methodology can make.

2. Underutilized Metadata

Keywords help create a foundation for improving organic search impressions. Targeting the right keywords improves the number of terms an app indexes for and how high it ranks for them. That, in turn, puts the app in more search results and in front of more users.

However, it is common to see apps that underutilize their metadata fields. For instance, an app may only use a fraction of the 50 characters they’re allowed for the Google Play title, or not use an App Store subtitle at all. These fields not only provide user-facing information about the app, but they also provide important keywords for indexation. 

Underutilizing metadata fields leads to apps with a significantly smaller keyword footprint, and may also result in reduced installs. Not having a title tag could be detrimental to conversions, as users may not understand what the app is from its brand name alone. These fields provide both keywords to target and additional value propositions to help influence user decisions.

For example, Minecraft Earth did not fully utilize its iOS metadata fields when it launched. The title only used 15 of the 30 characters allowed, while the subtitle was nonexistent. The app relied solely on the Minecraft IP, thus neglecting opportunities to target additional feature-based keywords using the title and subtitle. At its launch, it had difficulty ranking high for relevant terms like “augmented reality” (unranked) “building” (#22) and “mine blocks” (#40) keywords. Its rankings for those terms increased over time, and the app now includes a subtitle.
minecraft earth
By fully utilizing the available metadata fields, an app can rank for thousands of keywords. This increase in reach, if the conversion occurs, can lead to long-lasting improvements in daily downloads.

3. A/B Testing Dramatic Changes

Proper A/B testing can help you determine what aspects of your page listing perform well. Each test should be iterative, trying out specific elements to identify which variants work best. When you make a dramatic change, it becomes difficult to determine what elements are responsible for the results you see.

Testing multiple elements at once makes it hard to understand which ones are helping and which might actually be hurting. If you test a screenshot variant with updated callout text, color scheme and image placement, any one of those could be responsible for the most improvement. Another aspect could be simultaneously inhibiting growth, but it would not be possible to tell which one.

Instead, test each change one at a time. The callout text may be an improvement, while the new color scheme performs less well than the initial version. Without testing them individually, you’d never be able to tell which resulted in a positive or negative change.

The lack of insights also makes it difficult to create further variations, as it’s unclear what else needs improvement. Even if a massive change results in improved conversions, developers can’t capitalize on it if they don’t understand what changes drove the improvement and which can be tested further. A scientific, data-driven approach to A/B testing is critical in eliminating extraneous variables, pinpointing the elements that perform best and iterating from there.  The process may be slower than making many changes all at once, but it is far more actionable and data-driven, providing a roadmap of what to do next. 


4. Poorly Timed Metadata Updates

When a developer updates the app metadata, the stores’ algorithms re-crawl it to index the app for targeted terms. As an app makes metadata updates over time, its keyword footprint will continue to grow. 

Timing is critical to this process.  If updates are done too quickly, indexation trends cannot be assessed; if done too far apart, keywords can stagnate or decline.  It can take Apple’s ranking algorithm up to 30 days to fully crawl an app. If another update is made during this time, it essentially undoes all of the progress from the previous update. App updates should be spaced out enough for the metadata to have time to properly index.

While spacing out updates is vital in being able to assess trends and iterate, it’s important to frequently update once indexation occurs.  Between competitor apps making their own updates, seasonal changes and new competitors emerging, staying up to date with the latest trending terms is essential to organic discoverability.

To best ensure you’re maximizing growth potential, you’ll want to make updates after every crawl has ended. This will give enough time to index and analyze performance, while also keeping continued crawls for new search terms, compounding the benefits from previous releases.

Conclusion

It’s important to avoid common ASO mistakes if you want to properly optimize your app. These mistakes can include using inaccurate ASO tools that draw data from the wrong source, not fully utilizing metadata fields, launching tests with results that do not provide actionable data and not timing metadata updates properly. Doing this could lead to targeting fewer and less effective keywords or designing sub-optimal creatives, thus limiting impression and conversion potential.

If you want to get the most out of your App Store Optimization, you’ll need to know what mistakes you should be aware of. There is an abundance of information about App Store Optimization and the strategies behind it, but it is also important to understand what mistakes can sabotage your ASO efforts. Knowing what to avoid can be just as important as knowing what to do.