The goal was clear, to regain market position and exceptional yearly growth in Search by playing into all the strengths of machine learning. Historically the road to success has been through retaining control of keywords and bids. We challenged this historical concept by enabling the algorithms by providing more data.
The main idea behind this new strategy was to challenge a lot of existing best practices and to adopt a new account structure. This new structure was aimed towards one thing, assisting all elements of automation to perform better with the most important ones being smart-bidding, responsive search ads and n-gram based negative keywords through Precis own in-house tool.
This was accomplished by grouping and segmenting campaigns, ad groups, keywords and landing pages based on factors such as search intent, the value per click and seasonal likeness. The main theory being that the more closely relevant data we can gather in relevant groups the clearer the decisions become for the machines to thrive.
This made all of the most impactful data points for optimizing Search highly actionable. The most important being:
- Adgroup creative data for Responsive search ads
- Audience data on campaign level
- Value per click data on keyword and adgroup level
- Search query data on both adgroup and campaign level
- Demographic & geographic performance
Adgroup creative data for Responsive search ads means that with the historical best practices we would limit the amount of data on adgroup level due to retaining control of single keyword adgroups. With the new structure of grouped keywords, we now had more data on adgroup level where the algorithm Optimize sits to allow it to make more informed decisions on what ad to show to a given user.
Smarter segmentation and grouping of campaigns opened up the ability to tailor audience lists towards certain segments specific to the given campaigns. This allowed us to feed the smart-bidding algorithm, with extremely high-value audience signals.
By grouping keywords in adgroups based on their value per click, it meant that even though smart bidding tries to cluster similar performing queries we could hold a strategic advantage by easing this process, especially in low data situations like the always highly valuable long-tail keywords.
Themed grouping of search query data allowed for intelligent use of N gram logic by turning big data into actionable data. By having different themed levels to apply the automated negative keyword setup on, we were able to limit wasteful spend, while retaining as much revenue production as possible in an automated manner.
Lastly, as user targeting becomes more and more prevalent knowing your perfect audience is becoming paramount. By having better structured and aggregated data on this we were once again able to automate and optimize this area seamlessly.
Just 2 weeks after the launch of the entire new Search setup it became apparent in the results that the strategy was highly effective.
We established a baseline in performance in the months leading up to the launch of the new strategy where Bodystore’s generic search performance averaged yearly revenue growth of 19% YoY while hitting their ROAS target of 500%. After the initial first week of the launch Bodystore.dk has seen a sustained result for more than 3 months of staggering +100% revenue growth YoY for their Non-Brand Search efforts while continuing to meet its designated ROAS target of 500%.
It became clear that there’s an issue with historical best practices of retaining control and that the main objective for the future of search is to work together with and not against algorithms. It’s about finding creative ways to give the algorithms a better foundation for their decisions as well as tailoring the accounts towards each client to better enable their business goals through the structure.