
I was in a client’s market this week and one of the things we worked on was a refresh of the database for their Hot AC station. Over the course of my career, I’ve done more of these than I can count, first as a program director, then as the guy in charge of programming for a group and finally as a consultant. As a manager and consultant, many of those refreshes were contentious and incredibly time consuming as we went song by song and the programmer fought for why they think certain songs should remain in an active category. Which makes sense, in their minds they had a reason for adding that song at some point and for why they should continue to play it. Although often when pushed it became clear that the main reason is because they like the tune. Those refreshes generally ended with a compromise where I’m just trying to get the best version of the database that they’ll sign off on so their still passionate about programming it. This one, however, was a lot easier and more on-target. Here are the two reasons why.
We’re using better data
Now that we look at market level streaming and sales data to inform the add and move suggestions we make to our clients each week, we’re in a much better position to suggest refresh changes to the recurrent and gold categories as well. For weekly adds and moves we use data collected from their market over the past seven days specific to their format. When it’s time to do a quarterly refresh we either use a year to date, past 12 month or a past 5 to 10 years report depending on how far the categories or format goes back. This gives us a very accurate picture of exactly what songs people who like that stations music within their market are choosing to listen to on their own. Which makes it incredibly easy, and fast, to comb through recurrent and gold categories that have swelled and choose what to rest. Plus, with categories that have a good song count, it helps us identify a couple songs that are either in rest/hold categories or missing from the database all-together and swap those out with a few weaker active songs to freshen things up.
Local programmer that embraces a data-driven approach
But even with great data, how effective a refresh is depends on whether the local programmer is willing to take a data-driven approach to programming. I have worked with this programmer for over a year and have been incredibly impressed by how well he’s adapted to using this data. Every week when I send my suggestions I also send the reports that led to those suggestions. Many radio programmers don’t take the time to open and look at those reports because they look a lot different than what they’re used to looking at, the fifty songs on the Mediabase airplay chart for their format and maybe a couple other charts. This extra effort allows him to find the occasional other song to add/move on his own and to trust that what we’re suggesting is accurate so he feels comfortable following our advice. So, when it comes time to do a refresh of the whole database we’re both on the same page and it’s a quick and effective process that fine tunes the station to fit his local market and gives it an edge over the competition that’s running the same songs in every market across the country.
Why is it so important to utilize good data when programming music? Because every music scene is constantly evolving and changing and staying on top of that evolution either requires good data or countless man-hours that could be spent on other parts of the job.
What do you think? What’s your process for database refreshes and how often do you do them? Comment below or email me at Andy@RadioStationConsultant.com. Also, if you’re interested in getting a sneak peek at market level data from your market, schedule a meeting with me using this link.