We Implemented a Bot: Here's How it's Going
I wrote an article a few weeks ago about how I thought we might need a bot. Check it out before you dig into this “in the middle of the experiment” article. If clicking links to check out other articles isn’t your thing, here’s the basic summary: We had questions about how we could make connecting interested website visitors to our sales team a better experience, so we signed up for a 90-day test with the Drift chatbot.
We’re inching up on the end of our 90-day test, so we thought it’d be interesting to share how it’s going. Spoiler alert–it’s going well.
In our first 45 days, we worked really hard to figure out what would produce the best results. Not just for our sales team, but for the end user. We created over 100 different iterations of chat playbooks on our website, and probably tweaked those iterations close to a thousand times. We tweaked with the actual humans chatting with our chatbot in mind. When a channel produces leads like we expected our chatbot to do, it becomes easy to think solely about conversion optimization. We consistently forced ourselves to step back and weigh the experience of the human equally to the ability of the channel to produce leads. One learning we discovered early on was that our account-focused chatbot worked better when a face greeted the website visitor, as opposed to our logo. The website visitors seeing this chatbot were from companies we very much wanted to talk to. We fed this valuable account list into Drift and created a chatbot playbook that would immediately recognize them by name. After not seeing a meeting set in the first two weeks, we changed our greeting messaging and included an image of the salesperson they’d eventually be meeting with. Not long after those changes, we saw higher-quality conversations take place.
We learned early on that our chatbot performed better when paired with a face, compared to our company logo.
Another key learning was digging into what people wanted to do on which pages. Why does a demo request have to be the ultimate conversion if someone is willing to share their information with you in exchange for something else of value? We created a chatbot focused on turning Spredfast blog readers into Spredfast blog subscribers. We set the chatbot to greet a reader after 10 seconds so they had time to gain value from the content. We then asked them to never miss a post by signing up for our blog subscription. This is a conversion that’s focused on what they want at that time rather than asking for a demo on a blog article. In the last 30 days, we’ve also built chatbots that have greeted visitors differently depending on the page they’re on. All of this sounds like a simple exercise in human behavior. We all know that people go to different pages on websites for different reasons, but, in many cases, we’re still expecting all visitors to fill out one demo form instead of providing them with something of value in exchange for their contact information.
A major change we made several weeks into our trial was experimenting with a chatbot that spoke to everyone on our site instead of just accounts we uploaded or accounts that fit certain criteria. We wanted to take a look at all traffic on our site to see if we were missing solid leads that might not be in our original customer profile. This was a risk, and we hedged our bets by not connecting this chatbot to our CRM until we had a better idea on what type of quality we could expect to see. We were pleasantly surprised to see highly qualified people who would have never been in our database chat with this bot. While we didn’t have them schedule meetings directly with salespeople, we let people chatting give their email addresses to hear back from a human. We’ve produced multiple highly qualified sales opportunities through this chatbot with people we would have never reached before. It was certainly a learning in running a controlled experiment and a solid reminder that anyone can stumble onto your website.
We were pleasantly surprised to see highly qualified people who would have never been in our database chat with this bot.
Overall, the results speak to our observations from the previous article. Marketers willing to experiment with ways to deliver a fast and simple customer experience on their website can be rewarded with more meetings and better conversions.
To break down the numbers, our chatbot experiment has produced (so far):
- 763 Conversations on our Website
- 169 Emails Captured
- 23 Meetings
- 10 Opportunities Generated
When we consider how small of a qualified audience we target, these numbers are great. When we take into account how many steps in between interest and a meeting we’ve removed, these numbers represent a positive customer experience.
Our next steps are incorporating other digital strategies into additional chatbots like retargeting, serving consideration content to prospects with open opportunities, and refining our keyword responses. We’re excited to grow our chatbot program and continue exploring how our website performs as a marketing and brand experience tool.