10,000 Leads in 10 Minutes: Using Web Data to Generate High Quality Sales Leads in Mass

*Editors Note: Guest post by Andrew Fogg, co-founder and Chief Data Officer of Import.io. Import.io a free tool that allows you to transform the web into a table of data or an API quickly and easily without having to write any code. This blog post is based on a presentation he gave at HustleCon on August 1st 2014. Slides are embedded below.

What is a Lead?

Ok, let’s start with the basics. Leads represent the first stage of the sales process.  In its simplest form a lead is any “person or entity that has an interest and authority to purchase your product or service”. Or in other words: someone you can sell to.

Sounds good. So, what information do you need about that person or entity for it to be an actionable lead? A good rule of thumb is to look for the information that you would find on a business card, i.e. a name, associated company and contact details.

Where Do Leads Come From?

The most traditional way to get leads is by buying databases of telephone numbers, email addresses or mailing addresses. As you can imagine, these lists are immense, which gives you a high quantity of leads, but the quality is notoriously very poor.

Alternatively, you can attend trade shows and other industry events and collect business cards the old fashioned way. Or you can stalk people on social media and try to get in touch with them that way, through channels like Facebook, LinkedIn and Twitter. Leads generated in this way usually have great quality, but come with a high time-cost which limits the quantity that you will be able to generate.

But Wait, There’s Another Way!

I want to share with you a new approach to lead-generation that will deliver both quantity and quality – it is based on web data. This approach was developed by some of our earliest users, and it is both ingenious and simple:

  1. Find a website where your ideal user can be found
  2. Build an API to that website (using Import.io naturally) and extract as much data about each lead as you can
  3. Pull that data into a spreadsheet

That’s it. Three simple steps and it takes about 10 minutes, after which you will have thousands of quality leads to work with.

Show Me That Again!

Right, let’s look at each of those steps in a bit more detail. To help you visualize how this can work for your business, I’m going to step through an example. Let’s imagine that I’m in commercial real estate and want to talk to real estate brokers.

Step 1: Find Your Ideal User

The first step will require a little bit of imagination and thinking on your part. Where your ideal user can be found of course depends on who that person is. You’ll probably need to spend some time getting to know your user and looking around the web to see where they hang out. Is it a forum? A professional association? Are they on social media?

The key here is to be as specific as possible when defining your ideal user (lead). The more specific you are the more targeted your messaging can be. In our real estate example, I am going to use this real estate listing site in NYC. If I click through to one of those properties I can see the broker’s name, email (as a link from his name) and phone number – that’s the data I’m after!

Step 2 & 3: Extract the Data and Get it in a Spreadsheet

I’ve combined steps 2 and 3 together here, because they are closely tied to the same process.

To get this data I could use a number of different options. The simplest way to do this is to build a Crawler, which will then go to each part of the site and pull data from all the pages that match the ones I train it on. This means I will end up with a big list of names and contact information which I can export into Excel, CSV or Google Sheets.

That’s great, but Crawlers only create static data sets, which means that to get new data from this site I would have to re-crawl the whole site – and that would take a while. Instead, I can do something a tad more complicated by building an Extractor to one page. Then I use the URL pattern of that page to generate all the other URLs for that site and use this batch search Google Sheet to pass all of those URLs through the Extractor. This has the benefit of being able to quickly refresh whenever I need to.

A quick note about getting the email addresses. You’ll notice, if you visit the page, that the email address is displayed as a link to the estate agent’s name. When I map this data, I need to make sure I map it as a link. It may look like I’ve only mapped his name, but when I export the data into Excel or Google Sheets, I will get one column with his name and another column with the text of the link – in this case his email address.

In this particular example, I would also need to do a bit of data cleansing, because many of the properties are being sold by the same estate agent so I am likely to end up with a lot of duplicates. This is easily done in either Excel/Google Sheets or most mass e-mailing software like MailChimp.

Contact Those Leads

Use the data you collected to create a conversational, extremely personalized message to send to each lead. Seriously. Make it freaking personal. I CANNOT emphasize this enough. Something like this should give you an idea of how to begin:

Hi [first name], I was browsing [website name] and I came accross your profile. I noticed that you mentioned [profile keyword], and I thought you would be interested in what we do….

Then start sending out your messages. Be creative about the channels that you use – email isn’t the only way to contact people – try other channels like Twitter, text message, et cetera.

It’s fine to automate your messages, but don’t send out too many at once – you want to avoid appearing spammy. And remember that you are starting a conversation with a real person and you need to be able to respond and engage as people start replying to you.  If you send out 1,000 messages and you get 1,000 responses you will be swamped.  10 messages a day is a good start and go from there.

Eat Your Own Dog Food

At import.io, we actually followed these exact steps to bring in new users. We’ve created a platform that allows anyone to turn a website into data without the need to write any code. A key group who benefit from our platform are developers, they can save a lot of time and effort using Import.io instead of writing code to get web data.

There are lots of developers on oDesk, so we built an Extractor to all 13,000 who mention “scraping” as one of the services that they provide.  We pulled this list of users into a spreadsheet. Then, we created a personalized message template inviting them to apply for a real job on oDesk that requires the use of import.io. The message also quickly outlined the benefits of using our platform over traditional methods and included a link to our website.

Quick tip: use Google URL Builder to send out your link so that you can track how well each different campaign is doing.  You can also shorten the link using the Google URL Shortener.  If you are using Google Apps for Business you can use your own domain name, which makes it look less spammy.

Next, we built a connector to oDesk, this time to the messaging system.  The API logs in to oDesk, navigates to an individual user’s profile and sends them the personalized message. Every time a user follows the link to our site, we can follow them all the way through to account creation on our platform in our own analytics.

Using this method we have begun experimenting with getting 1,000s of quality leads to our site.

Andrew Fogg is the co-founder and Chief Data Officer of Import.io. Import.io a free tool that allows you to transform the web into a table of data or an API quickly and easily without having to write any code. He also writes at andrewfogg.com.

Join Us Today

Insider access to the GTM network and the best minds in tech.

Join Us Today

Insider access to the GTM network and the best minds in tech.