Scrape Web Content into Google Sheets

Sometimes information from your favorite websites is not available via an API or an RSS feed. Before analyzing the information, you will need to copy and paste it into Excel. However, web scraping is a much better alternative to get content from any website into a more usable format - such as into Google Sheets. Web scraping builds an API for you, for websites that don't provide them. You can use a web scraping tool and save time copying and pasting web data.

Google Sheets has a wonderful IMPORTDATA() function where you can pass in the API key and project token created by ParseHub, a visual a web scraping tool. Doing this will get you a continuous refresh of data every time content is updated on the website you are scraping.

In this tutorial we will show you how to:

  1. Scrape events from a website using ParseHub.
  2. Schedule the project to scrape new events continuously.
  3. Import the event name, date and location into
    Google Sheets, by using the IMPORTDATA() function and the ParseHub API key and
    project token.

1. Scrape events from a website

Open ParseHub and the Toronto events website

  1. Open the ParseHub desktop application.
  2. Open the following link - http://www.toronto.com/events/
    in the browser.
  3. Click on "Create New Project" and "Start project on this URL".

Select and scrape all of the events on the page

  1. Select the name of the first event on the webpage by clicking on it.
  2. Click on another event name and all of the events on the page will be selected.
  3. Rename the selection events. You have now extracted the name and the url of each event. Because of the "Begin New Entry" command, each event will be given its own row in your data set.

scrape events from an events website

Scrape the dates of each event

  1. Add a Relative Select command by clicking the "plus" button next to the command "Begin new entry in events" command.
  2. Click on one event name.
  3. Hover over the "Dates" section of each event. You will notice that the highlight surrounds both the dates and the location. We need to zoom in to select only the dates.
  4. While hovering over the date, hold down the Control key (or Command key on Mac) and press the number 2 on your keyboard.
  5. As soon as you see the blue highlight which says "Text node" that surrounds only the dates - click on the dates. The dates for all of the events should now be selected for you.
  6. Rename the selection date. They will be automatically extracted for you.

scrape event dates from an events website

Scrape the location for each event

  1. Add another Relative Select command from the plus button beside "Begin new entry in events".
  2. Click on the event name, then click on the location of the same event to connect the two with an arrow. All of the locations will be selected and extracted.
  3. Rename the selection locations.

scrape event locations from an events website

Do you want to get events from all of the pages? Check out this pagination tutorial.

2. Schedule the project to scrape new events continuously

You can run this project one time and get the data to populate our Google Sheets in step 3 only once. However, we will show you how to schedule the project to run one time a day instead.

  1. Click on the "Get Data" button in the ParseHub application.
  2. Click on the "Schedule" button.

schedule a web scraper to scrape results continuously

  1. From the dropdown you can select "every week" or "every hour". For this project "every day" is already selected for you.
  2. You can also choose the time. By default, the project will run at midnight. Of course you can just select the time you want from the dropdowns.
  3. Click on the "Save and Schedule" button.

    schedule a web scraper to scrape results continuously

  4. Now your project will run automatically at your scheduled times. A new tab for this schedule will be created in the "Get Data" page. You can click on this tab to open a page where your data can be downloaded, after the first time it is scheduled to run. Your data will be available in Excel and JSON format.

3. Import events into a Google Sheet

Instead of downloading data in Excel from the ParseHub extension just one time, we will import the data into Google Sheets. In this example we will get the latest run of data from our events project.

Every time the project runs and scrapes the website, this Google Sheet will be uploaded with new events.

Find your API key and project token

  1. Open the project we just worked on.
  2. Find your project token in the "Settings" tab of the project design.
  3. Find your API key by click on the "profile" icon in the top-right corner of the toolbar. Click "Account" and you will see your API key listed.

Open Google Sheets and create your IMPORTDATA function

  1. Open a new Google Sheet.
  2. Click on the A1 cell and type in =IMPORTDATA()

scrape data in Google Sheets

  1. In the =IMPORTDATA() function create your url like the following:

=IMPORTDATA("https://www.parsehub.com/api/v2/projects/PROJECTTOKEN/lastreadyrun/data?apikey=API_KEY&format=csv")

  1. Replace the PROJECT_TOKEN with the actual project token from the "Settings" tab of your project.
  2. Replace the API_KEY with the API key from your account.

We created this url based on the ParseHub API reference. Take a look at it for more information.

use the IMPORTDATA() function in Google Sheets to scrape

If you did everything correctly, you should see the data from your project appear almost immediately.

use the IMPORTDATA() function for web scraping

We are always open to hearing about your ideas and hacks. If you know a great way to use ParseHub with another tool, shoot us an email at angelina[at]parsehub[dot]com.