We have a new name! Browserbear is now Roborabbit

How to Automatically Send Structured Data to Airtable

Scraped data should be stored in an easily accessible place so you can use it for other workflows. Here's how to automatically send your data to Airtable.
by Julianne Youngberg · · Updated

Contents

    When you’re collecting data from an online source, you want to do so in a way that’s easy to organize, store, and format. This is especially true when it comes to automated scraping—information that’s easy to process requires fewer steps in your workflows, which in turn increases speed and accuracy.

    You can scrape data from webpages in a structured format using a tool like Browserbear. And because this cloud-based tool can be integrated with others, you can create entire nocode workflows that don’t require you to manually pass data from one program to the next.

    In this tutorial, you’ll learn not only how to use Browserbear to scrape structured data but also how to automatically send it to Airtable so it can be used for other processes.

    What is Browserbear?

    Browserbear is a cloud-based tool for automating web browsing tasks. You can use it to save structured data, test websites, and capture screenshots, among other things.

    Screenshot of Browserbear home page

    Using Browserbear, you can easily automate actions based on specific conditions by creating a task and triggering it using the REST API or no-code tools such as Zapier and Make. By integrating this task with other tools, you can save, manipulate, and optimize your data. This frees you from manual, repetitive work and allows you to focus on more important tasks.

    What is Airtable?

    Airtable is a cloud-based platform that brings together the best of spreadsheets and relational data management tools. With over 1,000 native integrations, this app makes it easy to create custom workflows in a few clicks.

    Third-party integration services, scripting extensions, automation scripts, webhook triggers, and an enterprise API (Application Programming Interface) also make it possible to take advantage of other apps’ capabilities.

    What You Will Create

    The structured data you scrape needs to be put into a database before you can use it in other workflows. But it’s not always efficient to manually upload a CSV file whenever you have a new set of data to work with. We will be setting up an automation that scrapes the Browserbear blog page then sends the data to Airtable, creating records for every separate item.

    The tools you will be using are:

    • Airtable : To store blog card information
    • Browserbear : To create a browser automation that scrapes blog card information
    • Zapier : To trigger actions in different programs based on pre-specified commands

    By the end of the tutorial, you should be able to automatically populate an Airtable database with data:

    Screenshot of Airtable blog card base

    All you need is a Browserbear task, an Airtable base, and Zapier to tie it all together.

    How to Send Scraped Structured Data to Airtable

    Airtable's many integrations make it a viable storage option for your scraped data from Browserbear. Setting up a workflow with Zapier allows you to automatically create up to 10 new records of scraped data at a time, reducing manual effort on your part.

    Here's how to set up a workflow that sends scraped structured data to Airtable:

    Create a Browserbear Task

    Log into your Browserbear account (or create a free trial account if you don’t have one—no credit card required! 🎉). Go to the Tasks tab, then click Create a Task.

    Name the task, then click Save.

    Screenshot of Browserbear new task setup

    You’ll now be on a task page where you can add steps, run your automation, and view completed runs.

    Click Add Step and set up the following actions:

    Step 1: Go

    This step instructs Browserbear to go to a URL and wait until it has loaded.

    Choose go as the Action and insert your destination URL. Select networkidle as the wait instructions.

    Screenshot of Browserbear go action setup

    Click Save.

    Bear Tip 🐻: The networkidle load instruction waits until no new network requests are made for 500 ms and is safest in the majority of situations. Other options include domcontentloaded and load.

    Step 2: Save Structured Data

    This step will define a container and the elements you want to scrape within it. You’ll only need to set up the scraping process for one container—it will then apply to all others on the page that contain the same elements.

    The Browserbear blog page has six blog cards. Let’s set up a scraping action for the first one.

    Choose save_structured_data as the Action.

    You will need to use the Browserbear Helper Chrome extension for the helper configs. Activate the extension on the fully loaded destination website.

    Screenshot of Browserbear Helper Chrome extension

    Hover over the parent container. You should see a blue outline indicate the selection.

    Screenshot of Browserbear Helper parent container outline

    Click the element, then copy the config that appears in the pop up window.

    Screenshot of Browserbear Helper parent container config

    Insert it into the Helper section of your save_structured_data step.

    Screenshot of Browserbear save_structured_data action parent container setup

    Now, let’s set up the individual HTML elements you want scraped. Return to your fully loaded destination, hover over the element you want to select, and click when you see a blue outline.

    Screenshot of Browserbear blog page with parent container and children HTML elements outlined

    Copy the resulting config into the Helper Config section of the Data Picker. Add a name and specify the type of data you want to pull (the default is Text ).

    Screenshot of Browserbear save_structured_data action children HTML element setup

    Click Add Data when you’re done.

    Repeat this step for as many HTML elements as you want, then click Save once all elements have successfully loaded onto the log on the right side.

    Screenshot of Browserbear save_structured_data action setup with log outlined in red

    Now from your task page, click Run Task to test it.

    Screenshot of Browserbear task page with red arrow pointing to run task

    A run will appear at the bottom of your task page, showing if it is successfully completed.

    Screenshot of Browserbear task run with red arrow pointing to log

    Click Log to see the output!

    Screenshot of Browserbear task log with scraped output outlined in red

    Make adjustments to your task until it yields the data set you want, then move on to the next step.

    Build an Airtable Base to Store Scraped Data

    Airtable will store your scraped data, keeping it easy to access if you want to add more to your workflow.

    Log into Airtable and create a new base. Create fields for each element you’re scraping. We will use the following:

    • Title
    • Date
    • Full URL (as formula: CONCATENATE("https://browserbear.com",{Link}))
    • Link
    • Description

    Your full table will look something like this:

    Screenshot of Airtable blog card base

    Note : The full URL field is included simply because the link scraped from Bannerbear is only the file path and does not include the domain. The concatenate formula combines the two.

    We will use Zapier to populate the table with scraped data in the next step.

    Set up a Zap to Save Scraped Data

    Now, we’ll set up a short, two-step zap that triggers when a run is completed in Browserbear and sends the output to Airtable.

    Log into your Zapier account, click + Create Zap , then set up the following events:

    Trigger: Run Finished in Browserbear

    Choose Browserbear as the app and Run Finished as the event. You’ll need to connect your Browserbear account using your API key.

    Set up the action by selecting the right task.

    Screenshot of Zapier Browserbear run finished trigger setup

    Test the trigger to make sure Zapier is able to find a completed run.

    Action: Create Records (With Line Item Support) in Airtable

    Choose Airtable as the app and Create Records (With Line Item Support) as the event. If you haven’t connected your account yet, do so using the API key found on your Account page.

    Set up the action by choosing the right base and table, then mapping all of the scraped elements to their corresponding Airtable fields.

    Note : A maximum of 10 records can be created per request. If you need more, you may need to split your data up into batches and set up separate requests.

    Screenshot of Zapier Airtable create records action setup

    Test the action to make sure your data auto-populates the table. Your result should look something like this:

    Screenshot of Airtable blog cards populated base

    And there you have it—scraped data automatically sent to Airtable!

    Bear Tip 🐻: You will need a separate automation that triggers the task run as frequently as needed. Consider setting up a different zap to schedule it on a recurring basis or in response to a certain event.

    Automate Scraping Workflows

    Automation relies on consistency and minimal manual input, so finding a way to automatically send properly formatted data across programs is crucial. But because there are so many nocode tools available today, you’ll never be short of creative ways to solve problems.

    Want to learn more about saving scraped structured data? Check out these articles:

    👉🏽 How to Automatically Scrape Website Data and Save to Notion (No Code)

    👉🏽 How to Automatically Scrape Structured Data and Save to Google Sheets

    And learn more about scraping with Browserbear here:

    👉🏽 How to Scrape Data from a Website Using Browserbear (Part 1)

    👉🏽 How to Scrape Data from a Website Using Browserbear (Part 2)

    About the authorJulianne Youngberg@paradoxicaljul
    Julianne is a technical content specialist fascinated with digital tools and how they can optimize our lives. She enjoys bridging product-user gaps using the power of words.

    Automate & Scale
    Your Web Scraping

    Roborabbit helps you get the data you need to run your business, with our nocode task builder and integrations

    How to Automatically Send Structured Data to Airtable
    How to Automatically Send Structured Data to Airtable