How to Scrape Website Data into Google Sheets

Taking out data online can be incredibly useful.

It can benefit with your research, APIs, Applications, Listings and more.

In many cases, you might be interested in scraping scraping google this data into a Google Linen spreadsheet to allow for easier access and sharing.

Today, we uses a free web scraper to remove data from a website to Google Sheets.

Web Scraping data into Google Sheets
For this example, we uses ParseHub, a free and powerful web scraper that can remove data from any website.

We will also remove data from Amazon’s results page for the term “computer monitor”. We will remove this data on to a Google Sheets spreadsheet which will be automatically updated.

Make sure to download ParseHub for free before getting started.

Setting up Assembling your shed
Now, let’s get scraping.

Open ParseHub and click on “New Project” and enter the URL you will be scraping. Once again, for this example, i will be scraping data from Amazon’s results page for the term “computer monitor”. The page will now establish inside the iphone app.
Web scraping Amazon on ParseHub
Once the page is caused to become, a select command will be created by default. Begin by clicking on the first product name on the page. It will be highlighted in green to indicate that it has been selected.
Click the second product name on the list to select them all. All product names will now be highlighted in Green. On the left sidebar, rename your selection to “product”.
Taking out product name on Amazon
You will notice that ParseHub is now pulling each products’ name and URL.

For now, we will keep our scraping project fairly simple. However, if you want to scrape more data from Amazon, such as product pricing and details, check our guide on what to scrape Amazon Product data.

Scheduling Future Scuff marks
You can now remove data from ParseHub to Google Sheets. However, you might want to schedule your scrape to pull data onto Google Sheets on a schedule.

If you prefer to execute a one-time scrape, skip to another location section.

Note: Project Scheduling is a paid ParseHub Feature.

To schedule your scraping project, follow these steps:

Click the green Get Data button on the left sidebar.
Click the “Schedule” button.
From the dropdown, you can select how often you’d like to run your scrape and at what time.
Once you’ve set your schedule, click on Save and Schedule.
Now assembling your shed will run automatically at your scheduled times. A new bill for this schedule will be created on the “Get Data” page. You can click on this bill to open a webpage where crucial computer data can be down loaded, after the first time it is scheduled to run. Crucial computer data will be available in Shine and JSON format.
Next, we should go over how to automatically remove your leads to Google Sheets.

Conveying Data Right to Google Sheets
ParseHub allows you to remove your scrape results upon to Google Sheets via its API keys.

Here’s how to arrange it:

Go to the setting page of your project.
Find your API key by clicking on the “profile” image in the top-right corner of the toolbar. Click “Account” and you will see your API key listed.
Open a new Google Linen spreadsheet.
On cell A2, type =IMPORTDATA()
importdata() function on google sheets
Between the parenthesis, enter the following URL: https: //www. parsehub. com/api/v2/projects/PROJECT_TOKEN/last_ready_run/data? api_key=API_KEY&format=csv
In the URL above, replace PROJECT_TOKEN with the actual project expression from the “Settings” bill of your project.
Replace the API_KEY with the API key from your account.
Once you’ve finished up your formula, crucial computer data will be auto-populated once you run your scrape at least one time. To do this, use the green Get Data button on the left sidebar and click on “Run”.

produced data onto google sheets
Closing Thoughts
You now know how to automatically remove data from any website to Google Sheets.

If you run into any issues while setting up assembling your shed, contact us via the live chat on our site and we will be happy to help you.

Leave a Reply

Your email address will not be published. Required fields are marked *