Once this is done, it moves to the next date and repeats the process.
Once that day is completed, it writes the data to a new, or existing CSV file, depending on the situation. Then – for that same date – it makes another request to get the rows from 25k to 50k, then from 50k to 75k, until all the rows are extracted for that specific date. Then, for each of those days, it makes the first request to extract the first 25K rows of data. What it does is that it loops one day at the time. Clone the Repositoryįirst, you need to clone the code from my Github account.įrom gsc_to_csv_by_month import gsc_to_csvĪrgs = webmasters_service,site,output,creds,start_date To be able to access the API, you will need to do a few things first. Get Started With the GSC APIīefore we get to the video. The search console API provides a way around that, by letting you extract all of the data, helping you build – advanced SEO reports – using Python. But, checking each page manually is a painful process and provides limited insight. In Google Search Console, you can look at the page and check the keywords for that specific page. It shows only 1000 rows at the time and you can’t extract keywords per page at scale.
The reason I am using the Google search console API is that the GSC User Interface is quite limited for advanced reporting. I’ll show you how in the next video tutorial. One of the great use cases for using the API is to build SEO reports like the one below.