This is an extract of my talk about this user case:
Anyway I studied a bit how everything works and is not possible in that way because in Google Analytics there is a lot of JS and iFrame that are not easy to manage.
So after studying a bit I chose to try iMacro that is a Firefox extension to replicate actions in a page.
Anyway I did a little script:
Let me explain what is the workflow of this script:
- Get the page (that require to be already opened with the various filter)
- Press on the Export button in the top
- Wait 2 second because there is a menu that will appear
- Click the button about Export as CSV
- Wait 4 seconds
- Scrool to the bottom
- Press the button to the next page
- Scroll to the top
- Wait 6 seconds (the page will be updated automatically with an onverlay)
The next step is to run the macro based on the page number. In my case was simple 170000/5000, where 170000 is the number of urls and 5000 the maximum showed in Analytics.
In that way I got that the iteration need to run 34 times and iMacro as a field for that.
The next step is to configure in the browser to automatically download the files without open the dialog to chose where place it.
For that script I used Firefox for developer edition because use a different profile so doesn’t conflict with my daily Firefox.
The last step is on you: merge all the various csv in a unique file.
Why CSV instead XLSX? Because the first file format is easier to manage and do scripting, but also easy to convert to XLSX and at the same time you can use it without problems in Office or LibreOffice.