Automating exports from Google Analytics for scheduled reporting

Post 52 of 81

While usually the most important and valuable analysis is ad-hoc, and done by an analyst, very often there is the need for scheduled exports and reporting of Google Analytics data.

Google Analytics provides some automation capabilities for custom reports, for a limited number of dimensions and metrics, but of course the results are subject to sampling (fast access mode).

The Analytics Canvas command line functionality goes much further, making it possible to query up to 7 dimensions and 10 metrics easily, and even more by taking advantage of the power of Analytics Canvas to join multiple data sets. By automatically pumping Google analytics data into a database, even more sophisticated analysis can be done, and sampling can be avoided by using query partitioning.

  • Run jobs at any frequency- Monthly, Weekly, Daily, hourly
  • Integrate Google Analytics into existing Business intelligence and reporting systems
  • Automatically extract data from Google Analytics using the API, and then insert into databases or files
  • Generate excel dashboards directly based on templates populated by Google Analytics data automatically
  • Avoid fast access mode sampling by running multiple queries seemlessly, deliver accurate data

With the command line functionality, not only can Google analytics data be automatically exported, but then based on the successful extraction of your analytics data, upstream tasks can be seemlessly run.

Get started now- Automate!

The free trial of Analytics canvas now includes the command line capability.

Try it free

This capability is licensed in the form of the Automation subscription, which is used in conjunction with the client subscriptions to provide a complete solution. Developers can create and run Canvases using Analytics Canvas desktop, and then run them on the command line enabled server version as needed.

For a brief overview of how to run Analytics Canvas from the command line, check out the command line documentation, which has examples and everything you need to start extracting from Google analytics using and automatic schedule, and avoiding mindless data manipulation for reporting, so you can spend more time doing Analysis, and driving your websites performance forward.

This article was written by James Standen

James is the founder of nModal Solutions, the creator of the Analytics Canvas tools. nModal's vision is to bring an entire new class of visual, flexible tools to web analytics and social media analysis. You can find him on Google+.


Cezanne HuqJuly 25, 2012 at 4:27 pmReply

Is it possible to derive raw data/clickstream files from Google Premium using your tool?

James StandenJuly 26, 2012 at 7:26 amReply

Hi Cezanne,

Google does not provide access to the raw data export that is available in Google Premium to the API (yet- Google if you are reading we’d love it!)

But we have a number of clients that are Google Premium customers, and they use our query partitioning feature to avoid sampling and can get exact data in the majority of cases.

James StandenAugust 7, 2016 at 12:06 pmReply

Update- With Analytics 360, and using BigQuery, the full raw clickstream is available- and Canvas has a wide range of features and capabilities to take full advantage of it.

Anshuman TripathyJanuary 28, 2015 at 2:53 pmReply

How can one automate the export of data instead of having to ‘Run with all data and write to exports.’ This currently works but don’t see any documentation regarding how to do this automatically.

James StandenFebruary 4, 2015 at 2:16 pmReply

Hi Anshuman,

If you look in the help files or the online documentation under automation, it defines how to run Analytics Canvas from the command line, or using the Automation wizard to generate a .bat file that will run one or more canvases. It is also possible to have emails sent on success and failure of automated runs. All of this functionality is available with the Automation subscription- this is an add-on to the other desktop subscriptions and provides a machine license for canvas to run in this automated mode. This allows for scheduling and integration into enterprise systems etc.

Bhanu Prasad ChinthaAugust 6, 2016 at 5:40 pmReply

Hello Team, Can we export all the content (All Accounts & Properties (tracking codes)) from google analytics every day to custom SQL database and store the analytics data for n number of years to see the trend in a visual representation.


James StandenAugust 7, 2016 at 12:05 pmReply

Hi Bhanu,

Analytics Canvas will let you export huge amounts of data from all your Google Analytics accounts- it can be automated to run daily, or on any frequency required, and can append the data to a wide range of databases, including SQL Server.

Depending on if you have Google analytics standard, or Analytics 360, all the options for data extract are available- even with GA Standard, Analytics Canvas can increase the number of dimensions and metrics possible from the core reporting API by doing multiple queries and stitching together the results.

To look at the trends, our users would use a tool like Tableau, or Excel (or any tool that connects to the database) to be able to see everything over years and years of history and potentially hundreds or even thousands of Google Analytics views.

Bhanu Prasad ChinthaAugust 7, 2016 at 3:23 pmReply

Hello James,

Thanks for your quick reply.

We are planning to use the GA Standard for our Analytics. May I know, if there is any limitations on the data that can be exported using the Analytics Canvas.

Because, We are expecting a minimum of 50,000 datapoints per day and per tracking code/property. So, There can be 5000 max tracking codes used at a time to track 5000
different workspaces. (using the maximum number of possible tracking codes available per google account (1 Google Account = 100 Accounts = 50 Tracking codes per
Account = 5000 Tracking codes))

So, If the Analytics Canvas job is started daily at some time. How much time it may take for the complete execution of all tracking codes in worst case scenario. Can you p
lease let me know this use case details as this will be a key deciding factor. Please let me know your feedback here.

We wanted to use MS Power BI as visualization tool for the custom SQL database.


James StandenAugust 7, 2016 at 3:35 pmReply

There are no limits on Analytics Canvas in terms of data other than the response time of the Google Analytics API and the quota set by Google. For Core reporting, it is up to 10,000 API calls per day per view, and each API call can load 10,000 data points. We have many customers who routinely load millions of rows of data. In terms of the time required, we would have to look at the details for the break down of the various accounts, and of course it really depends on the response time from Google. But loading hundreds of thousands or even millions of rows daily would not be a concern, and would typically take at most a few hours, usually done very early in the morning so the data is ready when users come in, although we also have customers that load data multiple times per day.

BhanuSeptember 28, 2016 at 9:31 amReply

Hello James,

We have a use case of 4000 site collections. So, If we enable google analytics for all those site collections with different tracking codes & want to export the data from Google Analytics to custom SQL database – Do we need to create 4000 analytics canvas mapping manually or do you provide any script or api to create this Analytics Canvas automatically. So that we can overcome the burden of creating these canvas manually.

Thanks & Regards,

Ameet WadhwaniSeptember 28, 2016 at 10:06 amReply

Hi Bhanu,

This is an interesting data challenge and one that has been brought forward to us a number of times.

Managing 4,000 files would be far too cumbersome so we’ve designed a feature that allows you to develop and maintain as little as 1 Canvas file, iterating through it for each of your sites and writing out to your SQL database.

We call it Parameterization and its a feature that lets you pass in variables such as a Google Analytics ViewID, start and end dates, as well as other parameters that you can define yourself. This way you can have a Canvas file as your template and setup an automation that loops through your list of sites, accepting various parameters for each job in the process.

If you write to someone from our team would be happy to show you how this works and develop a proof of concept based on your specific needs.