analytics/lib/plausible_web/controllers/api/external_stats_controller.ex
Uku Taht e27734ed79
[Continued] Google Analytics import (#1753)
* Add has_imported_stats boolean to Site

* Add Google Analytics import panel to general settings

* Get GA profiles to display in import settings panel

* Add import_from_google method as entrypoint to import data

* Add imported_visitors table

* Remove conflicting code from migration

* Import visitors data into clickhouse database

* Pass another dataset to main graph for rendering in red

This adds another entry to the JSON data returned via the main graph API
called `imported_plot`, which is similar to `plot` in form but will be
completed with previously imported data.  Currently it simply returns
the values from `plot` / 2. The data is rendered in the main graph in
red without fill, and without an indicator for the present. Rationale:
imported data will not continue to grow so there is no projection
forward, only backwards.

* Hook imported GA data to dashboard timeseries plot

* Add settings option to forget imported data

* Import sources from google analytics

* Merge imported sources when queried

* Merge imported source data native data when querying sources

* Start converting metrics to atoms so they can be subqueried

This changes "visitors" and in some places "sources" to atoms. This does
not change the behaviour of the functions - the tests all pass unchanged
following this commit. This is necessary as joining subqueries requires
that the keys in `select` statements be atoms and not strings.

* Convery GA (direct) source to empty string

* Import utm campaign and utm medium from GA

* format

* Import all data types from GA into new tables

* Handle large amounts of more data more safely

* Fix some mistakes in tables

* Make GA requests in chunks of 5 queries

* Only display imported timeseries when there is no filter

* Correctly show last 30 minutes timeseries when 'realtime'

* Add with_imported key to Query struct

* Account for injected :is_not filter on sources from dashboard

* Also add tentative imported_utm_sources table

This needs a bit more work on the google import side, as GA do not
report sources and utm sources as distinct things.

* Return imported data to dashboard for rest of Sources panel

This extends the merge_imported function definition for sources to
utm_sources, utm_mediums and utm_campaigns too. This appears to be
working on the DB side but something is incomplete on the client side.

* Clear imported stats from all tables when requested

* Merge entry pages and exit pages from imported data into unfiltered dashboard view

This requires converting the `"visits"` and `"visit_duration"` metrics
to atoms so that they can be used in ecto subqueries.

* Display imported devices, browsers and OSs on dashboard

* Display imported country data on dashboard

* Add more metrics to entries/exits for modals

* make sure data is returned via API with correct keys

* Import regions and cities from GA

* Capitalize device upon import to match native data

* Leave query limits/offsets until after possibly joining with imported data

* Also import timeOnPage and pageviews for pages from GA

* imported_countries -> imported_locations

* Get timeOnPage and pageviews for pages from GA

These are needed for the pages modal, and for calculating exit rates for
exit pages.

* Add indicator to dashboard when imported data is being used

* Don't show imported data as separately line on main graph

* "bounce_rate" -> :bounce_rate, so it works in subqueries

* Drop imported browser and OS versions

These are not needed.

* Toggle displaying imported data by clicking indicator

* Parse referrers with RefInspector

- Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual
  referrer host + path, whereas 'ga:source' includes utm_mediums and
  other values when relevant.
- 'ga:fullReferror' does however include search engine names directly,
  so they are manually checked for as RefInspector won't pick up on
  these.

* Keep imported data indicator on dashboard and strikethrough when hidden

* Add unlink google button to import panel

* Rename some GA browsers and OSes to plausible versions

* Get main top pages and exit pages panels working correctly with imported data

* mix format

* Fetch time_on_pages for imported data when needed

* entry pages need to fetch bounces from GA

* "sample_percent" -> :sample_percent as only atoms can be used in subqueries

* Calculate bounce_rate for joined native and imported data for top pages modal

* Flip some query bindings around to be less misleading

* Fixup entry page modal visit durations

* mix format

* Fetch bounces and visit_duration for sources from GA

* add more source metrics used for data in modals

* Make sources modals display correct values

* imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration

* Merge imported data into aggregate stats

* Reformat top graph side icons

* Ensure sample_percent is yielded from aggregate data

* filter event_props should be strings

* Hide imported data from frontend when using filter

* Fix existing tests

* fix tests

* Fix imported indicator appearing when filtering

* comma needed, lost when rebasing

* Import utm_terms and utm_content from GA

* Merge imported utm_term and utm_content

* Rename imported Countries data as Locations

* Set imported city schema field to int

* Remove utm_terms and utm_content when clearing imported

* Clean locations import from Google Analytics

- Country and region should be set to "" when GA provides "(not set)"
- City should be set to 0 for "unknown", as we cannot reliably import
  city data from GA.

* Display imported region and city in dashboard

* os -> operating_system in some parts of code

The inconsistency of using os in some places and operating_system in
others causes trouble with subqueries and joins for the native and
imported data, which would require additional logic to account for. The
simplest solution is the just use a consistent word for all uses. This
doesn't make any user-facing or database changes.

* to_atom -> to_existing_atom

* format

* "events" metric -> :events

* ignore imported data when "events" in metrics

* update "bounce_rate"

* atomise some more metrics from new city and region api

* atomise some more metrics for email handlers

* "conversion_rate" -> :conversion_rate during csv export

* Move imported data stats code to own module

* Move imported timeseries function to Stats.Imported

* Use Timex.parse to import dates from GA

* has_imported_stats -> imported_source

* "time_on_page" -> :time_on_page

* Convert imported GA data to UTC

* Clean up GA request code a bit

There was some weird logic here with two separate lists that really
ought to be together, so this merges those.

* Fail sooner if GA timezone can't be identified

* Link imported tables to site by id

* imported_utm_content -> imported_utm_contents

* Imported GA from all of time

* Reorganise GA data fetch logic

- Fetch data from the start of time (2005)
- Check whether no data was fetched, and if so, inform user and don't
  consider data to be imported.

* Clarify removal of "visits" data when it isn't in metrics

* Apply location filters from API

This makes it consistent with the sources etc which filter out 'Direct /
None' on the API side. These filters are used by both the native and
imported data handling code, which would otherwise both duplicate the
filters in their `where` clauses.

* Do not use changeset for setting site.imported_source

* Add all metrics to all dimensions

* Run GA import in the background

* Send email when GA import completes

* Add handler to insert imported data into tests and imported_browsers_factory

* Add remaining import data test factories

* Add imported location data to test

* Test main graph with imported data

* Add imported data to operating systems tests

* Add imported data to pages tests

* Add imported data to entry pages tests

* Add imported data to exit pages tests

* Add imported data to devices tests

* Add imported data to sources tests

* Add imported data to UTM tests

* Add new test module for the data import step

* Test import of sources GA data

* Test import of utm_mediums GA data

* Test import of utm_campaigns GA data

* Add tests for UTM terms

* Add tests for UTM contents

* Add test for importing pages and entry pages data from GA

* Add test for importing exit page data

* Fix module file name typo

* Add test for importing location data from GA

* Add test for importing devices data from GA

* Add test for importing browsers data from GA

* Add test for importing OS data from GA

* Paginate GA requests to download all data

* Bump clickhouse_ecto version

* Move RefInspector wrapper function into module

* Drop timezone transform on import

* Order imported by side_id then date

* More strings -> atoms

Also changes a conditional to be a bit nicer

* Remove parallelisation of data import

* Split sources and UTM sources from fetched GA data

GA has only a "source" dimension and no "UTM source" dimension. Instead
it returns these combined. The logic herein to tease these apart is:

1. "(direct)" -> it's a direct source
2. if the source is a domain -> it's a source
3. "google" -> it's from adwords; let's make this a UTM source "adwords"
4. else -> just a UTM source

* Keep prop names in queries as strings

* fix typo

* Fix import

* Insert data to clickhouse in batches

* Fix link when removing imported data

* Merge source tables

* Import hostname as well as pathname

* Record start and end time of imported data

* Track import progress

* Fix month interval with imported data

* Do not JOIN when imported date range has no overlap

* Fix time on page using exits

Co-authored-by: mcol <mcol@posteo.net>
2022-03-10 15:04:59 -06:00

232 lines
7.1 KiB
Elixir

defmodule PlausibleWeb.Api.ExternalStatsController do
use PlausibleWeb, :controller
use Plausible.Repo
use Plug.ErrorHandler
alias Plausible.Stats.{Query, Props}
def realtime_visitors(conn, _params) do
site = conn.assigns[:site]
query = Query.from(site, %{"period" => "realtime"})
json(conn, Plausible.Stats.Clickhouse.current_visitors(site, query))
end
def aggregate(conn, params) do
site = conn.assigns[:site]
params = Map.put(params, "sample_threshold", "infinite")
with :ok <- validate_period(params),
:ok <- validate_date(params),
query <- Query.from(site, params),
{:ok, metrics} <- parse_metrics(params, nil, query) do
results =
if params["compare"] == "previous_period" do
prev_query = Query.shift_back(query, site)
[prev_result, curr_result] =
Task.await_many(
[
Task.async(fn -> Plausible.Stats.aggregate(site, prev_query, metrics) end),
Task.async(fn -> Plausible.Stats.aggregate(site, query, metrics) end)
],
10_000
)
Enum.map(curr_result, fn {metric, %{value: current_val}} ->
%{value: prev_val} = prev_result[metric]
{metric,
%{
value: current_val,
change: percent_change(prev_val, current_val)
}}
end)
|> Enum.into(%{})
else
Plausible.Stats.aggregate(site, query, metrics)
end
json(conn, %{results: Map.take(results, metrics)})
else
{:error, msg} ->
conn
|> put_status(400)
|> json(%{error: msg})
end
end
def breakdown(conn, params) do
site = conn.assigns[:site]
params = Map.put(params, "sample_threshold", "infinite")
with :ok <- validate_period(params),
:ok <- validate_date(params),
{:ok, property} <- validate_property(params),
query <- Query.from(site, params),
{:ok, metrics} <- parse_metrics(params, property, query) do
limit = String.to_integer(Map.get(params, "limit", "100"))
page = String.to_integer(Map.get(params, "page", "1"))
results = Plausible.Stats.breakdown(site, query, property, metrics, {limit, page})
results =
if property == "event:goal" do
prop_names = Props.props(site, query)
Enum.map(results, fn row ->
Map.put(row, "props", prop_names[row[:goal]] || [])
end)
else
results
end
json(conn, %{results: results})
else
{:error, msg} ->
conn
|> put_status(400)
|> json(%{error: msg})
end
end
defp validate_property(%{"property" => property}) do
{:ok, property}
end
defp validate_property(_) do
{:error,
"The `property` parameter is required. Please provide at least one property to show a breakdown by."}
end
defp event_only_property?("event:name"), do: true
defp event_only_property?("event:props:" <> _), do: true
defp event_only_property?(_), do: false
@event_metrics ["visitors", "pageviews", "events"]
@session_metrics ["visits", "bounce_rate", "visit_duration"]
defp parse_metrics(params, property, query) do
metrics =
Map.get(params, "metrics", "visitors")
|> String.split(",")
event_only_filter = Map.keys(query.filters) |> Enum.find(&event_only_property?/1)
valid_metrics =
if event_only_property?(property) || event_only_filter do
@event_metrics
else
@event_metrics ++ @session_metrics
end
invalid_metric = Enum.find(metrics, fn metric -> metric not in valid_metrics end)
if invalid_metric do
cond do
event_only_property?(property) && invalid_metric in @session_metrics ->
{:error,
"Session metric `#{invalid_metric}` cannot be queried for breakdown by `#{property}`."}
event_only_filter && invalid_metric in @session_metrics ->
{:error,
"Session metric `#{invalid_metric}` cannot be queried when using a filter on `#{event_only_filter}`."}
true ->
{:error,
"The metric `#{invalid_metric}` is not recognized. Find valid metrics from the documentation: https://plausible.io/docs/stats-api#get-apiv1statsbreakdown"}
end
else
{:ok, Enum.map(metrics, &String.to_atom/1)}
end
end
def timeseries(conn, params) do
site = conn.assigns[:site]
params = Map.put(params, "sample_threshold", "infinite")
with :ok <- validate_period(params),
:ok <- validate_date(params),
:ok <- validate_interval(params),
query <- Query.from(site, params),
{:ok, metrics} <- parse_metrics(params, nil, query) do
graph = Plausible.Stats.timeseries(site, query, metrics)
metrics = metrics ++ [:date]
json(conn, %{results: Enum.map(graph, &Map.take(&1, metrics))})
else
{:error, msg} ->
conn
|> put_status(400)
|> json(%{error: msg})
end
end
def handle_errors(conn, %{kind: kind, reason: reason}) do
json(conn, %{error: Exception.format_banner(kind, reason)})
end
defp percent_change(old_count, new_count) do
cond do
old_count == 0 and new_count > 0 ->
100
old_count == 0 and new_count == 0 ->
0
true ->
round((new_count - old_count) / old_count * 100)
end
end
defp validate_date(%{"period" => "custom"} = params) do
with {:ok, date} <- Map.fetch(params, "date"),
[from, to] <- String.split(date, ","),
{:ok, _from} <- Date.from_iso8601(String.trim(from)),
{:ok, _to} <- Date.from_iso8601(String.trim(to)) do
:ok
else
:error ->
{:error,
"The `date` parameter is required when using a custom period. See https://plausible.io/docs/stats-api#time-periods"}
_ ->
{:error,
"Invalid format for `date` parameter. When using a custom period, please include two ISO-8601 formatted dates joined by a comma. See https://plausible.io/docs/stats-api#time-periods"}
end
end
defp validate_date(%{"date" => date}) do
case Date.from_iso8601(date) do
{:ok, _date} ->
:ok
{:error, msg} ->
{:error,
"Error parsing `date` parameter: #{msg}. Please specify a valid date in ISO-8601 format."}
end
end
defp validate_date(_), do: :ok
defp validate_period(%{"period" => period}) do
if period in ["day", "7d", "30d", "month", "6mo", "12mo", "custom"] do
:ok
else
{:error,
"Error parsing `period` parameter: invalid period `#{period}`. Please find accepted values in our docs: https://plausible.io/docs/stats-api#time-periods"}
end
end
defp validate_period(_), do: :ok
@valid_intervals ["date", "month"]
@valid_intervals_str Enum.map(@valid_intervals, &("`" <> &1 <> "`")) |> Enum.join(", ")
defp validate_interval(%{"interval" => interval}) do
if interval in @valid_intervals do
:ok
else
{:error,
"Error parsing `interval` parameter: invalid interval `#{interval}`. Valid intervals are #{@valid_intervals_str}"}
end
end
defp validate_interval(_), do: :ok
end