analytics/lib/plausible_web/controllers/api/external_controller.ex
Uku Taht e27734ed79
[Continued] Google Analytics import (#1753)
* Add has_imported_stats boolean to Site

* Add Google Analytics import panel to general settings

* Get GA profiles to display in import settings panel

* Add import_from_google method as entrypoint to import data

* Add imported_visitors table

* Remove conflicting code from migration

* Import visitors data into clickhouse database

* Pass another dataset to main graph for rendering in red

This adds another entry to the JSON data returned via the main graph API
called `imported_plot`, which is similar to `plot` in form but will be
completed with previously imported data.  Currently it simply returns
the values from `plot` / 2. The data is rendered in the main graph in
red without fill, and without an indicator for the present. Rationale:
imported data will not continue to grow so there is no projection
forward, only backwards.

* Hook imported GA data to dashboard timeseries plot

* Add settings option to forget imported data

* Import sources from google analytics

* Merge imported sources when queried

* Merge imported source data native data when querying sources

* Start converting metrics to atoms so they can be subqueried

This changes "visitors" and in some places "sources" to atoms. This does
not change the behaviour of the functions - the tests all pass unchanged
following this commit. This is necessary as joining subqueries requires
that the keys in `select` statements be atoms and not strings.

* Convery GA (direct) source to empty string

* Import utm campaign and utm medium from GA

* format

* Import all data types from GA into new tables

* Handle large amounts of more data more safely

* Fix some mistakes in tables

* Make GA requests in chunks of 5 queries

* Only display imported timeseries when there is no filter

* Correctly show last 30 minutes timeseries when 'realtime'

* Add with_imported key to Query struct

* Account for injected :is_not filter on sources from dashboard

* Also add tentative imported_utm_sources table

This needs a bit more work on the google import side, as GA do not
report sources and utm sources as distinct things.

* Return imported data to dashboard for rest of Sources panel

This extends the merge_imported function definition for sources to
utm_sources, utm_mediums and utm_campaigns too. This appears to be
working on the DB side but something is incomplete on the client side.

* Clear imported stats from all tables when requested

* Merge entry pages and exit pages from imported data into unfiltered dashboard view

This requires converting the `"visits"` and `"visit_duration"` metrics
to atoms so that they can be used in ecto subqueries.

* Display imported devices, browsers and OSs on dashboard

* Display imported country data on dashboard

* Add more metrics to entries/exits for modals

* make sure data is returned via API with correct keys

* Import regions and cities from GA

* Capitalize device upon import to match native data

* Leave query limits/offsets until after possibly joining with imported data

* Also import timeOnPage and pageviews for pages from GA

* imported_countries -> imported_locations

* Get timeOnPage and pageviews for pages from GA

These are needed for the pages modal, and for calculating exit rates for
exit pages.

* Add indicator to dashboard when imported data is being used

* Don't show imported data as separately line on main graph

* "bounce_rate" -> :bounce_rate, so it works in subqueries

* Drop imported browser and OS versions

These are not needed.

* Toggle displaying imported data by clicking indicator

* Parse referrers with RefInspector

- Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual
  referrer host + path, whereas 'ga:source' includes utm_mediums and
  other values when relevant.
- 'ga:fullReferror' does however include search engine names directly,
  so they are manually checked for as RefInspector won't pick up on
  these.

* Keep imported data indicator on dashboard and strikethrough when hidden

* Add unlink google button to import panel

* Rename some GA browsers and OSes to plausible versions

* Get main top pages and exit pages panels working correctly with imported data

* mix format

* Fetch time_on_pages for imported data when needed

* entry pages need to fetch bounces from GA

* "sample_percent" -> :sample_percent as only atoms can be used in subqueries

* Calculate bounce_rate for joined native and imported data for top pages modal

* Flip some query bindings around to be less misleading

* Fixup entry page modal visit durations

* mix format

* Fetch bounces and visit_duration for sources from GA

* add more source metrics used for data in modals

* Make sources modals display correct values

* imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration

* Merge imported data into aggregate stats

* Reformat top graph side icons

* Ensure sample_percent is yielded from aggregate data

* filter event_props should be strings

* Hide imported data from frontend when using filter

* Fix existing tests

* fix tests

* Fix imported indicator appearing when filtering

* comma needed, lost when rebasing

* Import utm_terms and utm_content from GA

* Merge imported utm_term and utm_content

* Rename imported Countries data as Locations

* Set imported city schema field to int

* Remove utm_terms and utm_content when clearing imported

* Clean locations import from Google Analytics

- Country and region should be set to "" when GA provides "(not set)"
- City should be set to 0 for "unknown", as we cannot reliably import
  city data from GA.

* Display imported region and city in dashboard

* os -> operating_system in some parts of code

The inconsistency of using os in some places and operating_system in
others causes trouble with subqueries and joins for the native and
imported data, which would require additional logic to account for. The
simplest solution is the just use a consistent word for all uses. This
doesn't make any user-facing or database changes.

* to_atom -> to_existing_atom

* format

* "events" metric -> :events

* ignore imported data when "events" in metrics

* update "bounce_rate"

* atomise some more metrics from new city and region api

* atomise some more metrics for email handlers

* "conversion_rate" -> :conversion_rate during csv export

* Move imported data stats code to own module

* Move imported timeseries function to Stats.Imported

* Use Timex.parse to import dates from GA

* has_imported_stats -> imported_source

* "time_on_page" -> :time_on_page

* Convert imported GA data to UTC

* Clean up GA request code a bit

There was some weird logic here with two separate lists that really
ought to be together, so this merges those.

* Fail sooner if GA timezone can't be identified

* Link imported tables to site by id

* imported_utm_content -> imported_utm_contents

* Imported GA from all of time

* Reorganise GA data fetch logic

- Fetch data from the start of time (2005)
- Check whether no data was fetched, and if so, inform user and don't
  consider data to be imported.

* Clarify removal of "visits" data when it isn't in metrics

* Apply location filters from API

This makes it consistent with the sources etc which filter out 'Direct /
None' on the API side. These filters are used by both the native and
imported data handling code, which would otherwise both duplicate the
filters in their `where` clauses.

* Do not use changeset for setting site.imported_source

* Add all metrics to all dimensions

* Run GA import in the background

* Send email when GA import completes

* Add handler to insert imported data into tests and imported_browsers_factory

* Add remaining import data test factories

* Add imported location data to test

* Test main graph with imported data

* Add imported data to operating systems tests

* Add imported data to pages tests

* Add imported data to entry pages tests

* Add imported data to exit pages tests

* Add imported data to devices tests

* Add imported data to sources tests

* Add imported data to UTM tests

* Add new test module for the data import step

* Test import of sources GA data

* Test import of utm_mediums GA data

* Test import of utm_campaigns GA data

* Add tests for UTM terms

* Add tests for UTM contents

* Add test for importing pages and entry pages data from GA

* Add test for importing exit page data

* Fix module file name typo

* Add test for importing location data from GA

* Add test for importing devices data from GA

* Add test for importing browsers data from GA

* Add test for importing OS data from GA

* Paginate GA requests to download all data

* Bump clickhouse_ecto version

* Move RefInspector wrapper function into module

* Drop timezone transform on import

* Order imported by side_id then date

* More strings -> atoms

Also changes a conditional to be a bit nicer

* Remove parallelisation of data import

* Split sources and UTM sources from fetched GA data

GA has only a "source" dimension and no "UTM source" dimension. Instead
it returns these combined. The logic herein to tease these apart is:

1. "(direct)" -> it's a direct source
2. if the source is a domain -> it's a source
3. "google" -> it's from adwords; let's make this a UTM source "adwords"
4. else -> just a UTM source

* Keep prop names in queries as strings

* fix typo

* Fix import

* Insert data to clickhouse in batches

* Fix link when removing imported data

* Merge source tables

* Import hostname as well as pathname

* Record start and end time of imported data

* Track import progress

* Fix month interval with imported data

* Do not JOIN when imported date range has no overlap

* Fix time on page using exits

Co-authored-by: mcol <mcol@posteo.net>
2022-03-10 15:04:59 -06:00

601 lines
16 KiB
Elixir

defmodule PlausibleWeb.Api.ExternalController do
use PlausibleWeb, :controller
use OpenTelemetryDecorator
require Logger
def event(conn, _params) do
with {:ok, params} <- parse_body(conn),
_ <- Sentry.Context.set_extra_context(%{request: params}),
:ok <- create_event(conn, params) do
conn |> put_status(202) |> text("ok")
else
{:error, :invalid_json} ->
conn
|> put_status(400)
|> json(%{errors: %{request: "Unable to parse request body as json"}})
{:error, errors} ->
conn |> put_status(400) |> json(%{errors: errors})
end
end
def error(conn, _params) do
Sentry.capture_message("JS snippet error")
send_resp(conn, 200, "")
end
def health(conn, _params) do
postgres_health =
case Ecto.Adapters.SQL.query(Plausible.Repo, "SELECT 1", []) do
{:ok, _} -> "ok"
e -> "error: #{inspect(e)}"
end
clickhouse_health =
case Ecto.Adapters.SQL.query(Plausible.ClickhouseRepo, "SELECT 1", []) do
{:ok, _} -> "ok"
e -> "error: #{inspect(e)}"
end
status =
case {postgres_health, clickhouse_health} do
{"ok", "ok"} -> 200
_ -> 500
end
put_status(conn, status)
|> json(%{
postgres: postgres_health,
clickhouse: clickhouse_health
})
end
@decorate trace("ingest.parse_user_agent")
defp parse_user_agent(conn) do
user_agent = Plug.Conn.get_req_header(conn, "user-agent") |> List.first()
if user_agent do
res =
Cachex.fetch(:user_agents, user_agent, fn ua ->
UAInspector.parse(ua)
end)
case res do
{:ok, user_agent} -> user_agent
{:commit, user_agent} -> user_agent
_ -> nil
end
end
end
@no_domain_error {:error, %{domain: ["can't be blank"]}}
defp create_event(conn, params) do
params = %{
"name" => params["n"] || params["name"],
"url" => params["u"] || params["url"],
"referrer" => params["r"] || params["referrer"],
"domain" => params["d"] || params["domain"],
"screen_width" => params["w"] || params["screen_width"],
"hash_mode" => params["h"] || params["hashMode"],
"meta" => parse_meta(params)
}
ua = parse_user_agent(conn)
blacklist_domain = params["domain"] in Application.get_env(:plausible, :domain_blacklist)
referrer_spam = is_spammer?(params["referrer"])
if is_bot?(ua) || blacklist_domain || referrer_spam do
:ok
else
uri = params["url"] && URI.parse(params["url"])
host = if uri && uri.host == "", do: "(none)", else: uri && uri.host
query = decode_query_params(uri)
ref = parse_referrer(uri, params["referrer"])
location_details = visitor_location_details(conn)
salts = Plausible.Session.Salts.fetch()
event_attrs = %{
timestamp: NaiveDateTime.utc_now() |> NaiveDateTime.truncate(:second),
name: params["name"],
hostname: strip_www(host),
pathname: get_pathname(uri, params["hash_mode"]),
referrer_source: get_referrer_source(query, ref),
referrer: clean_referrer(ref),
utm_medium: query["utm_medium"],
utm_source: query["utm_source"],
utm_campaign: query["utm_campaign"],
utm_content: query["utm_content"],
utm_term: query["utm_term"],
country_code: location_details[:country_code],
country_geoname_id: location_details[:country_geoname_id],
subdivision1_code: location_details[:subdivision1_code],
subdivision2_code: location_details[:subdivision2_code],
city_geoname_id: location_details[:city_geoname_id],
operating_system: ua && os_name(ua),
operating_system_version: ua && os_version(ua),
browser: ua && browser_name(ua),
browser_version: ua && browser_version(ua),
screen_size: calculate_screen_size(params["screen_width"]),
"meta.key": Map.keys(params["meta"]),
"meta.value": Map.values(params["meta"]) |> Enum.map(&Kernel.to_string/1)
}
Enum.reduce_while(get_domains(params, uri), @no_domain_error, fn domain, _res ->
user_id = generate_user_id(conn, domain, event_attrs[:hostname], salts[:current])
previous_user_id =
salts[:previous] &&
generate_user_id(conn, domain, event_attrs[:hostname], salts[:previous])
changeset =
event_attrs
|> Map.merge(%{domain: domain, user_id: user_id})
|> Plausible.ClickhouseEvent.new()
if changeset.valid? do
event = Ecto.Changeset.apply_changes(changeset)
session_id = Plausible.Session.Store.on_event(event, previous_user_id)
event
|> Map.put(:session_id, session_id)
|> Plausible.Event.WriteBuffer.insert()
{:cont, :ok}
else
errors = Ecto.Changeset.traverse_errors(changeset, &encode_error/1)
{:halt, {:error, errors}}
end
end)
end
end
# https://hexdocs.pm/ecto/Ecto.Changeset.html#traverse_errors/2-examples
defp encode_error({msg, opts}) do
Regex.replace(~r"%{(\w+)}", msg, fn _, key ->
opts |> Keyword.get(String.to_existing_atom(key), key) |> to_string()
end)
end
defp is_bot?(%UAInspector.Result.Bot{}), do: true
defp is_bot?(%UAInspector.Result{client: %UAInspector.Result.Client{name: "Headless Chrome"}}),
do: true
defp is_bot?(_), do: false
defp is_spammer?(nil), do: false
defp is_spammer?(referrer_str) do
uri = URI.parse(referrer_str)
ReferrerBlocklist.is_spammer?(strip_www(uri.host))
end
defp parse_meta(params) do
raw_meta = params["m"] || params["meta"] || params["p"] || params["props"]
with {:ok, parsed_json} <- decode_raw_props(raw_meta),
:ok <- validate_custom_props(parsed_json) do
parsed_json
else
_ -> %{}
end
end
defp validate_custom_props(props) do
is_valid =
Enum.all?(props, fn {_key, val} ->
!is_list(val) && !is_map(val)
end)
if is_valid, do: :ok, else: :invalid_props
end
defp decode_raw_props(props) when is_map(props), do: {:ok, props}
defp decode_raw_props(raw_json) when is_binary(raw_json) do
case Jason.decode(raw_json) do
{:ok, parsed_props} when is_map(parsed_props) ->
{:ok, parsed_props}
_ ->
:not_a_map
end
end
defp decode_raw_props(_), do: :bad_format
defp get_domains(params, uri) do
if params["domain"] do
String.split(params["domain"], ",")
|> Enum.map(&String.trim/1)
|> Enum.map(&strip_www/1)
else
List.wrap(strip_www(uri && uri.host))
end
end
defp get_pathname(nil, _), do: "/"
defp get_pathname(uri, hash_mode) do
pathname =
(uri.path || "/")
|> URI.decode()
if hash_mode && uri.fragment do
pathname <> "#" <> URI.decode(uri.fragment)
else
pathname
end
end
@city_overrides %{
# Austria
# Gemeindebezirk Floridsdorf -> Vienna
2_779_467 => 2_761_369,
# Gemeindebezirk Leopoldstadt -> Vienna
2_772_614 => 2_761_369,
# Gemeindebezirk Landstrasse -> Vienna
2_773_040 => 2_761_369,
# Gemeindebezirk Donaustadt -> Vienna
2_780_851 => 2_761_369,
# Gemeindebezirk Favoriten -> Vienna
2_779_776 => 2_761_369,
# Gemeindebezirk Währing -> Vienna
2_762_091 => 2_761_369,
# Gemeindebezirk Wieden -> Vienna
2_761_393 => 2_761_369,
# Gemeindebezirk Innere Stadt -> Vienna
2_775_259 => 2_761_369,
# Gemeindebezirk Alsergrund -> Vienna
2_782_729 => 2_761_369,
# Gemeindebezirk Liesing -> Vienna
2_772_484 => 2_761_369,
# Urfahr -> Linz
2_762_518 => 2_772_400,
# Canada
# Old Toronto -> Toronto
8_436_019 => 6_167_865,
# Etobicoke -> Toronto
5_950_267 => 6_167_865,
# East York -> Toronto
5_946_235 => 6_167_865,
# Scarborough -> Toronto
6_948_711 => 6_167_865,
# North York -> Toronto
6_091_104 => 6_167_865,
# Czech republic
# Praha 5 -> Prague
11_951_220 => 3_067_696,
# Praha 4 -> Prague
11_951_218 => 3_067_696,
# Praha 11 -> Prague
11_951_232 => 3_067_696,
# Praha 10 -> Prague
11_951_210 => 3_067_696,
# Praha 4 -> Prague
8_378_772 => 3_067_696,
# Denmark
# København SV -> Copenhagen
11_747_123 => 2_618_425,
# København NV -> Copenhagen
11_746_894 => 2_618_425,
# Odense S -> Odense
11_746_825 => 2_615_876,
# Odense M -> Odense
11_746_974 => 2_615_876,
# Odense SØ -> Odense
11_746_888 => 2_615_876,
# Aarhus C -> Aarhus
11_746_746 => 2_624_652,
# Aarhus N -> Aarhus
11_746_890 => 2_624_652,
# Estonia
# Kristiine linnaosa -> Tallinn
11_050_530 => 588_409,
# Kesklinna linnaosa -> Tallinn
11_053_706 => 588_409,
# Lasnamäe linnaosa -> Tallinn
11_050_526 => 588_409,
# Põhja-Tallinna linnaosa -> Tallinn
11_049_594 => 588_409,
# Mustamäe linnaosa -> Tallinn
11_050_531 => 588_409,
# Haabersti linnaosa -> Tallinn
11_053_707 => 588_409,
# Viimsi -> Tallinn
587_629 => 588_409,
# Germany
# Bezirk Tempelhof-Schöneberg -> Berlin
3_336_297 => 2_950_159,
# Bezirk Mitte -> Berlin
2_870_912 => 2_950_159,
# Bezirk Charlottenburg-Wilmersdorf -> Berlin
3_336_294 => 2_950_159,
# Bezirk Friedrichshain-Kreuzberg -> Berlin
3_336_295 => 2_950_159,
# Moosach -> Munich
8_351_447 => 2_867_714,
# Schwabing-Freimann -> Munich
8_351_448 => 2_867_714,
# Stadtbezirk 06 -> Düsseldorf
6_947_276 => 2_934_246,
# Stadtbezirk 04 -> Düsseldorf
6_947_274 => 2_934_246,
# Köln-Ehrenfeld -> Köln
6_947_479 => 2_886_242,
# Köln-Lindenthal- -> Köln
6_947_481 => 2_886_242,
# Beuel -> Bonn
2_949_619 => 2_946_447,
# India
# Navi Mumbai -> Mumbai
6_619_347 => 1_275_339,
# Mexico
# Miguel Hidalgo Villa Olímpica -> Mexico city
11_561_026 => 3_530_597,
# Zedec Santa Fe -> Mexico city
3_517_471 => 3_530_597,
# Fuentes del Pedregal-> Mexico city
11_562_596 => 3_530_597,
# Centro -> Mexico city
9_179_691 => 3_530_597,
# Cuauhtémoc-> Mexico city
12_266_959 => 3_530_597,
# Netherlands
# Schiphol-Rijk -> Amsterdam
10_173_838 => 2_759_794,
# Westpoort -> Amsterdam
11_525_047 => 2_759_794,
# Amsterdam-Zuidoost -> Amsterdam
6_544_881 => 2_759_794,
# Loosduinen -> The Hague
11_525_037 => 2_747_373,
# Laak -> The Hague
11_525_042 => 2_747_373,
# Norway
# Nordre Aker District -> Oslo
6_940_981 => 3_143_244,
# Romania
# Sector 1 -> Bucharest,
11_055_041 => 683_506,
# Sector 2 -> Bucharest
11_055_040 => 683_506,
# Sector 3 -> Bucharest
11_055_044 => 683_506,
# Sector 4 -> Bucharest
11_055_042 => 683_506,
# Sector 5 -> Bucharest
11_055_043 => 683_506,
# Sector 6 -> Bucharest
11_055_039 => 683_506,
# Bucuresti -> Bucharest
6_691_781 => 683_506,
# Slovakia
# Bratislava -> Bratislava
3_343_955 => 3_060_972,
# Sweden
# Södermalm -> Stockholm
2_676_209 => 2_673_730,
# Switzerland
# Vorstädte -> Basel
11_789_440 => 2_661_604,
# Zürich (Kreis 11) / Oerlikon -> Zürich
2_659_310 => 2_657_896,
# Zürich (Kreis 3) / Alt-Wiedikon -> Zürich
2_658_007 => 2_657_896,
# Zürich (Kreis 5) -> Zürich
6_295_521 => 2_657_896,
# Zürich (Kreis 1) / Hochschulen -> Zürich
6_295_489 => 2_657_896,
# UK
# Shadwell -> London
6_690_595 => 2_643_743,
# City of London -> London
2_643_741 => 2_643_743,
# South Bank -> London
6_545_251 => 2_643_743,
# Soho -> London
6_545_173 => 2_643_743,
# Whitechapel -> London
2_634_112 => 2_643_743,
# King's Cross -> London
6_690_589 => 2_643_743,
# Poplar -> London
2_640_091 => 2_643_743,
# Hackney -> London
2_647_694 => 2_643_743
}
@decorate trace("ingest.geolocation")
defp visitor_location_details(conn) do
result =
PlausibleWeb.RemoteIp.get(conn)
|> Geolix.lookup()
country_code =
get_in(result, [:geolocation, :country, :iso_code])
|> ignore_unknown_country
city_geoname_id = get_in(result, [:geolocation, :city, :geoname_id])
subdivision1_code =
case result do
%{geolocation: %{subdivisions: [%{iso_code: iso_code} | _rest]}} ->
country_code <> "-" <> iso_code
_ ->
""
end
subdivision2_code =
case result do
%{geolocation: %{subdivisions: [_first, %{iso_code: iso_code} | _rest]}} ->
country_code <> "-" <> iso_code
_ ->
""
end
%{
country_code: country_code,
subdivision1_code: subdivision1_code,
subdivision2_code: subdivision2_code,
city_geoname_id: Map.get(@city_overrides, city_geoname_id, city_geoname_id)
}
end
defp ignore_unknown_country("ZZ"), do: nil
defp ignore_unknown_country(country), do: country
@decorate trace("ingest.parse_referrer")
defp parse_referrer(_, nil), do: nil
defp parse_referrer(uri, referrer_str) do
referrer_uri = URI.parse(referrer_str)
if strip_www(referrer_uri.host) !== strip_www(uri.host) && referrer_uri.host !== "localhost" do
RefInspector.parse(referrer_str)
end
end
defp generate_user_id(conn, domain, hostname, salt) do
user_agent = List.first(Plug.Conn.get_req_header(conn, "user-agent")) || ""
ip_address = PlausibleWeb.RemoteIp.get(conn)
root_domain = get_root_domain(hostname)
if domain && root_domain do
SipHash.hash!(salt, user_agent <> ip_address <> domain <> root_domain)
end
end
defp get_root_domain(nil), do: "(none)"
defp get_root_domain(hostname) do
case PublicSuffix.registrable_domain(hostname) do
domain when is_binary(domain) -> domain
_ -> hostname
end
end
defp calculate_screen_size(nil), do: nil
defp calculate_screen_size(width) when width < 576, do: "Mobile"
defp calculate_screen_size(width) when width < 992, do: "Tablet"
defp calculate_screen_size(width) when width < 1440, do: "Laptop"
defp calculate_screen_size(width) when width >= 1440, do: "Desktop"
defp clean_referrer(nil), do: nil
defp clean_referrer(ref) do
uri = URI.parse(ref.referer)
if PlausibleWeb.RefInspector.right_uri?(uri) do
host = String.replace_prefix(uri.host, "www.", "")
path = uri.path || ""
host <> String.trim_trailing(path, "/")
end
end
defp parse_body(conn) do
case conn.body_params do
%Plug.Conn.Unfetched{} ->
{:ok, body, _conn} = Plug.Conn.read_body(conn)
case Jason.decode(body) do
{:ok, params} -> {:ok, params}
_ -> {:error, :invalid_json}
end
params ->
{:ok, params}
end
end
defp strip_www(nil), do: nil
defp strip_www(hostname) do
String.replace_prefix(hostname, "www.", "")
end
defp browser_name(ua) do
case ua.client do
:unknown -> ""
%UAInspector.Result.Client{name: "Mobile Safari"} -> "Safari"
%UAInspector.Result.Client{name: "Chrome Mobile"} -> "Chrome"
%UAInspector.Result.Client{name: "Chrome Mobile iOS"} -> "Chrome"
%UAInspector.Result.Client{name: "Firefox Mobile"} -> "Firefox"
%UAInspector.Result.Client{name: "Firefox Mobile iOS"} -> "Firefox"
%UAInspector.Result.Client{name: "Opera Mobile"} -> "Opera"
%UAInspector.Result.Client{name: "Opera Mini"} -> "Opera"
%UAInspector.Result.Client{name: "Opera Mini iOS"} -> "Opera"
%UAInspector.Result.Client{name: "Yandex Browser Lite"} -> "Yandex Browser"
%UAInspector.Result.Client{name: "Chrome Webview"} -> "Mobile App"
%UAInspector.Result.Client{type: "mobile app"} -> "Mobile App"
client -> client.name
end
end
defp major_minor(:unknown), do: ""
defp major_minor(version) do
version
|> String.split(".")
|> Enum.take(2)
|> Enum.join(".")
end
defp browser_version(ua) do
case ua.client do
:unknown -> ""
%UAInspector.Result.Client{type: "mobile app"} -> ""
client -> major_minor(client.version)
end
end
defp os_name(ua) do
case ua.os do
:unknown -> ""
os -> os.name
end
end
defp os_version(ua) do
case ua.os do
:unknown -> ""
os -> major_minor(os.version)
end
end
defp get_referrer_source(query, ref) do
source = query["utm_source"] || query["source"] || query["ref"]
source || PlausibleWeb.RefInspector.parse(ref)
end
defp decode_query_params(nil), do: nil
defp decode_query_params(%URI{query: nil}), do: nil
defp decode_query_params(%URI{query: query_part}) do
try do
URI.decode_query(query_part)
rescue
_ -> nil
end
end
end