analytics/test/plausible_web/controllers/site_controller_test.exs

970 lines
30 KiB
Elixir
Raw Normal View History

2019-09-02 14:29:19 +03:00
defmodule PlausibleWeb.SiteControllerTest do
use PlausibleWeb.ConnCase, async: false
2019-09-02 14:29:19 +03:00
use Plausible.Repo
use Bamboo.Test
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
use Oban.Testing, repo: Plausible.Repo
import ExUnit.CaptureLog
import Mox
setup :verify_on_exit!
2019-09-02 14:29:19 +03:00
describe "GET /sites/new" do
setup [:create_user, :log_in]
test "shows the site form", %{conn: conn} do
conn = get(conn, "/sites/new")
2021-05-04 15:37:58 +03:00
2019-09-02 14:29:19 +03:00
assert html_response(conn, 200) =~ "Your website details"
end
2021-05-04 15:37:58 +03:00
test "shows onboarding steps if it's the first site for the user", %{conn: conn} do
conn = get(conn, "/sites/new")
assert html_response(conn, 200) =~ "Add site info"
end
test "does not show onboarding steps if user has a site already", %{conn: conn, user: user} do
insert(:site, members: [user], domain: "test-site.com")
conn = get(conn, "/sites/new")
refute html_response(conn, 200) =~ "Add site info"
end
2019-09-02 14:29:19 +03:00
end
2020-11-27 11:27:47 +03:00
describe "GET /sites" do
setup [:create_user, :log_in]
test "shows empty screen if no sites", %{conn: conn} do
conn = get(conn, "/sites")
assert html_response(conn, 200) =~ "You don't have any sites yet"
end
test "lists all of your sites with last 24h visitors", %{conn: conn, user: user} do
insert(:site, members: [user], domain: "test-site.com")
conn = get(conn, "/sites")
assert html_response(conn, 200) =~ "test-site.com"
assert html_response(conn, 200) =~ "<b>3</b> visitors in last 24h"
end
test "shows invitations for user by email address", %{conn: conn, user: user} do
site = insert(:site)
insert(:invitation, email: user.email, site_id: site.id, inviter: build(:user))
conn = get(conn, "/sites")
assert html_response(conn, 200) =~ site.domain
end
test "invitations are case insensitive", %{conn: conn, user: user} do
site = insert(:site)
2021-07-27 11:41:04 +03:00
insert(:invitation,
email: String.upcase(user.email),
site_id: site.id,
inviter: build(:user)
)
conn = get(conn, "/sites")
assert html_response(conn, 200) =~ site.domain
end
test "paginates sites", %{conn: conn, user: user} do
insert(:site, members: [user], domain: "test-site1.com")
insert(:site, members: [user], domain: "test-site2.com")
insert(:site, members: [user], domain: "test-site3.com")
insert(:site, members: [user], domain: "test-site4.com")
conn = get(conn, "/sites?per_page=2")
assert html_response(conn, 200) =~ "test-site1.com"
assert html_response(conn, 200) =~ "test-site2.com"
refute html_response(conn, 200) =~ "test-site3.com"
refute html_response(conn, 200) =~ "test-site4.com"
conn = get(conn, "/sites?per_page=2&page=2")
refute html_response(conn, 200) =~ "test-site1.com"
refute html_response(conn, 200) =~ "test-site2.com"
assert html_response(conn, 200) =~ "test-site3.com"
assert html_response(conn, 200) =~ "test-site4.com"
end
2020-11-27 11:27:47 +03:00
end
2019-09-02 14:29:19 +03:00
describe "POST /sites" do
setup [:create_user, :log_in]
test "creates the site with valid params", %{conn: conn} do
Formatting only changes - No code change (#75) * first commit with test and compile job Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding 'prepare' stage Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci script to include "test" compile phase Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding environment variables for connecting to postgresql Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci config for postgres Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-alpine version of elixir Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * re-using the 'compile' artifacts and added explict env variables for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing redundant deps fetching from common code Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting using mix.format -- beware no-code changes! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added release config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding consistent env variable for Database Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * more cleaning up of environment variables Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding releases config for enabling releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up env configs Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Cleaned up config and prepared config for releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated CI script with new config for test Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added Dockerfile for creating production docker image Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding "docker" build job yay! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-slim version of debian and installing webpack Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding overlays for migrations on releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * restricting the docker built to master branch only Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * typo fix Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding "Hosting.md" to explain hosting instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed the default comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added documentation related to env variables * updated documentation and fixed typo Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated documentation * Bumping up elixir version as `overlays` are only supported in latest version read release notes: https://github.com/elixir-lang/elixir/releases/tag/v1.10.0 Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding tarball assembly during release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated HOSTING.md Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for db migration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * minor corrections Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * initializing admin user Admin user has been added in the "migration" phase. A default user is automatically created in the process. One can provide the related env variables, else a new one will be automatically created for you. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Initial base domain update - phase#1 These changes are only meant for correct operating it under self-hosting. There are many other cosmetic changes, that require updates to email, site and other places where the original website and author is used. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Using dedicated config variable `base_domain` instead Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding base_domain to releases config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing the dedicated config "base_domain", relying on endpoint host Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Removed the usage of "Mix" in code! It is bad practice to use "mix" module inside the code as in actual release this module is unavailable. Replacing this with a config environment variable Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for SMTP via Bamboo Smtp Adapter Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Capturing SMTP errors via Sentry Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Minor updates Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding junit formatter -- useful for generating test reports Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding documentation for default user * Resolve "Gitlab Adoption: Add supported services in "Security & Compliance"" * bumping up the debian version to fix issues fixing some vulnerabilities identified by the scanning tools * More updates for self-hosting Changes in most of the places to suit self-hosting. Although, there are some which have been left-off. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * quick-dirty-fix! * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up timeout - skipping MRs :-/ * removing restrictions on watching for changes this stuff isn't working * Update HOSTING.md * renamed the module name * reverting formatting-whitespace changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * reverting the name to release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding docker-compose.yml and related instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using `plausible_url` instead of assuming `https` this is because, it is much to test in local dev machines and in most cases there's already a layer above which is capable for `https` termination and http -> https upgrade Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP: merging changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * wip: more changes * Pushing in changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changes to ci for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up and finishing clickhouse integration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updating readme with hosting details * removing deleted files from upstream * minor config adjustments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me>
2020-06-08 10:35:13 +03:00
conn =
post(conn, "/sites", %{
"site" => %{
"domain" => "example.com",
"timezone" => "Europe/London"
}
})
2019-09-02 14:29:19 +03:00
assert redirected_to(conn) == "/example.com/snippet"
assert Repo.exists?(Plausible.Site, domain: "example.com")
end
test "starts trial if user does not have trial yet", %{conn: conn, user: user} do
Plausible.Auth.User.remove_trial_expiry(user) |> Repo.update!()
post(conn, "/sites", %{
"site" => %{
"domain" => "example.com",
"timezone" => "Europe/London"
}
})
assert Repo.reload!(user).trial_expiry_date
end
test "sends welcome email if this is the user's first site", %{conn: conn} do
post(conn, "/sites", %{
"site" => %{
"domain" => "example.com",
"timezone" => "Europe/London"
}
})
assert_email_delivered_with(subject: "Welcome to Plausible")
end
test "does not send welcome email if user already has a previous site", %{
conn: conn,
user: user
} do
insert(:site, members: [user])
post(conn, "/sites", %{
"site" => %{
"domain" => "example.com",
"timezone" => "Europe/London"
}
})
assert_no_emails_delivered()
end
2021-05-04 15:37:58 +03:00
test "does not allow site creation when the user is at their site limit", %{
conn: conn,
user: user
} do
# default site limit defined in config/.test.env
2021-05-04 15:37:58 +03:00
insert(:site, members: [user])
insert(:site, members: [user])
insert(:site, members: [user])
conn =
post(conn, "/sites", %{
"site" => %{
"domain" => "example.com",
"timezone" => "Europe/London"
}
})
assert conn.status == 400
end
2021-05-05 10:48:42 +03:00
test "allows accounts registered before 2021-05-05 to go over the limit", %{
conn: conn,
user: user
} do
Repo.update_all(from(u in "users", where: u.id == ^user.id),
set: [inserted_at: ~N[2020-01-01 00:00:00]]
)
insert(:site, members: [user])
insert(:site, members: [user])
insert(:site, members: [user])
insert(:site, members: [user])
conn =
post(conn, "/sites", %{
"site" => %{
"domain" => "example.com",
"timezone" => "Europe/London"
}
})
assert redirected_to(conn) == "/example.com/snippet"
assert Repo.exists?(Plausible.Site, domain: "example.com")
end
test "allows enterprise accounts to create unlimited sites", %{
conn: conn,
user: user
} do
2022-04-18 14:51:33 +03:00
ep = insert(:enterprise_plan, user: user)
insert(:subscription, user: user, paddle_plan_id: ep.paddle_plan_id)
insert(:site, members: [user])
insert(:site, members: [user])
insert(:site, members: [user])
2021-05-05 10:48:42 +03:00
conn =
post(conn, "/sites", %{
"site" => %{
"domain" => "example.com",
"timezone" => "Europe/London"
}
})
assert redirected_to(conn) == "/example.com/snippet"
assert Repo.exists?(Plausible.Site, domain: "example.com")
end
2019-09-02 14:29:19 +03:00
test "cleans up the url", %{conn: conn} do
Formatting only changes - No code change (#75) * first commit with test and compile job Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding 'prepare' stage Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci script to include "test" compile phase Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding environment variables for connecting to postgresql Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci config for postgres Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-alpine version of elixir Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * re-using the 'compile' artifacts and added explict env variables for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing redundant deps fetching from common code Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting using mix.format -- beware no-code changes! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added release config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding consistent env variable for Database Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * more cleaning up of environment variables Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding releases config for enabling releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up env configs Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Cleaned up config and prepared config for releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated CI script with new config for test Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added Dockerfile for creating production docker image Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding "docker" build job yay! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-slim version of debian and installing webpack Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding overlays for migrations on releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * restricting the docker built to master branch only Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * typo fix Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding "Hosting.md" to explain hosting instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed the default comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added documentation related to env variables * updated documentation and fixed typo Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated documentation * Bumping up elixir version as `overlays` are only supported in latest version read release notes: https://github.com/elixir-lang/elixir/releases/tag/v1.10.0 Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding tarball assembly during release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated HOSTING.md Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for db migration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * minor corrections Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * initializing admin user Admin user has been added in the "migration" phase. A default user is automatically created in the process. One can provide the related env variables, else a new one will be automatically created for you. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Initial base domain update - phase#1 These changes are only meant for correct operating it under self-hosting. There are many other cosmetic changes, that require updates to email, site and other places where the original website and author is used. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Using dedicated config variable `base_domain` instead Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding base_domain to releases config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing the dedicated config "base_domain", relying on endpoint host Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Removed the usage of "Mix" in code! It is bad practice to use "mix" module inside the code as in actual release this module is unavailable. Replacing this with a config environment variable Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for SMTP via Bamboo Smtp Adapter Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Capturing SMTP errors via Sentry Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Minor updates Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding junit formatter -- useful for generating test reports Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding documentation for default user * Resolve "Gitlab Adoption: Add supported services in "Security & Compliance"" * bumping up the debian version to fix issues fixing some vulnerabilities identified by the scanning tools * More updates for self-hosting Changes in most of the places to suit self-hosting. Although, there are some which have been left-off. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * quick-dirty-fix! * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up timeout - skipping MRs :-/ * removing restrictions on watching for changes this stuff isn't working * Update HOSTING.md * renamed the module name * reverting formatting-whitespace changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * reverting the name to release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding docker-compose.yml and related instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using `plausible_url` instead of assuming `https` this is because, it is much to test in local dev machines and in most cases there's already a layer above which is capable for `https` termination and http -> https upgrade Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP: merging changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * wip: more changes * Pushing in changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changes to ci for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up and finishing clickhouse integration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updating readme with hosting details * removing deleted files from upstream * minor config adjustments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me>
2020-06-08 10:35:13 +03:00
conn =
post(conn, "/sites", %{
"site" => %{
"domain" => "https://www.Example.com/",
"timezone" => "Europe/London"
}
})
2019-09-02 14:29:19 +03:00
assert redirected_to(conn) == "/example.com/snippet"
assert Repo.exists?(Plausible.Site, domain: "example.com")
end
test "renders form again when domain is missing", %{conn: conn} do
Formatting only changes - No code change (#75) * first commit with test and compile job Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding 'prepare' stage Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci script to include "test" compile phase Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding environment variables for connecting to postgresql Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci config for postgres Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-alpine version of elixir Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * re-using the 'compile' artifacts and added explict env variables for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing redundant deps fetching from common code Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting using mix.format -- beware no-code changes! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added release config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding consistent env variable for Database Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * more cleaning up of environment variables Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding releases config for enabling releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up env configs Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Cleaned up config and prepared config for releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated CI script with new config for test Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added Dockerfile for creating production docker image Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding "docker" build job yay! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-slim version of debian and installing webpack Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding overlays for migrations on releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * restricting the docker built to master branch only Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * typo fix Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding "Hosting.md" to explain hosting instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed the default comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added documentation related to env variables * updated documentation and fixed typo Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated documentation * Bumping up elixir version as `overlays` are only supported in latest version read release notes: https://github.com/elixir-lang/elixir/releases/tag/v1.10.0 Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding tarball assembly during release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated HOSTING.md Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for db migration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * minor corrections Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * initializing admin user Admin user has been added in the "migration" phase. A default user is automatically created in the process. One can provide the related env variables, else a new one will be automatically created for you. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Initial base domain update - phase#1 These changes are only meant for correct operating it under self-hosting. There are many other cosmetic changes, that require updates to email, site and other places where the original website and author is used. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Using dedicated config variable `base_domain` instead Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding base_domain to releases config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing the dedicated config "base_domain", relying on endpoint host Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Removed the usage of "Mix" in code! It is bad practice to use "mix" module inside the code as in actual release this module is unavailable. Replacing this with a config environment variable Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for SMTP via Bamboo Smtp Adapter Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Capturing SMTP errors via Sentry Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Minor updates Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding junit formatter -- useful for generating test reports Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding documentation for default user * Resolve "Gitlab Adoption: Add supported services in "Security & Compliance"" * bumping up the debian version to fix issues fixing some vulnerabilities identified by the scanning tools * More updates for self-hosting Changes in most of the places to suit self-hosting. Although, there are some which have been left-off. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * quick-dirty-fix! * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up timeout - skipping MRs :-/ * removing restrictions on watching for changes this stuff isn't working * Update HOSTING.md * renamed the module name * reverting formatting-whitespace changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * reverting the name to release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding docker-compose.yml and related instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using `plausible_url` instead of assuming `https` this is because, it is much to test in local dev machines and in most cases there's already a layer above which is capable for `https` termination and http -> https upgrade Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP: merging changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * wip: more changes * Pushing in changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changes to ci for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up and finishing clickhouse integration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updating readme with hosting details * removing deleted files from upstream * minor config adjustments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me>
2020-06-08 10:35:13 +03:00
conn =
post(conn, "/sites", %{
"site" => %{
"timezone" => "Europe/London"
}
})
2019-09-02 14:29:19 +03:00
assert html_response(conn, 200) =~ "can&#39;t be blank"
end
test "only alphanumeric characters and slash allowed in domain", %{conn: conn} do
conn =
post(conn, "/sites", %{
"site" => %{
"timezone" => "Europe/London",
"domain" => "!@£.com"
}
})
assert html_response(conn, 200) =~ "only letters, numbers, slashes and period allowed"
end
2019-09-02 14:29:19 +03:00
test "renders form again when it is a duplicate domain", %{conn: conn} do
insert(:site, domain: "example.com")
Formatting only changes - No code change (#75) * first commit with test and compile job Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding 'prepare' stage Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci script to include "test" compile phase Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding environment variables for connecting to postgresql Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci config for postgres Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-alpine version of elixir Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * re-using the 'compile' artifacts and added explict env variables for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing redundant deps fetching from common code Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting using mix.format -- beware no-code changes! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added release config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding consistent env variable for Database Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * more cleaning up of environment variables Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding releases config for enabling releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up env configs Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Cleaned up config and prepared config for releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated CI script with new config for test Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added Dockerfile for creating production docker image Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding "docker" build job yay! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-slim version of debian and installing webpack Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding overlays for migrations on releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * restricting the docker built to master branch only Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * typo fix Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding "Hosting.md" to explain hosting instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed the default comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added documentation related to env variables * updated documentation and fixed typo Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated documentation * Bumping up elixir version as `overlays` are only supported in latest version read release notes: https://github.com/elixir-lang/elixir/releases/tag/v1.10.0 Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding tarball assembly during release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated HOSTING.md Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for db migration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * minor corrections Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * initializing admin user Admin user has been added in the "migration" phase. A default user is automatically created in the process. One can provide the related env variables, else a new one will be automatically created for you. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Initial base domain update - phase#1 These changes are only meant for correct operating it under self-hosting. There are many other cosmetic changes, that require updates to email, site and other places where the original website and author is used. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Using dedicated config variable `base_domain` instead Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding base_domain to releases config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing the dedicated config "base_domain", relying on endpoint host Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Removed the usage of "Mix" in code! It is bad practice to use "mix" module inside the code as in actual release this module is unavailable. Replacing this with a config environment variable Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for SMTP via Bamboo Smtp Adapter Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Capturing SMTP errors via Sentry Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Minor updates Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding junit formatter -- useful for generating test reports Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding documentation for default user * Resolve "Gitlab Adoption: Add supported services in "Security & Compliance"" * bumping up the debian version to fix issues fixing some vulnerabilities identified by the scanning tools * More updates for self-hosting Changes in most of the places to suit self-hosting. Although, there are some which have been left-off. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * quick-dirty-fix! * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up timeout - skipping MRs :-/ * removing restrictions on watching for changes this stuff isn't working * Update HOSTING.md * renamed the module name * reverting formatting-whitespace changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * reverting the name to release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding docker-compose.yml and related instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using `plausible_url` instead of assuming `https` this is because, it is much to test in local dev machines and in most cases there's already a layer above which is capable for `https` termination and http -> https upgrade Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP: merging changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * wip: more changes * Pushing in changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changes to ci for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up and finishing clickhouse integration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updating readme with hosting details * removing deleted files from upstream * minor config adjustments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me>
2020-06-08 10:35:13 +03:00
conn =
post(conn, "/sites", %{
"site" => %{
"domain" => "example.com",
"timezone" => "Europe/London"
}
})
2019-09-02 14:29:19 +03:00
assert html_response(conn, 200) =~
"This domain has already been taken. Perhaps one of your team members registered it? If that&#39;s not the case, please contact support@plausible.io"
2019-09-02 14:29:19 +03:00
end
end
2020-02-26 12:46:28 +03:00
describe "GET /:website/snippet" do
setup [:create_user, :log_in, :create_site]
test "shows snippet", %{conn: conn, site: site} do
conn = get(conn, "/#{site.domain}/snippet")
assert html_response(conn, 200) =~ "Add javascript snippet"
end
end
2020-11-26 11:59:32 +03:00
describe "GET /:website/settings/general" do
2019-09-02 14:29:19 +03:00
setup [:create_user, :log_in, :create_site]
test "shows settings form", %{conn: conn, site: site} do
2020-11-26 11:59:32 +03:00
conn = get(conn, "/#{site.domain}/settings/general")
2019-09-02 14:29:19 +03:00
2020-11-26 11:59:32 +03:00
assert html_response(conn, 200) =~ "General information"
2019-09-02 14:29:19 +03:00
end
2020-11-26 11:59:32 +03:00
end
describe "GET /:website/settings/goals" do
setup [:create_user, :log_in, :create_site]
2019-11-28 07:01:07 +03:00
test "lists goals for the site", %{conn: conn, site: site} do
insert(:goal, domain: site.domain, event_name: "Custom event")
insert(:goal, domain: site.domain, page_path: "/register")
2020-11-26 11:59:32 +03:00
conn = get(conn, "/#{site.domain}/settings/goals")
2019-11-28 07:01:07 +03:00
assert html_response(conn, 200) =~ "Custom event"
assert html_response(conn, 200) =~ "Visit /register"
end
2019-09-02 14:29:19 +03:00
end
describe "PUT /:website/settings" do
setup [:create_user, :log_in, :create_site]
test "updates the timezone", %{conn: conn, site: site} do
2021-10-26 11:59:14 +03:00
conn =
put(conn, "/#{site.domain}/settings", %{
"site" => %{
"timezone" => "Europe/London"
}
})
2019-09-02 14:29:19 +03:00
updated = Repo.get(Plausible.Site, site.id)
assert updated.timezone == "Europe/London"
2021-10-26 11:59:14 +03:00
assert redirected_to(conn, 302) == "/#{site.domain}/settings/general"
2019-09-02 14:29:19 +03:00
end
end
describe "POST /sites/:website/make-public" do
setup [:create_user, :log_in, :create_site]
test "makes the site public", %{conn: conn, site: site} do
2021-10-26 11:59:14 +03:00
conn = post(conn, "/sites/#{site.domain}/make-public")
2019-09-02 14:29:19 +03:00
updated = Repo.get(Plausible.Site, site.id)
assert updated.public
2021-10-26 11:59:14 +03:00
assert redirected_to(conn, 302) == "/#{site.domain}/settings/visibility"
2019-09-02 14:29:19 +03:00
end
end
describe "POST /sites/:website/make-private" do
setup [:create_user, :log_in, :create_site]
test "makes the site private", %{conn: conn, site: site} do
2021-10-26 11:59:14 +03:00
conn = post(conn, "/sites/#{site.domain}/make-private")
2019-09-02 14:29:19 +03:00
updated = Repo.get(Plausible.Site, site.id)
refute updated.public
2021-10-26 11:59:14 +03:00
assert redirected_to(conn, 302) == "/#{site.domain}/settings/visibility"
2019-09-02 14:29:19 +03:00
end
end
describe "DELETE /:website" do
setup [:create_user, :log_in, :create_site]
test "deletes the site", %{conn: conn, user: user} do
site = insert(:site, members: [user])
2019-09-02 14:29:19 +03:00
insert(:google_auth, user: user, site: site)
2020-04-06 14:53:31 +03:00
insert(:custom_domain, site: site)
insert(:spike_notification, site: site)
2019-09-02 14:29:19 +03:00
delete(conn, "/#{site.domain}")
refute Repo.exists?(from s in Plausible.Site, where: s.id == ^site.id)
end
end
2019-10-31 08:39:51 +03:00
2020-06-30 11:11:47 +03:00
describe "PUT /:website/settings/google" do
setup [:create_user, :log_in, :create_site]
test "updates google auth property", %{conn: conn, user: user, site: site} do
insert(:google_auth, user: user, site: site)
Selhosted version Improvements and additional features (#209) * first commit with test and compile job Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding 'prepare' stage Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci script to include "test" compile phase Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding environment variables for connecting to postgresql Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci config for postgres Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-alpine version of elixir Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * re-using the 'compile' artifacts and added explict env variables for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing redundant deps fetching from common code Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting using mix.format -- beware no-code changes! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added release config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding consistent env variable for Database Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * more cleaning up of environment variables Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding releases config for enabling releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up env configs Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Cleaned up config and prepared config for releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated CI script with new config for test Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added Dockerfile for creating production docker image Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding "docker" build job yay! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-slim version of debian and installing webpack Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding overlays for migrations on releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * restricting the docker built to master branch only Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * typo fix Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding "Hosting.md" to explain hosting instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed the default comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added documentation related to env variables * updated documentation and fixed typo Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated documentation * Bumping up elixir version as `overlays` are only supported in latest version read release notes: https://github.com/elixir-lang/elixir/releases/tag/v1.10.0 Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding tarball assembly during release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated HOSTING.md Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for db migration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * minor corrections Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * initializing admin user Admin user has been added in the "migration" phase. A default user is automatically created in the process. One can provide the related env variables, else a new one will be automatically created for you. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Initial base domain update - phase#1 These changes are only meant for correct operating it under self-hosting. There are many other cosmetic changes, that require updates to email, site and other places where the original website and author is used. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Using dedicated config variable `base_domain` instead Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding base_domain to releases config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing the dedicated config "base_domain", relying on endpoint host Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Removed the usage of "Mix" in code! It is bad practice to use "mix" module inside the code as in actual release this module is unavailable. Replacing this with a config environment variable Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for SMTP via Bamboo Smtp Adapter Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Capturing SMTP errors via Sentry Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Minor updates Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding junit formatter -- useful for generating test reports Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding documentation for default user * Resolve "Gitlab Adoption: Add supported services in "Security & Compliance"" * bumping up the debian version to fix issues fixing some vulnerabilities identified by the scanning tools * More updates for self-hosting Changes in most of the places to suit self-hosting. Although, there are some which have been left-off. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * quick-dirty-fix! * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up timeout - skipping MRs :-/ * removing restrictions on watching for changes this stuff isn't working * Update HOSTING.md * renamed the module name * reverting formatting-whitespace changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * reverting the name to release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding docker-compose.yml and related instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using `plausible_url` instead of assuming `https` this is because, it is much to test in local dev machines and in most cases there's already a layer above which is capable for `https` termination and http -> https upgrade Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP: merging changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * wip: more changes * Pushing in changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changes to ci for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up and finishing clickhouse integration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updating readme with hosting details * removing deleted files from upstream * minor config adjustments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changing the connection strategy for clickhouse during release since clickhouse integration doesn't have an ecto support, we need to prepare the db _before_ the clickhouse migration. One workaround is to connect to a default db on init and then create a db Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting * cleanup and added separated migration to setup * Big improvements to selfhosting - added ability for disabling - authentication completely - registration - landing page - formatting cleanups * Big improvements to selfhosting - added ability for disabling - authentication completely - registration - landing page - formatting cleanups * changing smtp auth to optional Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed stale templates and permanently removed landing page Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed stale templates and permanently removed landing page Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed stale templates and permanently removed landing page Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * fixes form upstream merge Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added disabling subscription for selfhosted version Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated doc Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Remove reference to file that doesn't exist Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * do not show direct traffic if there's no data Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * addressing PR comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me>
2020-07-21 09:58:00 +03:00
2021-10-26 11:59:14 +03:00
conn =
put(conn, "/#{site.domain}/settings/google", %{
"google_auth" => %{"property" => "some-new-property.com"}
})
2020-06-30 11:11:47 +03:00
updated_auth = Repo.one(Plausible.Site.GoogleAuth)
assert updated_auth.property == "some-new-property.com"
2021-10-26 11:59:14 +03:00
assert redirected_to(conn, 302) == "/#{site.domain}/settings/search-console"
2020-06-30 11:11:47 +03:00
end
end
describe "DELETE /:website/settings/google" do
setup [:create_user, :log_in, :create_site]
test "deletes associated google auth", %{conn: conn, user: user, site: site} do
insert(:google_auth, user: user, site: site)
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
conn = delete(conn, "/#{site.domain}/settings/google-search")
2020-06-30 11:11:47 +03:00
refute Repo.exists?(Plausible.Site.GoogleAuth)
2021-10-26 11:59:14 +03:00
assert redirected_to(conn, 302) == "/#{site.domain}/settings/search-console"
2020-06-30 11:11:47 +03:00
end
end
2022-10-25 14:17:09 +03:00
describe "GET /:website/settings/search-console for self-hosting" do
setup [:create_user, :log_in, :create_site]
test "display search console settings", %{conn: conn, site: site} do
conn = get(conn, "/#{site.domain}/settings/search-console")
resp = html_response(conn, 200)
assert resp =~ "An extra step is needed"
assert resp =~ "Google Search Console integration"
assert resp =~ "self-hosting-configuration"
end
end
2022-10-25 14:17:09 +03:00
describe "GET /:website/settings/search-console" do
setup [:create_user, :log_in, :create_site]
setup_patch_env(:google, client_id: "some", api_url: "https://www.googleapis.com")
setup %{site: site, user: user} = context do
insert(:google_auth, user: user, site: site, property: "sc-domain:#{site.domain}")
context
end
test "displays appropriate error in case of google account `google_auth_error`", %{
conn: conn,
site: site
} do
expect(
Plausible.HTTPClient.Mock,
:get,
fn
"https://www.googleapis.com/webmasters/v3/sites",
[{"Content-Type", "application/json"}, {"Authorization", "Bearer 123"}] ->
{:error, %{reason: %Finch.Response{status: Enum.random([401, 403])}}}
end
)
conn = get(conn, "/#{site.domain}/settings/search-console")
resp = html_response(conn, 200)
assert resp =~ "Your Search Console account hasn't been connected successfully"
assert resp =~ "Please unlink your Google account and try linking it again"
end
test "displays docs link error in case of `invalid_grant`", %{
conn: conn,
site: site
} do
expect(
Plausible.HTTPClient.Mock,
:get,
fn
"https://www.googleapis.com/webmasters/v3/sites",
[{"Content-Type", "application/json"}, {"Authorization", "Bearer 123"}] ->
{:error, %{reason: %Finch.Response{status: 400, body: %{"error" => "invalid_grant"}}}}
end
)
conn = get(conn, "/#{site.domain}/settings/search-console")
resp = html_response(conn, 200)
assert resp =~
"https://plausible.io/docs/google-search-console-integration#i-get-the-invalid-grant-error"
end
test "displays generic error in case of random error code returned by google", %{
conn: conn,
site: site
} do
expect(
Plausible.HTTPClient.Mock,
:get,
fn
"https://www.googleapis.com/webmasters/v3/sites",
[{"Content-Type", "application/json"}, {"Authorization", "Bearer 123"}] ->
{:error, %{reason: %Finch.Response{status: 503, body: %{"error" => "some_error"}}}}
end
)
conn = get(conn, "/#{site.domain}/settings/search-console")
resp = html_response(conn, 200)
assert resp =~ "Something went wrong, but looks temporary"
assert resp =~ "try re-linking your Google account"
end
test "displays generic error and logs a message, in case of random HTTP failure calling google",
%{
conn: conn,
site: site
} do
expect(
Plausible.HTTPClient.Mock,
:get,
fn
"https://www.googleapis.com/webmasters/v3/sites",
[{"Content-Type", "application/json"}, {"Authorization", "Bearer 123"}] ->
{:error, :nxdomain}
end
)
log =
capture_log(fn ->
conn = get(conn, "/#{site.domain}/settings/search-console")
resp = html_response(conn, 200)
assert resp =~ "Something went wrong, but looks temporary"
assert resp =~ "try re-linking your Google account"
end)
assert log =~ "Google Analytics: failed to list sites: :nxdomain"
end
end
2019-10-31 09:20:45 +03:00
describe "GET /:website/goals/new" do
setup [:create_user, :log_in, :create_site]
test "shows form to create a new goal", %{conn: conn, site: site} do
conn = get(conn, "/#{site.domain}/goals/new")
assert html_response(conn, 200) =~ "Add goal"
end
end
2019-10-31 08:39:51 +03:00
describe "POST /:website/goals" do
setup [:create_user, :log_in, :create_site]
test "creates a pageview goal for the website", %{conn: conn, site: site} do
2021-10-26 11:59:14 +03:00
conn =
post(conn, "/#{site.domain}/goals", %{
goal: %{
page_path: "/success",
event_name: ""
}
})
2019-10-31 08:39:51 +03:00
goal = Repo.one(Plausible.Goal)
2019-10-31 09:36:16 +03:00
assert goal.page_path == "/success"
2019-10-31 08:39:51 +03:00
assert goal.event_name == nil
2021-10-26 11:59:14 +03:00
assert redirected_to(conn, 302) == "/#{site.domain}/settings/goals"
2019-10-31 08:39:51 +03:00
end
test "creates a custom event goal for the website", %{conn: conn, site: site} do
2021-10-26 11:59:14 +03:00
conn =
post(conn, "/#{site.domain}/goals", %{
goal: %{
page_path: "",
event_name: "Signup"
}
})
2019-10-31 08:39:51 +03:00
goal = Repo.one(Plausible.Goal)
2019-10-31 09:36:16 +03:00
assert goal.event_name == "Signup"
2019-10-31 08:39:51 +03:00
assert goal.page_path == nil
2021-10-26 11:59:14 +03:00
assert redirected_to(conn, 302) == "/#{site.domain}/settings/goals"
2019-10-31 08:39:51 +03:00
end
end
2019-10-31 09:20:45 +03:00
describe "DELETE /:website/goals/:id" do
setup [:create_user, :log_in, :create_site]
test "lists goals for the site", %{conn: conn, site: site} do
2019-10-31 09:36:16 +03:00
goal = insert(:goal, domain: site.domain, event_name: "Custom event")
2019-10-31 09:20:45 +03:00
2021-10-26 11:59:14 +03:00
conn = delete(conn, "/#{site.domain}/goals/#{goal.id}")
2019-10-31 09:20:45 +03:00
assert Repo.aggregate(Plausible.Goal, :count, :id) == 0
2021-10-26 11:59:14 +03:00
assert redirected_to(conn, 302) == "/#{site.domain}/settings/goals"
2019-10-31 09:20:45 +03:00
end
end
describe "POST /sites/:website/weekly-report/enable" do
setup [:create_user, :log_in, :create_site]
Formatting only changes - No code change (#75) * first commit with test and compile job Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding 'prepare' stage Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci script to include "test" compile phase Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding environment variables for connecting to postgresql Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci config for postgres Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-alpine version of elixir Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * re-using the 'compile' artifacts and added explict env variables for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing redundant deps fetching from common code Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting using mix.format -- beware no-code changes! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added release config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding consistent env variable for Database Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * more cleaning up of environment variables Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding releases config for enabling releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up env configs Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Cleaned up config and prepared config for releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated CI script with new config for test Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added Dockerfile for creating production docker image Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding "docker" build job yay! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-slim version of debian and installing webpack Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding overlays for migrations on releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * restricting the docker built to master branch only Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * typo fix Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding "Hosting.md" to explain hosting instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed the default comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added documentation related to env variables * updated documentation and fixed typo Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated documentation * Bumping up elixir version as `overlays` are only supported in latest version read release notes: https://github.com/elixir-lang/elixir/releases/tag/v1.10.0 Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding tarball assembly during release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated HOSTING.md Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for db migration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * minor corrections Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * initializing admin user Admin user has been added in the "migration" phase. A default user is automatically created in the process. One can provide the related env variables, else a new one will be automatically created for you. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Initial base domain update - phase#1 These changes are only meant for correct operating it under self-hosting. There are many other cosmetic changes, that require updates to email, site and other places where the original website and author is used. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Using dedicated config variable `base_domain` instead Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding base_domain to releases config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing the dedicated config "base_domain", relying on endpoint host Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Removed the usage of "Mix" in code! It is bad practice to use "mix" module inside the code as in actual release this module is unavailable. Replacing this with a config environment variable Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for SMTP via Bamboo Smtp Adapter Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Capturing SMTP errors via Sentry Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Minor updates Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding junit formatter -- useful for generating test reports Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding documentation for default user * Resolve "Gitlab Adoption: Add supported services in "Security & Compliance"" * bumping up the debian version to fix issues fixing some vulnerabilities identified by the scanning tools * More updates for self-hosting Changes in most of the places to suit self-hosting. Although, there are some which have been left-off. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * quick-dirty-fix! * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up timeout - skipping MRs :-/ * removing restrictions on watching for changes this stuff isn't working * Update HOSTING.md * renamed the module name * reverting formatting-whitespace changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * reverting the name to release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding docker-compose.yml and related instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using `plausible_url` instead of assuming `https` this is because, it is much to test in local dev machines and in most cases there's already a layer above which is capable for `https` termination and http -> https upgrade Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP: merging changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * wip: more changes * Pushing in changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changes to ci for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up and finishing clickhouse integration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updating readme with hosting details * removing deleted files from upstream * minor config adjustments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me>
2020-06-08 10:35:13 +03:00
test "creates a weekly report record with the user email", %{
conn: conn,
site: site,
user: user
} do
post(conn, "/sites/#{site.domain}/weekly-report/enable")
report = Repo.get_by(Plausible.Site.WeeklyReport, site_id: site.id)
assert report.recipients == [user.email]
end
end
describe "POST /sites/:website/weekly-report/disable" do
setup [:create_user, :log_in, :create_site]
test "deletes the weekly report record", %{conn: conn, site: site} do
insert(:weekly_report, site: site)
post(conn, "/sites/#{site.domain}/weekly-report/disable")
refute Repo.get_by(Plausible.Site.WeeklyReport, site_id: site.id)
end
end
describe "POST /sites/:website/weekly-report/recipients" do
setup [:create_user, :log_in, :create_site]
test "adds a recipient to the weekly report", %{conn: conn, site: site} do
insert(:weekly_report, site: site)
post(conn, "/sites/#{site.domain}/weekly-report/recipients", recipient: "user@email.com")
report = Repo.get_by(Plausible.Site.WeeklyReport, site_id: site.id)
assert report.recipients == ["user@email.com"]
end
end
describe "DELETE /sites/:website/weekly-report/recipients/:recipient" do
setup [:create_user, :log_in, :create_site]
test "removes a recipient from the weekly report", %{conn: conn, site: site} do
insert(:weekly_report, site: site, recipients: ["recipient@email.com"])
delete(conn, "/sites/#{site.domain}/weekly-report/recipients/recipient@email.com")
report = Repo.get_by(Plausible.Site.WeeklyReport, site_id: site.id)
assert report.recipients == []
end
end
describe "POST /sites/:website/monthly-report/enable" do
setup [:create_user, :log_in, :create_site]
Formatting only changes - No code change (#75) * first commit with test and compile job Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding 'prepare' stage Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci script to include "test" compile phase Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding environment variables for connecting to postgresql Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated ci config for postgres Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-alpine version of elixir Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * re-using the 'compile' artifacts and added explict env variables for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing redundant deps fetching from common code Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting using mix.format -- beware no-code changes! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * added release config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding consistent env variable for Database Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * more cleaning up of environment variables Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding releases config for enabling releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up env configs Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Cleaned up config and prepared config for releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated CI script with new config for test Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added Dockerfile for creating production docker image Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding "docker" build job yay! Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using non-slim version of debian and installing webpack Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding overlays for migrations on releases Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * restricting the docker built to master branch only Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * typo fix Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding "Hosting.md" to explain hosting instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removed the default comments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added documentation related to env variables * updated documentation and fixed typo Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated documentation * Bumping up elixir version as `overlays` are only supported in latest version read release notes: https://github.com/elixir-lang/elixir/releases/tag/v1.10.0 Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding tarball assembly during release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updated HOSTING.md Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for db migration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * minor corrections Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * initializing admin user Admin user has been added in the "migration" phase. A default user is automatically created in the process. One can provide the related env variables, else a new one will be automatically created for you. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Initial base domain update - phase#1 These changes are only meant for correct operating it under self-hosting. There are many other cosmetic changes, that require updates to email, site and other places where the original website and author is used. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Using dedicated config variable `base_domain` instead Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding base_domain to releases config Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * removing the dedicated config "base_domain", relying on endpoint host Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Removed the usage of "Mix" in code! It is bad practice to use "mix" module inside the code as in actual release this module is unavailable. Replacing this with a config environment variable Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Added support for SMTP via Bamboo Smtp Adapter Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Capturing SMTP errors via Sentry Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Minor updates Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * Adding junit formatter -- useful for generating test reports Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding documentation for default user * Resolve "Gitlab Adoption: Add supported services in "Security & Compliance"" * bumping up the debian version to fix issues fixing some vulnerabilities identified by the scanning tools * More updates for self-hosting Changes in most of the places to suit self-hosting. Although, there are some which have been left-off. Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * quick-dirty-fix! * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up the db connect timeout Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * bumping up timeout - skipping MRs :-/ * removing restrictions on watching for changes this stuff isn't working * Update HOSTING.md * renamed the module name * reverting formatting-whitespace changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * reverting the name to release Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * adding docker-compose.yml and related instructions Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * using `plausible_url` instead of assuming `https` this is because, it is much to test in local dev machines and in most cases there's already a layer above which is capable for `https` termination and http -> https upgrade Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * WIP: merging changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * wip: more changes * Pushing in changes from upstream Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * changes to ci for testing Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * cleaning up and finishing clickhouse integration Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * updating readme with hosting details * removing deleted files from upstream * minor config adjustments Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me> * formatting changes Signed-off-by: Chandra Tungathurthi <tckb@tgrthi.me>
2020-06-08 10:35:13 +03:00
test "creates a monthly report record with the user email", %{
conn: conn,
site: site,
user: user
} do
post(conn, "/sites/#{site.domain}/monthly-report/enable")
report = Repo.get_by(Plausible.Site.MonthlyReport, site_id: site.id)
assert report.recipients == [user.email]
end
end
describe "POST /sites/:website/monthly-report/disable" do
setup [:create_user, :log_in, :create_site]
test "deletes the monthly report record", %{conn: conn, site: site} do
insert(:monthly_report, site: site)
post(conn, "/sites/#{site.domain}/monthly-report/disable")
refute Repo.get_by(Plausible.Site.MonthlyReport, site_id: site.id)
end
end
describe "POST /sites/:website/monthly-report/recipients" do
setup [:create_user, :log_in, :create_site]
test "adds a recipient to the monthly report", %{conn: conn, site: site} do
insert(:monthly_report, site: site)
post(conn, "/sites/#{site.domain}/monthly-report/recipients", recipient: "user@email.com")
report = Repo.get_by(Plausible.Site.MonthlyReport, site_id: site.id)
assert report.recipients == ["user@email.com"]
end
end
describe "DELETE /sites/:website/monthly-report/recipients/:recipient" do
setup [:create_user, :log_in, :create_site]
test "removes a recipient from the monthly report", %{conn: conn, site: site} do
insert(:monthly_report, site: site, recipients: ["recipient@email.com"])
delete(conn, "/sites/#{site.domain}/monthly-report/recipients/recipient@email.com")
report = Repo.get_by(Plausible.Site.MonthlyReport, site_id: site.id)
assert report.recipients == []
end
end
describe "POST /sites/:website/spike-notification/enable" do
setup [:create_user, :log_in, :create_site]
test "creates a spike notification record with the user email", %{
conn: conn,
site: site,
user: user
} do
post(conn, "/sites/#{site.domain}/spike-notification/enable")
notification = Repo.get_by(Plausible.Site.SpikeNotification, site_id: site.id)
assert notification.recipients == [user.email]
end
test "does not allow duplicate spike notification to be created", %{
conn: conn,
site: site
} do
post(conn, "/sites/#{site.domain}/spike-notification/enable")
post(conn, "/sites/#{site.domain}/spike-notification/enable")
assert Repo.aggregate(
from(s in Plausible.Site.SpikeNotification, where: s.site_id == ^site.id),
:count
) == 1
end
end
describe "POST /sites/:website/spike-notification/disable" do
setup [:create_user, :log_in, :create_site]
test "deletes the spike notification record", %{conn: conn, site: site} do
insert(:spike_notification, site: site)
post(conn, "/sites/#{site.domain}/spike-notification/disable")
refute Repo.get_by(Plausible.Site.SpikeNotification, site_id: site.id)
end
end
describe "PUT /sites/:website/spike-notification" do
setup [:create_user, :log_in, :create_site]
2020-12-11 12:50:44 +03:00
test "updates spike notification threshold", %{conn: conn, site: site} do
insert(:spike_notification, site: site, threshold: 10)
put(conn, "/sites/#{site.domain}/spike-notification", %{
"spike_notification" => %{"threshold" => "15"}
})
notification = Repo.get_by(Plausible.Site.SpikeNotification, site_id: site.id)
assert notification.threshold == 15
end
end
describe "POST /sites/:website/spike-notification/recipients" do
setup [:create_user, :log_in, :create_site]
test "adds a recipient to the spike notification", %{conn: conn, site: site} do
insert(:spike_notification, site: site)
post(conn, "/sites/#{site.domain}/spike-notification/recipients",
recipient: "user@email.com"
)
report = Repo.get_by(Plausible.Site.SpikeNotification, site_id: site.id)
assert report.recipients == ["user@email.com"]
end
end
describe "DELETE /sites/:website/spike-notification/recipients/:recipient" do
setup [:create_user, :log_in, :create_site]
test "removes a recipient from the spike notification", %{conn: conn, site: site} do
insert(:spike_notification, site: site, recipients: ["recipient@email.com"])
delete(conn, "/sites/#{site.domain}/spike-notification/recipients/recipient@email.com")
report = Repo.get_by(Plausible.Site.SpikeNotification, site_id: site.id)
assert report.recipients == []
end
end
describe "GET /sites/:website/shared-links/new" do
setup [:create_user, :log_in, :create_site]
test "shows form for new shared link", %{conn: conn, site: site} do
conn = get(conn, "/sites/#{site.domain}/shared-links/new")
assert html_response(conn, 200) =~ "New shared link"
end
end
describe "POST /sites/:website/shared-links" do
setup [:create_user, :log_in, :create_site]
test "creates shared link without password", %{conn: conn, site: site} do
post(conn, "/sites/#{site.domain}/shared-links", %{
"shared_link" => %{"name" => "Link name"}
})
link = Repo.one(Plausible.Site.SharedLink)
refute is_nil(link.slug)
assert is_nil(link.password_hash)
assert link.name == "Link name"
end
test "creates shared link with password", %{conn: conn, site: site} do
post(conn, "/sites/#{site.domain}/shared-links", %{
"shared_link" => %{"password" => "password", "name" => "New name"}
})
link = Repo.one(Plausible.Site.SharedLink)
refute is_nil(link.slug)
refute is_nil(link.password_hash)
assert link.name == "New name"
end
end
describe "GET /sites/:website/shared-links/edit" do
setup [:create_user, :log_in, :create_site]
test "shows form to edit shared link", %{conn: conn, site: site} do
link = insert(:shared_link, site: site)
conn = get(conn, "/sites/#{site.domain}/shared-links/#{link.slug}/edit")
assert html_response(conn, 200) =~ "Edit shared link"
end
end
describe "PUT /sites/:website/shared-links/:slug" do
setup [:create_user, :log_in, :create_site]
test "can update link name", %{conn: conn, site: site} do
link = insert(:shared_link, site: site)
put(conn, "/sites/#{site.domain}/shared-links/#{link.slug}", %{
"shared_link" => %{"name" => "Updated link name"}
})
link = Repo.one(Plausible.Site.SharedLink)
assert link.name == "Updated link name"
end
end
describe "DELETE /sites/:website/shared-links/:slug" do
setup [:create_user, :log_in, :create_site]
test "shows form for new shared link", %{conn: conn, site: site} do
link = insert(:shared_link, site: site)
conn = delete(conn, "/sites/#{site.domain}/shared-links/#{link.slug}")
refute Repo.one(Plausible.Site.SharedLink)
assert redirected_to(conn, 302) =~ "/#{site.domain}/settings"
end
end
2020-06-30 11:00:19 +03:00
describe "DELETE sites/:website/custom-domains/:id" do
setup [:create_user, :log_in, :create_site]
test "lists goals for the site", %{conn: conn, site: site} do
domain = insert(:custom_domain, site: site)
delete(conn, "/sites/#{site.domain}/custom-domains/#{domain.id}")
assert Repo.aggregate(Plausible.Site.CustomDomain, :count, :id) == 0
end
end
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
describe "GET /:website/import/google-analytics/view-id" do
setup [:create_user, :log_in, :create_new_site]
test "lists Google Analytics views", %{conn: conn, site: site} do
bypass = Bypass.open()
Bypass.expect_once(
bypass,
"GET",
"/analytics/v3/management/accounts/~all/webproperties/~all/profiles",
fn conn ->
response_body = File.read!("fixture/ga_list_views.json")
conn
|> Plug.Conn.put_resp_header("content-type", "application/json")
|> Plug.Conn.resp(200, response_body)
end
)
:plausible
|> Application.get_env(:google)
|> Keyword.put(:api_url, "http://0.0.0.0:#{bypass.port}")
|> then(&Application.put_env(:plausible, :google, &1))
response =
conn
|> get("/#{site.domain}/import/google-analytics/view-id", %{
"access_token" => "token",
"refresh_token" => "foo",
"expires_at" => "2022-09-22T20:01:37.112777"
})
|> html_response(200)
assert response =~ "57238190 - one.test"
assert response =~ "54460083 - two.test"
end
end
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
describe "POST /:website/settings/google-import" do
setup [:create_user, :log_in, :create_new_site]
test "adds in-progress imported tag to site", %{conn: conn, site: site} do
2022-03-21 13:47:27 +03:00
post(conn, "/#{site.domain}/settings/google-import", %{
2022-03-21 13:55:20 +03:00
"view_id" => "123",
"start_date" => "2018-03-01",
"end_date" => "2022-03-01",
"access_token" => "token",
"refresh_token" => "foo",
"expires_at" => "2022-09-22T20:01:37.112777"
2022-03-21 13:47:27 +03:00
})
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
imported_data = Repo.reload(site).imported_data
assert imported_data
assert imported_data.source == "Google Analytics"
assert imported_data.end_date == ~D[2022-03-01]
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
assert imported_data.status == "importing"
end
test "schedules an import job in Oban", %{conn: conn, site: site} do
2022-03-21 13:47:27 +03:00
post(conn, "/#{site.domain}/settings/google-import", %{
"view_id" => "123",
"start_date" => "2018-03-01",
"end_date" => "2022-03-01",
"access_token" => "token",
"refresh_token" => "foo",
"expires_at" => "2022-09-22T20:01:37.112777"
2022-03-21 13:47:27 +03:00
})
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
assert_enqueued(
worker: Plausible.Workers.ImportGoogleAnalytics,
args: %{
"site_id" => site.id,
"view_id" => "123",
"start_date" => "2018-03-01",
"end_date" => "2022-03-01",
"access_token" => "token",
"refresh_token" => "foo",
"token_expires_at" => "2022-09-22T20:01:37.112777"
}
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
)
end
end
describe "DELETE /:website/settings/:forget_imported" do
setup [:create_user, :log_in, :create_new_site]
test "removes imported_data field from site", %{conn: conn, site: site} do
delete(conn, "/#{site.domain}/settings/forget-imported")
assert Repo.reload(site).imported_data == nil
end
2022-03-23 12:58:36 +03:00
test "removes actual imported data from Clickhouse", %{conn: conn, site: site} do
Plausible.Site.start_import(site, ~D[2022-01-01], Timex.today(), "Google Analytics")
|> Repo.update!()
populate_stats(site, [
build(:imported_visitors, pageviews: 10)
])
delete(conn, "/#{site.domain}/settings/forget-imported")
assert Plausible.Stats.Clickhouse.imported_pageview_count(site) == 0
end
test "cancels Oban job if it exists", %{conn: conn, site: site} do
{:ok, job} =
Plausible.Workers.ImportGoogleAnalytics.new(%{
"site_id" => site.id,
"view_id" => "123",
"start_date" => "2022-01-01",
"end_date" => "2023-01-01",
"access_token" => "token"
})
|> Oban.insert()
Plausible.Site.start_import(site, ~D[2022-01-01], Timex.today(), "Google Analytics")
|> Repo.update!()
populate_stats(site, [
build(:imported_visitors, pageviews: 10)
])
delete(conn, "/#{site.domain}/settings/forget-imported")
assert Repo.reload(job).state == "cancelled"
end
[Continued] Google Analytics import (#1753) * Add has_imported_stats boolean to Site * Add Google Analytics import panel to general settings * Get GA profiles to display in import settings panel * Add import_from_google method as entrypoint to import data * Add imported_visitors table * Remove conflicting code from migration * Import visitors data into clickhouse database * Pass another dataset to main graph for rendering in red This adds another entry to the JSON data returned via the main graph API called `imported_plot`, which is similar to `plot` in form but will be completed with previously imported data. Currently it simply returns the values from `plot` / 2. The data is rendered in the main graph in red without fill, and without an indicator for the present. Rationale: imported data will not continue to grow so there is no projection forward, only backwards. * Hook imported GA data to dashboard timeseries plot * Add settings option to forget imported data * Import sources from google analytics * Merge imported sources when queried * Merge imported source data native data when querying sources * Start converting metrics to atoms so they can be subqueried This changes "visitors" and in some places "sources" to atoms. This does not change the behaviour of the functions - the tests all pass unchanged following this commit. This is necessary as joining subqueries requires that the keys in `select` statements be atoms and not strings. * Convery GA (direct) source to empty string * Import utm campaign and utm medium from GA * format * Import all data types from GA into new tables * Handle large amounts of more data more safely * Fix some mistakes in tables * Make GA requests in chunks of 5 queries * Only display imported timeseries when there is no filter * Correctly show last 30 minutes timeseries when 'realtime' * Add with_imported key to Query struct * Account for injected :is_not filter on sources from dashboard * Also add tentative imported_utm_sources table This needs a bit more work on the google import side, as GA do not report sources and utm sources as distinct things. * Return imported data to dashboard for rest of Sources panel This extends the merge_imported function definition for sources to utm_sources, utm_mediums and utm_campaigns too. This appears to be working on the DB side but something is incomplete on the client side. * Clear imported stats from all tables when requested * Merge entry pages and exit pages from imported data into unfiltered dashboard view This requires converting the `"visits"` and `"visit_duration"` metrics to atoms so that they can be used in ecto subqueries. * Display imported devices, browsers and OSs on dashboard * Display imported country data on dashboard * Add more metrics to entries/exits for modals * make sure data is returned via API with correct keys * Import regions and cities from GA * Capitalize device upon import to match native data * Leave query limits/offsets until after possibly joining with imported data * Also import timeOnPage and pageviews for pages from GA * imported_countries -> imported_locations * Get timeOnPage and pageviews for pages from GA These are needed for the pages modal, and for calculating exit rates for exit pages. * Add indicator to dashboard when imported data is being used * Don't show imported data as separately line on main graph * "bounce_rate" -> :bounce_rate, so it works in subqueries * Drop imported browser and OS versions These are not needed. * Toggle displaying imported data by clicking indicator * Parse referrers with RefInspector - Use 'ga:fullReferrer' instead of 'ga:source'. This provides the actual referrer host + path, whereas 'ga:source' includes utm_mediums and other values when relevant. - 'ga:fullReferror' does however include search engine names directly, so they are manually checked for as RefInspector won't pick up on these. * Keep imported data indicator on dashboard and strikethrough when hidden * Add unlink google button to import panel * Rename some GA browsers and OSes to plausible versions * Get main top pages and exit pages panels working correctly with imported data * mix format * Fetch time_on_pages for imported data when needed * entry pages need to fetch bounces from GA * "sample_percent" -> :sample_percent as only atoms can be used in subqueries * Calculate bounce_rate for joined native and imported data for top pages modal * Flip some query bindings around to be less misleading * Fixup entry page modal visit durations * mix format * Fetch bounces and visit_duration for sources from GA * add more source metrics used for data in modals * Make sources modals display correct values * imported_visitors: bounce_rate -> bounces, avg_visit_duration -> visit_duration * Merge imported data into aggregate stats * Reformat top graph side icons * Ensure sample_percent is yielded from aggregate data * filter event_props should be strings * Hide imported data from frontend when using filter * Fix existing tests * fix tests * Fix imported indicator appearing when filtering * comma needed, lost when rebasing * Import utm_terms and utm_content from GA * Merge imported utm_term and utm_content * Rename imported Countries data as Locations * Set imported city schema field to int * Remove utm_terms and utm_content when clearing imported * Clean locations import from Google Analytics - Country and region should be set to "" when GA provides "(not set)" - City should be set to 0 for "unknown", as we cannot reliably import city data from GA. * Display imported region and city in dashboard * os -> operating_system in some parts of code The inconsistency of using os in some places and operating_system in others causes trouble with subqueries and joins for the native and imported data, which would require additional logic to account for. The simplest solution is the just use a consistent word for all uses. This doesn't make any user-facing or database changes. * to_atom -> to_existing_atom * format * "events" metric -> :events * ignore imported data when "events" in metrics * update "bounce_rate" * atomise some more metrics from new city and region api * atomise some more metrics for email handlers * "conversion_rate" -> :conversion_rate during csv export * Move imported data stats code to own module * Move imported timeseries function to Stats.Imported * Use Timex.parse to import dates from GA * has_imported_stats -> imported_source * "time_on_page" -> :time_on_page * Convert imported GA data to UTC * Clean up GA request code a bit There was some weird logic here with two separate lists that really ought to be together, so this merges those. * Fail sooner if GA timezone can't be identified * Link imported tables to site by id * imported_utm_content -> imported_utm_contents * Imported GA from all of time * Reorganise GA data fetch logic - Fetch data from the start of time (2005) - Check whether no data was fetched, and if so, inform user and don't consider data to be imported. * Clarify removal of "visits" data when it isn't in metrics * Apply location filters from API This makes it consistent with the sources etc which filter out 'Direct / None' on the API side. These filters are used by both the native and imported data handling code, which would otherwise both duplicate the filters in their `where` clauses. * Do not use changeset for setting site.imported_source * Add all metrics to all dimensions * Run GA import in the background * Send email when GA import completes * Add handler to insert imported data into tests and imported_browsers_factory * Add remaining import data test factories * Add imported location data to test * Test main graph with imported data * Add imported data to operating systems tests * Add imported data to pages tests * Add imported data to entry pages tests * Add imported data to exit pages tests * Add imported data to devices tests * Add imported data to sources tests * Add imported data to UTM tests * Add new test module for the data import step * Test import of sources GA data * Test import of utm_mediums GA data * Test import of utm_campaigns GA data * Add tests for UTM terms * Add tests for UTM contents * Add test for importing pages and entry pages data from GA * Add test for importing exit page data * Fix module file name typo * Add test for importing location data from GA * Add test for importing devices data from GA * Add test for importing browsers data from GA * Add test for importing OS data from GA * Paginate GA requests to download all data * Bump clickhouse_ecto version * Move RefInspector wrapper function into module * Drop timezone transform on import * Order imported by side_id then date * More strings -> atoms Also changes a conditional to be a bit nicer * Remove parallelisation of data import * Split sources and UTM sources from fetched GA data GA has only a "source" dimension and no "UTM source" dimension. Instead it returns these combined. The logic herein to tease these apart is: 1. "(direct)" -> it's a direct source 2. if the source is a domain -> it's a source 3. "google" -> it's from adwords; let's make this a UTM source "adwords" 4. else -> just a UTM source * Keep prop names in queries as strings * fix typo * Fix import * Insert data to clickhouse in batches * Fix link when removing imported data * Merge source tables * Import hostname as well as pathname * Record start and end time of imported data * Track import progress * Fix month interval with imported data * Do not JOIN when imported date range has no overlap * Fix time on page using exits Co-authored-by: mcol <mcol@posteo.net>
2022-03-11 00:04:59 +03:00
end
2019-09-02 14:29:19 +03:00
end