* Add `utm_content` and `utm_term`.
Support `utm_content` and `utm_term` as requested in #515.
* Add dropdown for UTM options
* Remove utm_content and term from filter modal for now
Co-authored-by: Blender Defender <defenderblender@gmail.com>
* adding api route PUT /api/v1/sites/goals with form fields "goal_type" and "goal_value" with supported types "event" and "page"
* arm64 docker images
* adding api route DELETE /api/v1/sites/goals/:goal_id with form param "site_id"
* revert makfile + package.json
* return statement hotfix in case site could not be found
* adding api route PUT /api/v1/sites/goals/:goal_id with form params "site_id", "goal_type", and "goal_value"
* update the goal api routes to accept event_name or page_path instead of goal_value
* cleaning goals model
* mix format
Co-authored-by: Ahmed Abbas <a.abbas@ixdc.net>
* Merge branch 'plausible_master'
* Add City level details
* Add City level details
* Use ISO codes instead of geoname_id for subdivisions
* Add easier way to configure geolocation database
* Add workflow for dev branch
* Correct clickhouse migration
* Translate subdivision names
* Translate city names
* WIP
* Region and country filters
* Fix region filter
* Remove region_name when removing region filter
* Add modals for regions and cities
* Remove dead code
* WIP
* Revert "WIP"
This reverts commit 3202bf2fe9.
* Feature flag to hide cities when deployed
* Add changelog entry
* Remove unused code
* Remove unused variables
* Fix test
Co-authored-by: AymanTerra <aymanterra@yahoo.com>
* Only add percentages to dashboard data when not filtering goal
* Correctly name CSV headers when exporting conversion data
* Remove percentages from tests when filtering for goal
* Fix custom property total conversions value not displayed
The custom property conversion metrics are not consistent with the other
metrics resulting in the total conversions not being displayed in the
dashboard. This fixes that.
* Export custom props of current goal when filtering dashboard for goal
This makes the CSV export also output a `prop_breakdown.csv` file which,
for the currently filtered goal, contains the conversion data for each
of its configured properties.
* Add test for goal-filtered CSV export
* Round time_on_page metric returned by 'pages' API
This rounds the `time_on_page` metric returned as part of the `pages`
API to the nearest second. While this shows no apparent change in the
web UI, it removes the decimal from the exported data in `pages.csv`.
* Tidy up export tests
* Round time_on_page in db query
* Add goal to CSV export tests
* Display CSV export spinner until download is ready
The mechanism used to make the page aware that the download is ready is:
- Client code sets a cookie and requests download.
- Server code handles download and removes cookie when complete.
- Client code polls every 1s to check the cookies, removing spinner when
the export cookie is removed.
f576fa2 should have updated the conversion metric names so that
`unique_conversions` and `total_conversions` are the two metrics
returned by the conversions API. This updates those so that the CSV
export outputs the correct data.
* Add details=True to export API parameters
This makes the ZIP export add `%{"details" => "True"}` to the query's
`params` when fetching data internally for packaging in the ZIP.
This adds bounce_rate and time_on_page to the data in pages.csv, and
bounce_rate and visit_duration to sources.csv.
* Make API return data with consistent names
Some of the data types returned via the JSON or CSV API use inconsistent
naming, and some have redundant name changes (i.e. count -> visitors ->
count). This makes these all consistent and removes the redundancy.
This addresses #1426, fixes some of the CSV headers, and unifies the
JSON and CSV return data labels.
* Update changelog
* Test should use Timex.shift, not relative time
* Return full country names in CSV export
This also replaces the " character with ' in two country names, as those
are the characters used in the names, yielding a more predictable and
'correct' output.
* Fetch CSV exported data concurrently
* Use spinner to indicate when export has started
* Use 300 as default number of brekadown entries for export
Higher numbers (e.g. 1000) seem to cause clickhouse errors when there
many pages to request. It is unclear what is causing the error, as
clickhouse returns an "unknown" error code and an empty error message.
* Export all dashboard data in zip
This packages all data currently visible in the dashboard into
individual CSVs and downloads them together in a zip.
* Also export conversions in zip of CSVs
* Update export test with zip file response
* Add zip file download to changelog