no issue
- clicks on the iframe never bubble out of the iframe so weren't captured by the dropdown-closing event listener
- added an event listener directly on the iframe's body element when we render the iframe's content that manually calls out to our generic dropdown closing method
refs https://github.com/TryGhost/Toolbox/issues/503
- The listener was not covered during quick and dirty implementation. While in the area did some cleanup to the sitemap manager test
- One of the problems I've stumbled upon when adding a test is having multiple instances of SiteManager in the test, which in turn created multiple "subscribe" events and repeat handle executions. Fixed it by having just one site manager instance (a singleton) as that's the pattern that used in main codebase
refs https://github.com/TryGhost/Toolbox/issues/503
- The Dynamic URL service no longer generates "url.added" event when only a partial resource update happened - only non-url forming properties were modified. The sitemaps service still needs to know when to update the lastmod ("Last Modified") field associated with specific URL.
refs https://github.com/TryGhost/Toolbox/issues/503
- Tier's are sometimes dynamically generated and are present in the "_changed" properties, causing full URL regeneration. They have no effect on post's URL, so should not trigger URL regeneration.
refs https://github.com/TryGhost/Toolbox/issues/503
- Full URL regeneration process was happening even when only unrelated to URL generation fields were updated (e.g. 'plaintext' change in post does not affect the URL of the post). Stopping the "resource updated" event processing early circumvents full url regeneration inside of DynamicRouting, which can be quite heavy depending on routing configuration
- The URLResourceUpdatedEvent is supposed to be emmited whenever there's an update to the resource already associated with the URL and no url-affecting fields were touched.
refs https://github.com/TryGhost/Toolbox/issues/503
- Reusing existing events inside of dynamic routing would only contribute to general confusion that is already there. Having separate "DomainEvents" is the best practice used throughout the code which is substituting generic events.
- The URLResourceUpdatedEvent is supposed to be emmited whenever there's an updated to the resource already associated with the URL circumventing full url regeneration process inside of DynamicRouting
no issue
Tests stopped working because the Mailgun mocker stopped working since we moved to the new email flow.
This also fixes a unit test that needed to get updated.
fixes https://github.com/TryGhost/Team/issues/2432
Adds outbound_link_tagging setting (enabled by default and behind
feature flag). If the feature flag is enabled, and the setting is
disabled, we won't add ?ref to links in emails.
This includes new E2E tests for email click tracking, which were also
extended to check outbound link tagging (for both MEGA and the new email
stability flow).
Also fixes a test fixture for the comments_enabled setting.
fixes https://github.com/TryGhost/Team/issues/2461
- Ignores 'edited' links when there is only one second differences.
- Make sure we don't set updatedAt when linking a post to a redirect
refs https://github.com/TryGhost/Toolbox/issues/501
- this reverts commit 48dda23554
- also includes a resolution for `@elastic/elasticsearch` so we don't
run a version that is potentially problematic - see referenced issue
for context
refs https://github.com/TryGhost/Team/issues/2466
This initial implementation just checks that we're on the right origin and
subdomain, but should be extended to check if the URL actually resolves to a
page hosted on the site!
refs https://github.com/TryGhost/Team/issues/2465
We've restricted this to Post resources for now until we update the Mention
entity to be able to handle multiple resource types.
refs https://github.com/TryGhost/Toolbox/issues/503
refs https://github.com/TryGhost/Toolbox/issues/406
- In Ghost 5.x we dropped multi-versioned API, which means there's no need to track resource configs dynamically as there can only be one version
- Along with removed "initResourceConfig" refactored the "config" file itself to be injected into Resource's constructor - allows for easier testing.
no issue
Using the slash menu it was possible to insert cards that shouldn't have been accessible based on their availability checks. This was happening because we were only hiding the visibility of the cards in the template rather than completely removing them from the slash command matching logic.
- added `{{card-menu-items}}` helper that combines the availability matching and snippet section addition to return a complete array of sections+items that match the current system state and post type
- added `@menuItems` argument set to the output of `{{card-menu-items}}` to the two card menu components so they are working against a pre-filtered list of menu items
- lets us remove duplication of code that handled pushing snippets section into the menus
- removed availability check conditionals from `<KoenigMenuContent>` as the menu items passed in are now pre-filtered
https://github.com/TryGhost/Team/issues/2458
This is an initial pass at pulling metadata from webmention sources, we've also
updated the fake data to pull from some real-world sites which implement
webmentions. We've reused the oembed service here, long term it would be nice to
pull the metadata parsing/pulling part out, so that we can have more generic
error messages.
Based on a discussion in slack we want to make all metadata properties optional,
with the exception of the title, which will default to the host of the source
URL if it's missing.
This is so that we can accept as many webmentions as possible and convert them
into Mentions. If we were to have strictly validation, we'd end up having to
drop webmentions that didn't match our criteria, and lose important data.
Giving the title a default allows us to provide a consistent UI experience too.
refs https://github.com/TryGhost/Team/issues/2419
This is the initial stab at having everything wired up, we're not
using a queue but we are handling the processing of the Webmention
asyncrounsly so that the HTTP response can be end immediately.
We've also laid the groundwork for extending and implementing the
correct processing of Webmentions, for example checking if the target
URL exists in the system, pulling out the metadata from the Webmention
source and fetching any internal resources.
This allows us to share the implementation with other parts of the codebase, the
specific usecase here being fetching the metadata from webmention sources, for
display in the mentions UI, which will be borrowing a lot of stuff from the
bookmark card.
refs https://github.com/TryGhost/Team/issues/2435
We've made these fields optional, and we may need to extend this to other fields
too as we discover more about the data we're able to get access to.
- we don't end up using the inserted model from Bookshelf, so we
shouldn't be performing a SELECT on the entry
- this disables refreshing the model using Bookshelf's `autoRefresh:
false` and allows the key through the sanitization for `add
refs https://github.com/TryGhost/Ghost/issues/15502
- the amazing `i18next-parser` dependency will extract our translated
strings from Portal and dump them into locale files, so we never have
to add them manually
refs https://github.com/TryGhost/Ghost/issues/15502
- plain JSON files are cleaner and less overwhelming than boilerplate JS
files, and given they're going to be automatically generated, we
probably won't be able to support comments anyway
refs https://github.com/TryGhost/Ghost/issues/15502
- this is an early implementation of an i18n provider by
exporting an instance of `i18next`
- there's a lot more to be done here but baby steps :)
fixes https://github.com/TryGhost/Team/issues/2200
When zipping a folder that contains files with UTF-8 characters in the filename, using the MacOS Archive Utility, the resulting zip will be missing some UTF-8 configuration bit. This breaks the unzipper, causing it to decode the filenames using the wrong encodign.
When the file names are long, and become longer than the length allowed by the OS, an ENAMETOOLONG error is thrown. This error is not handled by the importer, and causes the import to fail.
This adds a specific check for this error so we can show a clear error message to the user, that helps them to resolve the issue. We are currently unable to fix the issue on our side, because of a lack of well supported zip libraries for node.
refs acf0baa8c7
Due to the bump in express-test, we now handle string bodies 'properly'. So they now pass all the Express middlewares. In the past this failing test did not really pass by the bodyParser.raw middleware,
so the content-type check on the `bodyParser.raw({type: 'application/json'})` middleware was not executed. Now it is, and the test fails because the content-type header was not set to application/json.
no issue
When fetching the suppression list data for emails with a plus sign, the
parsing of the NQL filter fails:
```at Child.applyDefaultAndCustomFilters (/Ghost/node_modules/@tryghost/bookshelf-filter/lib/bookshelf-filter.js:66:23)
[ghost] email:[simon+test@ghos
[ghost] ------------^
[ghost] Expecting 'OR', 'RBRACKET', got 'AND'
```
refs https://github.com/TryGhost/Team/issues/2400
- we've deemed it useful to start to return `Content-Version` for all
API requests, because it becomes useful to know which version of Ghost
a response has come from in logs
- this should also help us detect Admin<->Ghost API mismatches, which
was the cause of a bug recently (ref'd issue)
refs https://github.com/TryGhost/Toolbox/issues/499
- The mockManager's sentEmailCount is left here to avoid breaking many tests that already depend on this method. With future improvements to email snapshot tests this method should not be used. Instead, emailMockReceiver's own sentEmailCount method should be used directly.
ref https://github.com/TryGhost/Team/issues/2421
- added the Mentions API endpoint to Admin
- setup initial mention model in the Ember Store to be able to dev with the endpoint
- added basic routing to access the `/mentions` page that is currently behind feature flags
- Setup basic testing with a mirage mock endpoint.
refs https://github.com/TryGhost/Team/issues/2416
This extends the mock API to use a more formal pattern of moving our
entity code into a separate package, and use the service/repository
patterns we've been work toward.
The repository is currently in memory, this allows us to start using
the API without having to make commitments to the database structure.
We've also injected a single fake webmention for testing. I'd expect
the Mention object to change a lot from this initial definition as we
gain more information about the type of data we expect to see.
refs https://github.com/TryGhost/Team/issues/2416
This doesn't even return a Mention in the correct format at the moment,
but it's just to get an endpoint there, behind a flag and returning data
so that we can start playing with the API and having it hooked up the
the Admin.
The next step will be fleshing this out further and defining the
services and repository to back it, as well as updating the Admin so that
we can fetch mentions to display in the UI
refs https://github.com/TryGhost/Toolbox/issues/499
refs 6bcc47a0ad
- Using module directly caused issues with snapshots manager instance initialization (mocha hooks did not apply to a correct instance)
- See refed commit for more
refs https://github.com/TryGhost/Toolbox/issues/499
- Outgoing emails have been a weak point of Ghost's stability recently. The concept of "emailMockReceiver" similarly to "webhookMockReceiver", allows to test side-effects like outgoing emails.
- This is a first iteration which should lay groundwork for testing all outgoing emails in the future
- The change adds a new concept of "email mock receiver" which is very similar to how the "webhook mock receiver" works. The email mock receiver exposes two methods to record and verify snapshots:
- matchHTMLSnapshot - records and verifies only the HTML content of the outgoint email
- matchMetadataSnapshot - records and verifies all the non-HTML properties sent along an email content, e.g.: to address, plaintext, subject, etc.
- What's missing is matching content based on dynamic content like dates, links with JWT tokens, etc.
We've wrapped both changes in a try/catch to make sure this has no
adverse affects. The endpoint currently doesn't exist - we're only
adding this to get an idea of how much traffic we'll expect to see.
Long term we'll want to read the endpoint from the webmention service.
closes https://github.com/TryGhost/Team/issues/2276
Portal had died with an unintelligible error about portal plans/includes being undefined when there was missing site data in some extreme edge cases. This change catches any errors in site transformation and logs it to console instead of crashing portal unexpectedly
- in the event the Mailgun config doesn't exist, we return `null` from
this function
- this updates the jsdoc to correct the return type of `getInstance`
no refs.
Post feedback popups were not optimised for mobile usage at all. All the content was on top of the window which is really hard to reach using a single hand/thumb and it just looked like a scaled down version of a desktop modal.
closes https://github.com/TryGhost/Team/issues/2206
- removes `www.` from the url shown on links table in post analytics
- we had previously removed http(s) protocol from it as well, and they are only shown while editing the url