Made robots.txt caching configurable

refs https://github.com/TryGhost/Toolbox/issues/411

- Having hardcoded cache control values in the codebase makes it impossible to experiment with new values without a version release.
- Having all values configurable by default will allow for easier caching experiments and customizations on self-hosting instances.
- Brings caching across both private and public robots file caching to same consistent and configurable value.
This commit is contained in:
Naz 2022-09-27 17:11:50 +08:00 committed by naz
parent f6c7df4018
commit e45eb4d5dd

View File

@ -118,7 +118,7 @@ module.exports = function setupSiteApp(routerConfig) {
debug('Themes done');
// Serve robots.txt if not found in theme
siteApp.use(mw.servePublicFile('static', 'robots.txt', 'text/plain', constants.ONE_HOUR_S));
siteApp.use(mw.servePublicFile('static', 'robots.txt', 'text/plain', config.get('caching:robotstxt:maxAge')));
// site map - this should probably be refactored to be an internal app
sitemapHandler(siteApp);