Ghost/core/server/services/mega/post-email-serializer.js

290 lines
12 KiB
JavaScript
Raw Normal View History

Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
const _ = require('lodash');
const juice = require('juice');
const template = require('./template');
const labsTemplate = require('./template-labs');
const config = require('../../../shared/config');
const settingsCache = require('../../services/settings/cache');
const urlUtils = require('../../../shared/url-utils');
const moment = require('moment-timezone');
const cheerio = require('cheerio');
const api = require('../../api');
const {URL} = require('url');
const mobiledocLib = require('../../lib/mobiledoc');
const htmlToText = require('html-to-text');
const {isUnsplashImage, isLocalContentImage} = require('@tryghost/kg-default-cards/lib/utils');
const logging = require('../../../shared/logging');
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
const ALLOWED_REPLACEMENTS = ['first_name'];
const getSite = () => {
const publicSettings = settingsCache.getPublic();
return Object.assign({}, publicSettings, {
url: urlUtils.urlFor('home', true),
iconUrl: publicSettings.icon ? urlUtils.urlFor('image', {image: publicSettings.icon}, true) : null
});
};
/**
* createUnsubscribeUrl
*
* Takes a member uuid and returns the url that should be used to unsubscribe
* In case of no member uuid, generates the preview unsubscribe url - `?preview=1`
*
* @param {string} uuid
*/
const createUnsubscribeUrl = (uuid) => {
const siteUrl = urlUtils.getSiteUrl();
const unsubscribeUrl = new URL(siteUrl);
unsubscribeUrl.pathname = `${unsubscribeUrl.pathname}/unsubscribe/`.replace('//', '/');
if (uuid) {
unsubscribeUrl.searchParams.set('uuid', uuid);
} else {
unsubscribeUrl.searchParams.set('preview', '1');
}
return unsubscribeUrl.href;
};
// NOTE: serialization is needed to make sure we are using current API and do post transformations
// such as image URL transformation from relative to absolute
const serializePostModel = async (model, apiVersion = 'v4') => {
// fetch mobiledoc rather than html and plaintext so we can render email-specific contents
const frame = {options: {context: {user: true}, formats: 'mobiledoc'}};
const docName = 'posts';
await api.shared
.serializers
.handle
.output(model, {docName: docName, method: 'read'}, api[apiVersion].serializers.output, frame);
return frame.response[docName][0];
};
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
// removes %% wrappers from unknown replacement strings in email content
const normalizeReplacementStrings = (email) => {
// we don't want to modify the email object in-place
const emailContent = _.pick(email, ['html', 'plaintext']);
const EMAIL_REPLACEMENT_REGEX = /%%(\{.*?\})%%/g;
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
const REPLACEMENT_STRING_REGEX = /\{(?<recipientProperty>\w*?)(?:,? *(?:"|&quot;)(?<fallback>.*?)(?:"|&quot;))?\}/;
['html', 'plaintext'].forEach((format) => {
emailContent[format] = emailContent[format].replace(EMAIL_REPLACEMENT_REGEX, (replacementMatch, replacementStr) => {
const match = replacementStr.match(REPLACEMENT_STRING_REGEX);
if (match) {
const {recipientProperty} = match.groups;
if (ALLOWED_REPLACEMENTS.includes(recipientProperty)) {
// keeps wrapping %% for later replacement with real data
return replacementMatch;
}
}
// removes %% so output matches user supplied content
return replacementStr;
});
});
return emailContent;
};
// parses email content and extracts an array of replacements with desired fallbacks
const parseReplacements = (email) => {
const EMAIL_REPLACEMENT_REGEX = /%%(\{.*?\})%%/g;
const REPLACEMENT_STRING_REGEX = /\{(?<recipientProperty>\w*?)(?:,? *(?:"|&quot;)(?<fallback>.*?)(?:"|&quot;))?\}/;
const replacements = [];
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
['html', 'plaintext'].forEach((format) => {
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
let result;
while ((result = EMAIL_REPLACEMENT_REGEX.exec(email[format])) !== null) {
const [replacementMatch, replacementStr] = result;
const match = replacementStr.match(REPLACEMENT_STRING_REGEX);
if (match) {
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
const {recipientProperty, fallback} = match.groups;
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
if (ALLOWED_REPLACEMENTS.includes(recipientProperty)) {
const id = `replacement_${replacements.length + 1}`;
replacements.push({
format,
id,
match: replacementMatch,
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
recipientProperty: `member_${recipientProperty}`,
fallback
});
}
}
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
}
});
return replacements;
};
const getTemplateSettings = async () => {
return {
showSiteHeader: settingsCache.get('newsletter_show_header'),
bodyFontCategory: settingsCache.get('newsletter_body_font_category'),
showBadge: settingsCache.get('newsletter_show_badge'),
footerContent: settingsCache.get('newsletter_footer_content'),
accentColor: settingsCache.get('accent_color')
};
};
const getLabsTemplateSettings = async () => {
const templateSettings = {
headerImage: settingsCache.get('newsletter_header_image'),
showHeaderIcon: settingsCache.get('newsletter_show_header_icon') && settingsCache.get('icon'),
showHeaderTitle: settingsCache.get('newsletter_show_header_title'),
showFeatureImage: settingsCache.get('newsletter_show_feature_image'),
titleFontCategory: settingsCache.get('newsletter_title_font_category'),
titleAlignment: settingsCache.get('newsletter_title_alignment'),
bodyFontCategory: settingsCache.get('newsletter_body_font_category'),
showBadge: settingsCache.get('newsletter_show_badge'),
footerContent: settingsCache.get('newsletter_footer_content'),
accentColor: settingsCache.get('accent_color')
};
if (templateSettings.headerImage) {
if (isUnsplashImage(templateSettings.headerImage)) {
// Unsplash images have a minimum size so assuming 1200px is safe
const unsplashUrl = new URL(templateSettings.headerImage);
unsplashUrl.searchParams.set('w', 1200);
templateSettings.headerImage = unsplashUrl.href;
templateSettings.headerImageWidth = 600;
} else {
const {imageSize} = require('../../lib/image');
try {
const size = await imageSize.getImageSizeFromUrl(templateSettings.headerImage);
if (size.width >= 600) {
// keep original image, just set a fixed width
templateSettings.headerImageWidth = 600;
}
if (isLocalContentImage(templateSettings.headerImage, urlUtils.getSiteUrl())) {
// we can safely request a 1200px image - Ghost will serve the original if it's smaller
templateSettings.headerImage = templateSettings.headerImage.replace(/\/content\/images\//, '/content/images/size/w1200/');
}
} catch (err) {
// log and proceed. Using original header image without fixed width isn't fatal.
logging.error(err);
}
}
}
return templateSettings;
};
const serialize = async (postModel, options = {isBrowserPreview: false, apiVersion: 'v4'}) => {
const post = await serializePostModel(postModel, options.apiVersion);
const timezone = settingsCache.get('timezone');
const momentDate = post.published_at ? moment(post.published_at) : moment();
post.published_at = momentDate.tz(timezone).format('DD MMM YYYY');
post.authors = post.authors && post.authors.map(author => author.name).join(',');
2019-11-06 14:32:11 +03:00
if (post.posts_meta) {
post.email_subject = post.posts_meta.email_subject;
}
// we use post.excerpt as a hidden piece of text that is picked up by some email
// clients as a "preview" when listing emails. Our current plaintext/excerpt
// generation outputs links as "Link [https://url/]" which isn't desired in the preview
if (!post.custom_excerpt && post.excerpt) {
post.excerpt = post.excerpt.replace(/\s\[http(.*?)\]/g, '');
}
post.html = mobiledocLib.mobiledocHtmlRenderer.render(JSON.parse(post.mobiledoc), {target: 'email'});
// same options as used in Post model for generating plaintext but without `wordwrap: 80`
// to avoid replacement strings being split across lines and for mail clients to handle
// word wrapping based on user preferences
post.plaintext = htmlToText.fromString(post.html, {
wordwrap: false,
ignoreImage: true,
hideLinkHrefIfSameAsText: true,
preserveNewlines: true,
returnDomByDefault: true,
uppercaseHeadings: false
});
// Outlook will render feature images at full-size breaking the layout.
// Content images fix this by rendering max 600px images - do the same for feature image here
if (post.feature_image) {
if (isUnsplashImage(post.feature_image)) {
// Unsplash images have a minimum size so assuming 1200px is safe
const unsplashUrl = new URL(post.feature_image);
unsplashUrl.searchParams.set('w', 1200);
post.feature_image = unsplashUrl.href;
post.feature_image_width = 600;
} else {
const {imageSize} = require('../../lib/image');
try {
const size = await imageSize.getImageSizeFromUrl(post.feature_image);
if (size.width >= 600) {
// keep original image, just set a fixed width
post.feature_image_width = 600;
}
if (isLocalContentImage(post.feature_image, urlUtils.getSiteUrl())) {
// we can safely request a 1200px image - Ghost will serve the original if it's smaller
post.feature_image = post.feature_image.replace(/\/content\/images\//, '/content/images/size/w1200/');
}
} catch (err) {
// log and proceed. Using original feature_image without fixed width isn't fatal.
logging.error(err);
}
}
}
const useLabsTemplate = config.get('enableDeveloperExperiments');
const templateSettings = await (useLabsTemplate ? getLabsTemplateSettings() : getTemplateSettings());
const templateRenderer = useLabsTemplate ? labsTemplate : template;
let htmlTemplate = templateRenderer({post, site: getSite(), templateSettings});
if (options.isBrowserPreview) {
const previewUnsubscribeUrl = createUnsubscribeUrl();
htmlTemplate = htmlTemplate.replace('%recipient.unsubscribe_url%', previewUnsubscribeUrl);
}
// Inline css to style attributes, turn on support for pseudo classes.
const juiceOptions = {inlinePseudoElements: true};
let juicedHtml = juice(htmlTemplate, juiceOptions);
// convert juiced HTML to a DOM-like interface for further manipulation
// happens after inlining of CSS so we can change element types without worrying about styling
let _cheerio = cheerio.load(juicedHtml);
// force all links to open in new tab
_cheerio('a').attr('target','_blank');
// convert figure and figcaption to div so that Outlook applies margins
_cheerio('figure, figcaption').each((i, elem) => (elem.tagName = 'div'));
juicedHtml = _cheerio.html();
// Fix any unsupported chars in Outlook
juicedHtml = juicedHtml.replace(/&apos;/g, '&#39;');
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
// Clean up any unknown replacements strings to get our final content
const {html, plaintext} = normalizeReplacementStrings({
html: juicedHtml,
plaintext: post.plaintext
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
});
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
return {
subject: post.email_subject || post.title,
html,
plaintext
};
};
module.exports = {
serialize,
Refactor mega service to use stored email content and batch/recipient records no issue - store raw content in email record - keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed - split post email serializer into separate serialization and replacement parsing functions - serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements - `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content - move mailgun-specific functionality into the mailgun provider - previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service - the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place - exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows - `bulk-email` service split into three methods - `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails - `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently - `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed - `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible - refactored `mega` service - modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats - used for generating email content before storing in the email table, and when sending test emails - from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData` - `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method - `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service - member row fetching extracted into a separate function and used by `createEmailBatches` - moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
createUnsubscribeUrl,
parseReplacements
};