Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
const _ = require('lodash');
|
2019-11-05 08:14:54 +03:00
|
|
|
const juice = require('juice');
|
|
|
|
const template = require('./template');
|
2021-06-30 16:56:57 +03:00
|
|
|
const settingsCache = require('../../../shared/settings-cache');
|
2021-06-18 18:03:58 +03:00
|
|
|
const labs = require('../../services/labs');
|
2020-05-28 13:57:02 +03:00
|
|
|
const urlUtils = require('../../../shared/url-utils');
|
2020-05-26 19:46:45 +03:00
|
|
|
const moment = require('moment-timezone');
|
2019-11-15 08:55:46 +03:00
|
|
|
const cheerio = require('cheerio');
|
2019-11-26 19:07:04 +03:00
|
|
|
const api = require('../../api');
|
2019-11-27 10:48:44 +03:00
|
|
|
const {URL} = require('url');
|
2020-04-17 12:22:53 +03:00
|
|
|
const mobiledocLib = require('../../lib/mobiledoc');
|
|
|
|
const htmlToText = require('html-to-text');
|
2021-05-14 13:57:29 +03:00
|
|
|
const {isUnsplashImage, isLocalContentImage} = require('@tryghost/kg-default-cards/lib/utils');
|
2021-06-15 17:36:27 +03:00
|
|
|
const logging = require('@tryghost/logging');
|
2019-11-05 08:14:54 +03:00
|
|
|
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
const ALLOWED_REPLACEMENTS = ['first_name'];
|
|
|
|
|
2019-11-05 08:14:54 +03:00
|
|
|
const getSite = () => {
|
2020-03-12 07:22:33 +03:00
|
|
|
const publicSettings = settingsCache.getPublic();
|
|
|
|
return Object.assign({}, publicSettings, {
|
|
|
|
url: urlUtils.urlFor('home', true),
|
|
|
|
iconUrl: publicSettings.icon ? urlUtils.urlFor('image', {image: publicSettings.icon}, true) : null
|
2019-11-05 08:14:54 +03:00
|
|
|
});
|
|
|
|
};
|
|
|
|
|
2019-11-26 19:07:04 +03:00
|
|
|
/**
|
|
|
|
* createUnsubscribeUrl
|
|
|
|
*
|
2019-11-27 08:28:21 +03:00
|
|
|
* Takes a member uuid and returns the url that should be used to unsubscribe
|
|
|
|
* In case of no member uuid, generates the preview unsubscribe url - `?preview=1`
|
2019-11-26 19:07:04 +03:00
|
|
|
*
|
2019-11-27 08:28:21 +03:00
|
|
|
* @param {string} uuid
|
2019-11-26 19:07:04 +03:00
|
|
|
*/
|
2019-11-27 08:28:21 +03:00
|
|
|
const createUnsubscribeUrl = (uuid) => {
|
2019-11-26 19:07:04 +03:00
|
|
|
const siteUrl = urlUtils.getSiteUrl();
|
|
|
|
const unsubscribeUrl = new URL(siteUrl);
|
|
|
|
unsubscribeUrl.pathname = `${unsubscribeUrl.pathname}/unsubscribe/`.replace('//', '/');
|
2019-11-27 08:28:21 +03:00
|
|
|
if (uuid) {
|
|
|
|
unsubscribeUrl.searchParams.set('uuid', uuid);
|
2019-11-26 19:07:04 +03:00
|
|
|
} else {
|
|
|
|
unsubscribeUrl.searchParams.set('preview', '1');
|
|
|
|
}
|
|
|
|
|
|
|
|
return unsubscribeUrl.href;
|
|
|
|
};
|
|
|
|
|
|
|
|
// NOTE: serialization is needed to make sure we are using current API and do post transformations
|
|
|
|
// such as image URL transformation from relative to absolute
|
2021-03-02 04:31:01 +03:00
|
|
|
const serializePostModel = async (model, apiVersion = 'v4') => {
|
2020-04-17 12:22:53 +03:00
|
|
|
// fetch mobiledoc rather than html and plaintext so we can render email-specific contents
|
|
|
|
const frame = {options: {context: {user: true}, formats: 'mobiledoc'}};
|
2019-11-26 19:07:04 +03:00
|
|
|
const docName = 'posts';
|
|
|
|
|
|
|
|
await api.shared
|
|
|
|
.serializers
|
|
|
|
.handle
|
|
|
|
.output(model, {docName: docName, method: 'read'}, api[apiVersion].serializers.output, frame);
|
|
|
|
|
|
|
|
return frame.response[docName][0];
|
|
|
|
};
|
|
|
|
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
// removes %% wrappers from unknown replacement strings in email content
|
|
|
|
const normalizeReplacementStrings = (email) => {
|
|
|
|
// we don't want to modify the email object in-place
|
|
|
|
const emailContent = _.pick(email, ['html', 'plaintext']);
|
|
|
|
|
2020-04-20 14:24:05 +03:00
|
|
|
const EMAIL_REPLACEMENT_REGEX = /%%(\{.*?\})%%/g;
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
const REPLACEMENT_STRING_REGEX = /\{(?<recipientProperty>\w*?)(?:,? *(?:"|")(?<fallback>.*?)(?:"|"))?\}/;
|
|
|
|
|
|
|
|
['html', 'plaintext'].forEach((format) => {
|
|
|
|
emailContent[format] = emailContent[format].replace(EMAIL_REPLACEMENT_REGEX, (replacementMatch, replacementStr) => {
|
|
|
|
const match = replacementStr.match(REPLACEMENT_STRING_REGEX);
|
|
|
|
|
|
|
|
if (match) {
|
|
|
|
const {recipientProperty} = match.groups;
|
|
|
|
|
|
|
|
if (ALLOWED_REPLACEMENTS.includes(recipientProperty)) {
|
|
|
|
// keeps wrapping %% for later replacement with real data
|
|
|
|
return replacementMatch;
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
// removes %% so output matches user supplied content
|
|
|
|
return replacementStr;
|
|
|
|
});
|
|
|
|
});
|
|
|
|
|
|
|
|
return emailContent;
|
|
|
|
};
|
|
|
|
|
2021-06-23 11:00:03 +03:00
|
|
|
/**
|
|
|
|
* Parses email content and extracts an array of replacements with desired fallbacks
|
|
|
|
*
|
|
|
|
* @param {Object} email
|
|
|
|
* @param {string} email.html
|
|
|
|
* @param {string} email.plaintext
|
|
|
|
*
|
|
|
|
* @returns {Object[]} replacements
|
|
|
|
*/
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
const parseReplacements = (email) => {
|
|
|
|
const EMAIL_REPLACEMENT_REGEX = /%%(\{.*?\})%%/g;
|
|
|
|
const REPLACEMENT_STRING_REGEX = /\{(?<recipientProperty>\w*?)(?:,? *(?:"|")(?<fallback>.*?)(?:"|"))?\}/;
|
2020-04-20 14:24:05 +03:00
|
|
|
|
|
|
|
const replacements = [];
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
|
2020-04-20 14:24:05 +03:00
|
|
|
['html', 'plaintext'].forEach((format) => {
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
let result;
|
|
|
|
while ((result = EMAIL_REPLACEMENT_REGEX.exec(email[format])) !== null) {
|
|
|
|
const [replacementMatch, replacementStr] = result;
|
2020-04-20 14:24:05 +03:00
|
|
|
const match = replacementStr.match(REPLACEMENT_STRING_REGEX);
|
|
|
|
|
|
|
|
if (match) {
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
const {recipientProperty, fallback} = match.groups;
|
2020-04-20 14:24:05 +03:00
|
|
|
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
if (ALLOWED_REPLACEMENTS.includes(recipientProperty)) {
|
2020-04-20 14:24:05 +03:00
|
|
|
const id = `replacement_${replacements.length + 1}`;
|
|
|
|
|
|
|
|
replacements.push({
|
|
|
|
format,
|
|
|
|
id,
|
|
|
|
match: replacementMatch,
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
recipientProperty: `member_${recipientProperty}`,
|
2020-04-20 14:24:05 +03:00
|
|
|
fallback
|
|
|
|
});
|
|
|
|
}
|
|
|
|
}
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
}
|
2020-04-20 14:24:05 +03:00
|
|
|
});
|
|
|
|
|
|
|
|
return replacements;
|
|
|
|
};
|
|
|
|
|
2021-06-07 12:02:35 +03:00
|
|
|
const getTemplateSettings = async () => {
|
|
|
|
const templateSettings = {
|
|
|
|
headerImage: settingsCache.get('newsletter_header_image'),
|
2021-06-08 12:18:10 +03:00
|
|
|
showHeaderIcon: settingsCache.get('newsletter_show_header_icon') && settingsCache.get('icon'),
|
2021-06-07 12:02:35 +03:00
|
|
|
showHeaderTitle: settingsCache.get('newsletter_show_header_title'),
|
|
|
|
showFeatureImage: settingsCache.get('newsletter_show_feature_image'),
|
|
|
|
titleFontCategory: settingsCache.get('newsletter_title_font_category'),
|
|
|
|
titleAlignment: settingsCache.get('newsletter_title_alignment'),
|
|
|
|
bodyFontCategory: settingsCache.get('newsletter_body_font_category'),
|
|
|
|
showBadge: settingsCache.get('newsletter_show_badge'),
|
|
|
|
footerContent: settingsCache.get('newsletter_footer_content'),
|
2021-06-18 18:03:58 +03:00
|
|
|
accentColor: settingsCache.get('accent_color'),
|
|
|
|
labsFeatureImageMeta: labs.isSet('featureImageMeta')
|
2021-06-07 12:02:35 +03:00
|
|
|
};
|
|
|
|
|
|
|
|
if (templateSettings.headerImage) {
|
|
|
|
if (isUnsplashImage(templateSettings.headerImage)) {
|
|
|
|
// Unsplash images have a minimum size so assuming 1200px is safe
|
|
|
|
const unsplashUrl = new URL(templateSettings.headerImage);
|
2021-06-18 17:37:42 +03:00
|
|
|
unsplashUrl.searchParams.set('w', '1200');
|
2021-06-07 12:02:35 +03:00
|
|
|
|
|
|
|
templateSettings.headerImage = unsplashUrl.href;
|
|
|
|
templateSettings.headerImageWidth = 600;
|
|
|
|
} else {
|
|
|
|
const {imageSize} = require('../../lib/image');
|
|
|
|
try {
|
|
|
|
const size = await imageSize.getImageSizeFromUrl(templateSettings.headerImage);
|
|
|
|
|
|
|
|
if (size.width >= 600) {
|
|
|
|
// keep original image, just set a fixed width
|
|
|
|
templateSettings.headerImageWidth = 600;
|
|
|
|
}
|
|
|
|
|
|
|
|
if (isLocalContentImage(templateSettings.headerImage, urlUtils.getSiteUrl())) {
|
|
|
|
// we can safely request a 1200px image - Ghost will serve the original if it's smaller
|
|
|
|
templateSettings.headerImage = templateSettings.headerImage.replace(/\/content\/images\//, '/content/images/size/w1200/');
|
|
|
|
}
|
|
|
|
} catch (err) {
|
|
|
|
// log and proceed. Using original header image without fixed width isn't fatal.
|
|
|
|
logging.error(err);
|
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
|
|
|
return templateSettings;
|
|
|
|
};
|
|
|
|
|
2021-03-02 04:31:01 +03:00
|
|
|
const serialize = async (postModel, options = {isBrowserPreview: false, apiVersion: 'v4'}) => {
|
|
|
|
const post = await serializePostModel(postModel, options.apiVersion);
|
2020-04-17 12:22:53 +03:00
|
|
|
|
2020-06-22 14:21:00 +03:00
|
|
|
const timezone = settingsCache.get('timezone');
|
2020-04-24 02:40:08 +03:00
|
|
|
const momentDate = post.published_at ? moment(post.published_at) : moment();
|
|
|
|
post.published_at = momentDate.tz(timezone).format('DD MMM YYYY');
|
|
|
|
|
2019-11-06 14:03:28 +03:00
|
|
|
post.authors = post.authors && post.authors.map(author => author.name).join(',');
|
2019-11-06 14:32:11 +03:00
|
|
|
if (post.posts_meta) {
|
|
|
|
post.email_subject = post.posts_meta.email_subject;
|
|
|
|
}
|
2020-08-12 22:14:06 +03:00
|
|
|
|
|
|
|
// we use post.excerpt as a hidden piece of text that is picked up by some email
|
|
|
|
// clients as a "preview" when listing emails. Our current plaintext/excerpt
|
|
|
|
// generation outputs links as "Link [https://url/]" which isn't desired in the preview
|
2020-08-24 10:46:46 +03:00
|
|
|
if (!post.custom_excerpt && post.excerpt) {
|
2020-08-12 22:14:06 +03:00
|
|
|
post.excerpt = post.excerpt.replace(/\s\[http(.*?)\]/g, '');
|
|
|
|
}
|
|
|
|
|
2020-04-17 12:22:53 +03:00
|
|
|
post.html = mobiledocLib.mobiledocHtmlRenderer.render(JSON.parse(post.mobiledoc), {target: 'email'});
|
|
|
|
// same options as used in Post model for generating plaintext but without `wordwrap: 80`
|
|
|
|
// to avoid replacement strings being split across lines and for mail clients to handle
|
|
|
|
// word wrapping based on user preferences
|
|
|
|
post.plaintext = htmlToText.fromString(post.html, {
|
2020-04-20 14:24:05 +03:00
|
|
|
wordwrap: false,
|
2020-04-17 12:22:53 +03:00
|
|
|
ignoreImage: true,
|
|
|
|
hideLinkHrefIfSameAsText: true,
|
|
|
|
preserveNewlines: true,
|
|
|
|
returnDomByDefault: true,
|
|
|
|
uppercaseHeadings: false
|
|
|
|
});
|
|
|
|
|
2021-05-14 13:57:29 +03:00
|
|
|
// Outlook will render feature images at full-size breaking the layout.
|
|
|
|
// Content images fix this by rendering max 600px images - do the same for feature image here
|
2021-06-01 19:07:19 +03:00
|
|
|
if (post.feature_image) {
|
|
|
|
if (isUnsplashImage(post.feature_image)) {
|
|
|
|
// Unsplash images have a minimum size so assuming 1200px is safe
|
|
|
|
const unsplashUrl = new URL(post.feature_image);
|
2021-06-18 17:37:42 +03:00
|
|
|
unsplashUrl.searchParams.set('w', '1200');
|
2021-06-01 19:07:19 +03:00
|
|
|
|
|
|
|
post.feature_image = unsplashUrl.href;
|
|
|
|
post.feature_image_width = 600;
|
|
|
|
} else {
|
|
|
|
const {imageSize} = require('../../lib/image');
|
|
|
|
try {
|
|
|
|
const size = await imageSize.getImageSizeFromUrl(post.feature_image);
|
|
|
|
|
|
|
|
if (size.width >= 600) {
|
|
|
|
// keep original image, just set a fixed width
|
|
|
|
post.feature_image_width = 600;
|
|
|
|
}
|
2021-05-14 13:57:29 +03:00
|
|
|
|
2021-06-01 19:07:19 +03:00
|
|
|
if (isLocalContentImage(post.feature_image, urlUtils.getSiteUrl())) {
|
|
|
|
// we can safely request a 1200px image - Ghost will serve the original if it's smaller
|
|
|
|
post.feature_image = post.feature_image.replace(/\/content\/images\//, '/content/images/size/w1200/');
|
|
|
|
}
|
|
|
|
} catch (err) {
|
|
|
|
// log and proceed. Using original feature_image without fixed width isn't fatal.
|
|
|
|
logging.error(err);
|
2021-05-14 13:57:29 +03:00
|
|
|
}
|
|
|
|
}
|
|
|
|
}
|
|
|
|
|
2021-06-21 15:40:40 +03:00
|
|
|
const templateSettings = await getTemplateSettings();
|
2021-06-07 12:02:35 +03:00
|
|
|
|
2021-06-21 15:40:40 +03:00
|
|
|
let htmlTemplate = template({post, site: getSite(), templateSettings});
|
2021-06-07 12:02:35 +03:00
|
|
|
|
2019-11-26 19:07:04 +03:00
|
|
|
if (options.isBrowserPreview) {
|
2021-06-18 17:37:42 +03:00
|
|
|
const previewUnsubscribeUrl = createUnsubscribeUrl(null);
|
2019-11-26 19:07:04 +03:00
|
|
|
htmlTemplate = htmlTemplate.replace('%recipient.unsubscribe_url%', previewUnsubscribeUrl);
|
|
|
|
}
|
2020-04-17 12:22:53 +03:00
|
|
|
|
2020-08-18 18:18:44 +03:00
|
|
|
// Inline css to style attributes, turn on support for pseudo classes.
|
|
|
|
const juiceOptions = {inlinePseudoElements: true};
|
|
|
|
let juicedHtml = juice(htmlTemplate, juiceOptions);
|
2020-04-17 12:22:53 +03:00
|
|
|
|
2020-08-19 14:34:14 +03:00
|
|
|
// convert juiced HTML to a DOM-like interface for further manipulation
|
|
|
|
// happens after inlining of CSS so we can change element types without worrying about styling
|
2019-11-15 08:55:46 +03:00
|
|
|
let _cheerio = cheerio.load(juicedHtml);
|
2020-08-19 14:34:14 +03:00
|
|
|
// force all links to open in new tab
|
2019-11-15 08:55:46 +03:00
|
|
|
_cheerio('a').attr('target','_blank');
|
2020-08-19 14:34:14 +03:00
|
|
|
// convert figure and figcaption to div so that Outlook applies margins
|
2021-06-18 17:37:42 +03:00
|
|
|
_cheerio('figure, figcaption').each((i, elem) => !!(elem.tagName = 'div'));
|
2019-11-15 08:55:46 +03:00
|
|
|
juicedHtml = _cheerio.html();
|
2020-04-20 14:24:05 +03:00
|
|
|
|
2020-08-12 21:46:25 +03:00
|
|
|
// Fix any unsupported chars in Outlook
|
|
|
|
juicedHtml = juicedHtml.replace(/'/g, ''');
|
|
|
|
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
// Clean up any unknown replacements strings to get our final content
|
|
|
|
const {html, plaintext} = normalizeReplacementStrings({
|
2019-11-15 08:55:46 +03:00
|
|
|
html: juicedHtml,
|
2019-11-05 11:04:48 +03:00
|
|
|
plaintext: post.plaintext
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
});
|
2020-04-20 14:24:05 +03:00
|
|
|
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
return {
|
|
|
|
subject: post.email_subject || post.title,
|
|
|
|
html,
|
|
|
|
plaintext
|
|
|
|
};
|
2019-11-05 08:14:54 +03:00
|
|
|
};
|
|
|
|
|
|
|
|
module.exports = {
|
2019-11-26 19:07:04 +03:00
|
|
|
serialize,
|
Refactor mega service to use stored email content and batch/recipient records
no issue
- store raw content in email record
- keep any replacement strings in the html/plaintext content so that it can be used when sending email rather than needing to re-serialize the post content which may have changed
- split post email serializer into separate serialization and replacement parsing functions
- serialization now returns any email content that is derived from the post content (subject/html/plaintext) rather than post content plus replacements
- `parseReplacements` has been split out so that it can be run against email content rather than a post, this allows mega and the email preview service to work with the stored email content
- move mailgun-specific functionality into the mailgun provider
- previously mailgun-specific behaviour was spread across the post email serializer, mega, and bulk-email service
- the per-batch `send` functionality was moved from the `bulk-email` service to the mailgun provider and updated to take email content, recipient info, and replacement info so that all mailgun-specific headers and replacement formatting can be handled in one place
- exposes the `BATCH_SIZE` constant because batch sizes are limited to what the provider allows
- `bulk-email` service split into three methods
- `send` responsible for taking email content and recipients, parsing replacement info from the email content and using that to collate a recipient data object, and finally coordinating the send via the mailgun provider. Usable directly for use-cases such as test emails
- `processEmail` takes an email ID, loads it and coordinates sending related batches concurrently
- `processEmailBatch` takes an email_batch ID, loads it along with associated email_recipient records and passes the data through to the `send` function, updating the batch status as it's processed
- `processEmail` and `processEmailBatch` take IDs rather than objects ready for future use by job-queues, it's best to keep job parameters as minimal as possible
- refactored `mega` service
- modified `getEmailData` to collate email content (from/reply-to/subject/html/plaintext) rather than being responsible for dealing with replacements and mailgun-specific replacement formats
- used for generating email content before storing in the email table, and when sending test emails
- from/reply-to calculation moved out of the post-email-serializer into mega and extracted into separate functions used by `getEmailData`
- `sendTestEmail` updated to generate `EmailRecipient`-like objects for each email address so that appropriate data can be supplied to the updated `bulk-email.send` method
- `sendEmailJob` updated to create `email_batches` and associated `email_recipients` records then hand over processing to the `bulk-email` service
- member row fetching extracted into a separate function and used by `createEmailBatches`
- moved updating of email status from `mega` to the `bulk-email` service, keeps concept of Successful/FailedBatch internal to the `bulk-email` service
2020-09-24 11:35:29 +03:00
|
|
|
createUnsubscribeUrl,
|
|
|
|
parseReplacements
|
2019-11-05 08:14:54 +03:00
|
|
|
};
|