Skip to content

Commit

Permalink
Merge pull request #9 from appirio-tech/dev
Browse files Browse the repository at this point in the history
Release 1.1.0 for handling failed messages as scheduled worker
  • Loading branch information
vikasrohit authored Apr 24, 2017
2 parents 6be4f05 + f646e94 commit c854e25
Show file tree
Hide file tree
Showing 14 changed files with 434 additions and 73 deletions.
6 changes: 6 additions & 0 deletions consumer/.ebextensions/01-environment-variables.config
Original file line number Diff line number Diff line change
Expand Up @@ -26,9 +26,15 @@ option_settings:
- namespace: aws:elasticbeanstalk:application:environment
option_name: RABBITMQ_PROJECTS_EXCHANGE
value: dev.projects
- namespace: aws:elasticbeanstalk:application:environment
option_name: RABBITMQ_CONNECT2SF_EXCHANGE
value: dev.tc.connect2sf
- namespace: aws:elasticbeanstalk:application:environment
option_name: QUEUE_PROJECTS
value: dev.project.service
- namespace: aws:elasticbeanstalk:application:environment
option_name: QUEUE_CONNECT2SF
value: dev.tc.connect2sf.exclusive
- namespace: aws:elasticbeanstalk:application:environment
option_name: IDENTITY_SERVICE_URL
value: TBD
Expand Down
12 changes: 8 additions & 4 deletions consumer/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ LABEL description="Topcoder Salesforce Integration"
RUN apt-get update && \
apt-get upgrade -y

#RUN apt-get install cron -y


# Create app directory
RUN mkdir -p /usr/src/app
Expand All @@ -15,10 +17,12 @@ COPY . /usr/src/app
# Install app dependencies
RUN npm install

RUN npm install -g forever
RUN npm install -g forever babel-cli

EXPOSE 80
#RUN crontab config/scheduler-cron

CMD forever -c "npm start" --uid "consumer" .
#RUN service cron start

EXPOSE 80

#CMD npm start
CMD forever -c "npm start" --uid "consumer" .
22 changes: 15 additions & 7 deletions consumer/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Env variable: `LOG_LEVEL`
- **rabbitmqURL**
The rabbitmq URL.
Create a free account here https://www.cloudamqp.com/ and create a new instance in any region.
You can get URL by clicking on queue details button.
You can get URL by clicking on queue details button. For deployment in AWS, please make sure that this instance is launched in the VPC which target AWS server can communicate with.
Env variable: `RABBITMQ_URL`

- **ownerId**
Expand Down Expand Up @@ -86,7 +86,7 @@ You can use the existing cert.pem from `config` directory.
Or generate a new certificate and key using a command:
`openssl req -newkey rsa:2048 -new -nodes -x509 -days 3650 -keyout key.pem -out cert.pem`

Private key of your certificate is read from environment variable, instead of reading from the config directory. So please make sure you replace all new line characters with `\n` before setting it in the environment variable. Application would add newline characters back to the key when using it to sign the requests.
**Private key of your certificate is read from environment variable, instead of reading from the config directory. So please make sure you replace all new line characters with `\n` before setting it in the environment variable. Application would add newline characters back to the key when using it to sign the requests.**

![Alt text](https://monosnap.com/file/tT9ZZXUH1aa1j7cFzYxaV9RjmHWCum.png)
Click Save
Expand Down Expand Up @@ -224,11 +224,19 @@ Check the Lead details in Saleforce
![Alt text](https://monosnap.com/file/PdMF97k18cBGeZjR9qOkkBe1AjYw2n.png)
Lead is removed from the campaign

## Deployment Checklist
1. AppXpressConfig table exists in dynamodb with dripcampaignId
2. Make sure configured rabbitmq exchange and queue are created appropriately in cloumamqp
3. There should be proper mapping between exchange and queue specified in the conifguration
4. Grant permission, with user conifgured, for the app once using url https://login.salesforce.com/services/oauth2/authorize?client_id=[clientId]&redirect_uri=https://login.salesforce.com&response_type=code

Notes on Error Handling.
UnprocessableError is thrown if operation cannot be completed.
## CI
* All changes into dev will be built and deployed to AWS beanstalk environment `tc-connect2sf-dev`
* All changes into master will be built and deployed to AWS beanstalk environment `tc-connect2sf-prod`

## Notes on Error Handling.
`UnprocessableError` is thrown if operation cannot be completed.
For example: duplicated project id added to the queue, Lead cannot be found etc.
In such situation, the message from rabbitmq will be marked as ACK (removed).
If we won't remove it from queue, the message will be stuck forever.


If we won't remove it from queue, the message will be stuck forever.
For any other type of error the message from the rabbitmq will me marked as ACK as well, however, it would requeued into another queue for later inspection. It right now publishes the message content to the same rabbitmq exchange (configured as mentioned in Configuration section) with routing key being `connect2sf.failed`. So, we have to map the exchange and routing key comibation to a queue to which no consumer is listeting e.g. `tc-connect2sf.failed` is used in dev environment. Now we can see messages, via rabbitmq manager UI, in this queue to check if any of the messages failed and what was id of the project which failed. We can either remove those messages from the queue, if we are going to add those leads manually in saleforce or move them again to the original queue after fixing the deployed environment.
2 changes: 1 addition & 1 deletion consumer/config/constants.js
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@ export const EVENT = {
PROJECT_DRAFT_CREATED: 'project.draft-created',
PROJECT_UPDATED: 'project.updated',
PROJECT_DELETED: 'project.deleted',
CONNECT_TO_SF_FAILED: 'connect2sf.failed'
FAILED_SUFFIX: '.failed'
},
};
4 changes: 3 additions & 1 deletion consumer/config/custom-environment-variables.json
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,10 @@
},
"rabbitmq" : {
"projectsExchange" : "RABBITMQ_PROJECTS_EXCHANGE",
"connect2sfExchange" : "RABBITMQ_CONNECT2SF_EXCHANGE",
"queues": {
"project": "QUEUE_PROJECTS"
"project": "QUEUE_PROJECTS",
"connect2sf": "QUEUE_CONNECT2SF"
}
}
}
1 change: 1 addition & 0 deletions consumer/config/scheduler-cron
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
*/5 * * * * babel-node /usr/src/app/src/scheduled-worker.js
4 changes: 3 additions & 1 deletion consumer/config/test.json
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,10 @@
},
"rabbitmq" : {
"projectsExchange" : "dev.projects",
"connect2sfExchange": "dev.tc.connect2sf",
"queues": {
"project": "dev.project.service"
"project": "dev.project.service",
"connect2sf": "dev.tc.connect2sf.exclusive"
}
}
}
1 change: 1 addition & 0 deletions consumer/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,7 @@
"joi": "^9.0.4",
"jsonwebtoken": "^7.1.7",
"lodash": "^4.14.2",
"node-cron": "^1.1.3",
"superagent": "^2.1.0",
"superagent-promise": "^1.1.0",
"winston": "^2.2.0"
Expand Down
170 changes: 170 additions & 0 deletions consumer/src/scheduled-worker.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,170 @@
/**
* The main app entry
*/

import config from 'config';
import amqp from 'amqplib';
import _ from 'lodash';
import logger from './common/logger';
import ConsumerService from './services/ConsumerService';
import { EVENT } from '../config/constants';

const debug = require('debug')('app:worker');

const FETCH_LIMIT = 10;

let connection;
process.once('SIGINT', () => {
debug('Received SIGINT...closing connection...')
try {
connection.close();
} catch (ignore) { // eslint-ignore-line
logger.logFullError(ignore)
}
process.exit();
});

let EVENT_HANDLERS = {
[EVENT.ROUTING_KEY.PROJECT_DRAFT_CREATED]: ConsumerService.processProjectCreated
// [EVENT.ROUTING_KEY.PROJECT_UPDATED]: ConsumerService.processProjectUpdated
}

function close() {
console.log('closing self after processing messages...')
try {
setTimeout(connection.close.bind(connection), 30000);
} catch (ignore) { // eslint-ignore-line
logger.logFullError(ignore)
}
}

export function initHandlers(handlers) {
EVENT_HANDLERS = handlers;
}

/**
* Processes the given message and acks/nacks the channel
* @param {Object} channel the target channel
* @param {Object} msg the message to be processed
*/
export function processMessage(channel, msg) {
return new Promise((resolve, reject) => {
if (!msg) {
reject(new Error('Empty message. Ignoring'));
return;
}
debug(`Consuming message in \n${msg.content}`);
const key = _.get(msg, 'fields.routingKey');
debug('Received Message', key, msg.fields);

let handler;
let data;
try {
handler = EVENT_HANDLERS[key];
if (!_.isFunction(handler)) {
logger.error(`Unknown message type: ${key}, NACKing... `);
reject(new Error(`Unknown message type: ${key}`));
return;
}
data = JSON.parse(msg.content.toString());
} catch (ignore) {
logger.info(ignore);
logger.error('Invalid message. Ignoring');
resolve('Invalid message. Ignoring');
return;
}
return handler(logger, data).then(() => {
resolve(msg);
return;
})
.catch((e) => {
// logger.logFullError(e, `Error processing message`);
if (e.shouldAck) {
debug("Resolving for Unprocessable Error in handler...");
resolve(msg);
} else {
debug("Rejecting promise for error in msg processing...")
reject(new Error('Error processing message'));
}
});
})
}

function assertExchangeQueues(channel, exchangeName, queue) {
channel.assertExchange(exchangeName, 'topic', { durable: true });
channel.assertQueue(queue, { durable: true });
const bindings = _.keys(EVENT_HANDLERS);
const bindingPromises = _.map(bindings, rk =>
channel.bindQueue(queue, exchangeName, rk));
debug('binding queue ' + queue + ' to exchange: ' + exchangeName);
return Promise.all(bindingPromises);
}

/**
* Start the worker
*/
export async function start() {
try {
console.log("Scheduled Worker Connecting to RabbitMQ: " + config.rabbitmqURL.substr(-5));
connection = await amqp.connect(config.rabbitmqURL);
connection.on('error', (e) => {
logger.logFullError(e, `ERROR IN CONNECTION`);
})
connection.on('close', () => {
debug('Before closing connection...')
})
debug('created connection successfully with URL: ' + config.rabbitmqURL);
const connect2sfChannel = await connection.createConfirmChannel();
debug('Channel created for consuming failed messages ...');
connect2sfChannel.prefetch(FETCH_LIMIT);
assertExchangeQueues(
connect2sfChannel,
config.rabbitmq.connect2sfExchange,
config.rabbitmq.queues.connect2sf
).then(() => {
debug('Asserted all required exchanges and queues');
let counter = 0;
_.range(1, 11).forEach(() => {
return connect2sfChannel.get(config.rabbitmq.queues.connect2sf).
then((msg) => {
if (msg) {
return processMessage(
connect2sfChannel,
msg
).then((responses) => {
counter++;
debug('Processed message');
connect2sfChannel.ack(msg);
if (counter >= FETCH_LIMIT) {
close();
}
}).catch((e) => {
counter++;
debug('Processed message with Error');
connect2sfChannel.nack(msg);
logger.logFullError(e, `Unable to process one of the messages`);
if (counter >= FETCH_LIMIT) {
close();
}
})
} else {
counter++;
debug('Processed Empty message');
if (counter >= FETCH_LIMIT) {
close();
}
}
}).catch(() => {
console.log('get failed to consume')
})
})
})

} catch (e) {
logger.logFullError(e, `Unable to connect to RabbitMQ`);
}
}

if (!module.parent) {
start();
}
Loading

0 comments on commit c854e25

Please sign in to comment.