Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add docker-compose + misc. irc bot changes (see commit msg) #1

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .env
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
irc:channel=#openmrstest
elasticsearch:host=http://elasticsearch:9200
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,5 @@
config.json
node_modules
logs
npm-debug.log
npm-debug.log
esdata
28 changes: 28 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
branches:
only:
- master
language: node_js
node_js:
- 4.3
- 5.6
- 5.7
compiler: clang-3.6

env:
- CXX=clang-3.6

addons:
apt:
sources:
- llvm-toolchain-precise-3.6
- ubuntu-toolchain-r-test
packages:
- clang-3.6
- g++-4.8
services:
- elasticsearch

before_install:
- npm install

script: npm test
6 changes: 4 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
FROM node:4-onbuild
FROM node:5-onbuild
RUN wget https://github.com/jwilder/dockerize/releases/download/v0.2.0/dockerize-linux-amd64-v0.2.0.tar.gz \
&& tar -C /usr/local/bin -xzvf dockerize-linux-amd64-v0.2.0.tar.gz

EXPOSE 3000
EXPOSE 3000
37 changes: 27 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,18 @@
# Record Scrum notes from the #openmrs IRC channel
[![Dependency Status](https://david-dm.org/djazayeri/openmrs-contrib-scrumbot.svg)](https://david-dm.org/djazayeri/openmrs-contrib-scrumbot) [![devDependency Status](https://david-dm.org/djazayeri/openmrs-contrib-scrumbot/dev-status.svg)](https://david-dm.org/djazayeri/openmrs-contrib-scrumbot#info=devDependencies)

You may set settings such as the channel and the elasticsearch host in `.env`, which is in this working directory, or optionally in `config.json`. A sample `.env` file is provided. `config.js` has defaults.

## Before building

npm install -g mocha

## Development

### ES (for a dev environment on OSX)
This requires [docker-compose][] for provisioning. For development, simply do:

$ docker run --name scrumbot-es -d -p 9200:9200 -p 9300:9300 elasticsearch -Des.network.bindHost=0.0.0.0

## Run just the listener (without the webapp)

$ node index --elasticsearch.host http://192.168.99.100:9200
Expand All @@ -30,28 +33,42 @@ If you do not have elasticsearch running on localhost:9200, then create a "confi
Then

$ npm start

And see the webapp running on http://localhost:3000

### Testing the docker build locally

$ docker run --name web -d -p 3000:3000 --link es:es -e "elasticsearch:host=http://es:9200" -e "irc:channel=#openmrstest" djazayeri/openmrs-scrumbot:1.0
``` shell
$ docker-compose run web
```

## Prod
## Production

Note: this approach uses LINK networking, which will eventually be deprecated in Docker.

### ES for prod on a Digital Ocean one-click app Docker box (don't expose ElasticSearch to the outside world, since it isn't secured)
### ES for production on a Digital Ocean one-click app Docker box (don't expose ElasticSearch to the outside world, since it isn't secured)

**Order here matters.**

$ docker run -d --restart="unless-stopped" --name es -v "$PWD/esdata":/usr/share/elasticsearch/data -p 127.0.0.1:9200:9200 elasticsearch

``` shell
$ docker-compose -f docker-compose.override.yml -f docker-compose-prod.yml up -d
```
### dockerhub does builds automatically

Whenever you commit code to this repository it is automatically built as djazayeri/openmrs-contrib-scrumbot:latest
Whenever you commit code to this repository it is automatically built as djazayeri/openmrs-contrib-scrumbot:latest

### run webapp+bot on docker on Digital Ocean

// this is automated as ./update-web
``` shell
$ docker pull djazayeri/openmrs-contrib-scrumbot:latest
$ docker rm web
$ docker run -d -p 80:3000 --name web --link es:es -e "elasticsearch:host=http://es:9200" djazayeri/openmrs-contrib-scrumbot:latest
```

You may also use [docker-compose][] to handle this:

``` shell
$ docker-compose -f docker-compose.yml -f docker-compose-prod.yml up -d
```

[docker-compose]: https://docs.docker.com/compose/
3 changes: 2 additions & 1 deletion bin/www
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
var app = require('../server');
var debug = require('debug')('scrumbot:server');
var http = require('http');
var log = require('../log');

/**
* Get port from environment and store in Express.
Expand Down Expand Up @@ -86,5 +87,5 @@ function onListening() {
var bind = typeof addr === 'string'
? 'pipe ' + addr
: 'port ' + addr.port;
debug('Listening on ' + bind);
log.info('Listening on ' + bind);
}
4 changes: 3 additions & 1 deletion config.js
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@ nconf
server: "chat.freenode.net",
channel: "#openmrs",
nick: "omrs-scrum-bot",
userName: "scrumbot",
realName: "OpenMRS Scrum Bot",
startListening: "!scrumon",
stopListening: "!scrumoff",
sayBuildFailures: "!scrumon"
Expand All @@ -22,4 +24,4 @@ nconf
}
});

module.exports = nconf;
module.exports = nconf;
8 changes: 4 additions & 4 deletions db.js
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ module.exports.thisWeekScrums = function () {
body: ejs.Request()
.query(ejs.RangeQuery("startTime", {gte: "now/w"}))
}).then(function (response) {
return _.pluck(response.hits.hits, "_source").reverse();
return _.map(response.hits.hits, "_source").reverse();
});
};

Expand All @@ -92,7 +92,7 @@ module.exports.scrumsWithIssue = function (key) {
body: ejs.Request()
.query(ejs.TermQuery("issues", key))
}).then(function (response) {
return _.pluck(response.hits.hits, "_source");
return _.map(response.hits.hits, "_source");
});
};

Expand All @@ -116,6 +116,6 @@ module.exports.scrumsBetween = function (startTime, endTime) {
)
.sort(ejs.Sort("startTime").desc())
}).then(function (response) {
return _.pluck(response.hits.hits, "_source");
return _.map(response.hits.hits, "_source");
});
}
}
19 changes: 19 additions & 0 deletions docker-compose-prod.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
version: "2"
services:
elasticsearch:
image: elasticsearch:latest
command: elasticsearch -Des.network.bindHost=0.0.0.0
restart: unless-stopped
ports:
- "127.0.0.1:9200:9200"
- "127.0.0.1:9300:9300"
volumes:
- "./esdata:/usr/share/elasticsearch/data"
web:
build:
context: .
ports:
- "80:3000"
env_file: "./.env"
restart: unless-stopped
entrypoint: dockerize -timeout 60s -wait tcp://elasticsearch:9200 -wait tcp://elasticsearch:9300 npm start
6 changes: 6 additions & 0 deletions docker-compose.override.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
version: "2"
services:
elasticsearch:
ports:
– "9200:9200"
- "9300:9300"
17 changes: 17 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
version: "2"
services:
elasticsearch:
image: elasticsearch:latest
command: elasticsearch -Des.network.bindHost=0.0.0.0
restart: unless-stopped
volumes:
- "./esdata:/usr/share/elasticsearch/data"
web:
build: .
ports:
- "3000:3000"
links:
- elasticsearch
env_file: ./.env
entrypoint: dockerize -timeout 60s -wait tcp://elasticsearch:9200 -wait tcp://elasticsearch:9300 npm start
restart: unless-stopped
13 changes: 9 additions & 4 deletions ircbot.js
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@ var bamboo = require("./bamboo");
var SERVER = config.get("irc").server;
var CHANNEL = config.get("irc").channel;
var NICK = config.get("irc").nick;
var USERNAME = config.get("irc").userName;
var REALNAME = config.get("irc").realName;
var START_LISTENING = config.get("irc").startListening;
var STOP_LISTENING = config.get("irc").stopListening;
var SAY_BUILD_FAILURES = config.get("irc").sayBuildFailures;
Expand All @@ -22,8 +24,11 @@ var postMessage = function (text) {
client.say(CHANNEL, text);
};

var client = new irc.Client(SERVER, NICK, {channels: [CHANNEL]});
log.info("Connected to " + CHANNEL);
var client = new irc.Client(SERVER, NICK, {userName: USERNAME, realName: REALNAME, channels: [CHANNEL]});

client.addListener('names', function(channel, nicks) {
log.info("Connected to " + CHANNEL);
});

client.addListener('error', function (message) {
log.error(message);
Expand Down Expand Up @@ -74,12 +79,12 @@ client.addListener('message', function (from, to, message) {
bamboo.summarizeBrokenBuilds().then(function (summary) {
_.each(summary, function (line) {
postMessage(line);
})
});
});
}
});

module.exports.shouldStartListening = shouldStartListening;
module.exports.shouldStopListening = shouldStopListening;
module.exports.shouldListBuildFailures = shouldListBuildFailures;
module.exports.postMessage = postMessage;
module.exports.postMessage = postMessage;
4 changes: 2 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,9 +25,9 @@
"fs": "0.0.2",
"irc": "^0.4.0",
"jade": "^1.11.0",
"lodash": "^3.10.1",
"lodash": "^4.5.1",
"log4js": "^0.6.29",
"moment": "^2.11.1",
"moment": "^2.11.2",
"morgan": "^1.6.1",
"nconf": "^0.8.2",
"request": "^2.67.0",
Expand Down
6 changes: 3 additions & 3 deletions processor.js
Original file line number Diff line number Diff line change
Expand Up @@ -25,13 +25,13 @@ module.exports = {
raw: conversation,
startTime: _.first(conversation).timestamp,
endTime: _.last(conversation).timestamp,
participants: _.uniq(_.pluck(conversation, "from")),
issues: _.chain(processed).pluck("issues").flatten().remove(null).uniq().value()
participants: _.uniq(_.map(conversation, "from")),
issues: _.chain(processed).map("issues").flatten().remove(null).uniq().value()
});
}
else {
log.warn("Didn't capture any conversation between START and STOP message");
}
conversation = null;
}
};
};