Skip to content

Commit

Permalink
Merge branch 'linkedin:master' into master
Browse files Browse the repository at this point in the history
  • Loading branch information
shirshanka authored Dec 8, 2021
2 parents bbd44ac + 1a5121a commit 192e7eb
Show file tree
Hide file tree
Showing 66 changed files with 1,085 additions and 562 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,7 @@ Here are the companies that have officially adopted DataHub. Please feel free to
- [LinkedIn](http://linkedin.com)
- [Peloton](https://www.onepeloton.com)
- [Saxo Bank](https://www.home.saxo)
- [Stash](https://www.stash.com)
- [Shanghai HuaRui Bank](https://www.shrbank.com)
- [ThoughtWorks](https://www.thoughtworks.com)
- [TypeForm](http://typeform.com)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ private boolean isAuthorized(@Nonnull TagUpdateInput update, @Nonnull QueryConte
return AuthorizationUtils.isAuthorized(
context.getAuthorizer(),
context.getAuthentication().getActor().toUrnStr(),
PoliciesConfig.DATASET_PRIVILEGES.getResourceType(),
PoliciesConfig.TAG_PRIVILEGES.getResourceType(),
update.getUrn(),
orPrivilegeGroups);
}
Expand Down
9 changes: 5 additions & 4 deletions docker/airflow/local_airflow.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Running Airflow locally with DataHub

## Introduction
This document describes how you can run Airflow side-by-side with DataHub's docker images to test out Airflow lineage with DataHub.
This document describes how you can run Airflow side-by-side with DataHub's quickstart docker images to test out Airflow lineage with DataHub.
This offers a much easier way to try out Airflow with DataHub, compared to configuring containers by hand, setting up configurations and networking connectivity between the two systems.

## Pre-requisites
Expand All @@ -11,6 +11,7 @@ docker info | grep Memory
> Total Memory: 7.775GiB
```
- Quickstart: Ensure that you followed [quickstart](../../docs/quickstart.md) to get DataHub up and running.

## Step 1: Set up your Airflow area
- Create an area to host your airflow installation
Expand All @@ -20,7 +21,7 @@ docker info | grep Memory
```
mkdir -p airflow_install
cd airflow_install
# Download docker-compose
# Download docker-compose file
curl -L 'https://raw.githubusercontent.com/linkedin/datahub/master/docker/airflow/docker-compose.yaml' -o docker-compose.yaml
# Create dags directory
mkdir -p dags
Expand Down Expand Up @@ -94,7 +95,7 @@ Default username and password is:
airflow:airflow
```

## Step 4: Register DataHub connection (hook) to Airflow
## Step 3: Register DataHub connection (hook) to Airflow

```
docker exec -it `docker ps | grep webserver | cut -d " " -f 1` airflow connections add --conn-type 'datahub_rest' 'datahub_rest_default' --conn-host 'http://datahub-gms:8080'
Expand All @@ -111,7 +112,7 @@ Successfully added `conn_id`=datahub_rest_default : datahub_rest://:@http://data
- Note: This is what requires Airflow to be able to connect to `datahub-gms` the host (this is the container running datahub-gms image) and this is why we needed to connect the Airflow containers to the `datahub_network` using our custom docker-compose file.


## Step 3: Find the DAGs and run it
## Step 4: Find the DAGs and run it
Navigate the Airflow UI to find the sample Airflow dag we just brought in

![Find the DAG](../../docs/imgs/airflow/find_the_dag.png)
Expand Down
2 changes: 1 addition & 1 deletion docker/datahub-gms/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ RUN apk --no-cache --update-cache --available upgrade \
else \
echo >&2 "Unsupported architecture $(arch)" ; exit 1; \
fi \
&& apk --no-cache add tar curl openjdk8-jre \
&& apk --no-cache add tar curl openjdk8-jre bash \
&& curl https://repo1.maven.org/maven2/org/eclipse/jetty/jetty-runner/9.4.20.v20190813/jetty-runner-9.4.20.v20190813.jar --output jetty-runner.jar \
&& curl https://repo1.maven.org/maven2/org/eclipse/jetty/jetty-jmx/9.4.20.v20190813/jetty-jmx-9.4.20.v20190813.jar --output jetty-jmx.jar \
&& curl https://repo1.maven.org/maven2/org/eclipse/jetty/jetty-util/9.4.20.v20190813/jetty-util-9.4.20.v20190813.jar --output jetty-util.jar \
Expand Down
61 changes: 36 additions & 25 deletions docker/datahub-gms/start.sh
Original file line number Diff line number Diff line change
@@ -1,18 +1,13 @@
#!/bin/sh

#!/bin/bash
set -x
# Add default URI (http) scheme if needed
if ! echo $NEO4J_HOST | grep -q "://" ; then
NEO4J_HOST="http://$NEO4J_HOST"
fi

if [[ -z $ELASTICSEARCH_USERNAME ]]; then
ELASTICSEARCH_HOST_URL=$ELASTICSEARCH_HOST
else
if [[ -z $ELASTICSEARCH_AUTH_HEADER ]]; then
ELASTICSEARCH_HOST_URL=$ELASTICSEARCH_USERNAME:$ELASTICSEARCH_PASSWORD@$ELASTICSEARCH_HOST
else
ELASTICSEARCH_HOST_URL=$ELASTICSEARCH_HOST
fi
if [[ ! -z $ELASTICSEARCH_USERNAME ]] && [[ -z $ELASTICSEARCH_AUTH_HEADER ]]; then
AUTH_TOKEN=$(echo -ne "$ELASTICSEARCH_USERNAME:$ELASTICSEARCH_PASSWORD" | base64 --wrap 0)
ELASTICSEARCH_AUTH_HEADER="Authorization:Basic $AUTH_TOKEN"
fi

# Add default header if needed
Expand All @@ -26,9 +21,18 @@ else
ELASTICSEARCH_PROTOCOL=http
fi

WAIT_FOR_NEO4J=""
WAIT_FOR_EBEAN=""
if [[ $SKIP_EBEAN_CHECK != true ]]; then
WAIT_FOR_EBEAN=" -wait tcp://$EBEAN_DATASOURCE_HOST "
fi

WAIT_FOR_KAFKA=""
if [[ $SKIP_KAFKA_CHECK != true ]]; then
WAIT_FOR_KAFKA=" -wait tcp://$(echo $KAFKA_BOOTSTRAP_SERVER | sed 's/,/ -wait tcp:\/\//g') "
fi

if [[ $GRAPH_SERVICE_IMPL != elasticsearch ]]; then
WAIT_FOR_NEO4J=""
if [[ $GRAPH_SERVICE_IMPL != elasticsearch ]] && [[ $SKIP_NEO4J_CHECK != true ]]; then
WAIT_FOR_NEO4J=" -wait $NEO4J_HOST "
fi

Expand All @@ -42,16 +46,23 @@ if [[ $ENABLE_PROMETHEUS == true ]]; then
PROMETHEUS_AGENT="-javaagent:jmx_prometheus_javaagent.jar=4318:/datahub/datahub-gms/scripts/prometheus-config.yaml "
fi

dockerize \
-wait tcp://$EBEAN_DATASOURCE_HOST \
-wait tcp://$(echo $KAFKA_BOOTSTRAP_SERVER | sed 's/,/ -wait tcp:\/\//g') \
-wait $ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT -wait-http-header "$ELASTICSEARCH_AUTH_HEADER" \
$WAIT_FOR_NEO4J \
-timeout 240s \
java $JAVA_OPTS $JMX_OPTS \
$OTEL_AGENT \
$PROMETHEUS_AGENT \
-jar /jetty-runner.jar \
--jar jetty-util.jar \
--jar jetty-jmx.jar \
/datahub/datahub-gms/bin/war.war
COMMON="
$WAIT_FOR_EBEAN \
$WAIT_FOR_KAFKA \
$WAIT_FOR_NEO4J \
-timeout 240s \
java $JAVA_OPTS $JMX_OPTS \
$OTEL_AGENT \
$PROMETHEUS_AGENT \
-jar /jetty-runner.jar \
--jar jetty-util.jar \
--jar jetty-jmx.jar \
/datahub/datahub-gms/bin/war.war"

if [[ $SKIP_ELASTICSEARCH_CHECK != true ]]; then
dockerize \
-wait $ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT -wait-http-header "$ELASTICSEARCH_AUTH_HEADER" \
$COMMON
else
dockerize $COMMON
fi
2 changes: 1 addition & 1 deletion docker/datahub-mae-consumer/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ ARG APP_ENV=prod

FROM adoptopenjdk/openjdk8:alpine-jre as base
ENV DOCKERIZE_VERSION v0.6.1
RUN apk --no-cache add curl tar wget \
RUN apk --no-cache add curl tar wget bash \
&& wget https://github.com/open-telemetry/opentelemetry-java-instrumentation/releases/download/v1.4.1/opentelemetry-javaagent-all.jar \
&& wget https://repo1.maven.org/maven2/io/prometheus/jmx/jmx_prometheus_javaagent/0.16.1/jmx_prometheus_javaagent-0.16.1.jar -O jmx_prometheus_javaagent.jar \
&& curl -L https://github.com/jwilder/dockerize/releases/download/$DOCKERIZE_VERSION/dockerize-linux-amd64-$DOCKERIZE_VERSION.tar.gz | tar -C /usr/local/bin -xzv
Expand Down
45 changes: 28 additions & 17 deletions docker/datahub-mae-consumer/start.sh
Original file line number Diff line number Diff line change
@@ -1,18 +1,13 @@
#!/bin/sh
#!/bin/bash

# Add default URI (http) scheme if needed
if ! echo $NEO4J_HOST | grep -q "://" ; then
NEO4J_HOST="http://$NEO4J_HOST"
fi

if [[ -z $ELASTICSEARCH_USERNAME ]]; then
ELASTICSEARCH_HOST_URL=$ELASTICSEARCH_HOST
else
if [[ -z $ELASTICSEARCH_AUTH_HEADER ]]; then
ELASTICSEARCH_HOST_URL=$ELASTICSEARCH_USERNAME:$ELASTICSEARCH_PASSWORD@$ELASTICSEARCH_HOST
else
ELASTICSEARCH_HOST_URL=$ELASTICSEARCH_HOST
fi
if [[ ! -z $ELASTICSEARCH_USERNAME ]] && [[ -z $ELASTICSEARCH_AUTH_HEADER ]]; then
AUTH_TOKEN=$(echo -ne "$ELASTICSEARCH_USERNAME:$ELASTICSEARCH_PASSWORD" | base64 --wrap 0)
ELASTICSEARCH_AUTH_HEADER="Authorization:Basic $AUTH_TOKEN"
fi

# Add default header if needed
Expand All @@ -26,9 +21,18 @@ else
ELASTICSEARCH_PROTOCOL=http
fi

WAIT_FOR_NEO4J=""
WAIT_FOR_KAFKA=""
if [[ $SKIP_KAFKA_CHECK != true ]]; then
WAIT_FOR_KAFKA=" -wait tcp://$(echo $KAFKA_BOOTSTRAP_SERVER | sed 's/,/ -wait tcp:\/\//g') "
fi

WAIT_FOR_ELASTICSEARCH=""
if [[ $SKIP_ELASTICSEARCH_CHECK != true ]]; then
WAIT_FOR_ELASTICSEARCH=" -wait $ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT -wait-http-header \"$ELASTICSEARCH_AUTH_HEADER\""
fi

if [[ $GRAPH_SERVICE_IMPL != elasticsearch ]]; then
WAIT_FOR_NEO4J=""
if [[ $GRAPH_SERVICE_IMPL != elasticsearch ]] && [[ $SKIP_NEO4J_CHECK != true ]]; then
WAIT_FOR_NEO4J=" -wait $NEO4J_HOST "
fi

Expand All @@ -42,9 +46,16 @@ if [[ $ENABLE_PROMETHEUS == true ]]; then
PROMETHEUS_AGENT="-javaagent:jmx_prometheus_javaagent.jar=4318:/datahub/datahub-mae-consumer/scripts/prometheus-config.yaml "
fi

dockerize \
-wait tcp://$(echo $KAFKA_BOOTSTRAP_SERVER | sed 's/,/ -wait tcp:\/\//g') \
-wait $ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT -wait-http-header "$ELASTICSEARCH_AUTH_HEADER" \
$WAIT_FOR_NEO4J \
-timeout 240s \
java $JAVA_OPTS $JMX_OPTS $OTEL_AGENT $PROMETHEUS_AGENT -jar /datahub/datahub-mae-consumer/bin/mae-consumer-job.jar
COMMON="
$WAIT_FOR_KAFKA \
$WAIT_FOR_NEO4J \
-timeout 240s \
java $JAVA_OPTS $JMX_OPTS $OTEL_AGENT $PROMETHEUS_AGENT -jar /datahub/datahub-mae-consumer/bin/mae-consumer-job.jar
"
if [[ $SKIP_ELASTICSEARCH_CHECK != true ]]; then
dockerize $COMMON
else
dockerize \
-wait $ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT -wait-http-header "$ELASTICSEARCH_AUTH_HEADER" \
$COMMON
fi
2 changes: 1 addition & 1 deletion docker/datahub-mce-consumer/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ ARG APP_ENV=prod

FROM adoptopenjdk/openjdk8:alpine-jre as base
ENV DOCKERIZE_VERSION v0.6.1
RUN apk --no-cache add curl tar wget \
RUN apk --no-cache add curl tar wget bash \
&& wget https://github.com/open-telemetry/opentelemetry-java-instrumentation/releases/download/v1.4.1/opentelemetry-javaagent-all.jar \
&& wget https://repo1.maven.org/maven2/io/prometheus/jmx/jmx_prometheus_javaagent/0.16.1/jmx_prometheus_javaagent-0.16.1.jar -O jmx_prometheus_javaagent.jar \
&& curl -L https://github.com/jwilder/dockerize/releases/download/$DOCKERIZE_VERSION/dockerize-linux-amd64-$DOCKERIZE_VERSION.tar.gz | tar -C /usr/local/bin -xzv
Expand Down
9 changes: 7 additions & 2 deletions docker/datahub-mce-consumer/start.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,9 @@
#!/bin/sh
#!/bin/bash

WAIT_FOR_KAFKA=""
if [[ $SKIP_KAFKA_CHECK != true ]]; then
WAIT_FOR_KAFKA=" -wait tcp://$(echo $KAFKA_BOOTSTRAP_SERVER | sed 's/,/ -wait tcp:\/\//g') "
fi

OTEL_AGENT=""
if [[ $ENABLE_OTEL == true ]]; then
Expand All @@ -11,6 +16,6 @@ if [[ $ENABLE_PROMETHEUS == true ]]; then
fi

dockerize \
-wait tcp://$(echo $KAFKA_BOOTSTRAP_SERVER | sed 's/,/ -wait tcp:\/\//g') \
$WAIT_FOR_KAFKA \
-timeout 240s \
java $JAVA_OPTS $JMX_OPTS $OTEL_AGENT $PROMETHEUS_AGENT -jar /datahub/datahub-mce-consumer/bin/mce-consumer-job.jar
2 changes: 2 additions & 0 deletions docker/docker-compose.dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,8 @@ services:
dockerfile: Dockerfile
args:
APP_ENV: dev
environment:
- SKIP_ELASTICSEARCH_CHECK=false
volumes:
- ./datahub-gms/start.sh:/datahub/datahub-gms/scripts/start.sh
- ./monitoring/client-prometheus-config.yaml:/datahub/datahub-gms/scripts/prometheus-config.yaml
Expand Down
6 changes: 4 additions & 2 deletions docker/elasticsearch-setup/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ ARG APP_ENV=prod

FROM alpine:3 as base
ENV DOCKERIZE_VERSION v0.6.1
RUN apk add --no-cache curl jq tar \
RUN apk add --no-cache curl jq tar bash \
&& curl -L https://github.com/jwilder/dockerize/releases/download/$DOCKERIZE_VERSION/dockerize-linux-amd64-$DOCKERIZE_VERSION.tar.gz | tar -C /usr/local/bin -xzv

FROM base AS prod-install
Expand All @@ -21,4 +21,6 @@ FROM base AS dev-install
FROM ${APP_ENV}-install AS final
CMD if [ "$ELASTICSEARCH_USE_SSL" == "true" ]; then ELASTICSEARCH_PROTOCOL=https; else ELASTICSEARCH_PROTOCOL=http; fi \
&& if [[ -n "$ELASTICSEARCH_USERNAME" ]]; then ELASTICSEARCH_HTTP_HEADERS="Authorization: Basic $(echo -ne "$ELASTICSEARCH_USERNAME:$ELASTICSEARCH_PASSWORD" | base64)"; else ELASTICSEARCH_HTTP_HEADERS="Accept: */*"; fi \
&& dockerize -wait $ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT -wait-http-header "${ELASTICSEARCH_HTTP_HEADERS}" -timeout 120s /create-indices.sh
&& if [[ "$SKIP_ELASTICSEARCH_CHECK" != "true" ]]; then \
dockerize -wait $ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT -wait-http-header "${ELASTICSEARCH_HTTP_HEADERS}" -timeout 120s /create-indices.sh; \
else /create-indices.sh; fi
33 changes: 18 additions & 15 deletions docker/elasticsearch-setup/create-indices.sh
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/bin/sh
#!/bin/bash

set -e

Expand All @@ -11,10 +11,14 @@ else
ELASTICSEARCH_PROTOCOL=http
fi

if [[ -z $ELASTICSEARCH_USERNAME ]]; then
ELASTICSEARCH_HOST_URL=$ELASTICSEARCH_HOST
else
ELASTICSEARCH_HOST_URL=$ELASTICSEARCH_USERNAME:$ELASTICSEARCH_PASSWORD@$ELASTICSEARCH_HOST
if [[ ! -z $ELASTICSEARCH_USERNAME ]] && [[ -z $ELASTICSEARCH_AUTH_HEADER ]]; then
AUTH_TOKEN=$(echo -ne "$ELASTICSEARCH_USERNAME:$ELASTICSEARCH_PASSWORD" | base64 --wrap 0)
ELASTICSEARCH_AUTH_HEADER="Authorization:Basic $AUTH_TOKEN"
fi

# Add default header if needed
if [[ -z $ELASTICSEARCH_AUTH_HEADER ]]; then
ELASTICSEARCH_AUTH_HEADER="Accept: */*"
fi

function create_datahub_usage_event_datastream() {
Expand All @@ -24,19 +28,19 @@ function create_datahub_usage_event_datastream() {
PREFIX="${INDEX_PREFIX}_"
fi

if [ $(curl -o /dev/null -s -w "%{http_code}" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/_ilm/policy/${PREFIX}datahub_usage_event_policy") -eq 404 ]
if [ $(curl -o /dev/null -s -w "%{http_code}" --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/_ilm/policy/${PREFIX}datahub_usage_event_policy") -eq 404 ]
then
echo -e "\ncreating datahub_usage_event_policy"
sed -e "s/PREFIX/${PREFIX}/g" /index/usage-event/policy.json | tee -a /tmp/policy.json
curl -XPUT "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/_ilm/policy/${PREFIX}datahub_usage_event_policy" -H 'Content-Type: application/json' --data @/tmp/policy.json
curl -XPUT --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/_ilm/policy/${PREFIX}datahub_usage_event_policy" -H 'Content-Type: application/json' --data @/tmp/policy.json
else
echo -e "\ndatahub_usage_event_policy exists"
fi
if [ $(curl -o /dev/null -s -w "%{http_code}" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/_index_template/${PREFIX}datahub_usage_event_index_template") -eq 404 ]
if [ $(curl -o /dev/null -s -w "%{http_code}" --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/_index_template/${PREFIX}datahub_usage_event_index_template") -eq 404 ]
then
echo -e "\ncreating datahub_usage_event_index_template"
sed -e "s/PREFIX/${PREFIX}/g" /index/usage-event/index_template.json | tee -a /tmp/index_template.json
curl -XPUT "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/_index_template/${PREFIX}datahub_usage_event_index_template" -H 'Content-Type: application/json' --data @/tmp/index_template.json
curl -XPUT --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/_index_template/${PREFIX}datahub_usage_event_index_template" -H 'Content-Type: application/json' --data @/tmp/index_template.json
else
echo -e "\ndatahub_usage_event_index_template exists"
fi
Expand All @@ -49,20 +53,20 @@ function create_datahub_usage_event_aws_elasticsearch() {
PREFIX="${INDEX_PREFIX}_"
fi

if [ $(curl -o /dev/null -s -w "%{http_code}" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/_opendistro/_ism/policies/${PREFIX}datahub_usage_event_policy") -eq 404 ]
if [ $(curl -o /dev/null -s -w "%{http_code}" --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/_opendistro/_ism/policies/${PREFIX}datahub_usage_event_policy") -eq 404 ]
then
echo -e "\ncreating datahub_usage_event_policy"
sed -e "s/PREFIX/${PREFIX}/g" /index/usage-event/aws_es_ism_policy.json | tee -a /tmp/aws_es_ism_policy.json
curl -XPUT "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/_opendistro/_ism/policies/${PREFIX}datahub_usage_event_policy" -H 'Content-Type: application/json' --data @/tmp/aws_es_ism_policy.json
curl -XPUT --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/_opendistro/_ism/policies/${PREFIX}datahub_usage_event_policy" -H 'Content-Type: application/json' --data @/tmp/aws_es_ism_policy.json
else
echo -e "\ndatahub_usage_event_policy exists"
fi
if [ $(curl -o /dev/null -s -w "%{http_code}" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/_template/${PREFIX}datahub_usage_event_index_template") -eq 404 ]
if [ $(curl -o /dev/null -s -w "%{http_code}" --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/_template/${PREFIX}datahub_usage_event_index_template") -eq 404 ]
then
echo -e "\ncreating datahub_usagAe_event_index_template"
sed -e "s/PREFIX/${PREFIX}/g" /index/usage-event/aws_es_index_template.json | tee -a /tmp/aws_es_index_template.json
curl -XPUT "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/_template/${PREFIX}datahub_usage_event_index_template" -H 'Content-Type: application/json' --data @/tmp/aws_es_index_template.json
curl -XPUT "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST_URL:$ELASTICSEARCH_PORT/${PREFIX}datahub_usage_event-000001" -H 'Content-Type: application/json' --data "{\"aliases\":{\"${PREFIX}datahub_usage_event\":{\"is_write_index\":true}}}"
curl -XPUT --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/_template/${PREFIX}datahub_usage_event_index_template" -H 'Content-Type: application/json' --data @/tmp/aws_es_index_template.json
curl -XPUT --header "$ELASTICSEARCH_AUTH_HEADER" "$ELASTICSEARCH_PROTOCOL://$ELASTICSEARCH_HOST:$ELASTICSEARCH_PORT/${PREFIX}datahub_usage_event-000001" -H 'Content-Type: application/json' --data "{\"aliases\":{\"${PREFIX}datahub_usage_event\":{\"is_write_index\":true}}}"
else
echo -e "\ndatahub_usage_event_index_template exists"
fi
Expand All @@ -75,4 +79,3 @@ if [[ $DATAHUB_ANALYTICS_ENABLED == true ]]; then
create_datahub_usage_event_aws_elasticsearch || exit 1
fi
fi

4 changes: 2 additions & 2 deletions docs-website/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ module.exports = {
position: "right",
},
{
href: "https://feature-requests.datahubproject.io/roadmap",
to: "docs/roadmap",
label: "Roadmap",
position: "right",
},
Expand Down Expand Up @@ -151,7 +151,7 @@ module.exports = {
},
{
label: "Roadmap",
href: "https://feature-requests.datahubproject.io/roadmap",
to: "docs/roadmap",
},
{
label: "Contributing",
Expand Down
Loading

0 comments on commit 192e7eb

Please sign in to comment.