Skip to content
This repository has been archived by the owner on Apr 14, 2023. It is now read-only.

Commit

Permalink
Merge pull request #1648 from finos/fix/rename-release-file
Browse files Browse the repository at this point in the history
fix(#1639): Rename release artefact to datahelix.zip
  • Loading branch information
Simon Laing authored Jan 22, 2020
2 parents da9257e + d3e119a commit 1b96875
Show file tree
Hide file tree
Showing 7 changed files with 20 additions and 18 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ node_modules/
classes/
profiler/
target/
bin/generator.jar
bin/datahelix.jar
.idea/sonarlint

# Eclipse files
Expand Down
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,6 @@ RUN gradle fatJar
FROM openjdk:8u212-jre-alpine

WORKDIR /root
COPY --from=build /root/orchestrator/build/libs/generator.jar .
COPY --from=build /root/orchestrator/build/libs/datahelix.jar .

ENTRYPOINT ["java", "-jar", "generator.jar"]
ENTRYPOINT ["java", "-jar", "datahelix.jar"]
2 changes: 1 addition & 1 deletion docs/DeveloperGuide.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ Checklist before raising an issue:

DataHelix uses Java 1.8 which can be downloaded from this [link](http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html)

DataHelix uses [gradle](https://gradle.org/) to automate the build and test process. To build the project run `gradle build` from the root folder of the project. If it was successful then the created jar file can be found in the path _orchestrator/build/libs/generator.jar_ .
DataHelix uses [gradle](https://gradle.org/) to automate the build and test process. To build the project run `gradle build` from the root folder of the project. If it was successful then the created jar file can be found in the path _orchestrator/build/libs/datahelix.jar_ .

[Guice](https://github.com/google/guice) is used in DataHelix for Dependency Injection (DI). It is configured in the 'module' classes, which all extend `AbstractModule`, and injected with the `@inject` annotation.

Expand Down
18 changes: 9 additions & 9 deletions docs/GettingStarted.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@ For more comprehensive documentation please refer to the [user guide](UserGuide.
- [Generating large datasets](#Generating-large-datasets)
- [Next steps](#Next-steps)

## Downloading the JAR file
## Downloading the release

The datahelix is distributed as a JAR file, with the latest release always available from the [GitHub releases page](https://github.com/finos/datahelix/releases/). You will need Java v1.8 installed to run the datahelix (you can run `java -version` to check whether you meet this requirement), it can be [downloaded here](https://www.java.com/en/download/manual.jsp).
The datahelix is distributed as a zip file, with the latest release always available from the [GitHub releases page](https://github.com/finos/datahelix/releases/). You will need Java v1.8 installed to run the datahelix (you can run `java -version` to check whether you meet this requirement), it can be [downloaded here](https://www.java.com/en/download/manual.jsp).

You are also welcome to download the source code and build the generator yourself. To do so, follow the instructions in the [Developer Guide](DeveloperGuide.md#Building).

Expand Down Expand Up @@ -51,18 +51,18 @@ When manually writing profiles, we recommend using a text editor which can valid

## Running the generator

Now place the `generator.jar` file (downloaded from the [GitHub releases page](https://github.com/finos/datahelix/releases/)) in the same folder as the profile, open up a terminal, and execute the following:
Now extract the `datahelix.zip` file (downloaded from the [GitHub releases page](https://github.com/finos/datahelix/releases/)) into the same folder as the profile, open up a terminal, and execute the following:

```
$ java -jar generator.jar --max-rows=100 --replace --profile-file=profile.json --output-path=output.csv
```shell script
$ generator/bin/datahelix --max-rows=100 --replace --profile-file=profile.json --output-path=output.csv
```

The generator is a command line tool which reads a profile, and outputs data in CSV or JSON format. The `--max-rows=100` option tells the generator to create 100 rows of data, and the `--replace` option tells it to overwrite previously generated files. The compulsory `--profile-file` option specifies the name of the input profile, and the `--output-path` option specifies the location to write the output to. In `generate` mode `--output-path` is optional; the generator will default to standard output if it is not supplied. By default the generator outputs progress, in rows per second, to the standard error output. This can be useful when generating large volumes of data.

Use

```
$ java -jar generator.jar --help
```shell script
$ generator/bin/datahelix --help
```

or see [the User Guide](UserGuide.md#command-line-arguments) to find the full range of command line arguments.
Expand Down Expand Up @@ -247,8 +247,8 @@ The mode is specified via the `--generation-type` option.

The generator has been designed to be fast and efficient, allowing you to generate large quantities of test and simulation data. If you supply a large number for the `--max-rows` option, the data will be streamed to the output file, with the progress / velocity reported during generation.

```
$ java -jar generator.jar --max-rows=10000 --replace --profile-file=profile.json --output-path=output.csv
```shell script
$ generator/bin/datahelix --max-rows=10000 --replace --profile-file=profile.json --output-path=output.csv
Generation started at: 16:41:44

Number of rows | Velocity (rows/sec) | Velocity trend
Expand Down
6 changes: 4 additions & 2 deletions docs/UserGuide.md
Original file line number Diff line number Diff line change
Expand Up @@ -765,10 +765,10 @@ Profiles can be run against a jar using the command line.
## Command Line Arguments
<div id="Command-Line-Arguments"></div>

Currently the only mode fully supported by the data helix is generate mode. An example command would be something like
An example command would be something like

```shell script
java -jar generator.jar --max-rows=100 --replace --profile-file=profile.json --output-path=output.csv
java -jar datahelix.jar --max-rows=100 --replace --profile-file=profile.json --output-path=output.csv
```

it is also possible to execute the generator using a wrapper script:
Expand All @@ -784,6 +784,8 @@ and on linux
datahelix.sh --max-rows=100 --replace --profile-file=profile.json --output-path=output.csv
```

These presume that the scripts (datahelix.zip\bin) are in the path, or you're currently working in the bin directory.

### Command Line Arguments
<div id="Command-Line-Arguments-for-Generate-Mode"></div>
Option switches are case-sensitive, arguments are case-insensitive
Expand Down
4 changes: 2 additions & 2 deletions orchestrator/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ task fatJar(type: Jar) {
manifest {
attributes 'Main-Class': 'com.scottlogic.datahelix.generator.orchestrator.App'
}
baseName = 'generator'
baseName = 'datahelix'
from { configurations.compile.collect { it.isDirectory() ? it : zipTree(it) } }
with jar
archiveName = "${baseName}.${extension}"
Expand All @@ -110,7 +110,7 @@ application {
project.ext.ghToken = project.hasProperty('ghToken') ? project.getProperty('ghToken') : System.getenv('GH_TOKEN') ?: null

distZip {
archiveName "generator.zip"
archiveName "datahelix.zip"
}

startScripts {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ private Process setupProcess(final String profile) throws IOException {
ProcessBuilder pb = new ProcessBuilder(
"java",
"-jar",
"build/libs/generator.jar",
"build/libs/datahelix.jar",
profile,
"--max-rows=1",
"--quiet");
Expand Down

0 comments on commit 1b96875

Please sign in to comment.