Skip to content

Commit

Permalink
update dependencies
Browse files Browse the repository at this point in the history
  • Loading branch information
gklijs committed Feb 20, 2021
1 parent 14e2247 commit e79d595
Show file tree
Hide file tree
Showing 5 changed files with 27 additions and 22 deletions.
26 changes: 13 additions & 13 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "schema_registry_converter"
version = "2.0.1"
version = "2.0.2"
authors = ["Gerard Klijs <[email protected]>"]
include = ["src/**/*", "Cargo.toml"]
description = "Encode/decode data from/to kafka using the Confluent Schema Registry"
Expand All @@ -22,13 +22,13 @@ kafka_test = []
default = ["futures"]

[dependencies.byteorder]
version = "^1.3"
version = "^1.4"

[dependencies.failure]
version = "^0.1"

[dependencies.reqwest]
version = "^0.10"
version = "^0.11"
features = ["json"]

[dependencies.serde]
Expand All @@ -39,43 +39,43 @@ features = ["derive"]
version = "^1.0"

[dependencies.avro-rs]
version = "^0.11"
version = "^0.13"
optional = true

[dependencies.bytes]
version = "^0.5"
version = "^1.0"
optional = true

[dependencies.futures]
version = "^0.3"
optional = true

[dependencies.integer-encoding]
version = "^2.1"
version = "^3.0"
optional = true

[dependencies.logos]
version = "^0.11"
version = "^0.12"
optional = true

[dependencies.protofish]
version = "^0.2"
version = "^0.3"
optional = true

[dependencies.url]
version = "^2"
optional = true

[dependencies.valico]
version = "^3.4"
version = "^3.5"
optional = true

[dev-dependencies]
mockito = "^0.27.0"
rdkafka = { version = "^0.23.1", features = ["cmake-build"] }
rand = "^0.7.3"
mockito = "^0.29.0"
rdkafka = { version = "^0.25.0", features = ["cmake-build"] }
rand = "^0.8.3"
test_utils = {path = "test_utils"}
tokio = { version = "^0.2.22", features = ["macros"] }
tokio = { version = "^1.2.0", features = ["macros"] }

[package.metadata.docs.rs]
all-features = true
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ To use it to convert using avro async use:

```toml
[dependencies]
schema_registry_converter = { version = "2.0.1", features = ["avro"] }
schema_registry_converter = { version = "2.0.2", features = ["avro"] }
```

...and see the [docs](https://docs.rs/schema_registry_converter) for how to use it.
Expand All @@ -44,14 +44,14 @@ All the converters also have a blocking (non async) version, in that case use so

```toml
[dependencies]
schema_registry_converter = { version = "2.0.1", default-features = false, features = ["avro", "blocking"]}
schema_registry_converter = { version = "2.0.2", default-features = false, features = ["avro", "blocking"]}
```

If you need to use both in a project you can use something like, but have to be weary you import the correct paths depending on your use.

```toml
[dependencies]
schema_registry_converter = { version = "2.0.1", features = ["avro", "blocking"]}
schema_registry_converter = { version = "2.0.2", features = ["avro", "blocking"]}
```

# Example with consumer and producer using Avro
Expand Down
9 changes: 7 additions & 2 deletions RELEASE_NOTES.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
## Release notes

### 2.0.2

Updated dependencies

### 2.0.1

Maintenance release with mainly updated dependencies, making the blocking sr settings cloneable and no longer needs `kafka_test` feature to use both blocking and async in the same project.
Expand All @@ -14,7 +18,7 @@ Another major change is by default support for async.

To use the new version of the library, and continue to use it in a blocking way like it was before, you need to use the library like:
```toml
schema_registry_converter = { version = "2.0.1", default-features = false, features = ["avro", "blocking"]}
schema_registry_converter = { version = "2.0.2", default-features = false, features = ["avro", "blocking"]}
```
Also the Converters are moved to the blocking module, and to create the converters you need a SrSettings object, which can be created with just the
schema registry url.
Expand Down Expand Up @@ -62,5 +66,6 @@ instead of the `encode` function on the encoder.
#### Contributors

- [@cbzehner](https://github.com/cbzehner)
- [@j-halbert](https://github.com/j-halbert)
- [@kitsuneninetails](https://github.com/kitsuneninetails)
- [@j-halbert](https://github.com/j-halbert)
- [@naamancurtis](https://github.com/naamancurtis)
6 changes: 3 additions & 3 deletions docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
version: '2'
services:
zookeeper:
image: confluentinc/cp-zookeeper:5.5.1
image: confluentinc/cp-zookeeper:6.1.0
hostname: zookeeper
container_name: zookeeper
ports:
Expand All @@ -12,7 +12,7 @@ services:
ZOOKEEPER_TICK_TIME: 2000

broker:
image: confluentinc/cp-server:5.5.1
image: confluentinc/cp-server:6.1.0
hostname: broker
container_name: broker
depends_on:
Expand All @@ -37,7 +37,7 @@ services:
CONFLUENT_SUPPORT_CUSTOMER_ID: 'anonymous'

schema-registry:
image: confluentinc/cp-schema-registry:5.5.1
image: confluentinc/cp-schema-registry:6.1.0
hostname: schema-registry
container_name: schema-registry
depends_on:
Expand Down
2 changes: 1 addition & 1 deletion tests/blocking/kafka_producer.rs
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ impl<'a> RecordProducer {
timestamp: None,
headers: None,
};
self.producer.send(fr, 0);
self.producer.send_result(fr).unwrap();
}
}

Expand Down

0 comments on commit e79d595

Please sign in to comment.