Skip to content

Commit

Permalink
s/candle/candle_core/g
Browse files Browse the repository at this point in the history
  • Loading branch information
Narsil committed Aug 2, 2023
1 parent ae68635 commit 166f4d1
Showing 1 changed file with 6 additions and 6 deletions.
12 changes: 6 additions & 6 deletions candle-book/src/inference/hub.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,17 +10,17 @@ Then let's start by downloading the [model file](https://huggingface.co/bert-bas


```rust
# extern crate candle;
# extern crate candle_core;
# extern crate hf_hub;
use hf_hub::api::sync::Api;
use candle::Device;
use candle_core::Device;

let api = Api::new().unwrap();
let repo = api.model("bert-base-uncased".to_string());

let weights = repo.get("model.safetensors").unwrap();

let weights = candle::safetensors::load(weights, &Device::Cpu);
let weights = candle_core::safetensors::load(weights, &Device::Cpu);
```

We now have access to all the [tensors](https://huggingface.co/bert-base-uncased?show_tensors=true) within the file.
Expand Down Expand Up @@ -48,7 +48,7 @@ cargo add hf-hub --features tokio
Now that we have our weights, we can use them in our bert architecture:

```rust
# extern crate candle;
# extern crate candle_core;
# extern crate candle_nn;
# extern crate hf_hub;
# use hf_hub::api::sync::Api;
Expand All @@ -57,10 +57,10 @@ Now that we have our weights, we can use them in our bert architecture:
# let repo = api.model("bert-base-uncased".to_string());
#
# let weights = repo.get("model.safetensors").unwrap();
use candle::{Device, Tensor, DType};
use candle_core::{Device, Tensor, DType};
use candle_nn::Linear;

let weights = candle::safetensors::load(weights, &Device::Cpu).unwrap();
let weights = candle_core::safetensors::load(weights, &Device::Cpu).unwrap();

let weight = weights.get("bert.encoder.layer.0.attention.self.query.weight").unwrap();
let bias = weights.get("bert.encoder.layer.0.attention.self.query.bias").unwrap();
Expand Down

0 comments on commit 166f4d1

Please sign in to comment.