You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I followed the readme to load a model successfully. Because I can't find a documenet about inference, I tried to implement a version by myself. However, the inference output is weired.
from cellmap_models import cosem
import numpy as np
import tifffile as tif
import torch
from einops import rearrange
model = cosem.load_model(
'setup04/1820500'
)
print("min input shape: ", model.min_input_shape)
print("input size step: ", model.input_size_step)
img = tif.imread(
"/data/FIB-SEM/raw_images/control/ControlNeuron4_Raw-003.tif"
)
print("Original image size: ", img.shape)
input = img[:216, :216, :216] / np.float32(255.0)
input = 2.0 * input - 1.0
print(input.shape, input.dtype, input.min(), input.max())
# print(model)
model = model.to("cuda")
model.eval()
backbone = model.backbone
head = model.prediction_head
with torch.no_grad():
input = torch.tensor(input, device="cuda")
input = rearrange(input, "h w d -> 1 1 h w d")
output = head(backbone(input))
print(output.size(), output.max(), output.min())
output = output.cpu().numpy()
pred = np.argmax(output, axis=1)
print(pred.shape, pred.dtype, pred.max(), pred.min())
Bug or feature?
Bug
Description
I followed the readme to load a model successfully. Because I can't find a documenet about inference, I tried to implement a version by myself. However, the inference output is weired.
The std output is:
Could you please provide a document about inference or point out the bug in my code? Thank you so much!
The text was updated successfully, but these errors were encountered: