Skip to content

Nicerova7/output-range-analysis-for-deep-neural-networks-with-simulated-annealing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

Output Range Analysis for Deep Neural Networks with Simulated Annealing

✨ Introduction

Our approach addresses the lack of local geometric information and high non-linearity in DNNs, making it versatile across various architectures, especially Residual Neural Networks (ResNets). We present a straightforward, implementation-friendly algorithm that avoids restrictive assumptions about network architecture. Through theoretical analysis and experimental evaluations, including tests on the Ackley function, we demonstrate our algorithm’s effectiveness in navigating complex, non-convex surfaces and accurately estimating DNN output ranges.

📈 Reproduction results

## Model

model = DeeperResidualNN()
criterion = nn.MSELoss()
optimizer = optim.Adam(model.parameters(), lr=0.001)


num_epochs = 1000
for epoch in range(num_epochs):
    model.train()
    optimizer.zero_grad()
    outputs = model(x)
    loss = criterion(outputs, y)
    loss.backward()
    optimizer.step()
    if epoch % 100 == 0:
        print(f'Epoch [{epoch}/{num_epochs}], Loss: {loss.item():.4f}')



## Use function

initial_solution = torch.tensor([-3.8, -3.8], dtype=torch.float32).unsqueeze(0)  # Starting point
max_temperature = 1000     # Initial Temperature
min_temperature = 1
cooling_rate = 0.99            # Cooling Rate
num_iterations = 1000          # Number of iterations

l = torch.Tensor([-4, -4])
u = torch.Tensor([4, 4])
interval = (l, u)
sigma = 0.1

best_solution, best_value = simulated_annealing(model, initial_solution, max_temperature, min_temperature, cooling_rate, num_iterations, interval, sigma)
print(f"Optimal Solution: {best_value:.4f}, Prediction = {best_solution}")

📝 Citation

[WIP]

✒️ Authors:

📃 License

No license yet.

About

Output Range Analysis for Deep Neural Networks with Simulated Annealing

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages