Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

avoid changing the original candles array #419

Closed
wants to merge 1 commit into from

Conversation

movy
Copy link
Contributor

@movy movy commented Jan 14, 2024

Occasionally, especially when using warm_up_candles with distributed backtests (e.g via Ray.tune), backtests were crashing with assignment destination is read-only error. Copying array before mutating it help avoid this error.

… is read-only` error

Occasionally, especially when using warm_up_candles with distributed backtests (e.g via Ray.tune), backtest was crashing with ` assignment destination is read-only` error. Copying array before mutating it help avoid this error.
Copy link

stale bot commented Mar 14, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale No recent activity. label Mar 14, 2024
@saleh-mir saleh-mir added will-fix-soon and removed stale No recent activity. labels Mar 14, 2024
@saleh-mir
Copy link
Member

Sorry about the delay for my response. So I looked into this. Now I'm assuming you're trying to use Rey or similar frameworks with Jesse's research module's backtest function. And if so, you're the one passing the candles and you can pass in copies of candles then to make sure mutations don't happen. Am I missing something?

@movy
Copy link
Contributor Author

movy commented Apr 11, 2024

Correct, the error is thrown when I run backtests from within Ray. Ray itself uses some kind of storage to distribute candles between many workers, and candles are passed around by reference, so I assumed Ray was not happy when original array is changed. I don't think there is a way to create a separate candles array copy for each ray worker, so the original candles array must be immutable:

backtest(config=ray.get(config_ref),  # starting balance, fees etc.
                            routes=ray.get(routes_refs[params["tf"]]),  # exchange, strategy, symbol, timeframe or 5m (if extra_timeframes)
                            extra_routes=ray.get(extra_routes_refs[anchor_timeframe(params["tf"])]),
                            candles=ray.get(candles_ref),  # candles within backtest date range
                            generate_csv=False,
                            generate_json=True,
                            hyperparameters=params)

The error was traced back to this call https://github.com/jesse-ai/jesse/pull/419/files#diff-678d1837e93b9e24db7260c519fdf374b46673e1e5855b936e1e8527a9cb0bc2L399, and copying the array instead of modifying it fixed the problem. I've patched it in my copy of Jesse long time ago (when this PR was created), and it's been working flawlessly since then.

@saleh-mir
Copy link
Member

This example code you provided, is it being used inside a loop? Or is it a one-time call and Rey is creating some kind of loop of its own? I'm saying if it's inside a loop you can pass a copy of the candles in it so that it would be copied inside the initial call to do the backtest() function.

@movy
Copy link
Contributor Author

movy commented Apr 12, 2024

It's not a loop per se, rather a testing function is distributed amongst many workers and each worker gets access to candles via immutable central storage.

def backtest_rungs(params):
    for candles_ref in candles_refs[overlap]:
        result = backtest(config=ray.get(config_ref),  
                        routes=ray.get(routes_refs[params["tf"]]), 
                         extra_routes=ray.get(extra_routes_refs[anchor_timeframe(params["tf"])]), 
                        candles=ray.get(candles_ref),  # candles within backtest date range
                        generate_csv=False,
                        generate_json=True,
                        hyperparameters=params)

tune.Tuner(tune.with_resources(backtest_rungs, tune.PlacementGroupFactory([{"CPU": 0.5}])), 
                        param_space={**prefix_hyperparams[strategy_prefix]}, 
                        tune_config=tune_config,
                        run_config=run_config
                        ).fit()

I cannot remember whether I tried copying candles received by reference, and running backtest() function with that copy, it would work I guess, but right now I don't have time to test this approach. I think this PR can be closed for now, and if someone else faces this problem, they can refer to this discussion for the solution.

@saleh-mir
Copy link
Member

I agree.

@saleh-mir saleh-mir closed this Apr 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants