Skip to content
This repository has been archived by the owner on Jul 10, 2023. It is now read-only.

Commit

Permalink
Added optional DataChunkSize value to config.json.
Browse files Browse the repository at this point in the history
DataChunkSize specifies that, when building data, files bigger than the size specified (in MB) will be compressed in chunks of this size to avoid loading the entire file in memory. This fixes some Python memory errors reported when building data on low memory computers.
  • Loading branch information
joelgomes1994 committed Jan 24, 2021
1 parent 7a18686 commit b7f24c2
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 2 deletions.
1 change: 1 addition & 0 deletions launcher/config.json
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
"MainFile" : "Example Game.blend",
"DataFile" : "./data.dat",
"DataSource" : "./data",
"DataChunkSize" : 32,
"Persistent" : [
"*.bgeconf",
"*.sav",
Expand Down
8 changes: 6 additions & 2 deletions source/tools/Common/helper_scripts/build_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,8 @@
from math import ceil
import common

data = common.getData()

COMPRESSION_LEVEL = 1
FILE_MAX_SIZE = 1024 * 1024 * 32 # bytes > KB > MB
ITEM_SEPARATOR = "\t"
Expand Down Expand Up @@ -50,9 +52,11 @@ def compressDataFile(sourcePath, dataFile):

print("> Done! Time taken:", round(time() - startTime, 3), "seconds")

data = common.getData()

if data is not None:

if "DataChunkSize" in data.keys():
FILE_MAX_SIZE = 1024 * 1024 * data["DataChunkSize"]

compressDataFile(
data["CurPath"] / data["DataSource"],
data["CurPath"] / data["DataFile"]
Expand Down

0 comments on commit b7f24c2

Please sign in to comment.