Skip to content

Commit

Permalink
dataset: cleanup datasets that hit the memcap while loading
Browse files Browse the repository at this point in the history
Datasets that hit the memcap limit need to be discarded if the memcap is
hit or otherwise the datasets are still loaded with partial data while
the signature is not loaded due to the memcap error.

Ticket: OISF#6678
  • Loading branch information
norg committed Apr 16, 2024
1 parent ce1556c commit f175a78
Show file tree
Hide file tree
Showing 2 changed files with 5 additions and 4 deletions.
5 changes: 5 additions & 0 deletions src/datasets.c
Original file line number Diff line number Diff line change
Expand Up @@ -746,6 +746,11 @@ Dataset *DatasetGet(const char *name, enum DatasetTypes type, const char *save,
break;
}

if (set->hash && SC_ATOMIC_GET(set->hash->memcap_reached)) {
SCLogError("dataset too large for set memcap");
goto out_err;
}

SCLogDebug("set %p/%s type %u save %s load %s",
set, set->name, set->type, set->save, set->load);

Expand Down
4 changes: 0 additions & 4 deletions src/detect-dataset.c
Original file line number Diff line number Diff line change
Expand Up @@ -406,10 +406,6 @@ int DetectDatasetSetup (DetectEngineCtx *de_ctx, Signature *s, const char *rawst
SCLogError("failed to set up dataset '%s'.", name);
return -1;
}
if (set->hash && SC_ATOMIC_GET(set->hash->memcap_reached)) {
SCLogError("dataset too large for set memcap");
return -1;
}

cd = SCCalloc(1, sizeof(DetectDatasetData));
if (unlikely(cd == NULL))
Expand Down

0 comments on commit f175a78

Please sign in to comment.