Skip to content

Commit

Permalink
Avoid crash in lexer by increasing size of token_stream early
Browse files Browse the repository at this point in the history
The previous code assumed that any parsing step would only add one token at a time, which isn't true. This would lead to crashes if the size doubling threshold was skipped at one point.
  • Loading branch information
usiems authored and mrbean-bremen committed Nov 17, 2023
1 parent 9172e0f commit 4ae718a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion generator/parser/lexer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ void Lexer::tokenize(const char *contents, std::size_t size)
line_table.current_line = 1;

do {
if (index == token_stream.size())
if (index >= token_stream.size()-4) // increase size of token_stream early, in case more than one token is written in one go
token_stream.resize(token_stream.size() * 2);

Token *current_token = &token_stream[(int) index];
Expand Down

0 comments on commit 4ae718a

Please sign in to comment.