-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Denial of service when parsing a big JSON number as Instant/ZonedDateTime/OffsetDateTime #2141
Comments
Possible solution (proposed by @h8 ) would be to use bounding check for the As example, for |
@cowtowncoder it is low-bandwidth DoS vulnerability and affects all products that uses Jackson's |
Fix for issue where 100000000e1000000 is too large for a BigDecimal to be converted to a long, causing CPU spike. Duration and Instants should only be as large as Instant.MAX in BigDecimal form.
I made a best guess. I just am throwing a parse exception if it is out of bounds. I can clean up the error message however or perform a pull request if it looks good. Not sure if this needs to be back ported to other versions as well. |
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of a stack trace: ``` at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
@abracadv8 would |
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
@plokhotnyuk Yes, I would appreciate help here. At first I was thinking text-limit might help, but then realized this is not likely to work well across formats, as although it'd be possible to access numeric values as strings with JSON (and in general textual types), it would cause issues for binary formats. |
@GotoFinal - No, the check for whether the BigDecimal is too big to fit into an Instant or Duration likely needs to be done before longValue is even performed. The seconds portion of an Instant.MAX is a long, but its max value is slightly smaller than Long.MAX. So if you know it is not bigger than an Instant.Max, then you know it is definitely not larger than Long.MAX and can safely be converted via longValue(). Also, for the DecimalUtils portion you described here, that extraction happens after longValue() is performed (and after I check to make sure it is no larger than an Instant.MAX). The check needs to look something like this -- https://github.com/abracadv8/jackson-modules-java8/blob/master/datetime/src/main/java/com/fasterxml/jackson/datatype/jsr310/deser/InstantDeserializer.java#L307-#L312 I can work on a pull for one of those versions later tonight if it looks good. What about really long strings, should we be wary of those as well? |
Guards against numbers causing CPU or OOM issues when deserializing large numbers into Instant or Duration by either: - Scientific notation too large (eg 10000e100000) - Raw string repesenting a number of length too long
Guards against numbers causing CPU or OOM issues when deserializing large numbers into Instant or Duration by either: - Scientific notation too large (eg 10000e100000) - Raw string repesenting a number of length too long
Pull request for 2.9 - FasterXML/jackson-modules-java8#84 If it looks good, I'll work on 2.8. |
This failure mode isn't tied to time, it's tied to conversion of The root cause appears to be
Are there other places in Jackson where decimals are converted to integers? I suspect those components are going to be equally susceptible to this problem. |
Agreed -- anywhere that can accept a string to be converted to a BigDecimal where:
|
@abracadv8 The DoS described here isn't caused by converting a string to |
Quick question: does this only affect @toddjonker Good point on real root cause; this was my assumption. But from this, instead of guarding textual notation length (which is bit suboptimal in other ways), would it be possible to use magnitude checking methods of |
@cowtowncoder yes, parsing of |
Guards against numbers causing CPU or OOM issues when deserializing large exponential numbers into Instant or Duration by either: - Scientific notation too large (eg 10000e100000)
Calling After bound checking you can use |
I wasn't able to reproduce that issue. I don't think the parser is set up to handle negative exponents via the When i try 1e1000000000 it it properly throws a parse exception from my check. When i try 1e-1000000000, it throws an exception from I added a test case to check and a parse exception is being thrown but not by me - FasterXML/jackson-modules-java8@6ac19f6 |
@abracadv8 sounds good, the test proves we don't need to worry about negative exponents. FYI I didn't test through Jackson only through instantiating the |
@plokhotnyuk Hmmh. This is odd since parsing (simple decoding) from textual base-10 into base-10 numbers like |
From my limited testing:
For example,
The tests and check in the pull request I made covers "100000e1000000", but does not cover a really long string of "10000......0000". I'm not sure if that case is reachable from the parser, but, I did have a failing test case for that. I attempted to fix those as well, but folks said I was conflagrating issues, so I removed them. For InstantDeserializer and DurationDeserializer:
|
@cowtowncoder Current implementations of Below are results of benchmarks for different JSON parsers for Scala (including Jackson-module-scala) which are parametrized by the
To run them on your JDK:
|
@plokhotnyuk First of all, thank you for extensive research here. Second: crap. I wish translation only occurred at a later point, not during parsing, since that would have made our work here easier. So. It seems to me like we need to consider two-part (at least) approach, if I understand situation correctly. For scientific notation the problem is coercion from very large magnitude The trickier part would then be decoding large (long) decimal numbers. If we can limit this to non-scientific numbers that is probably good, as heuristics may be easier. Another good thing is that we will typically use And finally... should some of this be configurable? I don't think we can do much for 2.9, but for Jackson 2.10 we may be able to actually allow configurable handler for |
@cowtowncoder More over, it seems that during parsing of any JSON object it is possible to DoS the Jackson parser by just adding a field with the big number. Here is a PR which initially reproduced it for Play-JSON parser: https://github.com/plokhotnyuk/jsoniter-scala/pull/168/files Below are results of parametrized benchmarks where the
Step to reproduce are same as before, except the names of branch and benchmark:
|
It gets better, there's also #2157. So I think problem does need to be resolved at streaming parser level. My main concern here is simply that:
|
What does "big number" mean? Does it a value with a large magnitude, or JSON text with many characters? Those are two independent, and very different, concerns. This ticket is expressly about the former, and the latter should be handled separately so as not to confuse the situation further. IMO we should focus on the large-exponent case, and patch that problem ASAP. We can do so without changing any semantics or error cases. I submitted a PR last week with test coverage for the relevant edges, but it's not garnered any response. |
i post a test case about "big number" in this page: |
@wujimin That's a different failure mode than is described in this issue. Very long input text presents a well-known DoS vector for any library or application, and is generally mitigated by limiting the overall length of input documents. That's an entirely different problem than a very short input text that results in unbounded processing time, as is the case with this issue and the short text |
@toddjonker yes, "big number" problem traced in #2157 agree to you, better to limit the overall length. |
Moved to: FasterXML/jackson-modules-java8#90 |
Release notes: https://www.dropwizard.io/1.3.8/docs/about/release-notes.html 1.3.6 fixes a DoS issue in Jackson: FasterXML/jackson-databind#2141 1.3.7 fixes incorrect reading of somaxconn on Linux: dropwizard/dropwizard#2430 1.3.8 upgrades Guava to fix a DoS (CVE-2018-10237)
Jackson uses `BigDecimal` for deserilization of `java.time` instants and durations. The problem is that if the users sets a very big number in the scientific notation (like `1e1000000000`), it takes forever to convert `BigDecimal` to `BigInteger` to convert it to a long value. An example of the stack trace: ``` @test(timeout = 2000) public void parseBigDecimal(){ new BigDecimal("1e1000000000").longValue(); } at java.math.BigInteger.squareToomCook3(BigInteger.java:2074) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2053) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2051) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2055) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.squareToomCook3(BigInteger.java:2049) at java.math.BigInteger.square(BigInteger.java:1899) at java.math.BigInteger.pow(BigInteger.java:2306) at java.math.BigDecimal.bigTenToThe(BigDecimal.java:3543) at java.math.BigDecimal.bigMultiplyPowerTen(BigDecimal.java:3676) at java.math.BigDecimal.setScale(BigDecimal.java:2445) at java.math.BigDecimal.toBigInteger(BigDecimal.java:3025) ``` A fix would be to reject big decimal values outside of the Instant and Duration ranges. See: [1] FasterXML/jackson-databind#2141 [2] https://reddit.com/r/java/comments/9jyv58/lowbandwidth_dos_vulnerability_in_jacksons/
It looks the same as: playframework/play-json#180
Reproduced by the following commit: plokhotnyuk/jsoniter-scala@0d53faf
The security bug is in
InstantDeserializer
andDurationDeserializer
of thejackson-datatype-jsr310
artifact:W/A is to use custom serializers for all types that are parsed with
InstantDeserializer
andDurationDeserializer
by registering them after (or instead of) registration of theJavaTimeModule
module.The text was updated successfully, but these errors were encountered: