-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sha256sum hashes #8
Comments
Please check if this is solved |
Thanks this seems to work for me. I.e. hashes for the complete JAR files of the susceptible 2.x versions are in hashes-pre-cve.txt. while mubex has been referenced by lunasec.io's thread on how to detect the original exploit here: the scstanton also includes sha256 hashes for the 1.x versions too. Apart from that I do not know if it would be better to check for the actually relevant JndiManager.class which in turn imports the JndiLookup.class. I have seen e.g. log4j-finder to check only for the class inside the JAR file. Regarding @whenselm 2. I do not know if this is really necessary. The script itself is executable together with the hashes-pre-cve.txt or whatever other hashes you supply. Though I do like the approach of the log4j-finder to include the hashes within the script itself, either using a here-doc or something similar as a safe default to be overwritten by a hashes file argument. |
Checked the source code, you can call the script with any of the above URLs as parameter and it will download the hashes via wget / curl. |
You could add a pull request, that checks if the local file exists and use it only then |
thanks for providing hashes for the class files itself, I saw the mentioned
jar archive's hashes before, did not work for me. 2.It's fine 4 me to
download the jar file from a source or local file. Next step I would like
to do is to check the files locally and not the production machine by
downloading it first via ssh. So I will not have A) the cpu load for hash
calculation on a virtual machine and B) circumvent the https/sftp/ pull
requests for the machine under test which will be blocked by the firewall.
So I do not have to upload anything on remotre machines under test
…On Mon, Dec 20, 2021 at 8:01 PM Ruben Barkow-Kuder ***@***.***> wrote:
Please check if this is solved
—
Reply to this email directly, view it on GitHub
<#8 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AW6GTHTICXZS37SWMUERBBLUR54P5ANCNFSM5KF2SWLA>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
I do not understand how you would gain less impact on the VMs you are about to analyse when you A. download the JAR files via a secure/encrypted protocol like ssh which will do its own hashes to do encryption and checksums and B. It is neither mire performant nor less than SFTP and the like. So either you do it remotely with the hashes copied or included or you may need to write a nice rsync script to pull all the relevant JAR files for local analysis. |
I added a check if the parameter is a URL, if not it uses a local file now. See README.md |
@rubo77 thanks for the latest addition, you even added some of the JNDIContextSelector.class hashes to the prepackaged hash file. Thanks a lot. @whenselm Regarding 2. I dunno if this is really feasible or useful doubting the impact of calculating the hashes locally vs. the overhead of transferring the classes and calculate hashes remotely. |
new issue: #21 The best would be if you would provide a pull request to the issue, because I won't have time for this this this. |
I have one question: Why are the provided hashes for "JNDIContextSelector.class" and not for "JndiLookup.class"? |
@whenselm There is only two hashes for the JNDIContextSelector.class |
@stefan123t did you notice the commit, where I added two more hashes? |
a lot of (~half )provided hashes are doublettes. Should be concatenated to improve scan speed (add corresponding version on right side rather than one hash per version). I do not know how to comit a change ... |
Just click the edit button here on GitHub on the top right when you view the hash file. It leads to a pull request |
Dear @whenselm, Regarding the duplicate hashes that were seen I agree that they should be merged into one unique line. Is the second column used eg for output ? If yes it should contain a list of the affected version numbers to display. Kind regards, |
It will have no effect on speed, since the hash-file is truncated anyway with I will merge the lines anyway ... |
Nice work, but I am missing vulnerable JndiLookup.class sha256sum hashes. Can anybody checkin to this project? Did not find anything on the net (there are hashes for log4j-core-xx.jar files, however available from the Apache repository), but it is pointless here to test on a .class level
2.You might not want to execute the hashing on the production machine /VM (cpu load), so I suggest to add some possibilties to check on a local machine and access the webserver only via ssh and pulling the .jar files to test. This might also circumvent where the webserver is blocked by a firewall for outgoing requests (either wget or curl).
The text was updated successfully, but these errors were encountered: