-
Notifications
You must be signed in to change notification settings - Fork 366
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vetting of plugins #53
Comments
@dail8859 I do a short local test to check that the plugins are starting up and behave "normally". Obviously I can't test the complete functionality of each plugin. So there is some portion of hoping the plugin authors are gentle and don't do evil things. But it is not the case that I ran the download through something like https://www.virustotal.com. Do you have any ideas for this? Any maybe also some automated tests which could be added to the appveyor run? |
No concrete ideas. Was just curious if there was anybody checking the plugins getting put into the list. Especially ones that do not have the source code available. I know open source is no guarantee a plugin is safe but at least one can inspect the source for any blatant security issues. |
See e.g. https://www.virustotal.com/de/documentation/public-api/ with python examples how to use it for the plugin dlls/zips binaries. |
@donho @chcg Any thoughts on this? This virustotal.com looks interesting, but we have Public vs Premium API, and Public is limited:
I don't know how it looks like in Premium and how to get it (how much does it cost or gets it in some other way). Basically how this test should look like? Every time all When adding a new plug-in or a new version, isn't it worth adding the report url to the plug-in list? It would lighten the process a bit and shift some of the task for the authors. Plugins are not too big so it shouldn't be too much trouble for their authors. Having a report ready, it is easier to analyze it, and it gives at least a bit of security. I played a bit of analyzing all reports for x86 plugins right in the browser (using a script). Unfortunately cant't be integrate with AppVeyor due to re-captcha (I only had one, but there was). For now, it can be tested locally if someone wants, at least until something else is invented. Result:
More details in npp_plugin_scan.txt file. This Safe-BAD is not so bad, because only 1 scanner from 84 (or 85) does not pass.
Edit: There is also link using SHA-256 of this .zip file. Example for 3P plugin: This package hash is unique for each version, even the smallest change? Report from .zip file hash can be treated the same as report for .zip URL (assuming that both have same hash)? If so, it would simplify my analysis because I could just use the url with .zip file hash (not try finding report for .zip url from plugin list), assuming that it has always been generated (after adding a new plug-in or changing the version). Currently, I am simply searching for a report for a given URL form the plugin list (but these are extra steps ). If the report existed, but for example is old, it can be repeated. I will write a script that will repeat the scan for reports older than the specified time (e.g. a month). I will run it from time to time so the reports will be generated for the current signatures. Later I will then experiment with this public API to see what these limits for requests look like in practice. Directly in the browser, it is not that restrictive, but it is because of the re-captcha (it will probably popup with many to fast requests). |
|
See gh action example: |
Yes, but this limits with public API... There are a few things about this tool. It allows to scan urls and files. Other types of scanners are used for this actions, but both analyzes are recommended. Regarding files, it can scan .zip or individual files. Scanning the .zip itself requires less effort, but checking all its files is more accurate (to reduce the time you can skip text, photos, or other "usually safe" file types). I'm writing about it because you can get different results for these 3 scanning variants. Sometimes it detects something in 1, but not in 2 and 3, sometimes in 2, but not in 1 and 3, etc. Another thing is how do you want to interpret the potential detection? 1/80 or 2/80 what will it mean? Plugin if not safe? Here is a small piece of data which I have collected. I checked the 32 and 64 bit list (test URL,
You can analyze the above files yourself. Just search for something by the label, e.g.
Checking all every time (for each PR) is pointless (takes a long time and rescan takes much much longer), This should be done once before generating a new list. It would also be good to put such reports somewhere for a given version of the plugins list (when new version is released). But even these results don't make it clear whether something is actually dangerous or not. One or two detections may simply be false positive. |
Are the plugins added to this list vetted at all?
Notepad++ makes sure the list of plugins is signed so it knows it is downloading the expected bytes, but is any if the DLL files etc actually ran through a virus checker at all or something?
The text was updated successfully, but these errors were encountered: