Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

File-based keystores, such as JKS and PKCS#12, are not yet supported in FIPS mode, we need to support this. #37

Open
sshuklao opened this issue Oct 20, 2022 · 10 comments
Labels
enhancement New feature or request

Comments

@sshuklao
Copy link

File-based keystores, such as JKS and PKCS#12, are not yet supported in FIPS mode. Can we get this support added in Semeru runtimes.

@sshuklao sshuklao changed the title File-based keystores, such as JKS and PKCS#12, are not yet supported in FIPS mode File-based keystores, such as JKS and PKCS#12, are not yet supported in FIPS mode, we need to support this. Oct 20, 2022
@patilca
Copy link

patilca commented Oct 20, 2022

@sshuklao This seems like a duplicate of https://github.ibm.com/runtimes/semeru-requests/issues/12

@patilca
Copy link

patilca commented Oct 20, 2022

Please close if you agree

@mstoodle
Copy link
Member

That's a link an internal IBM request, @patilca . I don't think we should close this public issue as overlapping with an IBM internal request.

@sshuklao we are looking into it. JKS is not compatible with FIPS, so there's no hope there. But PKCS#12 is something we're investigating now. It won't be in the October update release, but we're hoping to know more for the January update (it may even be available at that point, who knows?).

@sshuklao
Copy link
Author

sshuklao commented Nov 7, 2022

@mstoodle when can we expect this change would be available to use? We are looking for this fix before Jan 2023, is it possible?

@mstoodle
Copy link
Member

mstoodle commented Nov 7, 2022

The earliest possible target would be our January update release, which I would expect to release in perhaps the first week or so of February. It could be earlier, but I can't promise anything.

I also want to stress that this is a feature still under development and while we're hopeful it will make it into the January release, we are not yet at the point where we know it will be in that release. If this item is targeted for this release, there should be milestone builds in January you can test with.

@aprenaud aprenaud added the enhancement New feature or request label Feb 7, 2023
@sshuklao
Copy link
Author

@mstoodle is there any update on this, we need this support because looks like Spark SSL encryption for intra cluster communication of spark master and workers support only jks or pkcs12 keystore only as per this documentation.

@sshuklao
Copy link
Author

when we used pkcs12 based keystore in Spark configuration and after adding this keystore to nssdb, we are getting below error.

semerufips: FIPS mode properties loaded
semerufips: {jdk.jar.disabledAlgorithms=MD2, MD5, RSA keySize < 1024, DSA keySize < 1024, SHA1 denyAfter 2019-01-01, include jdk.disabled.namedCurves, policy.provider=sun.security.provider.PolicyFile, policy.url.1=file:${java.home}/conf/security/java.policy, jdk.security.legacyAlgorithms=SHA1, RSA keySize < 2048, DSA keySize < 2048, securerandom.source=file:/dev/random, policy.url.2=file:${user.home}/.java.policy, jdk.disabled.namedCurves=secp112r1, secp112r2, secp128r1, secp128r2, secp160k1, secp160r1, secp160r2, secp192k1, secp192r1, secp224k1, secp224r1, secp256k1, sect113r1, sect113r2, sect131r1, sect131r2, sect163k1, sect163r1, sect163r2, sect193r1, sect193r2, sect233k1, sect233r1, sect239k1, sect283k1, sect283r1, sect409k1, sect409r1, sect571k1, sect571r1, X9.62 c2tnb191v1, X9.62 c2tnb191v2, X9.62 c2tnb191v3, X9.62 c2tnb239v1, X9.62 c2tnb239v2, X9.62 c2tnb239v3, X9.62 c2tnb359v1, X9.62 c2tnb431r1, X9.62 prime192v2, X9.62 prime192v3, X9.62 prime239v1, X9.62 prime239v2, X9.62 prime239v3, brainpoolP256r1, brainpoolP320r1, brainpoolP384r1, brainpoolP512r1, crypto.policy=unlimited, jdk.certpath.disabledAlgorithms=MD2, MD5, SHA1 jdkCA & usage TLSServer, RSA keySize < 1024, DSA keySize < 1024, EC keySize < 224, SHA1 usage SignedJAR & denyAfter 2019-01-01, include jdk.disabled.namedCurves, jceks.key.serialFilter=java.base/java.lang.Enum;java.base/java.security.KeyRep;java.base/java.security.KeyRep$Type;java.base/javax.crypto.spec.SecretKeySpec;!*, jdk.tls.disabledAlgorithms=SSLv3, RC4, DES, MD5withRSA, DH keySize < 1024, EC keySize < 224, 3DES_EDE_CBC, anon, NULL, include jdk.disabled.namedCurves, X25519, X448, SSLv3, TLSv1, TLSv1.1, TLS_CHACHA20_POLY1305_SHA256, TLS_DHE_RSA_WITH_AES_256_GCM_SHA384, TLS_DHE_DSS_WITH_AES_256_GCM_SHA384, TLS_DHE_RSA_WITH_AES_128_GCM_SHA256, TLS_DHE_DSS_WITH_AES_128_GCM_SHA256, TLS_DHE_RSA_WITH_AES_256_CBC_SHA256, TLS_DHE_DSS_WITH_AES_256_CBC_SHA256, TLS_DHE_RSA_WITH_AES_128_CBC_SHA256, TLS_DHE_DSS_WITH_AES_128_CBC_SHA256, TLS_DHE_RSA_WITH_AES_256_CBC_SHA, TLS_DHE_DSS_WITH_AES_256_CBC_SHA, TLS_DHE_RSA_WITH_AES_128_CBC_SHA, TLS_DHE_DSS_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_GCM_SHA384, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_256_CBC_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA256, TLS_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_AES_256_GCM_SHA384, TLS_AES_128_GCM_SHA256, TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDH_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDH_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256, TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256, TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256, TLS_EMPTY_RENEGOTIATION_INFO_SCSV, policy.ignoreIdentityScope=false, login.configuration.provider=sun.security.provider.ConfigFile, keystore.type.compat=true, security.overridePropertiesFile=true, jdk.tls.legacyAlgorithms=K_NULL, C_NULL, M_NULL, DH_anon, ECDH_anon, RC4_128, RC4_40, DES_CBC, DES40_CBC, 3DES_EDE_CBC, jdk.sasl.disabledMechanisms=, jdk.security.caDistrustPolicies=SYMANTEC_TLS, sun.security.krb5.maxReferrals=5, jdk.tls.keyLimits=AES/GCM/NoPadding KeyUpdate 2^37, security.provider.1=SunPKCS11 ${java.home}/conf/security/nss.fips.cfg, security.provider.2=SUN, security.provider.3=SunEC, networkaddress.cache.negative.ttl=10, jdk.tls.alpnCharset=ISO_8859_1, security.provider.4=SunJSSE, ssl.KeyManagerFactory.algorithm=SunX509, jdk.xml.dsig.secureValidationPolicy=disallowAlg http://www.w3.org/TR/1999/REC-xslt-19991116,disallowAlg http://www.w3.org/2001/04/xmldsig-more#rsa-md5,disallowAlg http://www.w3.org/2001/04/xmldsig-more#hmac-md5,disallowAlg http://www.w3.org/2001/04/xmldsig-more#md5,maxTransforms 5,maxReferences 30,disallowReferenceUriSchemes file http https,minKeySize RSA 1024,minKeySize DSA 1024,minKeySize EC 224,noDuplicateIds,noRetrievalMethodLoops, securerandom.drbg.config=, sun.security.krb5.disableReferrals=false, ssl.TrustManagerFactory.algorithm=PKIX, keystore.type=PKCS11, policy.allowSystemProperty=true, jdk.io.permissionsUseCanonicalPath=false, securerandom.strongAlgorithms=NativePRNGBlocking:SUN,DRBG:SUN, policy.expandProperties=true, package.access=sun.misc.,sun.reflect., package.definition=sun.misc.,sun.reflect., krb5.kdc.bad.policy=tryLast}
23/02/20 03:54:58 INFO SparkContext: Running Spark version 3.3.1
23/02/20 03:54:58 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
23/02/20 03:54:58 INFO ResourceUtils: ==============================================================
23/02/20 03:54:58 INFO ResourceUtils: No custom resources configured for spark.driver.
23/02/20 03:54:58 INFO ResourceUtils: ==============================================================
23/02/20 03:54:58 INFO SparkContext: Submitted application: python3.10
23/02/20 03:54:58 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 4096, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
23/02/20 03:54:58 INFO ResourceProfile: Limiting resource is cpus at 1 tasks per executor
23/02/20 03:54:58 INFO ResourceProfileManager: Added ResourceProfile id: 0
23/02/20 03:54:58 INFO SecurityManager: Changing view acls to: 1000670000
23/02/20 03:54:58 INFO SecurityManager: Changing modify acls to: 1000670000
23/02/20 03:54:58 INFO SecurityManager: Changing view acls groups to: 
23/02/20 03:54:58 INFO SecurityManager: Changing modify acls groups to: 
23/02/20 03:54:58 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(1000670000); groups with view permissions: Set(); users  with modify permissions: Set(1000670000); groups with modify permissions: Set()
23/02/20 03:54:59 INFO Utils: Successfully started service 'sparkDriver' on port 34645.
23/02/20 03:54:59 INFO SparkEnv: Registering MapOutputTracker
23/02/20 03:54:59 INFO SparkEnv: Registering BlockManagerMaster
23/02/20 03:54:59 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
23/02/20 03:54:59 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
23/02/20 03:54:59 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
23/02/20 03:54:59 INFO DiskBlockManager: Created local directory at /tmp/spark/scratch/blockmgr-0cfe20af-fdf4-4dd9-9102-bfeff446de79
23/02/20 03:54:59 INFO MemoryStore: MemoryStore started with capacity 2.2 GiB
23/02/20 03:54:59 INFO SparkEnv: Registering OutputCommitCoordinator
23/02/20 03:54:59 INFO log: Logging initialized @5767ms to org.sparkproject.jetty.util.log.Slf4jLog
23/02/20 03:54:59 INFO Server: jetty-9.4.48.v20220622; built: 2022-06-21T20:42:25.880Z; git: 6b67c5719d1f4371b33655ff2d047d24e171e49a; jvm 11.0.17+8
23/02/20 03:54:59 INFO Server: Started @5937ms
23/02/20 03:54:59 ERROR SparkUI: Failed to bind SparkUI
java.io.IOException: parseAlgParameters failed: PBE AlgorithmParameters not available
	at sun.security.pkcs12.PKCS12KeyStore.parseAlgParameters(PKCS12KeyStore.java:839) ~[?:?]
	at sun.security.pkcs12.PKCS12KeyStore.engineLoad(PKCS12KeyStore.java:2074) ~[?:?]
	at java.security.KeyStore.load(KeyStore.java:1479) ~[?:?]

Below configuration we set it to Spark.

        "spark.ssl.enabled":"true",
        "spark.ssl.keyStore":"/home/spark/key.p12",
        "spark.ssl.keyStorePassword":"changeit",
        "spark.ssl.keyStoreType":"pkcs12"

@sshuklao
Copy link
Author

the same issue is reported here too https://access.redhat.com/solutions/6954451

@mstoodle
Copy link
Member

My apologies, I should have provided an update here a few weeks ago but missed doing it :( .

The work needed to support PKCS#12 key stores in FIPS 140-2 mode did not make it into the January update release, but it will be in the April update.

@sshuklao
Copy link
Author

@mstoodle any update on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants