Skip to main content

Posts

Open SSL AES-256 Encryption / Decryption Command Line and Java Code using bouncycastle

  OpenSSL encrypted files begin with an 8-byte signature: the ASCII characters " Salted__ ". Files have an 8-byte signature, followed by an 8(?)-byte salt. Following the salt is the encrypted data. The salt and password are to be combined in a particular way, to derive the encryption key and initialization vector. No information about which encryption cipher was used is stored in the file. In order to decrypt the file, the cipher must be known by external means, or guessed. (Obviously, the same goes for the password.) Above is old/ deprecated mechanism of OpenSSL to derive encryption key.  So, many a time -one can see following warning while running OpenSSL commands -  *** WARNING : deprecated key derivation used. Using -iter or -pbkdf2 would be better. Also, note that default message digest for OpenSSL has been changed from md5 to sha-256. So, one may face problem to decrypt encrypted file generated from, not same version of OpenSSL Also, Refer - https://stackoverflow.com/qu

Microsoft Azure - Get Bearer token and access KeyVault secret using CURL command

  First, you must be having following to call  https://login.microsoftonline.com to get Bearer token -  tenant_id/ subscription_id  client_id client_secret Then call following command to get Bearer token for authorization for accessing resource https://vault.azure.net-  curl -X POST \ https://login.microsoftonline.com/{tenant_id}/oauth2/token \ -H 'cache-control: no-cache' \ -H 'content-type: application/x-www-form-urlencoded' \ -d 'grant_type=client_credentials&client_id={client_id}&client_secret={client_secret}&resource=https://vault.azure.net' Note to replace {*} with appropriate value. This will result a JSON response like below -  {"token_type":"Bearer","expires_in":"3599","ext_expires_in":"3599","expires_on":"1677278006","not_before":"1677274106","resource":"https://vault.azure.net", "access_token":"eyJ0

Azure HDInsights - Sudden Spark Job Failure & Exit - ERROR sender.RawSocketSender: org.fluentd.logger.sender.RawSocketSender

  We observed that Spark Job suddenly exited without any Error when running long on Azure HDInsights. But, we observed following error - 22/07/13 05:38:32 ERROR RawSocketSender [MdsLoggerSenderThread]: Log data 53245216 larger than remaining buffer size 10485760 22/07/13 05:59:54 ERROR sender.RawSocketSender: org.fluentd.logger.sender.RawSocketSender java.net.ConnectException: Connection refused (Connection refused)         at java.net.PlainSocketImpl.socketConnect(Native Method)         at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)         at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)         at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)         at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)         at java.net.Socket.connect(Socket.java:607)         at org.fluentd.logger.sender.RawSocketSender.connect(RawSocketSender.java:85)         at org.flu

Spark - java.util.NoSuchElementException: next on empty iterator [SPARK-27514]

  Recently, we did upgrade from HDP 3 to CDP 7, which involved upgrading Spark from 2.3 to 2.4. We did compile and build our Jar with new dependencies. But, code started failing with below error -  23/02/09 16:47:44 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: jav a.util.NoSuchElementException: next on empty iterator         at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)         at scala.collection.Iterator$$anon$2.next(Iterator.scala:37)         at scala.collection.IndexedSeqLike$Elements.next(IndexedSeqLike.scala:63)         at scala.collection.IterableLike$class.head(IterableLike.scala:107)         at scala.collection.mutable.ArrayBuffer.scala$collection$IndexedSeqOptimized$$super$head(ArrayBuffer.scala:48)         at scala.collection.IndexedSeqOptimized$class.head(IndexedSeqOptimized.scala:126)         at scala.collection.mutable.ArrayBuffer.head(ArrayBuffer.scala:48)         at org.apache.spark.sql.catalyst

SSH “timed out waiting for input: auto-logout”

It means if your SSH session has no activities for some time configured, the session will be disconnected. Timeotu value can be check by doing echo on $ TMOUT   ~]$ echo $TMOUT 900 For Linux Bash, usually the environment variable TMOUT is set either at the user level (.bashrc or .bash_profile) or at the system level (/etc/profile) to implement this security measure.