@Ehsan Ullahโ :
The error message you received indicates that there is an issue with deserializing arrow data in the Spark JDBC driver. This error is caused by the fact that the sun.misc.Unsafe or java.nio.DirectByteBuffer
classes are not available.
To resolve this issue, you may try the following steps:
- Check that you are using the correct version of the JDBC driver. You can download the latest version of the driver from the Databricks website.
- Verify that you have the required dependencies installed, such as Java Development Kit (JDK) and Apache Arrow. Make sure you are using the correct versions.
- Check that you have properly configured your connection string, including the correct endpoint and credentials.
- If you are using a proxy server, make sure that it is correctly configured and that the necessary ports are open.
- Try to use a different JDBC driver or client tool to connect to your Databricks endpoint.
- If none of the above steps work, try contacting Databricks support for further assistance.
I hope this helps! Let me know if you have any further questions.