- 2628 Views
- 3 replies
- 0 kudos
Visualization from Python dataframe?
I notice it is very easily to get visualization from sql language inside Databricks. Say you run a SQL query which gives you a table, and you can easily use that table to do its visualization in terms of plots. How about in Python language when we ...
- 2628 Views
- 3 replies
- 0 kudos
- 0 kudos
Yes! You can visualize a Python DataFrame in Databricks easily using: display(df)This works like SQL visualizations, offering built-in charts. For more customization, Matplotlib, Seaborn, or Plotly can be used. Would love to see even more native sup...
- 0 kudos
- 13616 Views
- 5 replies
- 3 kudos
Resolved! Unable to use CX_Oracle library in notebook
While using cx_oracle python library, it returns the below error: error message: Cannot locate a 64-bit Oracle Client library: "libclntsh.so: cannot open shared object file: No such file or directory The cx_oracle library is dependent on native...
- 13616 Views
- 5 replies
- 3 kudos
- 3 kudos
Hi @AshvinManoj I used your script and still get same error. sudo echo 'LD_LIBRARY_PATH="/dbfs/databricks/instantclient_23_6"' >> /databricks/spark/conf/spark-env.shsudo echo 'ORACLE_HOME="/dbfs/databricks/instantclient_23_6"' >> /databricks/spark...
- 3 kudos
- 6177 Views
- 5 replies
- 5 kudos
Resolved! Recommended ETL workflow for weekly ingestion of .sql.tz "database dumps" from Blob Storage into Unity Catalogue-enabled Metastore
The client receives data from a third party as weekly "datadumps" of a MySQL database copied into an Azure Blob Storage account container (I suspect this is done manually, I also suspect the changes between the approx 7GB files are very small). I nee...
- 6177 Views
- 5 replies
- 5 kudos
- 5 kudos
Hi @Sylvia VB​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers you...
- 5 kudos
- 1113 Views
- 1 replies
- 0 kudos
- 1113 Views
- 1 replies
- 0 kudos
- 0 kudos
from an expert -tldr is that we want to support the critical mass of common charts ootb -- a user shouldn't have to import a new library to use something super common & should be able to benefit from DB provided capabilities. that being said, we know...
- 0 kudos
- 2066 Views
- 1 replies
- 0 kudos
Unable to install kneed library in cluster with DBR version 5.5 LTS
I have an issue to install and use kneed python libary. https://pypi.org/project/kneed/ I can install it and check it from log. [Install command] %sh pip install kneed [log] Installing collected packages: kneed Successfully installed kneed-0...
- 2066 Views
- 1 replies
- 0 kudos
- 0 kudos
The kneed library has a dependency and we need to install them as well in order to work:numpy==1.18scipy==1.1.0scikit-learn==0.21.3Once we install the above libraries using GUI, we can run the below command to check the installed library with the cor...
- 0 kudos
- 3557 Views
- 1 replies
- 0 kudos
Unable to construct the sql url as the password is having special characters.
while using the sqlalchemy, unable to connect with sql server from databricks: user='user@host.mysql.database.azure.com' password='P@test' host="host.mysql.database.azure.com" database = "db" connect_args={'ssl':{'fake_flag_to_enable_tls': True}} c...
- 3557 Views
- 1 replies
- 0 kudos
- 0 kudos
We can use urllib.parse to handle special characters. Here is an example:import urllib.parse user='user@host.mysql.database.azure.com' password=urllib.parse.quote_plus("P@test") host="host.mysql.database.azure.com" database = "db" connect_args={'...
- 0 kudos
-
Azure
3 -
Azure databricks
4 -
Cluster
3 -
Databricks Cluster
2 -
Databricks SQL
18 -
DBSQL
12 -
Delta
5 -
DLT
3 -
Mysql
2 -
Powerbi
6 -
Python
6 -
Spark
2 -
SQL
31 -
SQL Dashboard
2 -
Sql Warehouse
5 -
Unity Catalogue
1