cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Warehousing, Analytics, and BI

Forum Posts

MadelynM
by Contributor II
  • 393 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 393 Views
  • 0 replies
  • 0 kudos
User16826992666
by Valued Contributor
  • 1056 Views
  • 1 replies
  • 0 kudos
  • 1056 Views
  • 1 replies
  • 0 kudos
Latest Reply
sean_owen
Honored Contributor II
  • 0 kudos

No. If you use %pip or %conda to attach a library, then it will only affect the execution of the notebook. A separate virtualenv is created for each notebook and its dependencies, even on a shared cluster.If you create a Library in the workspace and ...

  • 0 kudos
User16753724663
by Valued Contributor
  • 4639 Views
  • 1 replies
  • 0 kudos

Unable to use JDBC/ODBC url with sql workbench

SQL Workbench is not able to connect to Cluster using JDBC/ODBC connection. Getting the following error. I used the configuration provided by the cluster (jdbc:spark://<host>.cloud.databricks.com:443/default;transportMode=http;ssl=1;httpPath=sql/prot...

  • 4639 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16753724663
Valued Contributor
  • 0 kudos

As we are getting 401 error that means an authentication issue. We should use Personal access token for password.The username should be "token" and the password should be PAT token.

  • 0 kudos
User16753724663
by Valued Contributor
  • 1515 Views
  • 1 replies
  • 0 kudos

Unable to install kneed library in cluster with DBR version 5.5 LTS

I have an issue to install and use kneed python libary. https://pypi.org/project/kneed/I can install it and check it from log.[Install command]%shpip install kneed[log]Installing collected packages: kneedSuccessfully installed kneed-0.7.0but when I c...

  • 1515 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16753724663
Valued Contributor
  • 0 kudos

The kneed library has a dependency and we need to install them as well in order to work:numpy==1.18scipy==1.1.0scikit-learn==0.21.3Once we install the above libraries using GUI, we can run the below command to check the installed library with the cor...

  • 0 kudos
User16753724663
by Valued Contributor
  • 2509 Views
  • 1 replies
  • 0 kudos

Unable to construct the sql url as the password is having special characters.

while using the sqlalchemy, unable to connect with sql server from databricks:user='user@host.mysql.database.azure.com' password='P@test' host="host.mysql.database.azure.com" database = "db" connect_args={'ssl':{'fake_flag_to_enable_tls': True}} conn...

  • 2509 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16753724663
Valued Contributor
  • 0 kudos

We can use urllib.parse to handle special characters. Here is an example:import urllib.parse user='user@host.mysql.database.azure.com'   password=urllib.parse.quote_plus("P@test") host="host.mysql.database.azure.com" database = "db"   connect_args={'...

  • 0 kudos
User16826992666
by Valued Contributor
  • 2035 Views
  • 1 replies
  • 0 kudos
  • 2035 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826992666
Valued Contributor
  • 0 kudos

SQL Analytics actually uses several layers of caching. Some documentation about the different layers can be found here in the documentation. There are two primary layers that users will experience. 1) The first is that the actual data results of spec...

  • 0 kudos
User16826992666
by Valued Contributor
  • 1061 Views
  • 1 replies
  • 0 kudos
  • 1061 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826992666
Valued Contributor
  • 0 kudos

At this time user credential passthrough is not supported on SQL Endpoints. Instead Databricks recommends using Table ACL's for data security.

  • 0 kudos
User16826992666
by Valued Contributor
  • 1001 Views
  • 1 replies
  • 0 kudos

Can you run data manipulation queries in Databricks SQL?

I am wondering if you can run queries that manipulate the actual data from within the Databricks SQL environment, or is it the case that you can only query tables?

  • 1001 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16826992666
Valued Contributor
  • 0 kudos

Yes, as long as proper permissions are in place you can run data manipulation through the SQL interface.

  • 0 kudos
Anonymous
by Not applicable
  • 1303 Views
  • 1 replies
  • 0 kudos
  • 1303 Views
  • 1 replies
  • 0 kudos
Latest Reply
User16783855117
Contributor II
  • 0 kudos

Hi! There are a few different types of caching that are supported in Databricks SQL, and you can see the cache retention policy for each of these different types of cache by starting here - https://docs.databricks.com/sql/admin/query-caching.html Que...

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels
Top Kudoed Authors