cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

dwfchu
by New Contributor
  • 82 Views
  • 1 replies
  • 0 kudos

SQL Warehouse external (to Databricks) access patterns and suggestions

Hi All!Has anyone encountered a situation where we need to setup data access for Unity Catalog tables for read access such as external data marts, dashboard tools and etc.We are currently using Databricks to serve data to people in our organisation t...

Warehousing & Analytics
access
JDBC
permissions
  • 82 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @dwfchu, Setting up data access for Unity Catalog tables in Databricks involves several considerations. Let’s explore your options and weigh their pros and cons: Personal Access Tokens (PATs): Pros: Quick and easy to set up.Developers can gen...

  • 0 kudos
Carsten03
by New Contributor III
  • 58 Views
  • 1 replies
  • 0 kudos

Permission Error When Running DELETE FROM

Hi,I want to remove duplicate rows from my managed delta table in my unity catalog. I use a query on a SQL warehouse similar to this:  WITH cte AS ( SELECT id, ROW_NUMBER() OVER (PARTITION BY id,##,##,## ORDER BY ts) AS row_num FROM catalog.sch...

  • 58 Views
  • 1 replies
  • 0 kudos
Latest Reply
Carsten03
New Contributor III
  • 0 kudos

I have first tried to use _metadata.row_index to delete the correct rows but also this resulted in an error. My solution was now to use spark and overwrite the table.table_name = "catalog.schema.table" df = spark.read.table(table_name) count_df = df....

  • 0 kudos
Jennifer
by New Contributor III
  • 103 Views
  • 3 replies
  • 1 kudos

How do I write dataframe to s3 without partition column name on the path

I am currently trying to write a dataframe to s3 likedf.write.partitionBy("col1","col2").mode("overwrite").format("json").save("s3a://my_bucket/")The path becomes `s3a://my_bucket/col1=abc/col2=opq/`But I want to path to be `s3a://my_bucket/abc/opq/`...

  • 103 Views
  • 3 replies
  • 1 kudos
Latest Reply
Jennifer
New Contributor III
  • 1 kudos

The way I did at the end was to write files to dbfs first and then move them to s3 in order to have a customized path and file name. I could also avoid writing commit files to s3.

  • 1 kudos
2 More Replies
EWhitley
by New Contributor II
  • 126 Views
  • 1 replies
  • 1 kudos

Documentation on the "test" capabilities in a DAB?

I see there’s a “test” capability within a DAB, but I’d like to know more about how this should/could be used. Does anyone know of any documentation or examples which might provide insights into its intended use?

  • 126 Views
  • 1 replies
  • 1 kudos
Latest Reply
AlliaKhosla
New Contributor III
  • 1 kudos

Hi @EWhitley     You can check and validate whether the Asset Bundle configuration is valid or not by using the below command   databricks bundle validate   If a JSON representation of the bundle configuration is returned, then the validation succeed...

  • 1 kudos
manish05485
by New Contributor II
  • 124 Views
  • 3 replies
  • 0 kudos

Issue setting metastore in GCP Databricks

While setting up metastore in GCP Databricks, I added the bucket name and then  service account permissons as well. Still my catalog dont have base root location. This deters me from creating table in my catalog. Root storage credential for metastore...

  • 124 Views
  • 3 replies
  • 0 kudos
Latest Reply
Ayushi_Suthar
Valued Contributor II
  • 0 kudos

Hi @manish05485 , Good Day!  Error:Root storage credential for metastore XXXXXX does not exist. Please contact your Databricks representative or consider updating the metastore with a valid storage Error states that the data access configuration for ...

  • 0 kudos
2 More Replies
Octavian1
by New Contributor III
  • 370 Views
  • 6 replies
  • 0 kudos

Resolved! api/2.0/sql/history/queries endpoint does not return query execution time

Hi,I cannot see the query execution time in the response to the "api/2.0/sql/history/queries" request.Basically, I get only the following fields:{"next_page_token":...,"has_next_page":...,"res":[  {     "query_id":...,     "status":..,     "query_tex...

  • 370 Views
  • 6 replies
  • 0 kudos
Latest Reply
Yeshwanth
Contributor III
  • 0 kudos

Spot on @feiyun0112  So this confirms that the API is working as expected right?

  • 0 kudos
5 More Replies
youssefmrini
by Honored Contributor III
  • 985 Views
  • 3 replies
  • 0 kudos

Getting started with Databricks lakeview Dashboards

Watch the youtube video : https://www.youtube.com/watch?v=MO7Dk035654

  • 985 Views
  • 3 replies
  • 0 kudos
Latest Reply
sheilaL
New Contributor II
  • 0 kudos

I am attempting to recreate a legacy dashboard in Lakeview. The bar graph in no way resembles what I created in the SQL visualization editor. Lakeview has far fewer formatting options for one thing.How do I recreate the graph so that it resembles the...

  • 0 kudos
2 More Replies
tawarity
by New Contributor
  • 1239 Views
  • 2 replies
  • 0 kudos

Python Requests Library Error ImportHookFinder.find_spec()

Hi All,I've been using notebooks to run patch requests to an external API using the Python requests library. Often times certain notebooks will randomly start to fail throughout the day and will raise a ImportHookFinder.find_spec() error when attempt...

  • 1239 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
1 More Replies
sjmb
by New Contributor
  • 479 Views
  • 4 replies
  • 0 kudos

Which object to use in which layer

I completed the Data Engineering Lakehouse course and I am familiar with different objects and concepts of databricks and lakehouse but I cant tie them together in my mind.Where do you typically use managed and non-managed tables? Bronze layer? Or no...

Warehousing & Analytics
Databricks
datalake
Lakehouse
  • 479 Views
  • 4 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hey there! Thanks a bunch for being part of our awesome community!  We love having you around and appreciate all your questions. Take a moment to check out the responses – you'll find some great info. Your input is valuable, so pick the best solution...

  • 0 kudos
3 More Replies
JustinM
by New Contributor II
  • 343 Views
  • 2 replies
  • 0 kudos

Cannot connect to SQL Warehouse using JDBC connector in Spark

When trying to connect to a SQL warehouse using the JDBC connector with Spark the below error is thrown. Note that connecting directly to a cluster with similar connection parameters works without issue, the error only occurs with SQL Warehouses.py4j...

  • 343 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @JustinM, Check your configuration settings: Ensure that the dbtable configuration is correctly set in your Spark code. The dbtable option should specify the table you want to load from your SQL warehouse.   Update JDBC driver: Make sure you’re us...

  • 0 kudos
1 More Replies
nikhilkumawat
by New Contributor III
  • 211 Views
  • 1 replies
  • 0 kudos

Decrypt and encrypted column in delta lake table in PowerBI

Hi Team,I have a delta table in databricks which contains a encrypted column. For encrypting I am using databricks "aes_encrypt" function. For reference: https://docs.databricks.com/en/sql/language-manual/functions/aes_encrypt.html#aes_encrypt-functi...

  • 211 Views
  • 1 replies
  • 0 kudos
Latest Reply
feiyun0112
New Contributor III
  • 0 kudos

you can create two cloumns , and display base on userDisplay different columns in Power BI based on logged in user | Paige Liu’s Posts (liupeirong.github.io)

  • 0 kudos
Eric_Kieft
by New Contributor III
  • 517 Views
  • 2 replies
  • 0 kudos

Unity Catalog "this table is deprecated" Functionality

We found a post on LinkedIn that revealed if "this table is deprecated" is added to a table comment, the table will appear with a strikethrough in notebooks and SQL editor windows.  Is this functionality GA?  Is there any documentation on the use of ...

  • 517 Views
  • 2 replies
  • 0 kudos
Latest Reply
Eric_Kieft
New Contributor III
  • 0 kudos

Thanks, @arpit! Is there any documentation on this feature?

  • 0 kudos
1 More Replies
EWhitley
by New Contributor II
  • 305 Views
  • 1 replies
  • 0 kudos

Retrieve task name within workflow task (notebook, python)?

Using workflows, is there a way to obtain the task name from within a task?EX: I have a workflow with a notebook task. From within that notebook task I would like to retrieve the task name so I can use it for a variety of purposes.Currently, we're re...

  • 305 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Honored Contributor III
  • 0 kudos

@EWhitley - could you please try using jobs API - /api/2.1/jobs/get and look for task_key fields to obtain all the task name within a given job.  Reference - https://docs.databricks.com/api/workspace/jobs/get

  • 0 kudos
Shailu1
by New Contributor II
  • 638 Views
  • 3 replies
  • 2 kudos

Resolved! Snowflake vs Databricks SQL Endpoint for Datawarehousing which is more persistent

Snowflake vs Databricks SQL Endpoint for Datawarehousing which is more persistent

  • 638 Views
  • 3 replies
  • 2 kudos
Latest Reply
Pritesh2
New Contributor II
  • 2 kudos

Databricks and Snowflake are both powerful platforms designed to address different aspects of data processing and analytics. Databricks shines in big data processing, machine learning, and AI workloads, while Snowflake excels in data warehousing, sto...

  • 2 kudos
2 More Replies
primaj
by New Contributor III
  • 1538 Views
  • 13 replies
  • 9 kudos

Introspecting catalogs and schemas JDBC in Pycharm

Hey,I've managed to add my SQL Warehouse as a data source in Pycharm using the JDBC driver and can query the warehouse from an SQL console within Pycharm. This is great, however, what I'm struggling with is getting the catalogs and schemas to show in...

  • 1538 Views
  • 13 replies
  • 9 kudos
Latest Reply
primaj
New Contributor III
  • 9 kudos

Hey Toby, thanks for your response, it's very helpful! I tried doing this is Pycharm to no avail, I then tried the same in Datagrip and it worked, very strange!

  • 9 kudos
12 More Replies