cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
Engage in discussions on data warehousing, analytics, and BI solutions within the Databricks Community. Share insights, tips, and best practices for leveraging data for informed decision-making.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

MadelynM
by Databricks Employee
  • 2916 Views
  • 0 replies
  • 0 kudos

[Recap] Data + AI Summit 2024 - Warehousing & Analytics | Improve performance and increase insights

Here's your Data + AI Summit 2024 - Warehousing & Analytics recap as you use intelligent data warehousing to improve performance and increase your organization’s productivity with analytics, dashboards and insights.  Keynote: Data Warehouse presente...

Screenshot 2024-07-03 at 10.15.26 AM.png
Warehousing & Analytics
AI BI Dashboards
AI BI Genie
Databricks SQL
  • 2916 Views
  • 0 replies
  • 0 kudos
MJT07
by New Contributor
  • 1092 Views
  • 0 replies
  • 0 kudos

Summit

Today was the first day of summit and learned about how different models can be deployed for different decision points. I'm curious to know what users have found to be model control issues regarding this. Thanks!

  • 1092 Views
  • 0 replies
  • 0 kudos
Eric_Kieft
by New Contributor III
  • 10764 Views
  • 4 replies
  • 1 kudos

Is there a way to determine location/folder in the Recents view?

With multiple locations for items to exist, duplicate names make it hard to tell which one is which without opening.

  • 10764 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

@Eric Kieft​ :In Databricks, the Recents view shows the recently accessed notebooks, dashboards, and folders. However, it does not show the exact location of the item. To determine the location of an item in the Recents view, you can try the followin...

  • 1 kudos
3 More Replies
pauloquantile
by New Contributor III
  • 2746 Views
  • 1 replies
  • 2 kudos

How to prevent users from scheduling SQL queries?

We have noticed that users can schedule SQL queries, but currently we haven't found a way to find these scheduled queries (this does not show up in the jobs workplane). Therefore, we don't know that people scheduled this. The only way is to look at t...

  • 2746 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

@Paulo Rijnberg​ :In Databricks, you can use the following approaches to prevent users from scheduling SQL queries and to receive notifications when such queries are scheduled:Cluster-level permissionsJobs APINotification hooksAudit logs and monitori...

  • 2 kudos
DataArchitect
by New Contributor III
  • 6966 Views
  • 6 replies
  • 5 kudos

Migration of Databricks Jobs, SQL dasboards and Alerts from lower environment to higher environment?

I want to move Databricks Jobs, SQL dasboards, Queries and Alerts from lower environment to higher environment, how we can move?

  • 6966 Views
  • 6 replies
  • 5 kudos
Latest Reply
Anonymous
Not applicable
  • 5 kudos

Hi @Shubham Agagwral​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answ...

  • 5 kudos
5 More Replies
Chengcheng
by New Contributor III
  • 5795 Views
  • 1 replies
  • 3 kudos

Best practice of handling Data Duplication in a Delta Live Table Pipeline with daily batch data source.

I am building a data pipeline using Delta Live table in Azure Databricks to move data from a raw data table to a feature table and model inference results table. However, I am concerned about the potential for duplication issues in future operations,...

  • 5795 Views
  • 1 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Chengcheng Guo​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 3 kudos
alejandrofm
by Valued Contributor
  • 2696 Views
  • 1 replies
  • 3 kudos

How to change compression codec of sql warehouse written files?

Hi, I'm currently starting to use SQL Warehouse, and we have most of our lake in a compression different than snappy. How can I set the SQL warehouse to use a compression like gzip, zstd, on CREATE, INSERT, etc? Tried this: set spark.sql.parquet.comp...

  • 2696 Views
  • 1 replies
  • 3 kudos
Latest Reply
Anonymous
Not applicable
  • 3 kudos

Hi @Alejandro Martinez​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 3 kudos
Nikhil3107
by New Contributor III
  • 24506 Views
  • 1 replies
  • 2 kudos

RuntimeError: It appears that you are attempting to reference SparkContext from a broadcast variable, action, or transformation. SparkContext can only be used on the driver, not in code that it run on workers. For more information, see SPARK-5063.

I created the following model:     which calls get_identifier_information() which is as follows:     This is how I log the model     And this is the error I am running into:   RuntimeError: It appears that you are attempting to reference SparkContext...

image image image
  • 24506 Views
  • 1 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @Nikhil Gajghate​ Great to meet you, and thanks for your question! Let's see if your peers in the community have an answer to your question. Thanks.

  • 2 kudos
dannylee
by New Contributor III
  • 13286 Views
  • 7 replies
  • 3 kudos

Resolved! AWS Glue and Databricks

Hello, we're receiving an error when running glue jobs to try and connect to and read from a Databricks SQL endpoint. Hello, we're receiving an error when running glue jobs to try and connect to and read from a Databricks SQL endpoint.   An error occ...

  • 13286 Views
  • 7 replies
  • 3 kudos
Latest Reply
dannylee
New Contributor III
  • 3 kudos

Hello @Vidula Khanna​ @Debayan Mukherjee​ ,I wanted to give you an update that might be helpful for your future customers, we worked with @Pavan Kumar Chalamcharla​ and through lots of trial and error we figured out a combination that works for SQL e...

  • 3 kudos
6 More Replies
gilo12
by New Contributor III
  • 2219 Views
  • 1 replies
  • 1 kudos

request error: invalid operation state. This should not have happened

I am trying to identify errors coming from Databricks. So I can handle them in my code.Sometimes I get a descriptive error, that points me to the exact problem, but then if I run the exact same test, I sometimes get "request error: invalid operation ...

  • 2219 Views
  • 1 replies
  • 1 kudos
Latest Reply
mathan_pillai
Databricks Employee
  • 1 kudos

Can you share what command were you executing ? Also were you currently doing exception handling within your code with try/catch exception ? Can you also check the driver logs during the time error happened ? The driver logs should have more detail...

  • 1 kudos
RohitSingh
by New Contributor
  • 1895 Views
  • 1 replies
  • 1 kudos

Resolved! Video Submission

For purpose of posting video, can we post it on Google Drive, and share relevant link while submitting? Video will be made accessible to anyone with the link.Tagging @Karen Bajza-Terlouw​ and @Michelle Brain​ 

  • 1895 Views
  • 1 replies
  • 1 kudos
Latest Reply
Michelle_-_Devp
New Contributor III
  • 1 kudos

Hello! The video should be uploaded to and made publicly visible on YouTube, Vimeo, Facebook Video, or Youku so that it can playback on Devpost. This makes it easier for the judges

  • 1 kudos
115412
by New Contributor
  • 2773 Views
  • 2 replies
  • 2 kudos

Resolved! can you help me with connection between databricks and sftp file using paramiko??

https://community.databricks.com/s/question/0D58Y00009fClizSAC/ssh-connection-with-paramiko i read in this post that you can help me with that problem, please can you give me more advice what i have to do to connect to an sftp through ssh

  • 2773 Views
  • 2 replies
  • 2 kudos
Latest Reply
ZhengHuang
New Contributor III
  • 2 kudos

Hi!Could you please let me know what your current blocker is? Are you looking for a code snippet that can help you get file through sftp? Or if you are looking for the spark config that whitelist the port? Or if you are blocked by some other error?

  • 2 kudos
1 More Replies
Joseph_B
by Databricks Employee
  • 9126 Views
  • 4 replies
  • 3 kudos

Connect Delta Lake to OData API?

I'd like to expose Delta Lake data to external customers via OData v4 APIs. What's the best way to do that?

  • 9126 Views
  • 4 replies
  • 3 kudos
Latest Reply
john1
New Contributor II
  • 3 kudos

Is the best answer to this still to implement the the OData intermediate service yourself? Or is there a better way now?

  • 3 kudos
3 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 2259 Views
  • 0 replies
  • 2 kudos

Availability of SQL Warehouse to Data Science and Engineering persona ​Hi All,Now we can use SQL Warehouse in our notebook execution.It's in previ...

Availability of SQL Warehouse to Data Science and Engineering persona   ​ Hi All, Now we can use SQL Warehouse in our notebook execution. It's in preview now and soon will be GA.  

image.png
  • 2259 Views
  • 0 replies
  • 2 kudos
Chhaya
by New Contributor III
  • 5800 Views
  • 4 replies
  • 4 kudos

Resolved! DLT pipeline run cost

Hi team,   I am looking for a way to find DBU cost for DLT clusters, does it get stored anywhere I have been looking into event_logs but did not find information related to cost. it does have cluster resource utilization details. here is what I found...

dlt_Cost
  • 5800 Views
  • 4 replies
  • 4 kudos
Latest Reply
Anonymous
Not applicable
  • 4 kudos

Hi @Chhaya Vishwakarma​ Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best an...

  • 4 kudos
3 More Replies