cancel
Showing results for 
Search instead for 
Did you mean: 
Warehousing & Analytics
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kaz
by New Contributor II
  • 2186 Views
  • 1 replies
  • 0 kudos

Show full logs on job log

Is it possible to show the full logs of a databricks job? Currently, the logs are skipped with:*** WARNING: max output size exceeded, skipping output. ***However, I don't believe our log files are more than 20 MB. I know you can press the logs button...

  • 2186 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Kaz , it's not possible to display the complete logs of a Databricks job in the job overview if the log output size has been exceeded. Databricks have a limit on the size of the output logs that can be displayed in the job overview. If the output...

  • 0 kudos
SimonMcCor
by New Contributor
  • 869 Views
  • 2 replies
  • 1 kudos

Calculated Field in Dashboards

Is there a way to create a calculated field in a dashboard from the data that has been put into it?I have an aggregated dataset that goes into a dashboard, but using an average in the calculation will only work if I display the average by the grouped...

  • 869 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @SimonMcCor , Yes, you can create a calculated field in a dashboard from the data that has been put into it. In Databricks, you can perform this operation within the notebook that feeds data into your dashboard. If you want to request a new featur...

  • 1 kudos
1 More Replies
Erik
by Valued Contributor II
  • 2437 Views
  • 1 replies
  • 0 kudos

Hot path event processing and serving in databricks

We have a setup where we process sensor data in databricks using pyspark structured streaming from kafka streams, and continuisly write these to delta tables. These delta tables are served through a SQL warehouse endpoint to the users. We also store ...

  • 2437 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Erik ,  - Use Delta Lake to optimize tables by coalescing small files into larger ones- Use the OPTIMIZE command on Delta Lake tables to improve write speed and reduce the number of small files- Utilize ZORDER for multi-dimensional clustering to ...

  • 0 kudos
youssefmrini
by Honored Contributor III
  • 3085 Views
  • 0 replies
  • 1 kudos

Dataricks asset bundles is now Public preview

Databricks asset bundles, now in Public Preview, enable end-to-end data, analytics, and ML projects to be expressed as a collection of source files. This makes it simpler to apply data engineering best practices such as source control, code review, t...

Warehousing & Analytics
CICD
MachineLearning
  • 3085 Views
  • 0 replies
  • 1 kudos
colinsorensen
by New Contributor III
  • 994 Views
  • 1 replies
  • 1 kudos

Resolved! Unhandled error while executing ['DatabricksSQLCursorWrapper' object has no attribute 'fetchmany'

Getting this error in dbt when trying to run a query. Not happening in the actual SQL warehouse in Databricks. Is this a bug? Can only find source code when I search 'DatabricksSQLCursorWrapper' but no documentation or information otherwise.

  • 994 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @colinsorensen, Based on the given information, it appears that the error you are encountering is related to accessing cloud storage in Databricks. The error message suggests that the cluster does not have the necessary permissions to access the s...

  • 1 kudos
hehuan-yu-zen
by New Contributor II
  • 827 Views
  • 2 replies
  • 0 kudos

customise the dates showing in the calendar selection in sql editor/dashboard

Does anybody know whether we could customise the dates showing in the calendar selection in sql editor/dashboard?My query has a time frame in a particular period, however when I use DateRange parameter in sql editor, it could allow users to choose th...

  • 827 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @hehuan-yu-zen , Yes, you can customize the dates showing in the calendar selection in the SQL editor/dashboard. The Date Range, Date and Time Range, and Date and Time Range (with seconds) parameters now support the option to designate the startin...

  • 0 kudos
1 More Replies
pauloquantile
by New Contributor III
  • 5728 Views
  • 6 replies
  • 0 kudos

PowerBI "Token expired while fetching results: TEAuthTokenExpired."

Hi everyone,We are at the moment stumbeling upon a big challenge with loading data into PowerBI. I need some advice!To give a bit of conext: we introduced Databricks instead of Azure Synapse for a client of ours. We are currently busy with moving all...

pauloquantile_0-1693824616520.png
  • 5728 Views
  • 6 replies
  • 0 kudos
Latest Reply
pauloquantile
New Contributor III
  • 0 kudos

Currently our solution to this problem is using a Personal Access Token as authentication method. I stumbled upon the problem that when the dataset is scheduled via PowerBI it went back to OAuth authentication. Still checking if the problem is stayin...

  • 0 kudos
5 More Replies
mortenhaga
by Contributor
  • 2530 Views
  • 5 replies
  • 4 kudos

Resolved! Databricks SQL and Engineer Notebooks yields different outputs from same script

Hi allWe are having some alarming issues regarding a script that yields different output when running on SQL vs Notebook. The correct output should be 8625 rows which it is in the notebook, but the output in Databricks SQL is 156 rows. The script use...

serverless cluster.JPG notebook compute info.JPG corrupt script wrong output notebook with serverless cluster.JPG corrupt script wrong output sql.JPG
Warehousing & Analytics
Databricks SQL
Notebook
Wrong output
  • 2530 Views
  • 5 replies
  • 4 kudos
Latest Reply
mortenhaga
Contributor
  • 4 kudos

UPDATE:I think we have identefied and solved the issue. It seems like using LAST with Databricks SQL requires to excplicitly be careful about setting the "ignoreNull" argument and also be careful about the correct datatype. I guess this is because of...

  • 4 kudos
4 More Replies
Hubert-Dudek
by Esteemed Contributor III
  • 1309 Views
  • 1 replies
  • 0 kudos

Optimizing SQL Databricks Warehouse Timeout Settings

Did you know the default timeout setting for SQL #databricks Warehouse is two days?The default timeout can be too long for most use cases. You can easily change this for your session or in the general SQL warehouse configuration.

ezgif-1-a917957651.gif
  • 1309 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Hubert-Dudek, I want to extend my heartfelt appreciation to our esteemed Community Member for consistently delivering outstanding company updates that are both informative and insightful. Your dedication to keeping the community well-informed and...

  • 0 kudos
uberweiss
by New Contributor II
  • 1511 Views
  • 2 replies
  • 0 kudos

Unable to access Databricks cluster through ODBC in R

We have previously been able to access our Databricks cluster in R using ODBC but it stopped working a couple of months ago and now i can't get it to connect.I've downloaded the latest drivers and added the right information in odbc/odbcinst files bu...

Warehousing & Analytics
cluster
Databricks
ODBC
R
  • 1511 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @uberweiss, Based on the error message you are receiving, the issue might be related to the server host or port specified for the connection. Here are a few suggestions to troubleshoot this issue. 1. Verify Server Details: The error message sugges...

  • 0 kudos
1 More Replies
mbhakta
by New Contributor II
  • 1623 Views
  • 2 replies
  • 1 kudos

Change Databricks Connection on Power BI (service)

We're creating a report with Power BI using data from our AWS Databricks workspace. Currently, I can view the report on Power BI (service) after publishing. Is there a way to change the data source connection, e.g. if I want to change the data source...

  • 1623 Views
  • 2 replies
  • 1 kudos
Latest Reply
mbhakta
New Contributor II
  • 1 kudos

Hi @Kaniz,Thanks for your reply! It seems possible, but I'm stuck on step 3, since `extensionDataSourcePath` is greyed out. Everything else is editable. What permissions do I need in order to update that field? â€ƒ

  • 1 kudos
1 More Replies
data_guy
by New Contributor
  • 404 Views
  • 0 replies
  • 0 kudos

Issue while extracting value From Decimal Key is Json

Hi Guys,I have a JSON as the below structure where the key is as decimal.{ "5.0": { "a": "15.92", "b": 0.0, "c": "15.92", "d": "637.14" }, "0.0": { "a": "15.92", "b": 0.0, "c": "15.92", "d": "637.14" } }schema_of_json returns the following:STRUCT<`0....

  • 404 Views
  • 0 replies
  • 0 kudos
Labels
Top Kudoed Authors