by
esi
• New Contributor II
- 3375 Views
- 0 replies
- 0 kudos
Hi Community,I am looking for a way to access the Power BI tables from databricks and import them as a spark dataframe into my databricks notebook.As far as I have seen, there is a Power BI connector to load data from databricks into Power BI but not...
- 3375 Views
- 0 replies
- 0 kudos
- 2637 Views
- 0 replies
- 0 kudos
I'm using StarRocks Connector[2] to ingest data to StarRocks on DataBricks 13.1 (powered by Spark 3.4.0). The connector could run on community Spark 3.4, but fail on the DBR. The reason is (the full stack trace is attached)java.lang.IncompatibleClass...
- 2637 Views
- 0 replies
- 0 kudos
- 1348 Views
- 0 replies
- 0 kudos
[UNKNOWN_FIELD_EXCEPTION.NEW_FIELDS_IN_FILE] Encountered unknown fields during parsing: [<field_name>], which can be fixed by an automatic retry: trueI am using Azure Databricks, and write with python code. Want to catch the error and raise. Tried wi...
- 1348 Views
- 0 replies
- 0 kudos
- 1077 Views
- 0 replies
- 0 kudos
Is there a way to automate data categorisation with OpenAI API?
- 1077 Views
- 0 replies
- 0 kudos
- 9783 Views
- 6 replies
- 1 kudos
Hi Team,Getting below error while creating a table with primary key,"Table constraints are only supported in Unity Catalog."Table script : CREATE TABLE persons(first_name STRING NOT NULL, last_name STRING NOT NULL, nickname STRING,CONSTRAINT persons_...
- 9783 Views
- 6 replies
- 1 kudos
Latest Reply
Hi, this needs further investigation, could you please raise a support case with Databricks?
5 More Replies
- 5373 Views
- 0 replies
- 0 kudos
I have a large collection - and growing daily - of DLT pipelines and I need to grant access to non-admin users. Do I need to assign permissions on each individual DLT pipeline or is there a better approach?
- 5373 Views
- 0 replies
- 0 kudos
- 1493 Views
- 1 replies
- 0 kudos
I am experiencing error of over caching in databricks notebook. If i display different dfs one of the dfs get cache which after the result of others afterwards. Please how can I avoid the cache memory while using the notebook?
- 1493 Views
- 1 replies
- 0 kudos
Latest Reply
I don't exactly understand what your issue is. Can you elaborate more?
- 2015 Views
- 1 replies
- 0 kudos
I try to start cluster that i used to start it 7 times before and it gave me this error Cloud provider is undergoing a transient resource throttling. This is retryable. 1 out of 2 pods scheduled. Failed to launch cluster in kubernetes in 1800 seconds...
- 2015 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, This error "GCE out of resources" typically means that Google compute engine is out of resources as in out of nodes (can be a quota issue or can be node issues in that particular region in GCP). Could you please raise a google support case on thi...
- 1350 Views
- 1 replies
- 0 kudos
I try to start cluster that i used to start it 7 times before and it gave me this error Cloud provider is undergoing a transient resource throttling. This is retryable. 1 out of 2 pods scheduled. Failed to launch cluster in kubernetes in 1800 seconds...
- 1350 Views
- 1 replies
- 0 kudos
Latest Reply
Hi, This error "GCE out of resources" typically means that Google compute engine is out of resources as in out of nodes (can be a quota issue or can be node issues in that particular region in GCP). Could you please raise a google support case on thi...
- 7223 Views
- 3 replies
- 1 kudos
Hello, in rerence to https://www.databricks.com/blog/2022/11/18/introducing-ingestion-time-clustering-dbr-112.htmlI have a silly question how to use it. So let's assume that I have a few TB of not partitioned data. So, if I would like to query on dat...
- 7223 Views
- 3 replies
- 1 kudos
- 5793 Views
- 0 replies
- 0 kudos
Error: default auth: cannot configure default credentials. Config: token=***. Env: DATABRICKS_TOKENon cluster.tf line 27, in data “databricks_spark_version” “latest_lts”:27: data “databricks_spark_version” “latest_lts” {
- 5793 Views
- 0 replies
- 0 kudos
- 3240 Views
- 1 replies
- 1 kudos
Does the new feature 'Run If' that allows you to run tasks conditionally lack the 'ALWAYS' option? In order to execute the task both when there is OK and error from the dependencies
- 3240 Views
- 1 replies
- 1 kudos
Latest Reply
You can choose the All Done option to run the task in both the scenarios