cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

bricksdata
by New Contributor
  • 14090 Views
  • 4 replies
  • 0 kudos

Unable to authenticate against https://accounts.cloud.databricks.com as an account admin.

ProblemI'm unable to authenticate against the https://accounts.cloud.databricks.com endpoint even though I'm an account admin. I need it to assign account level groups to workspaces via the workspace assignment api (https://api-docs.databricks.com/re...

  • 14090 Views
  • 4 replies
  • 0 kudos
Latest Reply
137292
New Contributor II
  • 0 kudos

From this doc: To automate Databricks account-level functionality, you cannot use Databricks personal access tokens. Instead, you must use either OAuth tokens for Databricks account admin users or service principals. For more information, see:Use a s...

  • 0 kudos
3 More Replies
thiagoawstest
by Contributor
  • 1394 Views
  • 0 replies
  • 0 kudos

change network/vpc workspace

Hello, I have two workspaces, each workspace pointing to a VPC in AWS, in one of the accounts we need to remove a subnet, after removing the InvalidSubnetID.NotFound AWS error when starting the clueter, checked in Manager Account, the networl is poin...

thiagoawstest_0-1720808852626.png
  • 1394 Views
  • 0 replies
  • 0 kudos
Avinash_Narala
by Databricks Partner
  • 893 Views
  • 0 replies
  • 0 kudos

Tracking Serverless cluster cost

Hi,I just explored serverless feature in databricks and wondering how can i track cost associated with it. Is it stored in system tables? If yes, then where can i find it?And also how can i prove that it's cost is relatively less compared to classic ...

  • 893 Views
  • 0 replies
  • 0 kudos
Avinash_Narala
by Databricks Partner
  • 1434 Views
  • 0 replies
  • 0 kudos

File Trigger VS Autoloader

Hi,I recently came across File Trigger in Databricks and find mostly similar to Autoloader. My 1st question is why file trigger as we have autoloader.In which scenarios I can go with file triggers and autoloader.Can you please differentiate?

  • 1434 Views
  • 0 replies
  • 0 kudos
FennVerm_60454
by New Contributor II
  • 8685 Views
  • 4 replies
  • 1 kudos

Resolved! How to clean up extremely large delta log checkpoints and many small files?

AWS by the way, if that matters. We have an old production table that has been running in the background for a couple of years, always with auto-optimize and auto-compaction turned off. Since then, it has written many small files (like 10,000 an hour...

  • 8685 Views
  • 4 replies
  • 1 kudos
Latest Reply
siddhathPanchal
Databricks Employee
  • 1 kudos

Sometime, if we have less commit versions for a delta table, it won't create checkpoint files in the table. Checkpoint file is responsible to trigger the log cleanup activities. In case, you observe that there are no checkpoint files available for th...

  • 1 kudos
3 More Replies
Kayla
by Valued Contributor II
  • 3474 Views
  • 3 replies
  • 1 kudos

Resolved! Datadog Installation

Is anyone familiar with installing the Datadog agent on clusters? We're not having much luck.  We honestly might not be having the init script run since we're not seeing it in the log, but we can get just a generic "hellow world" init script to run a...

  • 3474 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kayla
Valued Contributor II
  • 1 kudos

Responding here with the solution I found. Hopefully it'll help anyone with similar issues.First, the Datadog install script is practically a matryoshka doll- the script creates another script which creates a YAML file.One of the consequences of that...

  • 1 kudos
2 More Replies
erigaud
by Honored Contributor
  • 4017 Views
  • 4 replies
  • 0 kudos

Pass Dataframe to child job in "Run Job" task

Hello,I have a Job A that runs a Job B, and Job A defines a globalTempView and I would like to somehow access it in the child job. Is that in anyway possible ? Can the same cluster be used for both jobs ? If it is not possible, does someone know of a...

  • 4017 Views
  • 4 replies
  • 0 kudos
Latest Reply
rahuja
Contributor
  • 0 kudos

Hi @ranged_coop Yes, we are using the same job compute for using different workflows. But I think different tasks are like different docker containers so that is why it becomes an issue. It would be nice if you can explain a bit about the approach yo...

  • 0 kudos
3 More Replies
tf32
by Databricks Partner
  • 4744 Views
  • 2 replies
  • 1 kudos

Resolved! ERROR com.databricks.common.client.DatabricksServiceHttpClientException: DEADLINE_EXCEEDED

Hi,I got this error "com.databricks.WorkflowException: com.databricks.common.client.DatabricksServiceHttpClientException: DEADLINE_EXCEEDED" during the run of a job workflow with an interactive cluster, at the start of this. It's a job that has been ...

  • 4744 Views
  • 2 replies
  • 1 kudos
Latest Reply
tf32
Databricks Partner
  • 1 kudos

Yes, subsequent runs have been successful.Thank you for the explanation. 

  • 1 kudos
1 More Replies
Avinash_Narala
by Databricks Partner
  • 5470 Views
  • 2 replies
  • 2 kudos

Resolved! Databricks AI Assistant Cost Implications

I'm worried about how much the Databricks AI assistant will cost me.I need to understand what I'll be charged for, especially when I give a prompt to the AI Assistant Pane and how it will operate in the background.

  • 5470 Views
  • 2 replies
  • 2 kudos
Latest Reply
Avinash_Narala
Databricks Partner
  • 2 kudos

Is there any token limit? like in response or the prompt we send?

  • 2 kudos
1 More Replies
Jreco
by Contributor
  • 7555 Views
  • 2 replies
  • 1 kudos

Resolved! SQLServer Incorrect syntax near the keyword 'WITH'

Hi Mates!I'm trying to get some data from an SQLServer using a query; the query has a WITH statement but I'm getting the following error:raise convert_exception(pyspark.errors.exceptions.connect.SparkConnectGrpcException: (com.microsoft.sqlserver.jdb...

  • 7555 Views
  • 2 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @Jreco ,You need to use prepare query option and then query like below: url = "jdbc:sqlserver://server_name:1433;database=db_name" df = spark.read \ .format("jdbc") \ .option("url", url) \ .option("prepareQuery", "with cte as ( SELECT ...

  • 1 kudos
1 More Replies
gaurav_khanna
by New Contributor II
  • 11170 Views
  • 4 replies
  • 3 kudos
  • 11170 Views
  • 4 replies
  • 3 kudos
Latest Reply
BartRJD
New Contributor II
  • 3 kudos

I am having the same issue (Azure Databricks).I have a running compute cluster analytics-compute-cluster running in Single User access mode.  The Event Log for the cluster says the cluster is running and the "Driver is healthy".I have Manage permissi...

  • 3 kudos
3 More Replies
duttong
by Databricks Partner
  • 8061 Views
  • 8 replies
  • 7 kudos

[Errno 11] resource temporarily unavailable

Hi Databricks Community,We faced a strange error today where the error below was returned when a notebook was being run. It only happens on git connected notebooks and on rerun it succeeds. What is the issue? 

duttong_0-1719413522488.png
  • 8061 Views
  • 8 replies
  • 7 kudos
Latest Reply
Witold
Databricks Partner
  • 7 kudos

Just follow https://status.azuredatabricks.net, there you'll see an active incident in West Europe

  • 7 kudos
7 More Replies
Labels