cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

BananaHotSauce
by New Contributor III
  • 1338 Views
  • 1 replies
  • 3 kudos

Can I use PrivateLink and Customer Managed Policy for Cross Account Role

Hello, Im trying to enable Privatelink on my AWS Databricks quickstart, ​I use the customer managed VPC policy for the ​cross account role and supply it on the template. Im having an error that it cannot create a VPC Endpoint.​Do i need to change the...

  • 1338 Views
  • 1 replies
  • 3 kudos
Latest Reply
Debayan
Databricks Employee
  • 3 kudos

Hi @Chris Joshua Manuel​ , Thanks for reaching out to Community.databricks.com. Cross account VPC access works. Please refer below: https://tomgregory.com/cross-account-vpc-access-in-aws/Also, please let us know in case of any further clarification n...

  • 3 kudos
NimaiAhl
by New Contributor II
  • 3206 Views
  • 3 replies
  • 0 kudos

Databricks SQL

I am not able to see the SQL, I am an admin myself how can I use the SQL feature?

Screenshot 2022-08-24 at 9.57.13 PM
  • 3206 Views
  • 3 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, I believe you are using a single tenant shard where SQL option is not there.

  • 0 kudos
2 More Replies
368545
by New Contributor III
  • 4189 Views
  • 2 replies
  • 2 kudos

Resolved! Errors on Redash when queries are in cache

We got the following error when running queries on Redash connected toDatabricks early today (2022-08-24):```Error running query: [HY000] [Simba][Hardy] (35) Error from server:error code: '0' error message:'org.apache.spark.sql.catalyst.expressions.U...

  • 4189 Views
  • 2 replies
  • 2 kudos
Latest Reply
Debayan
Databricks Employee
  • 2 kudos

This can be related to user permission, particularly necessary permission to access the table in the database instance. I understand in SQL editor it is working fine, still can we check the permissions?

  • 2 kudos
1 More Replies
LearningDatabri
by Contributor II
  • 3869 Views
  • 2 replies
  • 1 kudos

Resolved! why this change in UI

The change in the UI is really confusing on what to use where. Earlier i had HC clusters and now I cant find it in new UI. It says HC clusters are not available. I want to use the HC cluster functions. where can I get that?

  • 3869 Views
  • 2 replies
  • 1 kudos
Latest Reply
Prabakar
Databricks Employee
  • 1 kudos

Hi @Databricks learner​ see if this post helps you.

  • 1 kudos
1 More Replies
Nidhi1
by New Contributor II
  • 2521 Views
  • 2 replies
  • 3 kudos

Resolved! Azure Databricks Phiton

Does the photon engine work against Azure SQL? Is there any documentation for this?​

  • 2521 Views
  • 2 replies
  • 3 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 3 kudos

Photon is a type of engine available within databricks workspace..In data engineering, you can opt to have photon enabled on your cluster, whereas in Databricks SQL, photon is enabled by default.

  • 3 kudos
1 More Replies
colt
by New Contributor III
  • 3533 Views
  • 2 replies
  • 1 kudos

Using built-in SQL functions in Delta Live tables

Do Delta Live Tables have different built-in SQL functions than the corresponding Databricks runtime? I created a cluster with Databricks runtime 10.3 (the current DLT runtime) so I could test my Delta Live Tables code before running it as a pipeline...

  • 3533 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Colt Kesselring​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Th...

  • 1 kudos
1 More Replies
Bency
by New Contributor III
  • 2401 Views
  • 2 replies
  • 1 kudos

How to get the list of parameters passed from widget

Hi ,Could someone help me understand how I would be able to get all the parameters in the task (from the widget). ie I want to get input as parameter 'Start_Date' , but the case is that this will not always be passed . It could be 'Run_Date' as well ...

  • 2401 Views
  • 2 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi @Bency Mathew​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...

  • 1 kudos
1 More Replies
ejloh
by New Contributor II
  • 5234 Views
  • 3 replies
  • 1 kudos

SQL query with leads and lags

I'm trying to create a new column that fills in the nulls below. I tried using leads and lags but isn't turning out right. Basically trying to figure out who is in "possession" of the record, given the TransferFrom and TransferTo columns and sequence...

image image
  • 5234 Views
  • 3 replies
  • 1 kudos
Latest Reply
Vidula
Honored Contributor
  • 1 kudos

Hi there @Eric Lohbeck​ Does @Hubert Dudek​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 1 kudos
2 More Replies
dmayi
by New Contributor
  • 5737 Views
  • 1 replies
  • 0 kudos

Setting up custom tags (JobName, JobID, UserId) on an all-purpose cluster

Hi i want to set up custom tags on an all-purpose cluster for purposes of cost break down and chargebacks. What: specifically, i want to capture JobName, JobID, UserId who ran jobI can set other custom tags such as Business Unit, Owner... However,...

  • 5737 Views
  • 1 replies
  • 0 kudos
Latest Reply
Vidula
Honored Contributor
  • 0 kudos

Hey there @DIEUDONNE MAYI​ Does @Kaniz Fatma​  response answer your question? If yes, would you be happy to mark it as best so that other members can find the solution more quickly?We'd love to hear from you.Thanks!

  • 0 kudos
ao1
by New Contributor III
  • 2889 Views
  • 2 replies
  • 1 kudos

About privileges that clone a Git repository on Databricks

Hi,​allDo I need admin privileges to clone a Git repository on Databricks?​Cloning was not possible with an account that did not have administrator privileges.​Regards.

  • 2889 Views
  • 2 replies
  • 1 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 1 kudos

Navigate to Settings > admin console > Workspace settings > Repos and check the value for "Repos Git URL Allow List permissions".When set to 'Disabled (no restrictions)', users can clone or commit and push to any Git repository.When set to 'Restrict ...

  • 1 kudos
1 More Replies
tanin
by Contributor
  • 7045 Views
  • 8 replies
  • 8 kudos

Using .repartition(100000) causes the unit test to be extremely slow (>20 mins). Is there a way to speed it up?

Here's the code:val result = spark .createDataset(List("test")) .rdd .repartition(100000) .map { _ => "test" } .collect() .toList   println(result)I write tests to test for correctness, so I wonde...

  • 7045 Views
  • 8 replies
  • 8 kudos
Latest Reply
Vidula
Honored Contributor
  • 8 kudos

Hey there @tanin​ Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help. We'd love to hear from you.Thank...

  • 8 kudos
7 More Replies
thushar
by Contributor
  • 38495 Views
  • 2 replies
  • 2 kudos

Resolved! Connect to an on-prem SQL server database

Need to connect to an on-prem SQL database to extract data, we are using the Apache Spark SQL connector. The problem is can't able to connect to connection failure SQLServerException: The TCP/IP connection to the host ***.***.X.XX, port 1433 has fail...

  • 38495 Views
  • 2 replies
  • 2 kudos
Latest Reply
Mohit_m
Databricks Employee
  • 2 kudos

Maybe you can check the below docs and see if something is missing in the set uphttps://docs.microsoft.com/en-us/azure/databricks/administration-guide/cloud-configurations/azure/on-prem-network

  • 2 kudos
1 More Replies
572509
by New Contributor
  • 2641 Views
  • 2 replies
  • 1 kudos

Resolved! Noteboook-scoped env variables?

Is it possible to set environment variables at the notebook level instead of the cluster level? Will they be available in the workers in addition to the driver? Can they override the env variables set at the cluster level?

  • 2641 Views
  • 2 replies
  • 1 kudos
Latest Reply
Prabakar
Databricks Employee
  • 1 kudos

It is not possible to set it from the notebook level.

  • 1 kudos
1 More Replies
data_testing1
by New Contributor III
  • 5423 Views
  • 5 replies
  • 5 kudos

Resolved! How much of this tutorial or blog post can I run before starting a cloud instance of databricks?

I'm new to python and databricks so I'm still running tests on features, and not sure how much of this can be run without databricks which I guess requires an AWS or Google cloud account? Can I do all three stages without the AWS databricks or how fa...

  • 5423 Views
  • 5 replies
  • 5 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 5 kudos

Hi, to run it, you need databricks. You can try to open a free community account. Here is explained how: https://community.databricks.com/s/feed/0D53f00001ebEasCAE

  • 5 kudos
4 More Replies
RicksDB
by Contributor III
  • 3779 Views
  • 2 replies
  • 3 kudos

Resolved! Maximum job execution per hour

Hi, what is the maximum number of jobs we can execute in an hour for a given workspace?This page mentions 5000https://docs.microsoft.com/en-us/azure/databricks/data-engineering/jobs/jobsThe number of jobs a workspace can create in an hour is limited ...

  • 3779 Views
  • 2 replies
  • 3 kudos
Latest Reply
Sivaprasad1
Databricks Employee
  • 3 kudos

Up to 5000 jobs (both normal and ephemeral) may be created per hour in a single workspace

  • 3 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels