cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

liu
by New Contributor III
  • 143 Views
  • 7 replies
  • 4 kudos

configure AWS authentication for serverless Spark

I only have an AWS Access Key ID and Secret Access Key, and I want to use this information to access S3.However, the official documentation states that I need to set the AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID environment variables, but I cannot ...

  • 143 Views
  • 7 replies
  • 4 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 4 kudos

Hi @liu ,The proper way is to go to your cluster and in advanced section you can set them up. In that way they will be scoped at cluster level.  It's recommended to store values itself in a secret scopes as environment variables:Use a secret in a Spa...

  • 4 kudos
6 More Replies
touchyvivace
by New Contributor
  • 54 Views
  • 1 replies
  • 1 kudos

is there another way to authen to azure databricks using MSI on Java

Hi I am try to connect to azure databricks using MSI on Java but on a documenthttps://learn.microsoft.com/en-us/azure/databricks/dev-tools/auth/azure-miit saidThe Databricks SDK for Java has not yet implemented Azure managed identities authentication...

  • 54 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @touchyvivace Unfortunately not, the documentation is up to date. In the Java SDK, MSI has not been implemented yet.And here's an open issue on github:[FEATURE] Add support for Azure Managed Identity authentication (system and user-assigned) · Iss...

  • 1 kudos
IONA
by New Contributor III
  • 78 Views
  • 1 replies
  • 2 kudos

Dev/Pie/Prd and the same workspace

Hi all!I'm appealing to all you folk who are clever than I for some advice on Databricks dev ops.I was asked by my team leader to expand our singular environment to a devops style dev/pie/prd system, potentially using Dabs to promote code to higher e...

  • 78 Views
  • 1 replies
  • 2 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 2 kudos

Hi @IONA ,I guess you can still use DABs to simulate different environments on single workspace. In targets define 3 different environments but with the same for all of them (something similar to the picture below).Then your intuition is good - it's ...

  • 2 kudos
ckanzabedian
by New Contributor
  • 98 Views
  • 1 replies
  • 1 kudos

ServiceNow LakeFlow Connector - Using TABLE API only for tables and NOT views

The current Databricks ServiceNow Lakeflow connector relies on ServiceNow REST TABLE API to capture data. And for some reason, it is unable to list a user defined view as a data source to be configured, even though ServiceNow user defined views are a...

  • 98 Views
  • 1 replies
  • 1 kudos
Latest Reply
BS_THE_ANALYST
Esteemed Contributor II
  • 1 kudos

Hi @ckanzabedian, have you checked out the documentation yet for the ServiceNow connector?https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/servicenow-limits The link above is about the limits. I can't see a mention about ...

  • 1 kudos
excavator-matt
by New Contributor III
  • 235 Views
  • 4 replies
  • 2 kudos

How do use Databricks Lakeflow Declarative Pipeline on AWS DMS data?

Hi!I am trying to replicate an AWS RDS PostgreSQL database in Databricks. I have successfully manage to enable CDC using AWS DMS that writes an initial load file and continuous CDC files in parquet.I have been trying to follow the official guide Repl...

Data Engineering
AUTO CDC
AWS DMS
declarative pipelines
LakeFlow
  • 235 Views
  • 4 replies
  • 2 kudos
Latest Reply
mmayorga
Databricks Employee
  • 2 kudos

Hi @excavator-matt , Yes, you are correct. CloudFiles/Autoloader handles idempotency on the file level.  From the guide's perspective, the View is created from the source files in the specified location. This view captures all files and their corresp...

  • 2 kudos
3 More Replies
smoortema
by New Contributor III
  • 106 Views
  • 1 replies
  • 1 kudos

How to make FOR cycle and dynamic SQL and variables work together

I am working on a testing notebook where the table that is tested can be given as a widget. I wanted to write it in SQL. The notebook does the following steps in a cycle that should run 10 times:1. Store the starting version of a delta table in a var...

  • 106 Views
  • 1 replies
  • 1 kudos
Latest Reply
mmayorga
Databricks Employee
  • 1 kudos

Hi @smoortema , Thank you for reaching out! You are very close to getting the “start_version”; you just need to include “INTO start_version” after the “EXECUTE IMMEDIATE”. Here is the updated code BEGIN DECLARE sum INT DEFAULT 0; DECLARE start_ve...

  • 1 kudos
LonguiVic1
by New Contributor III
  • 123 Views
  • 1 replies
  • 1 kudos

Resolved! How to Find DBU Consumption and Cost for a Serverless Job?

Hello community,I'm new to using Serverless compute for my Jobs and I need some help understanding how to monitor the costs.I have configured and run a job that executes a notebook using the "Serverless" compute option. The job completed successfully...

  • 123 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @LonguiVic1 ,You can use system table to track consumption in serverless. In below article they even provide sample queries you can use. Also, notice that there's list_prices system table that includes list prices over time for each available SKU....

  • 1 kudos
daan_dw
by New Contributor III
  • 71 Views
  • 1 replies
  • 0 kudos

Databricks asset bundles in Python: referencing variables

Hey,I am using DAB's and in my .yml files I can reference my variables set in my databricks.yml like this: git_branch: ${var.branch}I would like to do the same thing in my DAB's written in Python but I cannot find any documentation on how to do this....

  • 71 Views
  • 1 replies
  • 0 kudos
Latest Reply
SP_6721
Honored Contributor
  • 0 kudos

Hi @daan_dw ,To reference variables defined in your databricks.yml in Python DAB code, define your variables class and use bundle.resolve_variablehttps://docs.databricks.com/aws/en/dev-tools/bundles/python/#access-bundle-variables

  • 0 kudos
seefoods
by Valued Contributor
  • 51 Views
  • 0 replies
  • 0 kudos

DQX - datacontract cli

Hello Guyz, Someone can i combine dqx databricks rules check with datacontract cli ? If yes can we share your idea? https://gpt.datacontract.com/sources/cli.datacontract.com/Cordially, 

  • 51 Views
  • 0 replies
  • 0 kudos
nulltype
by New Contributor
  • 105 Views
  • 1 replies
  • 1 kudos

Online Table Migration

I am currently trying to migrate our Online Tables to synced tables with Online Feature Store since Online Tables is deprecated. When creating a new table, it worked just fine and how the docs said it would (https://docs.databricks.com/aws/en/machine...

  • 105 Views
  • 1 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

Migrating from deprecated Online Tables to synced tables with the Databricks Online Feature Store can be tricky due to several points of integration and timing between Unity Catalog (UC), Feature Store metadata, and the underlying online store. The m...

  • 1 kudos
KristiLogos
by Contributor
  • 2247 Views
  • 1 replies
  • 0 kudos

Spark JDBC Netsuite error - SQLSyntaxErrorException: [NetSuite][OpenAccess SDK JDBC Driver][OpenAcc

 I'm  trying to query the Customer netsuite tables with spark jdbc and  I've added the .jar file to the cluster and trying to run the below:jdbc_url = "jdbc:ns://xxxx.connect.api.netsuite.com:1708;ServerDataSource=NetSuite2.com;Encrypted=1;NegotiateS...

  • 2247 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

If you are on a Spark version that supports .option("query", ...), you can do:   python df = spark.read \ .format("jdbc") \ .option("url", jdbc_url) \ .option("query", "SELECT TOP 10 * FROM Customer") \ .option("user", "xxxx") \ ...

  • 0 kudos
iskidet
by New Contributor
  • 158 Views
  • 2 replies
  • 1 kudos

Declarative Pipeline Failure for Autoloader

Hello Folks, After moving my working serverless Auto Loader notebook to a declarative (DLT) pipeline, I’m getting an AccessDenied error. What could be causing this?”here is the DLT json  and error message in the DLT I googled around got saw some hint...

iskidet_1-1759326363933.png iskidet_4-1759326671286.png iskidet_5-1759327166901.png iskidet_2-1759326493353.png
  • 158 Views
  • 2 replies
  • 1 kudos
Latest Reply
Advika_
Databricks Employee
  • 1 kudos

Hello @iskidet! Were you able to resolve the AccessDenied issue? If the above suggestion helped, or if you found another solution, it would be great if you could mark it as the accepted solution or share your approach with the community.

  • 1 kudos
1 More Replies
timo82
by New Contributor
  • 296 Views
  • 7 replies
  • 4 kudos

Resolved! [CANNOT_OPEN_SOCKET] Can not open socket: ["tried to connect to ('127.0.0.1', 45287)

Hello,after databricks update the Runtime from Release: 15.4.24 to Release: 15.4.25 we getting in all jobs the Error:[CANNOT_OPEN_SOCKET] Can not open socket: ["tried to connect to ('127.0.0.1', 45287)What we can do here?Greetings

  • 296 Views
  • 7 replies
  • 4 kudos
Latest Reply
HariSankar
Contributor III
  • 4 kudos

Hi @Hansjoerg,Apologies for the confusion earlier. You are right Bundles doesn't allow pinning to specific patch versions like 15.4.24.Your best option is to skip Bundles for now and use the regular Databricks Jobs setup (via UI or Jobs API) where yo...

  • 4 kudos
6 More Replies
AkhileshVB
by New Contributor
  • 2039 Views
  • 1 replies
  • 0 kudos

Syncing lakebase table to delta table

I have been exploring Lakebase and I wanted to know if there is a way to sync CDC data from Lakebase tables to delta table in Lakehouse. I know the other way is possible and that's what was shown in the demo. Can you tell how I can I sync both the ta...

  • 2039 Views
  • 1 replies
  • 0 kudos
Latest Reply
sarahbhord
Databricks Employee
  • 0 kudos

  Hey AkhileshVB! Lakebase-to-Delta CDC sync is in Private Preview—GA/Preview dates are not firm yet. Doo you have a Databricks contact or account manager? They are the right place to go if you want early involvement. Workarounds & DIY Approaches: Fo...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels