- 714 Views
- 2 replies
- 0 kudos
Resolved! Can I give different git branches in the same repo for different tasks in a data bricks workflow
I have 2 tasks (T1 &T2) that run in branch B1 of Repo1.I have created a new task (depends on T2 ) which points to a different branch B2 of same Repo1.Is it possible to run them in the same workflow pipeline? When I tried to set this up, databricks c...
- 714 Views
- 2 replies
- 0 kudos
- 0 kudos
I was to able to find a workaround. Created separate jobs for those that need to be in different branch (testing tasks) and then ran all of them from a new job.
- 0 kudos
- 558 Views
- 1 replies
- 0 kudos
Resolved! Setting a preset list of values in a task parameter in databricks job
I want to be able to have a user select from a preset list of values for a task parameter when they kick off a job with the "Run now with different parameters" option. In a notebook I am able to use dbutils.widgets.dropdown() to set the list of value...
- 558 Views
- 1 replies
- 0 kudos
- 0 kudos
Unfortunately providing a job params dropdown list is not currently available, you can alway do a Run with different params, but the user will have to change them manually and not with a predefined list.
- 0 kudos
- 305 Views
- 1 replies
- 0 kudos
New icon for SQL Editor looks like a broken image
Hey - I may be showing my age here, but I felt compelled to point out that at a glance, the new icon for a SQL Editor tab in the Databricks UI looks an awful lot like a broken image link icon, from the days of Internet Explorer. This, subconsciously,...
- 305 Views
- 1 replies
- 0 kudos
- 0 kudos
Is this still showing broking image? Is this only happening in Explorer, if you try Chrome for example does it work?Can you share an screenshot of your workspace to better understand how it shows?
- 0 kudos
- 584 Views
- 2 replies
- 1 kudos
Resolved! Internal Error with MERGE Command in Spark SQL
I'm trying to perform a MERGE between two tables (customers and customers_update) using Spark SQL, but I’m encountering an internal error during the planning phase. The error message suggests it might be a bug in Spark or one of the plugins in use.He...
- 584 Views
- 2 replies
- 1 kudos
- 1 kudos
The issue you encountered with the MERGE statement in Spark SQL, which was resolved by specifying the database and metastore, is likely related to how Spark handles table references during the planning phase. The internal error you faced suggests a b...
- 1 kudos
- 347 Views
- 1 replies
- 0 kudos
Ingress/Egress private endpoint
Hello , We have configured our Databricks environment with private endpoint connections injected into our VNET, which includes two subnets (public and private). We have disabled public IPs and are using Network Security Groups (NSGs) on the subnet, a...
- 347 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Fkebbati ,There always be some costs related to data transfer between those account. Let's have a look at private link pricing page. So it's expected, but MS likes to hide this kind of information
- 0 kudos
- 313 Views
- 1 replies
- 1 kudos
Best way to find Databricks Certified Professionals?
Hi all! We have a few Databricks certified folks on our team, but we are looking for more! What is the best way to find a list of or know who is certified? We are looking for North America / Europe / LATAM / South America. I am part of the Microsoft ...
- 313 Views
- 1 replies
- 1 kudos
- 1 kudos
Hey @ladyleet ,There is the community of Databricks certified: Databricks Certified Credential Holder Directory.
- 1 kudos
- 349 Views
- 1 replies
- 0 kudos
Amazon MSK integration with Databricks
Hello Everyone,I am a beginner in the world of Data bricks. I am trying to achieve a use case which involves : consuming messages from Amazon MSK and creating a Delta table in Data bricks.I need to get some insights on what all accesses are supposed ...
- 349 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey @Amrit23 , Databricks Assistant can help you write the code. You need to use spark.readStream() to access the stream. https://docs.databricks.com/en/connect/streaming/kafka.html And for the Access Permissions: Amazon MSK: Ensure you have the nec...
- 0 kudos
- 414 Views
- 1 replies
- 0 kudos
Resolved! Databricks SQL as alternative to Spark thrift server
We are currently using Spark as our SQL engine with Thrift Server but are evaluating Databricks Serverless SQL as a potential alternative. We have a few specific questions:Does Databricks Serverless SQL support custom Spark extensions?Can we configur...
- 414 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @SachinJanani , 1 - Databricks Serverless SQL does not support custom Spark extensions (Advanced Spark Configs/Libraries etc). This is because the serverless environment is designed to be highly optimized and managed by Databricks, which limits th...
- 0 kudos
- 638 Views
- 1 replies
- 0 kudos
Unable to access Azure blob storage with SAS token
I am following Microsoft documentation to connect from Databricks workspace to Azure blob storage. but it is not working. Any help is greatly appreciated. Below is the codespark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows....
- 638 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @bvraravind, The error you are encountering is due to an incorrect configuration setting in your code. The error message indicates that the configuration fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net is not recognized Verify th...
- 0 kudos
- 983 Views
- 1 replies
- 0 kudos
Troubleshooting the Error "Credential was not sent or was of an unsupported type for this API"
I previously worked on Databricks Asset Bundle (DAB) using a Service Principal token, and it was successful. However, when I attempted it again now, I encountered an error.Error: failed to compute file content for {{.project_name}}/databricks.yml.tmp...
- 983 Views
- 1 replies
- 0 kudos
- 0 kudos
Which type of token are you currently using, is it an Oauth token or Obo token? Have you generated a new token for testing?
- 0 kudos
- 445 Views
- 1 replies
- 0 kudos
Update on CTE
So I am reflecting a business logic from on prem to azure databricks . what on prem did is created the table and after that updated . I have to construct that as a single query . Example Create or replace table table1with CTE 1 as () ,CTE 2 as (selec...
- 445 Views
- 1 replies
- 0 kudos
- 0 kudos
An actual "Update", it may not be possible, but have you consider and will something like this work for you? This is simulating updates within the query without actual UPDATE statements: CREATE OR REPLACE TABLE table1 AS WITH CTE1 AS ( -- Your in...
- 0 kudos
- 2172 Views
- 5 replies
- 4 kudos
Impersonating a user
How do I impersonate a user? I can't find any documentation that explains how to do this or even hint that it's possible.Use case: I perform administrative tasks like assign grants and roles to catalogs, schemas, and tables for the benefit of busines...
- 2172 Views
- 5 replies
- 4 kudos
- 4 kudos
DB-I-8117 this one is mentioned to be considered for future so adding votes for sure will help.
- 4 kudos
- 1224 Views
- 2 replies
- 0 kudos
Databricks grant update calatog catlog_name --json @privileges.json not updating privileges
Hi Team, I am trying to update the catalog permission privileges using databricks cli command Grant by appending json file but which is not updating the prIviliges, please help on grant update command usage.Command using : databricks grants update c...
- 1224 Views
- 2 replies
- 0 kudos
- 0 kudos
Hello @Prasad_Koneru If the command is not updating the privileges as expected, there could be a few reasons for this. Firstly, ensure that the JSON file is correctly formatted and contains the correct privilege assignments. The privileges.json fi...
- 0 kudos
- 2930 Views
- 3 replies
- 3 kudos
Resolved! Find and replace
Hi,Is there a "Find and replace" option to edit SQL code? I am not referring to the "replace" function but something similar to Control + shift + F in Snowflake or Control + F in MS Excel.
- 2930 Views
- 3 replies
- 3 kudos
- 3 kudos
is there an option to find-replace just within a cell instead of entire notebook?
- 3 kudos
- 3469 Views
- 5 replies
- 1 kudos
How to get data from Splunk on daily basis?
I am finding the ways to get the data to Databricks from Splunk (similar to other data sources like S3, Kafka, etc.,). I have received a suggestion to use the Databricks add-on to get/put the data from/to Splunk. To pull the data from Databricks to S...
- 3469 Views
- 5 replies
- 1 kudos
- 1 kudos
@Arch_dbxlearner - could you please follow the post for more details. https://community.databricks.com/t5/data-engineering/does-databricks-integrate-with-splunk-what-are-some-ways-to-send/td-p/22048
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »