- 1070 Views
- 4 replies
- 0 kudos
Deployment of tables and views in unity catalog and Repo structuring for catalogs objects.
we want to create the CI/CD Pipeline for deploying Unity catalog objects inorder to enhance the deployment ability.but How to maintain a repository of Tables/views/ or any other objects created in the catalogs and schema.Is this possible to do just l...
- 1070 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @Prasad_Koneru , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your f...
- 0 kudos
- 792 Views
- 3 replies
- 0 kudos
hive_metastore schema access control
We are trying to control access to schemas under hive_metastore, only allowing certain users to access the tables under a schema (via SQL, Pyspark, and Python...), we have follow steps in a testing schema:1. Enable workspace table access control2. Ru...
- 792 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @icyapple , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedba...
- 0 kudos
- 929 Views
- 3 replies
- 1 kudos
cluster termination
Hello All,when I am creating all purpose cluster I am getting an idle time of 3 days, my cluster is terminating after 3days, I want to make my cluster terminate in 60 min of idle time, i want to do it globally so that in future any cluster created by...
- 929 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi @SHASHANK2 , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedb...
- 1 kudos
- 2241 Views
- 0 replies
- 0 kudos
DBRX: Serving Endpoint Not Ready: Failing to Load the Model within 300 seconds
Hi Team,I created a serving endpoint using the mlflow.deployments -> get_deploy_client -> create_endpoint. The endpoint was created successfully but it was not going to the ready state as the update failed with the error message being "Exceeded maxim...
- 2241 Views
- 0 replies
- 0 kudos
- 617 Views
- 1 replies
- 0 kudos
My Databricks Professional Data Engineer exam has suspended , Need help Urgently (17/07/2024)
Hello Team, I encountered Pathetic experience while attempting my Professional Data Engineer DataBricks certification. Abruptly, Proctor asked me to show my desk, after 30 mins of exam showing he/she asked multiple times.. wasted my time and then sus...
- 617 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 769 Views
- 2 replies
- 2 kudos
Resolved! unit testing
Currently I am creating unit tests for our ETL scripts although the test is not able to recognize sc (SparkContext).Is there a way to mock SparkContext for a unit test? Code being tested: df = spark.read.json(sc.parallelize([data])) Error message rec...
- 769 Views
- 2 replies
- 2 kudos
- 2 kudos
Was able to get this to work.What I had to do was instantiate the "sc" variable in the PySpark notebook.PySpark code:"sc = spark.SparkContext"Then in the PyTest script we add a "@patch()" statement with the "sc" variable and create a "mock_sc" variab...
- 2 kudos
- 1173 Views
- 0 replies
- 0 kudos
Issue with Service Principal and Grants in Databricks
Hi,We created a service principal in Databricks as per the documentation here.However, when we execute the following SQL query, we are unable to see the service principal: SHOW GRANTS testservice ON METASTOREerror:[RequestId=564cbcf9-e8b7-476d-a4db-...
- 1173 Views
- 0 replies
- 0 kudos
- 386 Views
- 0 replies
- 0 kudos
Databricks exam got suspended
My Databricks Certified data engineer associate exam got suspended on 18 July 2024 .I was continuously in front of the camera and an alert appeared and then my exam resumed. Then later a support person told me that your exam got suspended. I Don't kn...
- 386 Views
- 0 replies
- 0 kudos
- 438 Views
- 1 replies
- 1 kudos
Issue while launching the data engineer exam
I started my exam but it said due to a technical issue it had been suspended though I checked all prerequisites and system checks.I have already raised the ticket please resolve this issue as early as possible.Ticket no.: #00504757#reschedule #issue...
- 438 Views
- 1 replies
- 1 kudos
- 1 kudos
@Retired_mod , Please resolve this issue as early as possible. I have already raised the ticket
- 1 kudos
- 1816 Views
- 2 replies
- 1 kudos
Resolved! Databricks Bundle Error
I am running this command: databricks bundle deploy --profile DAVE2_Dev --debug And I am getting this error: 10:13:28 DEBUG open dir C:\Users\teffs.THEAA\OneDrive - AA Ltd\Databricks\my_project\dist: open C:\Users\teffs.THEAA\OneDrive - AA Ltd\Databr...
- 1816 Views
- 2 replies
- 1 kudos
- 1 kudos
So I found a link to a page that said that the databricks bundle command is expecting python3.exe instead of python.exe. So I took a copy of python.exe and renamed it to python3.exe and that seems to work. Thanks for investigating though.
- 1 kudos
- 841 Views
- 3 replies
- 1 kudos
how to implement delta load when table only has primary columns
I have a table where there are two columns and both are primary key, I want to do delta load when taking data from source to target. Any idea how to implement this?
- 841 Views
- 3 replies
- 1 kudos
- 1 kudos
But that shouldn't be a problem. In merge condition you check both keys as in example above. If combination of two keysb already exists in the table then do nothing. If there is new combination of key1 and key2 just insert it into target table.It's t...
- 1 kudos
- 844 Views
- 3 replies
- 0 kudos
how can I store my cell output as a text file in my local drive?
I want to store the output of my cell as a text file in my local hard drive.I'm getting the json output and I need that json in my local drive as a text file.
- 844 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @InquisitiveGeek ,You can do this following below approach: https://learn.microsoft.com/en-us/azure/databricks/notebooks/notebook-outputs#download-results
- 0 kudos
- 656 Views
- 2 replies
- 1 kudos
Can I move a single file larger than 100GB using dbtuils fs?
Hello. I have a file over 100GB. Sometimes this is on the cluster's local path, and sometimes it's on the volume.And I want to send this to another path on the volume, or to the s3 bucket. dbutils.fs.cp('file:///tmp/test.txt', '/Volumes/catalog/schem...
- 656 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @himanmon ,This is caused because of S3 limit on segment count. The part files can be numbered only from 1 to 10000After Setting spark.hadoop.fs.s3a.multipart.size to 104857600. , did you RESTART the cluster? Because it'll only work when the clust...
- 1 kudos
- 1177 Views
- 1 replies
- 0 kudos
Resolved! error creating token when creating databricks_mws_workspace resource on GCP
resource "databricks_mws_workspaces" "this" { depends_on = [ databricks_mws_networks.this ] provider = databricks.account account_id = var.databricks_account_id workspace_name = "${local.prefix}-dbx-ws" location = var.google_region clou...
- 1177 Views
- 1 replies
- 0 kudos
- 0 kudos
my issue was caused be credentials in `~/.databrickscfg` (generated by databricks cli) taking precedence over the creds set by `gcloud auth application-default login`. google's application default creds should be used when using the databricks google...
- 0 kudos
- 1881 Views
- 2 replies
- 1 kudos
Resolved! Error - Data Masking
Hi,I was testing masking functionality of databricks and got the below error:java.util.concurrent.ExecutionException: com.databricks.sql.managedcatalog.acl.UnauthorizedAccessException:PERMISSION_DENIED: Query on table dev_retransform.uc_lineage.test_...
- 1881 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @FaizH , Are you using single user compute by any chance? Because of you do there is following limitation:Single-user compute limitationDo not add row filters or column masks to any table that you are accessing from a single-user cluster. During t...
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »