- 345 Views
- 1 replies
- 0 kudos
Assistance Required for Enabling Unity Catalog in Databricks Workspace
Hi,I hope this message finds you well.I am reaching out regarding a concern with Databricks Administrator privileges. I have an Azure subscription and I use Azure Databricks for my tutorials, but I currently do not have Global Administrator access, w...
- 345 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Meghana_Vasavad ,During initial setup of unity catalog you need to find a person with global administrator role of Entra ID tenant. It's one time action, because then he can give necessary permission to manage catalog to your account, or even bet...
- 0 kudos
- 5031 Views
- 7 replies
- 0 kudos
Asset Bundles git branch per target
Hi,I am migrating from dbx to Databricks Asset Bundles (DAB) a deployment setup where I have specific parameters per environment. This was working well with dbx, and I am trying now to define those parameters defining targets (3 targets : dev, uat, p...
- 5031 Views
- 7 replies
- 0 kudos
- 0 kudos
Something must have changed in the meantime on Databricks side. I have only updated databricks CLI to 016 and now, using a git / branch under each target deploys this setup, where feature-dab is the branch I want the job to pull sources from, I see t...
- 0 kudos
- 677 Views
- 5 replies
- 0 kudos
Resolved! maxFilesPerTrigger not working while loading data from Unity Catalogue table
Hi,I am using streaming on unity catalogue tables and trying to limit the number of records read in each batch. Here is my code but its not respecting maxFilesPerTrigger, instead reads all available data. (spark.readStream.option("skipChangeCommits",...
- 677 Views
- 5 replies
- 0 kudos
- 0 kudos
I believe you misunderstand the fundamentals of delta tables. `maxFilesPerTrigger` has nothing to do with how many rows you will process at the same time. And if you really want to control the number of records per file, then you need to adapt the wr...
- 0 kudos
- 536 Views
- 1 replies
- 1 kudos
Resolved! Serving pay-per-token Chat LLM Model
We have build a chat solution on LLM RAG chat model, but we face an issue when we spin up a service endpoint to host the model.According to the documentation, there should be sevral LLM models available as pay-per-token endpoints, for instance the DB...
- 536 Views
- 1 replies
- 1 kudos
- 1 kudos
@Henrik The documentation clearly states that it should be available in west europe, but i'm also unable to see DBRX ppt endpoint. I think that it would be best to raise an Azure Support ticket - they should either somehow enable it on your workspace...
- 1 kudos
- 253 Views
- 1 replies
- 0 kudos
Databricks repo not working with installed python libraries
Hello,I'm trying to use some installed libraries in my cluster.I created a single node cluster with the version Runtime version 14.3 LTS.I also installed libraries like oracledb==2.2.1Then when I try to use python to load this libraries in the worksp...
- 253 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello Nelson, How are you dong today?Try checking the permissions on your repo folder to ensure your cluster can access it without issues. Use absolute paths when running from your GitHub repo to avoid directory confusion. Reinstall the oracledb libr...
- 0 kudos
- 380 Views
- 1 replies
- 0 kudos
Concurrent installation of UCX in multiple workspaces
I am trying to install UCX in multiple workspaces concurrently through bash script but facing issue continuously. I have created separate directories for each workspace. I'm facing the below error everytime.Installing UCX in Workspace1Error: lib: cle...
- 380 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @unity_Catalog, How are you dong today?Try running the UCX installations sequentially to avoid file access conflicts, adding a small delay between each. Ensure each workspace uses a separate installation directory to prevent overlap. You could als...
- 0 kudos
- 198 Views
- 1 replies
- 0 kudos
Informatica API data retrieval through Datrbicks Workflows or ADF!! Which is better?
The above set of activities took some 4 hours at ADF to explore and design with greater ease of use, connections, monitoring and it could have probably taken 4 days or more using Databrick...
- 198 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @rohit_kumar, How are you dong today?To answer to your subject line question, If you're looking for flexibility and integration, Databricks Workflows might be better since it offers native support for complex data transformations and seamless inte...
- 0 kudos
- 457 Views
- 1 replies
- 0 kudos
Exam suspended
Hello Databricks Team,I had a terrible experience during certification exam and I have also raised a ticket to the Databricks team but haven’t got any response to the mail till now. I appeared for the Databricks certified Associate Developer for Apa...
- 457 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Cert-Team Could you please look into this issue and assist me in rescheduling my exam since it’s very important for me to provide my certification to my employer at the earliest.Thanks and RegardsHaneen Heeba
- 0 kudos
- 449 Views
- 1 replies
- 1 kudos
Resolved! Remove duplicate records using pyspark
Hi,I am trying to remove duplicate records from pyspark dataframe and keep the latest one. But somehow df.dropDuplicates["id"] keeps the first one instead of latest. One of the option is to use pandas drop_duplicates, Is there any solution in pyspark...
- 449 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @sanjay ,You can write window function that will rank your rows and then filter rows based on that rank.Take a look on below stackoverflow thread: https://stackoverflow.com/questions/63343958/how-to-drop-duplicates-but-keep-first-in-pyspark-datafr...
- 1 kudos
- 2445 Views
- 15 replies
- 4 kudos
Can't activate users
A while back apparently a user became inactive on our Databricks platform from unknown reason.So far everything we tried haven't worked:Delete and manually re-create the userDelete and let SSO to create the user on loginUse Databricks CLI - shows no ...
- 2445 Views
- 15 replies
- 4 kudos
- 4 kudos
I had the same issue on my admin account seamlessly becoming inactive at random.The problem occurred after helping our platform team testing out a new setup of another databricks workspace.I was testing the setup logging in as a standard user and an ...
- 4 kudos
- 485 Views
- 0 replies
- 0 kudos
Installation of cluster requirements.txt does not appear to run as google service account
I am running on 15.4 LTS Beta, which supports cluster-level requirements.txt files. The particular requirements.txt I have uploaded to my workspace specifies an extra index URL using a first line that looks like--extra-index-url https://us-central1-p...
- 485 Views
- 0 replies
- 0 kudos
- 576 Views
- 1 replies
- 0 kudos
Unlocking the Power of Databricks: A Comprehensive Guide for Beginners
In the rapidly evolving world of big data, Databricks has emerged as a leading platform for data engineering, data science, and machine learning. Whether you're a data professional or someone looking to expand your knowledge, understanding Databricks...
- 576 Views
- 1 replies
- 0 kudos
- 921 Views
- 4 replies
- 1 kudos
Databricks Asset Bundles - terraform.tfstate not matching when using databricks bundle deploy
Hello I have notice something strange with the asset bundle deployments via the CLI toolI am trying to run databricks bundle deploy and i'm getting and error saying the job ID doesn't exist or I don't have access to it. Error: cannot read job: User h...
- 921 Views
- 4 replies
- 1 kudos
- 1 kudos
hello,I experienced the same problem today. However i was able to fix it by deleting the tf statefile containing the deleted job. (Located in the .bundle folder in my workspace under the user/ service principal who deployed the bundle. )
- 1 kudos
- 111 Views
- 0 replies
- 0 kudos
Discover the Essence of Elegance: Jawaad Perfume by Lattafa Asad
Unveil a new level of sophistication with Jawaad Perfume by Lattafa Asad. This exquisite fragrance captures the essence of elegance through its meticulously crafted notes, blending timeless aromas into a harmonious symphony.
- 111 Views
- 0 replies
- 0 kudos
- 782 Views
- 5 replies
- 1 kudos
How to change the OAuth token lifetime and the maximum number of OAuth tokens
Hi team,I’m working on generating an OAuth token using a service principal, following the instructions here: https://docs.databricks.com/en/dev-tools/auth/oauth-m2m.html#language-CLI, specifically the section on manually generating a workspace-level ...
- 782 Views
- 5 replies
- 1 kudos
- 1 kudos
Thank you for your reply @szymon_dybczak service principal added to admins group: This is the problem @szymon_dybczak, we don't want that service principal to get the admin privilege, it should have access to some tables/schemas in our workspace but ...
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »