- 1299 Views
- 2 replies
- 3 kudos
Resolved! Databricks Asset Bundles + Artifacts + Poetry
Hello,I've configured the DABs on our project successfully. Moreover, I could switch from setuptools to poetry almost successfully. In the project's databricks.yml I configured it as the documentation suggested, I've just changed the name of the arti...
- 1299 Views
- 2 replies
- 3 kudos
- 3 kudos
Hi @Fiabane ,Could you first check:Do you see your .whl file in your artifacts folder?Could you try to install the package by running the code in your notebook : %pip install <path to your wheel>As far as I understand you want to have a job ...
- 3 kudos
- 358 Views
- 0 replies
- 1 kudos
Change AWS S3 storage class for subset of schema
I have a schema that has grown very large. There are mainly two types of tables in it. One of those types accounts for roughly 80% of the storage. Is there a way to somehow set a policy for those tables only to transfer them to a different storage cl...
- 358 Views
- 0 replies
- 1 kudos
- 666 Views
- 2 replies
- 0 kudos
Databricks Kryo setup
I would like to consolidate all our Spark jobs in Databricks. One of those jobs that are currently running in Azure HDInsight is not properly working using a Databricks JAR job.It uses Spark 3.3 RDDs and requires configuring Kryo serialisation. There...
- 666 Views
- 2 replies
- 0 kudos
- 0 kudos
Integrating Spark tasks with Databricks can greatly improve your workflow. For tasks that require Kryo serialization, make sure you configure your Spark session correctly. You may need to adjust the serialization settings in your Spark configuration....
- 0 kudos
- 388 Views
- 1 replies
- 2 kudos
Azure Databricks Classic Compute Plane Firewall
I’m designing a compute plane configuration that will align our data platform with internal policies from a security perspective. As part of this exercise I'm documenting how the permissible traffic inbound and outbound is controlled using NSG rules,...
- 388 Views
- 1 replies
- 2 kudos
- 2 kudos
@Jim-Shady wrote:I’m designing a compute plane configuration that will align our data platform with internal policies from a security perspective. As part of this exercise I'm documenting how the permissible traffic inbound and outbound is controlled...
- 2 kudos
- 2353 Views
- 1 replies
- 0 kudos
Resolved! How to deploy to Databricks Assets Bundle from Azure DevOps using Service Principal?
I have a CI/CD process that after a Pull Request (PR) to main it deploys to staging.It works using a Personal Access Token using Azure Pipelines.From local, deploying using Service Principal works (https://community.databricks.com/t5/administration-a...
- 2353 Views
- 1 replies
- 0 kudos
- 0 kudos
I needed to deploy a job using CI/CD Azure Pipelines without using the OAuth, this is the way:First you need to have configured the Service Principal, for that you need to generate it in your workspace with this you will have:A host: Which is your wo...
- 0 kudos
- 721 Views
- 5 replies
- 1 kudos
Resolved! Creating Groups with API and Python
I am working on a notebook to help me create Azure Databricks Groups. When I create a group in a workspace using the UI, it automatically creates the group at the account level and links them. When I create a group using the API, and I create the w...
- 721 Views
- 5 replies
- 1 kudos
- 1 kudos
That was it, thank you. I was looking at the wrong details. I really appreciate it.
- 1 kudos
- 1017 Views
- 5 replies
- 0 kudos
Resolved! Unable to Create Job Task Using Git Provider Invalid Path
I am attempting to create a task in a job using the Git Provider as a source and GitHub is the provider. The repo is a private repo. Regardless of how I enter the path to the notebook I receive the same error that the notebook path is invalid and o...
- 1017 Views
- 5 replies
- 0 kudos
- 0 kudos
Like I said in a previous response. This started working automatically a few days ago with no changes on our end. The developer who was working on this decided to try it one more time and it just worked, no error this time. I don't know if Databri...
- 0 kudos
- 667 Views
- 1 replies
- 0 kudos
Bring data from databricks to sharepoint list using the Power Automate
Good afternoon to all and I am new to this community.We are trying to bring data from databricks to sharepoint list using the Power Automate app (create workflow and trigger it when there is new record or exising record is modified in source table in...
- 667 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi all, Can anyone assist me with this request ?Thanks in advance
- 0 kudos
- 378 Views
- 1 replies
- 0 kudos
Tabs for notebooks
Browsing this page of the documentation, the displayed GIF shows a notebook that is opened in its own tab. I've been looking for how to enable this feature in my own workspace, but cannot find it.Does anyone know how to enable this feature?
- 378 Views
- 1 replies
- 0 kudos
- 0 kudos
Nope.It seems that is some kind of new version of the UI.In the SQL editor one can open multiple tabs. But for python notebooks I have no idea.
- 0 kudos
- 430 Views
- 1 replies
- 0 kudos
How to generate an Azure Subscription from a Databricks Generated Service Principal?
Hello, I currently have a Service Principal (SP) Client_Id and its associated secret, I generated it directly from my workspace in Databricks, i was following this post: https://github.com/databricks/cli/issues/1722, but I don't know how to generate ...
- 430 Views
- 1 replies
- 0 kudos
- 0 kudos
Learn to summon an Azure Subscription from a Databricks-generated Service Principal. Harness the power of data with this vital step in Azure infrastructure management. Mastering it is as crucial as surviving Fnaf
- 0 kudos
- 672 Views
- 1 replies
- 0 kudos
Resolved! Restrictions on setting environment variables in Compute Policies
As recommended by Databricks, we are trying to use Compute Policies to set environment variables, which are used by our notebooks, across clusters.However, when specifying a JSON string as env var, we are getting this error upon applying the policy t...
- 672 Views
- 1 replies
- 0 kudos
- 0 kudos
This is because you use Shared access mode.This enables multiple users to use the cluster simultaneously.However, there are features that do not work on these Shared access mode clusters:https://docs.databricks.com/en/compute/access-mode-limitations....
- 0 kudos
- 940 Views
- 1 replies
- 0 kudos
One Azure Tenant with Multiple Azure Databricks Accounts
Hi there,We have one Azure tenant with multiple subscriptions. Each subscription is a project for itself.At this moment, we have only one Azure Databricks account, and all workspaces (created under different subscriptions) are associated with it.Can ...
- 940 Views
- 1 replies
- 0 kudos
- 0 kudos
hello @stevanovic ,as far as I understand, in Azure, you can create one databricks account per tenant, meaning for example unity catalog is also tenant-level resource.There is a fantastic blog post available here:https://community.databricks.com/t5/t...
- 0 kudos
- 900 Views
- 3 replies
- 2 kudos
Resolved! Silly question-Easy way to show full notebook path or owner in UI?
We have a few people working in Databricks right now in different clones of the same repository. Occasionally we'll have multiple people with the same branch open- one working, another just has it open to see what it looks like, sort of deal.This has...
- 900 Views
- 3 replies
- 2 kudos
- 2 kudos
hi @Kayla ,I think the easiest way to check the current notebook location when opened is just hover the mouse cursor over the name of the notebook (top left, "ADE 3.1 - Streaming Deduplication" in this case) and wait for about 1-2 seconds; after that...
- 2 kudos
- 677 Views
- 1 replies
- 1 kudos
Delta Lake: Running Delete and writes concurrently
Is it safe to run a delete query when there are active writes to a delta lake table? Next question : Is it safe to run a vacuum when writes are being done actively?
- 677 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello @sharat_n ,Yes, it is generally safe to run a DELETE query on a Delta Lake table while active writes are happening.Delta Lake is designed with ACID transactions, meaning operations like DELETE, UPDATE, and MERGE are atomic and isolated.In other...
- 1 kudos
- 434 Views
- 0 replies
- 0 kudos
Databricks report error: unexpected end of stream, read 0 bytes from 4 (socket was closed by server)
Has anyone encountered this error and knows how to resolve it?"Unexpected end of stream, read 0 bytes from 4 (socket was closed by server)."This occurs in Databricks while generating reports.I've already adjusted the wait_timeout to 28,800, and both ...
- 434 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
Access control
1 -
Access Delta Tables
2 -
ActiveDirectory
1 -
AmazonKMS
1 -
Apache spark
1 -
App
1 -
Availability
1 -
Availability Zone
1 -
AWS
5 -
Aws databricks
1 -
AZ
1 -
Azure
8 -
Azure Data Lake Storage
1 -
Azure databricks
6 -
Azure databricks workspace
1 -
Best practice
1 -
Best Practices
2 -
Billing
2 -
Bucket
1 -
Cache
1 -
Change
1 -
Checkpoint
1 -
Checkpoint Path
1 -
Cluster
1 -
Cluster Pools
1 -
Clusters
1 -
ClustersJob
1 -
Compliance
1 -
Compute Instances
1 -
Cost
1 -
Credential passthrough
1 -
Data
1 -
Data Ingestion & connectivity
6 -
Data Plane
1 -
Databricks Account
1 -
Databricks Control Plane
1 -
Databricks Error Message
2 -
Databricks Partner
1 -
Databricks Repos
1 -
Databricks Runtime
1 -
Databricks SQL
3 -
Databricks SQL Dashboard
1 -
Databricks workspace
1 -
DatabricksJobs
1 -
DatabricksLTS
1 -
DBFS
1 -
DBR
3 -
Dbt
1 -
Dbu
3 -
Deep learning
1 -
DeleteTags Permissions
1 -
Delta
4 -
Delta Sharing
1 -
Delta table
1 -
Dev
1 -
Different Instance Types
1 -
Disaster recovery
1 -
DisasterRecoveryPlan
1 -
DLT Pipeline
1 -
EBS
1 -
Email
2 -
External Data Sources
1 -
Feature
1 -
GA
1 -
Ganglia
3 -
Ganglia Metrics
2 -
GangliaMetrics
1 -
GCP
1 -
GCP Support
1 -
Gdpr
1 -
Gpu
2 -
Group Entitlements
1 -
HIPAA
1 -
Hyperopt
1 -
Init script
1 -
InstanceType
1 -
Integrations
1 -
IP Addresses
1 -
IPRange
1 -
Job
1 -
Job Cluster
1 -
Job clusters
1 -
Job Run
1 -
JOBS
1 -
Key
1 -
KMS
1 -
KMSKey
1 -
Lakehouse
1 -
Limit
1 -
Live Table
1 -
Log
2 -
LTS
3 -
Metrics
1 -
MFA
1 -
ML
1 -
Model Serving
1 -
Multiple workspaces
1 -
Notebook Results
1 -
Okta
1 -
On-premises
1 -
Partner
73 -
Pools
1 -
Premium Workspace
1 -
Public Preview
1 -
Redis
1 -
Repos
1 -
Rest API
1 -
Root Bucket
2 -
SCIM API
1 -
Security
1 -
Security Group
1 -
Security Patch
1 -
Service principal
1 -
Service Principals
1 -
Single User Access Permission
1 -
Sns
1 -
Spark
1 -
Spark-submit
1 -
Spot instances
1 -
SQL
1 -
Sql Warehouse
1 -
Sql Warehouse Endpoints
1 -
Ssh
1 -
Sso
2 -
Streaming Data
1 -
Subnet
1 -
Sync Users
1 -
Tags
1 -
Team Members
1 -
Thrift
1 -
TODAY
1 -
Track Costs
1 -
Unity Catalog
1 -
Use
1 -
User
1 -
Version
1 -
Vulnerability Issue
1 -
Welcome Email
1 -
Workspace
2 -
Workspace Access
1
- « Previous
- Next »
User | Count |
---|---|
42 | |
26 | |
21 | |
13 | |
9 |