- 6739 Views
- 5 replies
- 2 kudos
Resolved! How to enable Genie?
Hi All,Based on the article below to enable Genie one needs to:1. Enable Azure AI services-powered featuresThat is done:2. Enable Genie must be enabled from the Previews pageI do not see Genie among Previews:I am using Azure Databricks. Any idea how ...
- 6739 Views
- 5 replies
- 2 kudos
- 2 kudos
I can access to preview on account level but can't see Genie in Previews
- 2 kudos
- 3954 Views
- 2 replies
- 3 kudos
Databricks shared workspace
We have a Self service portal through which users can launch databricks clusters of different configurations. This portal is set up to work in Dev, Sandbox and Prod environments. We have configured databricks workspaces only for Sandbox and Prod por...
- 3954 Views
- 2 replies
- 3 kudos
- 3 kudos
@Alberto_Umana Thanks for sharing doc linksWe have exact same set up to support shared databricks workspace. But still Im facing issue while adding instance profileI am trying to add AWS Instance Profile created in source AWS Account (No databricks w...
- 3 kudos
- 512 Views
- 1 replies
- 2 kudos
displayHTML <a href="#id"> not working
Many packages output a html-report, e.g. ydata-profiler. The report contains links to other parts of the report. But when the user clicks the links a new window is opened instead of scrolling to the correct section of the displayed html.Could this be...
- 512 Views
- 1 replies
- 2 kudos
- 2 kudos
Hello @invalidargument, Currently, there is no direct support from the Databricks end to modify this behavior without using such a workaround. The displayHTML function in Databricks renders HTML content within an iframe, and the injected JavaScript h...
- 2 kudos
- 1059 Views
- 1 replies
- 2 kudos
Resolved! Databricks Docker CLI image - how to debug Terraform when deploying Asset Bundle
We're having issues when deploying asset bundle using docker databricks cli image.Validation part passes OK:14:21:15 Name: test 14:21:15 Target: prototype-dev 14:21:15 Workspace: 14:21:15 Host: https://adb-xxx.azuredatabricks.net/ 14:21:15 ...
- 1059 Views
- 1 replies
- 2 kudos
- 2 kudos
OK, as it turns out - in order to bypass proxy we needed to set no_proxy env variable in both upper and lower case (!), like this:NO_PROXY="adb-xxx.azuredatabricks.net" docker run \ -v %teamcity.build.checkoutDir%:/my-bundle \ -v %teamcity.build...
- 2 kudos
- 991 Views
- 2 replies
- 1 kudos
Databricks Apps: AWS Secret Manager Access
We are exploring Databricks Apps.We want Databricks APP to interact with AWS Secret Manager. How we can configure this and configure IAM on AWS side for this to take place.@app
- 991 Views
- 2 replies
- 1 kudos
- 1 kudos
Thanks @Alberto_Umana .. Yes we will try to use databricks secret, that can be helpful. Couple of other questions on Databricks App 1) Can we use Framework other than mentioned in documentation( Streamlit,Flask,Dash,Gradio,Shiny).2) If required can w...
- 1 kudos
- 2293 Views
- 4 replies
- 0 kudos
Different NCC having same subnets
Hello,We are forwarding this Microsoft tutorial to secure our storage access:https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-firewallWe have a weird behavior when we create several NCCs in th...
- 2293 Views
- 4 replies
- 0 kudos
- 0 kudos
Ok, so no, I correctly set the subnets of my NCC in the Virtual Networks setting as documented:https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/serverless-firewallThis setting is working fine, without th...
- 0 kudos
- 668 Views
- 1 replies
- 2 kudos
How can you enable {{"data_security_mode": "USER_ISOLATION" }} sql warehouses
Hi,I want to create an sql warehouse with {{"data_security_mode": "USER_ISOLATION" }} however, I dont find the section to get the json file of my cluster. Thanks
- 668 Views
- 1 replies
- 2 kudos
- 2 kudos
Hello @carlos_tasayco, Access mode are not configurable in SQL warehouses. Please see: https://docs.databricks.com/api/workspace/warehouses/create
- 2 kudos
- 2830 Views
- 0 replies
- 1 kudos
Get managedResourceGroup from serverless
Hello,In my job I have a task where I should modify a notebook to get dynamically the environment, for example:This is how we get it:dic = {"D":"dev", "Q":"qa", "P":"prod"}managedResourceGroup = spark.conf.get("spark.databricks.xxxxx")xxxxx_Index = m...
- 2830 Views
- 0 replies
- 1 kudos
- 1188 Views
- 3 replies
- 3 kudos
Subscription and billing: Directly or via AWS?
Is there any difference in price & features if I subscribe to DataBricks directly or via AWS account?Any startup discount programs available?
- 1188 Views
- 3 replies
- 3 kudos
- 3 kudos
No problem, and there is no difference in the pricing between subscribing to Databricks directly or via the AWS Marketplace, it's just about billing mechanism.
- 3 kudos
- 1323 Views
- 5 replies
- 1 kudos
Resolved! Payment receipts of Databricks payments
Hello experts,I am trying to get receipts for the monthly payments done to Databricks. I need them for the financial department of the organization I am working for. The only billing information I get access to is the usage dashboards and the tables ...
- 1323 Views
- 5 replies
- 1 kudos
- 2088 Views
- 3 replies
- 4 kudos
Resolved! DLT-Asset bundle : Pipelines do not support a setting a run_as user that is different from the owner
Hello !We're using Databricks asset bundles to deploy to several environments using a devops pipeline. The service principal running the CICD pipeline and creating the job (owner) is not the same as the SPN that will be running the jobs (run_as).This...
- 2088 Views
- 3 replies
- 4 kudos
- 4 kudos
By default who creates pipeline is the owner/run_as. However, yes this is a feature in progress. I wouls still encorage you to share your request her https://docs.databricks.com/en/resources/ideas.html#ideas ...
- 4 kudos
- 866 Views
- 3 replies
- 0 kudos
Resolved! Issue in building databricks source code on ARM platform
Hi,I am attempting to build the Databricks Koalas source code on an ARM platform but am encountering an error. Could anyone provide specific steps for building it on ARM?
- 866 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @mohakgoel, You’re on the right track with the steps you’ve mentioned for building Koalas. It’s important to note that building Koalas is not the same as building the entire Databricks source code. Koalas is an open-source project that provides a ...
- 0 kudos
- 2726 Views
- 5 replies
- 0 kudos
Resolved! Using Databricks Docker CLI image with Asset Bundles - Azure
I'm trying to deploy asset bundle from a CI/CD pipeline, I'd like to use the docker databricks CLI image for that, but I can't get it to authenticate. I'm using entra service principal for my deployments and we are using TeamCity as our CI/CD tool. T...
- 2726 Views
- 5 replies
- 0 kudos
- 0 kudos
OK, tested and now everything is working - according to the docs bundle settings are of highest priority, https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/authentication#auth-evalNo wonder none of my env vars or .databrickscfg setting...
- 0 kudos
- 4626 Views
- 1 replies
- 0 kudos
We are getting AttributeError: module 'numpy.typing' has no attribute 'NDArray' with tidal jobs
File "/databricks/python/lib/python3.8/site-packages/PIL/_typing.py", line 10, in <module>NumpyArray = npt.NDArray[Any]AttributeError: module 'numpy.typing' has no attribute 'NDArray' we are using 10.4 runtime version , please suggest
- 4626 Views
- 1 replies
- 0 kudos
- 0 kudos
The file path of the above error is suggesting it is from the Pillow library dependency: "/databricks/python/lib/python3.8/site-packages/PIL/_typing.py" Pillow's _typing.py checks for NumPy >=1.21 while 10.4 runtime has these versions installed ...
- 0 kudos
- 3963 Views
- 1 replies
- 1 kudos
Troubleshooting Cluster
We had a failure on a previously running fact table load (our biggest one) and it looked like an executor was failing due to a timeout error. As a test we upped the cluster size and changed the spark.executor.heartbeatinterval to 300s and the spark....
- 3963 Views
- 1 replies
- 1 kudos
- 1 kudos
The XXKDA error code is a general indicator for task scheduler issues or SPARK_JOB_CANCELLED.
- 1 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
Access control
1 -
Apache spark
1 -
AWS
5 -
Azure
7 -
Azure databricks
5 -
Billing
2 -
Cluster
1 -
Compliance
1 -
Data Ingestion & connectivity
5 -
Databricks Runtime
1 -
Databricks SQL
2 -
DBFS
1 -
Dbt
1 -
Delta
4 -
Delta Sharing
1 -
DLT Pipeline
1 -
GA
1 -
Gdpr
1 -
Github
1 -
Partner
40 -
Public Preview
1 -
Service Principals
1 -
Unity Catalog
1 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
89 | |
37 | |
25 | |
24 | |
17 |