- 3335 Views
- 1 replies
- 0 kudos
Enable or disable Databricks Assistant in the Community Edition.
Hello,Good afternoon great people. I was following the step-by-step instructions to enable or disable Databricks Assistant in my Databricks Community Edition to enable the AI assistance. However, I couldn't find the option and was unable to enable it...
- 3335 Views
- 1 replies
- 0 kudos
- 4656 Views
- 0 replies
- 0 kudos
DAB template dbt-sql not working
Hi,We are trying to use the dbt-sql template provided for databricks asset bundles but getting error as follows: Looks like its regarding default catalog configuration. Has anyone faced this previously or can help with the same
- 4656 Views
- 0 replies
- 0 kudos
- 3457 Views
- 1 replies
- 2 kudos
How to collect a thread dump from Databricks Spark UI.
If you observe a hung job, thread dumps are crucial to determine the root cause. Hence, it would be a good idea to collect the thread dumps before cancelling the hung job. Here are the Instructions to collect the Spark driver/executor thread dump: ​...
- 3457 Views
- 1 replies
- 2 kudos
- 1196 Views
- 0 replies
- 0 kudos
Response code 400 received when using VSCode on Windows 10 but no issue while using Ubuntu
I use VSCode on Windows 10 for building and deploying a workflow from my system and always encounter response code 400 when trying to deploy it. I am able to deploy the workflows via Ubuntu, but not via Windows. Has anyone encountered this issue befo...
- 1196 Views
- 0 replies
- 0 kudos
- 653 Views
- 0 replies
- 0 kudos
jobs.python_wheel_task.enty_point can't find entry points defined in dependency packages
Let's say I have packageA with no entry points, packageA dependents on dependencyA package, which has entry-points.In order to be able to use those entrypoints, i.e.```yamlpython_wheel_task: package_name: packageA entry_point:dependencyA_entry```I ...
- 653 Views
- 0 replies
- 0 kudos
- 1462 Views
- 2 replies
- 0 kudos
child notebook is not displaying output.
I have built pipeline to execute databricks notebook having SQL scripts. It is executing notebook but not able to see output for each cell. I am executing child notebook through driver notebook.
- 1462 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Verr,In short it depends on how your child notebook is configured. But I would start with the following points -Output Logging Settings: Check the logging settings for your notebook cells. Ensure that the cells are configured to display output. In...
- 0 kudos
- 4075 Views
- 6 replies
- 2 kudos
Resolved! column "storage_sub_directory" is now always NULL in system.information_schema.tables
Hello,I am running a job that depends on the information provided in column storage_sub_directory in system.information_schema.tables .... and it worked until 1-2 weeks ago.Now I discovered in the doc that this column is deprecated and always null , ...
- 4075 Views
- 6 replies
- 2 kudos
- 1093 Views
- 1 replies
- 0 kudos
displaying unity catalog metadata and other information in sharePoint
Is there any connectors or api which we can use to display metadata information stored in Unity catalog to business users using SharePoint.
- 1093 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Lucifer As metadata are stored in system schema as table that means you can use databricks to extract the data from databricks and display it to Sharepoint Docs - Statement Execution API: Run SQL on warehouses | Databricks on AWS
- 0 kudos
- 986 Views
- 0 replies
- 0 kudos
DLT to push data instead of a pull
I am relatively new to Databricks, and from my recent experience it appears that every step in a DLT Pipeline, we define each LIVE TABLES (be it streaming or not) to pull data upstream.I have yet to see an implementation where data from upstream woul...
- 986 Views
- 0 replies
- 0 kudos
- 3510 Views
- 1 replies
- 1 kudos
Resolved! Unable to provide access in unity catalog using SQL commands
I am trying to provide access in unity catalog using the SQL commands.I am following the below documentation:https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/manage-privileges/It suggests to create SQL commands in belo...
- 3510 Views
- 1 replies
- 1 kudos
- 1 kudos
Try this. For some reason the quotes are crazy when using these commands. GRANT USAGE ON CATALOG `uda_dev` TO `your-group` GRANT SELECT ON SCHEMA uda_dev.default TO `your-group` (without quotes for the schema)
- 1 kudos
- 1319 Views
- 0 replies
- 0 kudos
Issues with Runtime 15.1/15.2Beta in shared access mode
We have been using runtime 14.2, share mode for our computing cluster in Databrick for quite some time. We are now trying to upgrade to python 3.11 for some dependencies mangement, thereby requiring us to use runtime 15.1/15.2 as runtime 14.2 only ...
- 1319 Views
- 0 replies
- 0 kudos
- 2557 Views
- 3 replies
- 1 kudos
Cluster policy type
Hi guys,I am creating a cluster policy through json. "runtime_engine": {"type": "fixed""value": "PHOTON"}When I run the above code... PHOTON option is getting enabled but graying out... What would I specify in type field so that the photon option sho...
- 2557 Views
- 3 replies
- 1 kudos
- 1 kudos
@thrinadhReddy This section includes a reference for each of the available policy types. There are two categories of policy types: fixed policies and limiting policies. Fixed policies prevent user configuration on an attribute. The two types of fixed...
- 1 kudos
- 787 Views
- 1 replies
- 0 kudos
Databricks workflow jobs run is taking Double time in EU Region
We have a scheduled job in Databricks workflow, This Job run is taking aroud 5 hours Previously before 1 month it was tasking on 2.5 hours. Can any one tell what may be the reason behind this. Note: There is no change has been made in this period of ...
- 787 Views
- 1 replies
- 0 kudos
- 0 kudos
You can check if you are using spot instances on your Job Cluster.btw. if you are using Azure West Europe is on very high demand and sometimes it takes time to provision compute.But it should be matter of minutes, not hours.Check maybe if your data v...
- 0 kudos
- 8442 Views
- 3 replies
- 0 kudos
Resolved! How to pass variables to a python file job
Hi everyone,It's relatively straight forward to pass a value to a key-value pair in notebook job. But for the python file job however, I couldn't figure out how to do it. Does anyone have any idea?Have been tried out different variations for a job wi...
- 8442 Views
- 3 replies
- 0 kudos
- 0 kudos
Thanks so much for this! By the way, is there a way to do it with the JSON interface? I am struggling to get the parameters if entered in this way
- 0 kudos
- 1030 Views
- 1 replies
- 0 kudos
Chat Bot with Azure blob and databricks
Hi Team, I am thinking to start a chat bot application for teams to query data from Azure blob and data bricks tables in python programming language.Please help me out on how i can start and which tools i can use for this requirement.Thanks in advanc...
- 1030 Views
- 1 replies
- 0 kudos
- 0 kudos
@Nagrjuna , that's a great idea! Although we do not know about your use case completely, I am sure you would definitely fall in love with our AI/ML Products. To create a Python chat bot application that can pull data from Azure Blob Storage and Datab...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »