- 1075 Views
- 2 replies
- 1 kudos
You haven't configured the CLI yet
I have coded pyfunc model code in Databricks notebook and deployed and served thr model thr endpoint. I tried to query the endpoint thr databricks notebook itself through below code but getting CLI error. Not sure why i am getting this error since I ...
- 1075 Views
- 2 replies
- 1 kudos
- 1 kudos
Try this article which seems to have worked for most such use cases:https://medium.com/featurepreneur/solving-the-cli-configuration-error-in-databricks-d0462a96449f
- 1 kudos
- 72 Views
- 2 replies
- 0 kudos
Is there a way to clone an AIBI Dashboard?
We moved some of our processes from notebooks to dashboards, but it looks like I cannot clone a dashboard like I can clone a notebook.
- 72 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, Please follow these steps to clone AIBI dashboard. 1. Open a dashboard and select the draft mode. 2. Click the kebab menu (three vertical dots) and select the clone option.
- 0 kudos
- 5107 Views
- 5 replies
- 1 kudos
Insufficient Permissions Issue on Databricks
I have encountered a technical issue on Databricks.While executing commands both in Spark and SQL within the Databricks environment, I’ve run into permission-related errors from selecting files from DBFS. "org.apache.spark.SparkSecurityException: [IN...
- 5107 Views
- 5 replies
- 1 kudos
- 1 kudos
Please refer to some of the other community articles with the no module error https://community.databricks.com/t5/data-engineering/udf-importing-from-other-modules/td-p/58988
- 1 kudos
- 944 Views
- 7 replies
- 2 kudos
Alter table to add/update multiple column comments
I was wondering if there's a way to alter table and add/update comments for multiple columns at once using SQL or API calls. For instance - ALTER TABLE <table_name> CHANGE COLUMN <col1> COMMENT '<comment1>',CHANGE COLUMN <col2> COMMENT 'comment2' ; ...
- 944 Views
- 7 replies
- 2 kudos
- 2 kudos
Flock printing is a specialized technique that adds a velvety texture to fabric or other surfaces, making designs stand out with a luxurious, raised finish. This process involves applying tiny fiber particles, called "flock," onto a surface coated wi...
- 2 kudos
- 84 Views
- 1 replies
- 0 kudos
forums and groups - Databricks Certified Data Engineer Professional
Hi, I’m exploring more about the Databricks Certified Data Engineer Professional certification and was wondering if there’s a specific community or forum where people with this certification gather to share knowledge and experiences. Could anyone poi...
- 84 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @1diazdev ,This community has a lounge specifically for certified members.https://community.databricks.com/t5/databricks-certified-lounge/gh-p/Databricks-Certified-Lounge I haven’t come across a community specifically focused on Databricks Certifi...
- 0 kudos
- 178 Views
- 7 replies
- 0 kudos
Delta Live Tables Permissions
Hi allI'm the owner of delta live tables pipelines but I don't see the option described on documentation to grant permissions for different users. The options available are "settings" and "delete"In the sidebar, click Delta Live Tables.Select the nam...
- 178 Views
- 7 replies
- 0 kudos
- 0 kudos
Got it, so issue might be different, can you share with me an screenshot on how it looks like right now in the pipeline when you click on the vertical dots? Is this happening with any pipeline?
- 0 kudos
- 5000 Views
- 5 replies
- 3 kudos
Databricks Asset Bundles and Dashboards
Hi Databricks Team! I saw in the documentation that Databricks Asset Bundles will support Dashboards as well in the future. Could you please share, when we can expect that feature to be available? Is it coming only for the new Lakeview dashboards or ...
- 5000 Views
- 5 replies
- 3 kudos
- 3 kudos
It is now supported with asset bundles, an example in github can be found here : https://github.com/databricks/bundle-examples
- 3 kudos
- 82 Views
- 1 replies
- 0 kudos
Converting managed table to external tables
I have some managed tables in catalog which i plan to convert to external tables. But i want preserve the version history of the tables as well. I have tried deep cloning but it builds the external table as version 0. Is there any way i can achieve t...
- 82 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Roger667, When you use the DEEP CLONE command, you can specify the version of the table you want to clone. This allows you to create a clone of the table at a specific version, thus preserving the version history up to that point. Here is an e...
- 0 kudos
- 7982 Views
- 8 replies
- 1 kudos
Resolved! My Databrick exam got suspended just for coming closer to laptop screen to read question and options
Hi team,My Databricks Certified Data Engineer Associate exam got suspended within 10 minutes.I had also shown my exam room to the proctor. My exam got suspended due to eye movement. I was not moving my eyes away from laptop screen. It's hard to focus...
- 7982 Views
- 8 replies
- 1 kudos
- 1 kudos
Hi @Cert-Team, My Databricks Certified Data Engineer Associate exam got suspended, last 2 questions was remaining of my exam.I had also shown my exam room to the proctor. My exam got suspended due to eye movement. I was not moving my eyes away from l...
- 1 kudos
- 2543 Views
- 1 replies
- 0 kudos
Delta Live Tables and GIT
Notebooks that runs in a delta live table are GIT enabled, but what about the Delta Live Table pipeline?I'm looking for a good way to deploy pipelines from DEV to TEST and TEST to PROD that not just deploy the notebooks but also the pipeline.What pos...
- 2543 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Henrik , Databricks Asset Bundles would help you do this - https://docs.databricks.com/en/dev-tools/bundles/pipelines-tutorial.html Also this is a wonderful post addressing your query - https://www.databricks.com/blog/applying-software-develop...
- 0 kudos
- 124 Views
- 2 replies
- 0 kudos
Enable delta sharing oss using pyspark.
I have started the delta sharing server.Using guide on https://github.com/delta-io/delta-sharing.But I am not able to create a profile.How can I correctly create profile and share the table through delta sharing with the data stored in MinIO?
- 124 Views
- 2 replies
- 0 kudos
- 0 kudos
@gupta_tanmay wrote:I have started the delta sharing server.Using guide on https://github.com/delta-io/delta-sharing.But I am not able to create a profile.How can I correctly create profile and share the table through delta sharing with the data stor...
- 0 kudos
- 146 Views
- 0 replies
- 0 kudos
Databricks cleanroom functionality and billing
I'm new to Databricks and have been tasked with exploring Databricks Clean Rooms. I'm a bit confused about how billing works for Clean Rooms and their overall functionality. Specifically, I'm curious about the following:Environment Hosting: Are Clean...
- 146 Views
- 0 replies
- 0 kudos
- 816 Views
- 1 replies
- 0 kudos
Use DataBricks migration tool to export query
Dear all,I tried to use Databricks migration tool (https://github.com/databrickslabs/migrate) to migrate objects from one Databricks instance to another. I realized that notebooks, clusters, jobs can be done but queries can not be migrated by this to...
- 816 Views
- 1 replies
- 0 kudos
- 0 kudos
@yjiao If you're planning to migrate from your current technology to Databricks, Travinto Technologies' Code Converter Tool is here to make the process seamless. This powerful tool enables you to migrate data, ETL workflows, and reports across platf...
- 0 kudos
- 1087 Views
- 1 replies
- 0 kudos
URGENT - Databricks workspace deletion & recovery
Hi Team,I accidentally deleted our databricks workspace, which had all our artefacts and control plane, and was the primary resource for our team's working environment.Could anyone please help on priority, regarding the recovery/ restoration mechanis...
- 1087 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello data-warriors Note: Users cannot recover the deleted Databricks instance directly from Azure Portal. Deleted databricks instance can be only recovered by opening a support ticket, where our core engineering team will help you to recover the Da...
- 0 kudos
- 113 Views
- 2 replies
- 0 kudos
Disable mlflow autologging in a helper notebook
We have a helper function that uses a sklearn estimator. We don't want to to be logged to mlflow.I can dodef myfunc(): import mlflow with mlflow.autolog.ignore: # train model # use model return predictionsBut I get info prints:...
- 113 Views
- 2 replies
- 0 kudos
- 0 kudos
mlflow.autolog(disable=True, silent=True) fixes the printing. But my other problem with setting autologging back to previous state is still unsolved. I can't find any information about that problem in the docs.
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »