- 1352 Views
- 1 replies
- 0 kudos
How to save a catalog table as a spark or pandas dataframe?
HelloI have a table in my catalog, and I want to have it as a pandas or spark df. I was using this code to do that before, but I don't know what is happened recently that the code is not working anymore. from pyspark.sql import SparkSession spark = S...
- 1352 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Pbr, To work around this, you can create a temporary view using SQL in a separate cell (e.g., a %%sql cell) and then reference that view from your Python or Scala code. Here’s how you can achieve this: First, create a temporary view for your tab...
- 0 kudos
- 1639 Views
- 2 replies
- 0 kudos
Associating a Git Credential with a Service Principal using Terraform Provider (AWS)
I am attempting to create a Databrick Repo in a workspace via Terraform. I would like the Repo and the associated Git Credential to be associated with a Service Principal. In my initial run, the Terraform provider is associated with the user defined ...
- 1639 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Kinger, The Databricks Terraform provider does not support creating a git credential associated with a Service Principal (SP) and associating the SP with the Repo creation.• However, you can create a Service Principal and associate it with a git ...
- 0 kudos
- 2479 Views
- 3 replies
- 1 kudos
Resolved! Databricks Private Preview Features
Hi All, I just wanted to try the new Databricks Workflow private preview features (For Each Task).Can someone please guide me on how we can enable it in our workspace as I am having the same use case in my current project where this feature can help ...
- 2479 Views
- 3 replies
- 1 kudos
- 1 kudos
If you are using Azure Databricks just raise a support request regarding Private Preview they will enable it for you !
- 1 kudos
- 8256 Views
- 6 replies
- 0 kudos
My notebooks running in parallel no longer include jobids
Hey Databricks - what happened to the jobids that used to be returned from parallel runs? We used them to identify which link matched the output. See attached. How are we supposed to match up the links?
- 8256 Views
- 6 replies
- 0 kudos
- 0 kudos
Hey @Databricks, @Kaniz_Fatma - waccha doing? Yesterday's "newer version of the app" that got rolled out seems to have broken the parallel runs. The ephemeral notebook is missing. The job ids are missing. What's up? Benedetta
- 0 kudos
- 1978 Views
- 4 replies
- 0 kudos
How to get databricks performance metrics programmatically?
How to retrieve all Databricks performance metrics on an hourly basis. Is there a recommended method or API available for retrieving performance metrics ?
- 1978 Views
- 4 replies
- 0 kudos
- 0 kudos
Hi @Nandhini_Kumar, there's many performance metrics available - it depends on what you're looking to do with this data, and how you look to take action in real time. I would strongly recommend mapping out a user journey so you get only the metrics y...
- 0 kudos
- 1263 Views
- 2 replies
- 1 kudos
Looking for a promo code for Databricks certification
I went through the webinar suggested and the courses mentioned, but I have not received any voucher code for the certification. Can anyone please help. Thank you so much.
- 1263 Views
- 2 replies
- 1 kudos
- 1 kudos
Thank you. The link is broken in the response for ticketing portal. I have added a ticket with the help center. Kindly let me know if anything else is needed. Appreciate the help.
- 1 kudos
- 1217 Views
- 1 replies
- 0 kudos
LLM Chatbot With Retrieval Augmented Generation (RAG)
When executing the second block under 01-Data-Preparation-and-IndexI get the following error. Please help. AnalysisException: [RequestId=c9625879-339d-45c6-abb5-f70d724ddb47 ErrorClass=INVALID_STATE] Metastore storage root URL does not exist.Please p...
- 1217 Views
- 1 replies
- 0 kudos
- 0 kudos
I fixed this issue by providing a MANAGED LOCATION for the catalog - meant updating the _resources/00-init file as follows - spark.sql(f"CREATE CATALOG IF NOT EXISTS {catalog} MANAGED LOCATION '<location path>'")
- 0 kudos
- 2746 Views
- 4 replies
- 0 kudos
Resolved! My Global Init Script doesn't run with the new(er) LTS version 13.3. It runs great on 12.2LTS
Hey Databricks, Seems like you changed the way Global Init Scripts work. How come you changed it? My Global Init Script runs great on 12.2LTS but not on the new(er) LTS version 13.3. We don't have Unity Catalog turned on. What's up with that? Ar...
- 2746 Views
- 4 replies
- 0 kudos
- 0 kudos
Thank you ChloeBors - I tried upgrading the ubuntu version per your suggestion but got a new error "Can t open lib ODBC Driver 17 for SQL Server file not found 0 SQLDriverConnect". I tried modifying this line to 18: ACCEPT_EULA=Y apt-get install mso...
- 0 kudos
- 717 Views
- 1 replies
- 0 kudos
Why are data companies like Databricks investing or looping in k-12 education institutions?
Hello there,I work in the K-12 education sector, where the use of data is becoming increasingly important. With some schools having to manage over 13 grades and hundreds, if not thousands, of students, the ability to use data effectively is crucial. ...
- 717 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Fawcee, In the K-12 education sector, harnessing data-driven insights is indeed crucial for improving student outcomes and enhancing educational processes. While many schools still rely on basic tools like Excel, Google Sheets, and Microsoft BI, ...
- 0 kudos
- 833 Views
- 2 replies
- 0 kudos
A problem has occured
I am using the Community version of Databricks, and even though how much I clear and delete my items, notebooks, the below message appear when I want to insert a new cell in the notebook. Where can I see my usage and quota? A problem has occuredServe...
- 833 Views
- 2 replies
- 0 kudos
- 0 kudos
sorry, I do not see any Trash button on my side. Can you printscreen?
- 0 kudos
- 770 Views
- 2 replies
- 1 kudos
Model logging issues - Azure Databricks
I am running job with service principal ( principal e1077c1b-1dcf-4975-8164-d14f748d4774). Mentioned job has a task that will log (mlflow.pyfunc.log_model) newly trained model to unity catalog, but I get PERMISSION DENIED error. I do not get why prin...
- 770 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @Ivan_, Check Artifact Permissions: Verify that the service principal has write permissions to the artifact store location where you’re trying to log the model.Experiment Permissions: With the extension of MLflow experiment permissions to artifac...
- 1 kudos
- 1396 Views
- 3 replies
- 3 kudos
Resolved! Can we get the all notebooks attached to an all purpose compute at a particular instant ?
For an all purpose compute, we can see the notebooks attached to a cluster in the UI. Is there a way to get the same details using some api or other method in program.
- 1396 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi @RahulChaubey, Hi there! Thank you for reaching out to the Databricks community for assistance. I'm here to help you with your question. First, obtain the details of the cluster to which the notebooks are attached. You can use the Databricks REST...
- 3 kudos
- 647 Views
- 1 replies
- 0 kudos
Urgent: Issue with Databricks Certified Machine Learning Engineer Associate Exam
Hello Team,I encountered an issue during my Databricks Certified Machine Learning Engineer Associate exam, which took place on 30- 03- 2024, at 11:00 AM IST. Despite being present at my seat and viewing the camera, the proctor unexpectedly suspended...
- 647 Views
- 1 replies
- 0 kudos
- 0 kudos
@himashu-singh thank you for filing at ticket with the support team, it looks like they have already responded to you.
- 0 kudos
- 979 Views
- 1 replies
- 0 kudos
How to enable custom metrics for databricks EC2 instance?
Case 1: I have an AWS Databricks Instance, followed some steps for enabling CloudWatch for the Databricks EC2 instance. However, memory metrics is not available in the cloud watch. For that followed below steps for enabling custom metrics for the EC2...
- 979 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Nandhini_Kumar, It’s unfortunate that the Databricks-related EC2 instance is not showing up in the list when you try to install the CloudWatch Agent manually.To resolve this, let’s ensure that the instance meets the following criteria: The insta...
- 0 kudos
- 1587 Views
- 2 replies
- 0 kudos
How to install and use tkinter in databricks notebooks?
I have access to databricks cloud UI. I use the notebooks for my experiments (not MLFlow).I install libraries using the notebooks by using this command - !pip install LIBRARY_NAMEI am unable to install tkinter and use it in notebook as. Is there a wa...
- 1587 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Kaniz_Fatma Thanks for your response. I believe the "Create Library" is depreciated. Although there is a way to install libraries in the cluster configurations which I found and can try to install libraries through there. After using that, still ...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »