- 4696 Views
- 1 replies
- 0 kudos
Databricks and streamlit and fast API combination
hello friends ! i have project where i need databricks to train eval model then put it to productioni trained model & eval in databricks i used mlflow everything is good now i have another two steps that i have zeroclue how they should be done : usag...
- 4696 Views
- 1 replies
- 0 kudos
- 0 kudos
This repo has examples that you can use in your Databricks workspace for FastAPI and Streamlit. I recommend only using these for development or lightweight use cases.
- 0 kudos
- 2681 Views
- 0 replies
- 0 kudos
can not set permission in table
In databricks database table I was able to set permissions to groups but Now I get this error when using a cluster:Error getting permissionssummary: SparkException: Trying to perform permission action on Hive Metastore /CATALOG/`hive_metastore`/DATAB...
- 2681 Views
- 0 replies
- 0 kudos
- 1743 Views
- 2 replies
- 1 kudos
Databricks Repos
Hi everyone!I've set up an Azure cloud environment for the analytical team that I am part of and everythings is working wonderfully except Databricks Repos. Whenever we open Databricks, we find ourselves in the branch that the most recent person work...
- 1743 Views
- 2 replies
- 1 kudos
- 1 kudos
use a separate a Databricks Git folder mapped to a remote Git repo for each user who works in their own development branch .Run Git operations on Databricks Repos | Databricks on AWS
- 1 kudos
- 1638 Views
- 4 replies
- 1 kudos
Not loading csv files with ".c000.csv" in the name
Yesterday I created a ton of csv files via joined_df.write.partitionBy("PartitionColumn").mode("overwrite").csv( output_path, header=True )Today, when working with them I realized, that they were not loaded. Upon investigation I saw...
- 1638 Views
- 4 replies
- 1 kudos
- 1 kudos
Then removing the "_commited_" file stops spark form reading in the other files
- 1 kudos
- 1784 Views
- 1 replies
- 1 kudos
Resolved! Is it possible to get Azure Databricks cluster metrics using REST API thru pyspark code
Am trying to get azure databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API by writing pyspark code. Its showing always cpu utilization & memory usage as N/A where as data...
- 1784 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @databricksdev You can use System tables for Azure Databricks cluster metrics.Please refer below blog for the same -Compute system tables reference | Databricks on AWS
- 1 kudos
- 15004 Views
- 5 replies
- 0 kudos
bigquery in notebook failing with unity catalog enabled cluster
bigquery(reading data from google cloud) failing with unity catalog enabled cluster. Same working fine without unity cluster. Any help is appreciated!Thanks,Sai
- 15004 Views
- 5 replies
- 0 kudos
- 0 kudos
Hi @385653 It works from single user clusters using dbfs path. On Shared clusters, please set the spark conf at the notebook level where you would convert the json content into base64 string. This is a workaround as shared clusters do not support d...
- 0 kudos
- 2755 Views
- 0 replies
- 0 kudos
TASK_WRITE_FAILED when trying to write on the table, Databricks (Scala)
Hello,I have a code on Databricks (Scala) that constructs a df and then write it to a Database table. It is working fine for almost all of the tables, but there is a table with a problem. It says No module named 'delta.connect' - TASK_WRITE_FAILED.In...
- 2755 Views
- 0 replies
- 0 kudos
- 2430 Views
- 3 replies
- 2 kudos
Resolved! reading workflow items
Hello, In databricks I have created workflows.in cmd prompt I can get a list of the workflows which look like the ones in dev environment.How can I get the list of workflows in test databricks environment?This is the command I use:databricks jobs lis...
- 2430 Views
- 3 replies
- 2 kudos
- 2 kudos
you need to config databricks CLI which host connectedhttps://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/#set-up-authentication-using-a-databricks-personal-access-token
- 2 kudos
- 1701 Views
- 2 replies
- 0 kudos
Unable to create cluster in community edition
Hello, since yesterday it is impossible to start a cluster in the community version of Databricks. I have tried deleting it, creating a new one...Also, from what I see, it is an error that is happening to many people. Bootstrap Timeout:Node daemon pi...
- 1701 Views
- 2 replies
- 0 kudos
- 0 kudos
I have the same issue in the Community Edition, has there been any response?
- 0 kudos
- 1066 Views
- 1 replies
- 0 kudos
Access AWS Resource In Another Account without STS
The EC2 instance profile I setup in the master AWS account can assume an S3/Dynamo access role in another S3 account. How do i setup in Databricks/AWS so that when I use Python Boto3 to access S3 and Dynamo without using STS to assume the role.
- 1066 Views
- 1 replies
- 0 kudos
- 0 kudos
Hey Kaniz, i am sorry about the confusion. I should have made my question more clear. I mean to access without using IAM assume role or access key as if the i am access resource within the same aws account.
- 0 kudos
- 844 Views
- 0 replies
- 0 kudos
Queries Upgradation from HMS to UC
I am currently doing Queries upgradation from HMS to Unity catalog. I would like to know and understand a few best practices to update the queries and also use a 3-level namespace for the existing query structure. Please guide me!
- 844 Views
- 0 replies
- 0 kudos
- 1415 Views
- 0 replies
- 0 kudos
Editor bug when escaping strings
When working in a notebook using %sql, when you escape a quote the editor colors get messed up.how it is:how it should be: I wont open a ticket or send a email to support.
- 1415 Views
- 0 replies
- 0 kudos
- 1844 Views
- 5 replies
- 3 kudos
Compute cluster not working
I have been facing this issue for 4 hour on the databricks community version.Bootstrap Timeout:Node daemon ping timeout in 780000 ms for instance i-0a711924387d1bc88 @ 10.172.241.200. Please check network connectivity between the data plane and the c...
- 1844 Views
- 5 replies
- 3 kudos
- 589 Views
- 0 replies
- 0 kudos
Security Analysis Tool - pat token extension or changes
HI TeamI have setup the SAT for my workspace . Do i need to change the PAT token everytime it expires or is there another workaround . How to change the pat token for for already established sat ?
- 589 Views
- 0 replies
- 0 kudos
- 1314 Views
- 0 replies
- 0 kudos
Adding new field to Delta live Table
Hi Experts,I have on Bronze layer all delta merge files (Parquet) format.I am converting these files into delta live tables in silver layer. While doing so, I am unable to add current time stamp column.Following is the script:from pyspark.sql.functio...
- 1314 Views
- 0 replies
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »