cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Faiçal_1979
by New Contributor
  • 4696 Views
  • 1 replies
  • 0 kudos

Databricks and streamlit and fast API combination

hello friends ! i have project where i need databricks to train eval model then put it to productioni trained model & eval in databricks i used mlflow everything is good now i have another two steps that i have zeroclue how they should be done : usag...

  • 4696 Views
  • 1 replies
  • 0 kudos
Latest Reply
RafiKurlansik
Databricks Employee
  • 0 kudos

This repo has examples that you can use in your Databricks workspace for FastAPI and Streamlit.  I recommend only using these for development or lightweight use cases. 

  • 0 kudos
arkiboys
by Contributor
  • 2681 Views
  • 0 replies
  • 0 kudos

can not set permission in table

In databricks database table I was able to set permissions to groups but Now I get this error when using a cluster:Error getting permissionssummary: SparkException: Trying to perform permission action on Hive Metastore /CATALOG/`hive_metastore`/DATAB...

  • 2681 Views
  • 0 replies
  • 0 kudos
johann_blake
by New Contributor
  • 1743 Views
  • 2 replies
  • 1 kudos

Databricks Repos

Hi everyone!I've set up an Azure cloud environment for the analytical team that I am part of and everythings is working wonderfully except Databricks Repos. Whenever we open Databricks, we find ourselves in the branch that the most recent person work...

  • 1743 Views
  • 2 replies
  • 1 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 1 kudos

 use a separate a Databricks Git folder mapped to a remote Git repo for each user who works in their own development branch .Run Git operations on Databricks Repos | Databricks on AWS

  • 1 kudos
1 More Replies
jenshumrich
by Contributor
  • 1638 Views
  • 4 replies
  • 1 kudos

Not loading csv files with ".c000.csv" in the name

Yesterday I created a ton of csv files via joined_df.write.partitionBy("PartitionColumn").mode("overwrite").csv(            output_path, header=True        )Today, when working with them I realized, that they were not loaded. Upon investigation I saw...

  • 1638 Views
  • 4 replies
  • 1 kudos
Latest Reply
jenshumrich
Contributor
  • 1 kudos

Then removing the "_commited_" file stops spark form reading in the other files

  • 1 kudos
3 More Replies
databricksdev
by New Contributor II
  • 1784 Views
  • 1 replies
  • 1 kudos

Resolved! Is it possible to get Azure Databricks cluster metrics using REST API thru pyspark code

Am trying to get azure  databricks cluster metrics such as memory utilization, CPU utilization, memory swap utilization, free file system using REST API by writing pyspark code. Its showing always cpu utilization & memory usage as N/A   where as data...

  • 1784 Views
  • 1 replies
  • 1 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 1 kudos

Hi @databricksdev  You can use System tables for Azure Databricks cluster metrics.Please refer below blog for the same -Compute system tables reference | Databricks on AWS

  • 1 kudos
385653
by New Contributor II
  • 15004 Views
  • 5 replies
  • 0 kudos

bigquery in notebook failing with unity catalog enabled cluster

bigquery(reading data from google cloud) failing with unity catalog enabled cluster. Same working fine without unity cluster. Any help is appreciated!Thanks,Sai

  • 15004 Views
  • 5 replies
  • 0 kudos
Latest Reply
Srihasa_Akepati
Databricks Employee
  • 0 kudos

Hi @385653  It works from single user clusters using dbfs path.  On Shared clusters, please set the spark conf at the notebook level where you would convert the json content into base64 string. This is a workaround as shared clusters do not support d...

  • 0 kudos
4 More Replies
chemajar
by New Contributor III
  • 2755 Views
  • 0 replies
  • 0 kudos

TASK_WRITE_FAILED when trying to write on the table, Databricks (Scala)

Hello,I have a code on Databricks (Scala) that constructs a df and then write it to a Database table. It is working fine for almost all of the tables, but there is a table with a problem. It says No module named 'delta.connect' - TASK_WRITE_FAILED.In...

chemajar_2-1710422710695.png chemajar_5-1710422843604.png
  • 2755 Views
  • 0 replies
  • 0 kudos
arkiboys
by Contributor
  • 2430 Views
  • 3 replies
  • 2 kudos

Resolved! reading workflow items

Hello, In databricks I have created workflows.in cmd prompt I can get a list of the workflows which look like the ones in dev environment.How can I get the list of workflows in test databricks environment?This is the command I use:databricks jobs lis...

  • 2430 Views
  • 3 replies
  • 2 kudos
Latest Reply
feiyun0112
Honored Contributor
  • 2 kudos

you need to config databricks CLI which host connectedhttps://learn.microsoft.com/en-us/azure/databricks/archive/dev-tools/cli/#set-up-authentication-using-a-databricks-personal-access-token 

  • 2 kudos
2 More Replies
M1NE
by New Contributor
  • 1701 Views
  • 2 replies
  • 0 kudos

Unable to create cluster in community edition

Hello, since yesterday it is impossible to start a cluster in the community version of Databricks. I have tried deleting it, creating a new one...Also, from what I see, it is an error that is happening to many people. Bootstrap Timeout:Node daemon pi...

  • 1701 Views
  • 2 replies
  • 0 kudos
Latest Reply
zach
New Contributor III
  • 0 kudos

I have the same issue in the Community Edition, has there been any response?

  • 0 kudos
1 More Replies
Blasti
by New Contributor II
  • 1066 Views
  • 1 replies
  • 0 kudos

Access AWS Resource In Another Account without STS

The EC2 instance profile I setup in the master AWS account can assume an S3/Dynamo access role in another S3 account. How do i setup in Databricks/AWS so that when I use Python Boto3 to access S3 and Dynamo without using STS to assume the role. 

  • 1066 Views
  • 1 replies
  • 0 kudos
Latest Reply
Blasti
New Contributor II
  • 0 kudos

Hey Kaniz, i am sorry about the confusion. I should have made my question more clear. I mean to access without using IAM assume role or access key as if the i am access resource within the same aws account.

  • 0 kudos
DjtheDE
by New Contributor
  • 844 Views
  • 0 replies
  • 0 kudos

Queries Upgradation from HMS to UC

I am currently doing Queries upgradation from HMS to Unity catalog. I would like to know and understand a few best practices to update the queries and also use a 3-level namespace for the existing query structure. Please guide me!

  • 844 Views
  • 0 replies
  • 0 kudos
Nirdosh
by New Contributor II
  • 1844 Views
  • 5 replies
  • 3 kudos

Compute cluster not working

I have been facing this issue for 4 hour on the databricks community version.Bootstrap Timeout:Node daemon ping timeout in 780000 ms for instance i-0a711924387d1bc88 @ 10.172.241.200. Please check network connectivity between the data plane and the c...

  • 1844 Views
  • 5 replies
  • 3 kudos
Latest Reply
Yara
New Contributor II
  • 3 kudos

I have the same issue

  • 3 kudos
4 More Replies
Madalian
by New Contributor III
  • 1314 Views
  • 0 replies
  • 0 kudos

Adding new field to Delta live Table

Hi Experts,I have on Bronze layer all delta merge files (Parquet) format.I am converting these files into delta live tables in silver layer. While doing so, I am unable to add current time stamp column.Following is the script:from pyspark.sql.functio...

  • 1314 Views
  • 0 replies
  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors