by
Surajv
• New Contributor III
- 16 Views
- 1 replies
- 0 kudos
Hi Community, I was going through this doc: https://docs.databricks.com/api/workspace/tokens/create to and got to know, that there is a quota limit to how many token one can generate using the api: POST /api/2.0/token/create, having breached the thre...
- 16 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Surajv, Let’s dive into the details of token quotas in Databricks.
Quota Limit for Token Creation:
The quota limit for creating user tokens via the API (specifically, using POST /api/2.0/token/create) is essential to manage token usage.Each u...
by
Surajv
• New Contributor III
- 24 Views
- 1 replies
- 0 kudos
Hi community, Is there any API or option to view all PAT tokens generated by a Databricks service principal?
- 24 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Surajv, When working with Databricks service principals, you can manage and view personal access tokens (PATs) associated with them.
Here’s how you can achieve this:
What is a Service Principal?
A service principal is an identity created in ...
- 72 Views
- 0 replies
- 0 kudos
We're looking for feedback on the Databricks free trial experience, and we need your help!
Whether you've used it for data engineering, data science, or analytics, Sujit Nair, our Product Manager on the free trial experience, and our journey archite...
- 72 Views
- 0 replies
- 0 kudos
by
Еmil
• New Contributor III
- 680 Views
- 3 replies
- 1 kudos
My post was marked as Spam after trying to post the description of my issue so now I have posted the question on stackoverflow.
- 680 Views
- 3 replies
- 1 kudos
Latest Reply
Hi @Еmil, I've read through your question and believe I have a solution for you.
Here's a response to your question:
Since your job runs as a service principal, consider using OAuth M2M authentication for accessing your Azure DevOps Git repository.En...
2 More Replies
- 158 Views
- 1 replies
- 0 kudos
Hi, First foray into DLT and following code exerts from the sample-DLT-notebook.I'm creating a notebook with the SQL below:CREATE STREAMING LIVE TABLE sales_orders_rawCOMMENT "The raw sales orders, ingested from /databricks-datasets."TBLPROPERTIES ...
- 158 Views
- 1 replies
- 0 kudos
Latest Reply
If you change the notebook default language as opposed to using magic command. I normally have it set to Python, I've wrongly assumed DLT would transpose as can't use magic command but have to change default in order for it to work.
- 55 Views
- 0 replies
- 0 kudos
how can I download the run and event logs? spark UI is loading them from somewhere but I couldn't find them in dbfs nor on s3
- 55 Views
- 0 replies
- 0 kudos
- 121 Views
- 3 replies
- 0 kudos
I'm trying to use the API of billable usage and I do get a report but I have not been able to get the usd cost report, only the dbuHours. I guess I've to change the meter_name but I cannot find the key for that parameter anywhere
- 121 Views
- 3 replies
- 0 kudos
Latest Reply
Hi @hpicatto,
AWS Usage Reports:
AWS provides detailed usage and cost reports through the AWS Cost and Usage Report. You can access this report via the AWS Management Console. Here are the steps:
Log in to the AWS Management Console.Navigate to the B...
2 More Replies
by
Floody
• New Contributor II
- 263 Views
- 1 replies
- 0 kudos
When I visit my profile page, under the drafts section I see an entry for every post I visit in the discussions. Is this normal?
- 263 Views
- 1 replies
- 0 kudos
Latest Reply
Hi @Floody, Yes, it is normal to see an entry for every post you visit in the discussions under the drafts section of your profile page. This feature allows you to easily access and continue working on drafts of posts that you have started or viewed ...
- 81 Views
- 1 replies
- 1 kudos
Hey all,I am searching for a non-political answer to my database questions. Please know that I am a data newbie and litteraly do not know anything about this topic, but I want to learn, so please be gentle. Some context: I am working for an OEM that...
- 81 Views
- 1 replies
- 1 kudos
Latest Reply
Hi @NoviKamayana, Let’s dive into the world of data storage and explore the differences between Delta Lake and PostgreSQL. As a data newbie, you’re on the right track to learn more about these concepts.
Data Lake vs. Delta Lake
Benefits and Limit...
- 1583 Views
- 5 replies
- 1 kudos
I'm trying to do fuzzy matching on two dataframes by cross joining them and then using a udf for my fuzzy matching. But using both python udf and pandas udf its either very slow or I get an error. @pandas_udf("int")def core_match_processor(s1: pd.Ser...
- 1583 Views
- 5 replies
- 1 kudos
Latest Reply
Hi @mohaimen_syed, Could you please help me with these details:-
- Cluster details, and
- Check if Apache Apache Arrow optimization is enabled in your cluster.
4 More Replies
by
kp12
• New Contributor II
- 2151 Views
- 3 replies
- 0 kudos
Hello,I'm trying to write to Azure PostgreSQL flexible database from Azure Databricks, using PostgreSQL connector in Databricks Runtime in 12.2LTS.I'm using df.write.format("postgresql").save() to write to PostgreSQL database, but getting the follow...
- 2151 Views
- 3 replies
- 0 kudos
Latest Reply
Had same problem. You need to add option "stringtype": "unspecified".Example:df.write.format("postgresql").mode("overwrite").option("truncate", "true").option("stringtype", "unspecified").option("dbtable", table).option("host", host).option("database...
2 More Replies
- 170 Views
- 2 replies
- 1 kudos
Hey Folks anyone put Databricks behind Okta and enabled Unified Login with workspaces that have a Unity Catalog metastore applied and some that don't?There are some workspaces we can't move over yet and it isn't clear in documentation if Unity Catalo...
- 170 Views
- 2 replies
- 1 kudos
Latest Reply
Yes, users should be able to use a single Okta application for all workspaces, regardless of whether the Unity Catalog metastore has been applied or not. The Unity Catalog is a feature that allows you to manage and secure access to your data across a...
1 More Replies
- 1741 Views
- 5 replies
- 7 kudos
Hello,I have some data which are lying into Snowflake, so I want to apply CDC on them using delta live table but I am having some issues.Here is what I am trying to do: @dlt.view()
def table1():
return spark.read.format("snowflake").options(**opt...
- 1741 Views
- 5 replies
- 7 kudos
Latest Reply
The CDC for delta live works fine for delta tables, as you have noticed. However it is not a full blown CDC implementation/software.If you want to capture changes in Snowflake, you will have to implement some CDC method on Snowflake itself, and read...
4 More Replies
- 445 Views
- 1 replies
- 0 kudos
Hi,I am having a hard time configuring my Databricks workspace when working in VSCode via WSL. When following the steps to setup Databricks authentication I am receiving the following error on the Step 5 of "Step 4: Set up Databricks authentication"....
- 445 Views
- 1 replies
- 0 kudos
Latest Reply
Scratch that, I found the alternative means of authenticating via this link: Authentication setup for the Databricks extension for Visual Studio Code
- 719 Views
- 3 replies
- 0 kudos
Hi,databricks jdbc version - 2.6.34I am facing the below issue with connecting databricks sql from apache solr Caused by: java.sql.SQLFeatureNotSupportedException: [Databricks][JDBC](10220) Driver does not support this optional feature.at com.databri...
- 719 Views
- 3 replies
- 0 kudos
Latest Reply
Databricks team recommended to set IgnoreTransactions=1 and autocommit=false in the connection string but that didn't resolve the issue .Ultimately I had to use solr update API for uploading documents
2 More Replies