- 729 Views
- 1 replies
- 0 kudos
Unity catalog internal error - quality monitoring
I try to get my head around the quality monitoring functionality in Unity Catalog. I configured one of the tables in our unity catalog. My assumption is that the profile and drift metrics tables are automatically created. But when I get an internal e...
- 729 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, were you able to resolve this, am having a similar issue - thanks
- 0 kudos
- 6555 Views
- 0 replies
- 3 kudos
Unity Catalog Governance Value Levers
What makes Unity Catalog a game-changer? The blog intricately dissects five main value levers: mitigating data and architectural risks, ensuring compliance, accelerating innovation, reducing platform complexity and costs while improving operational e...
- 6555 Views
- 0 replies
- 3 kudos
- 756 Views
- 1 replies
- 0 kudos
Workspace Assignment Issue via REST API
I’m relying on workspace assignment via REST API to have the account user created in the workspace. This is like the workspace assignment screen at account level or adding existing user screen at workspace level. The reference URL is below.Workspace ...
- 756 Views
- 1 replies
- 0 kudos
- 0 kudos
It turns out, the problem is the documentation. It says that the permission parameter (that's supplied in) is an array of strings. It really just expects a string, either UNKNOWN, USER, or ADMIN. It would be great if the team could fix the documentat...
- 0 kudos
- 1699 Views
- 1 replies
- 0 kudos
Unable to create Iceberg tables pointing to data in S3 and run queries against the tables.
I need to to set up Iceberg tables in Databricks environment, but the data resides in an S3 bucket. Then read these tables by running SQL queries.Databricks environment has access to S3. This is done bysetting up the access by mapping the Instance Pr...
- 1699 Views
- 1 replies
- 0 kudos
- 0 kudos
@JohnsonBDSouza - could you please let me know if you had a chance to review the Uniform feature that allows to create iceberg tables from the delta format. Based on what i could understand from the above, you can create a delta table and use the b...
- 0 kudos
- 684 Views
- 2 replies
- 0 kudos
IsBlindAppend config change
Hello Allcan someone please suggest me how can I change the config IsBlindAppend true from false.I need to do this not for a data table but a custom log table .Also is there any concern If I toggle the value as standard practises. pls suggest
- 684 Views
- 2 replies
- 0 kudos
- 0 kudos
IsBlindAppend is not a config but an operation metrics that is used in Delta Lake History. The value of this changes based on the type of operation performed on Delta table. https://docs.databricks.com/en/delta/history.html
- 0 kudos
- 1020 Views
- 1 replies
- 0 kudos
Foreign table to delta streaming table
I want to copy a table from a foreign catalog as my streaming table. This is the code I used but I am getting error: Table table_name does not support either micro-batch or continuous scan.; spark.readStream .table(table_name) ...
- 1020 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @ksenija, In Spark, tables need to explicitly declare their support for batch or streaming scans. If you’re working with a Delta table, you might need to ensure that the correct configurations are set. If you’re working with a non-Delta table, y...
- 0 kudos
- 1274 Views
- 3 replies
- 1 kudos
Enable system schemas
Hello All,I'm new with Databricks,Have an issue within enable system schemas. When run api call to check system schemas status in metastores -I see that all schemas in "Unavailable" state (except "information_schema", which is "ENABLE_COMPLETED").Is ...
- 1274 Views
- 3 replies
- 1 kudos
- 1 kudos
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
- 1 kudos
- 1647 Views
- 3 replies
- 0 kudos
Connection Error Python databricks-sql-connector
Hello Databricks Community, I'm trying to connect to Databricks via the Python library (databricks-sql-connector-3.0.1). This below code was working few months ago. Now, it is failing to connect.hostname, http_path and access_token are valid values a...
- 1647 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @JohnsonBDSouza, In your case, it seems like the DatabricksRetryPolicy class is being initialized with an unexpected keyword argument allowed_methods. This could be due to a variety of reasons such as changes in the library version, deprecated fea...
- 0 kudos
- 1196 Views
- 3 replies
- 0 kudos
scalar function in databricks
Hi Expert,here is sql server scalar function how to convert in databricks functionSQLCREATE function [dbo].[gettrans](@PickupCompany nvarchar(2),@SupplyCountry int, @TxnSource nvarchar(10),@locId nvarchar(50), @ExternalSiteId nvarchar(50))RETURNS INT...
- 1196 Views
- 3 replies
- 0 kudos
- 0 kudos
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
- 0 kudos
- 2419 Views
- 6 replies
- 0 kudos
Cluster Access mode set to Shared on Databricks, results in connection refused on Exasol
I am trying to run a TRUNCATE command on my exasol DWH from Databricks using the pyexasol. This works perfectly fine when I have the cluster access mode as "No Isolation Shared" which does not have access to our Unity Catalog. When I change the clust...
- 2419 Views
- 6 replies
- 0 kudos
- 0 kudos
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
- 0 kudos
- 1323 Views
- 2 replies
- 0 kudos
UserAgentEntry added to JDBC URL but not visible in Audit logs
Hi,As part of Databricks Best Practices, I have added 'UserAgentEntry' to JDBC URL that is being created when we are executing SQL statements through the JDBC driver.Sample url - jdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;...
- 1323 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
- 0 kudos
- 1512 Views
- 2 replies
- 0 kudos
local filesystem access is forbidden
Hi,When i running this command over my private cluster (Single User) it works well :dbutils.fs.cp(ituff_file, protocol_local_file_path) When i try to run it over a shared cluster, i am getting : java.lang.SecurityException: Cannot use com.databricks....
- 1512 Views
- 2 replies
- 0 kudos
- 0 kudos
Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...
- 0 kudos
- 1098 Views
- 1 replies
- 0 kudos
Py4JSecurityException for file access in azure data storage - seeking help
I am trying to access a file in azure data storage using databricks in python. when i access I am getting py4SecurityException (py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks...
- 1098 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dev_4 , The py4j.security.Py4JSecurityException the error you’re encountering is thrown when a method that Azure Databricks has not explicitly marked as safe for Azure data lake Storage credential passthrough clusters is accessed. This securit...
- 0 kudos
- 2791 Views
- 2 replies
- 0 kudos
How to update a value in a column in a delta table with Map of Struct datatype?
I have a delta table in Databricks named "prod.silver.control_table". It has a few columns including "table_name" with string data type and "transform_options" with the below structure: |-- transform_options: map (nullable = true) | |-- key: str...
- 2791 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @Mado, Yes, you can update the values in the “order_duplicates_by” field of your Delta table using the withColumn function in PySpark. Also, please be aware that overwriting a Delta table will replace all existing data in the table. If you want ...
- 0 kudos
- 1400 Views
- 3 replies
- 0 kudos
Unable to login to community edition
Hello there,I have successfully created a databricks account and went to login to the community edition with the exact same login credentials as my account, but it tells me that the email/password are invalid. I can login with these same exact creden...
- 1400 Views
- 3 replies
- 0 kudos
- 0 kudos
Databricks Community Edition and Databricks are separate services. You have to create an account specific to community.
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »