cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

JohnsonBDSouza
by New Contributor II
  • 1965 Views
  • 1 replies
  • 0 kudos

Unable to create Iceberg tables pointing to data in S3 and run queries against the tables.

I need to to set up Iceberg tables in Databricks environment, but the data resides in an S3 bucket. Then read these tables by running SQL queries.Databricks environment has access to S3. This is done bysetting up the access by mapping the Instance Pr...

JohnsonBDSouza_0-1705982713662.png JohnsonBDSouza_1-1705982713665.png JohnsonBDSouza_2-1705982713667.jpeg JohnsonBDSouza_3-1705982713676.png
  • 1965 Views
  • 1 replies
  • 0 kudos
Latest Reply
shan_chandra
Esteemed Contributor
  • 0 kudos

@JohnsonBDSouza  - could you please let me know if you had a chance to review the Uniform feature that allows to create iceberg tables from the delta format.  Based on what i could understand from the above, you can create a delta table and use the b...

  • 0 kudos
kiko_roy
by Contributor
  • 739 Views
  • 2 replies
  • 0 kudos

IsBlindAppend config change

Hello Allcan someone please suggest me how can I change the config IsBlindAppend true from false.I need to do this not for a data table but a custom log table .Also is there any concern If I toggle the value as standard practises. pls suggest

  • 739 Views
  • 2 replies
  • 0 kudos
Latest Reply
Lakshay
Esteemed Contributor
  • 0 kudos

IsBlindAppend is not a config but an operation metrics that is used in Delta Lake History. The value of this changes based on the type of operation performed on Delta table. https://docs.databricks.com/en/delta/history.html

  • 0 kudos
1 More Replies
ksenija
by Contributor
  • 1090 Views
  • 1 replies
  • 0 kudos

Foreign table to delta streaming table

I want to copy a table from a foreign catalog as my streaming table. This is the code I used but I am getting error: Table table_name does not support either micro-batch or continuous scan.; spark.readStream                .table(table_name)         ...

  • 1090 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @ksenija, In Spark, tables need to explicitly declare their support for batch or streaming scans.   If you’re working with a Delta table, you might need to ensure that the correct configurations are set. If you’re working with a non-Delta table, y...

  • 0 kudos
OlekNV
by New Contributor
  • 1401 Views
  • 3 replies
  • 1 kudos

Enable system schemas

Hello All,I'm new with Databricks,Have an issue within enable system schemas. When run api call to check system schemas status in metastores -I see that all schemas in "Unavailable" state (except "information_schema", which is "ENABLE_COMPLETED").Is ...

  • 1401 Views
  • 3 replies
  • 1 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 1 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 1 kudos
2 More Replies
JohnsonBDSouza
by New Contributor II
  • 1845 Views
  • 3 replies
  • 0 kudos

Connection Error Python databricks-sql-connector

Hello Databricks Community, I'm trying to connect to Databricks via the Python library (databricks-sql-connector-3.0.1). This below code was working few months ago. Now, it is failing to connect.hostname, http_path and access_token are valid values a...

  • 1845 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @JohnsonBDSouza, In your case, it seems like the DatabricksRetryPolicy class is being initialized with an unexpected keyword argument allowed_methods. This could be due to a variety of reasons such as changes in the library version, deprecated fea...

  • 0 kudos
2 More Replies
Shree23
by New Contributor III
  • 1287 Views
  • 3 replies
  • 0 kudos

scalar function in databricks

Hi Expert,here is sql server scalar function how to convert in databricks functionSQLCREATE function [dbo].[gettrans](@PickupCompany nvarchar(2),@SupplyCountry int, @TxnSource nvarchar(10),@locId nvarchar(50), @ExternalSiteId nvarchar(50))RETURNS INT...

  • 1287 Views
  • 3 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
2 More Replies
Gembo
by New Contributor II
  • 2691 Views
  • 6 replies
  • 0 kudos

Cluster Access mode set to Shared on Databricks, results in connection refused on Exasol

I am trying to run a TRUNCATE command on my exasol DWH from Databricks using the pyexasol. This works perfectly fine when I have the cluster access mode as "No Isolation Shared" which does not have access to our Unity Catalog. When I change the clust...

  • 2691 Views
  • 6 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
5 More Replies
Databricks24
by New Contributor
  • 1426 Views
  • 2 replies
  • 0 kudos

UserAgentEntry added to JDBC URL but not visible in Audit logs

Hi,As part of Databricks Best Practices, I have added 'UserAgentEntry' to JDBC URL that is being created when we are executing SQL statements through the JDBC driver.Sample url - jdbc:databricks://<host>:443;httpPath=<httpPath>; AuthMech=3;UID=token;...

  • 1426 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
udi_azulay
by New Contributor II
  • 1728 Views
  • 2 replies
  • 0 kudos

local filesystem access is forbidden

Hi,When i running this command over my private cluster (Single User) it works well :dbutils.fs.cp(ituff_file, protocol_local_file_path) When i try to run it over a shared cluster, i am getting : java.lang.SecurityException: Cannot use com.databricks....

  • 1728 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Thank you for posting your question in our community! We are happy to assist you.To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your question?This...

  • 0 kudos
1 More Replies
dev_4
by New Contributor
  • 1213 Views
  • 1 replies
  • 0 kudos

Py4JSecurityException for file access in azure data storage - seeking help

I am trying to access a file in azure data storage using databricks in python. when i access I am getting py4SecurityException (py4j.security.Py4JSecurityException: Method public com.databricks.backend.daemon.dbutils.DBUtilsCore$Result com.databricks...

  • 1213 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @dev_4 , The py4j.security.Py4JSecurityException the error you’re encountering is thrown when a method that Azure Databricks has not explicitly marked as safe for Azure data lake Storage credential passthrough clusters is accessed. This securit...

  • 0 kudos
Mado
by Valued Contributor II
  • 3048 Views
  • 2 replies
  • 0 kudos

How to update a value in a column in a delta table with Map of Struct datatype?

I have a delta table in Databricks named "prod.silver.control_table". It has a few columns including "table_name" with string data type and "transform_options" with the below structure:  |-- transform_options: map (nullable = true) | |-- key: str...

Community Platform Discussions
MAP
Struct
Update_table
  • 3048 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Mado, Yes, you can update the values in the “order_duplicates_by” field of your Delta table using the withColumn function in PySpark.   Also, please be aware that overwriting a Delta table will replace all existing data in the table. If you want ...

  • 0 kudos
1 More Replies
arun949290
by New Contributor II
  • 1590 Views
  • 3 replies
  • 0 kudos

Unable to login to community edition

Hello there,I have successfully created a databricks account and went to login to the community edition with the exact same login credentials as my account, but it tells me that the email/password are invalid. I can login with these same exact creden...

  • 1590 Views
  • 3 replies
  • 0 kudos
Latest Reply
Erik_L
Contributor II
  • 0 kudos

Databricks Community Edition and Databricks are separate services. You have to create an account specific to community.

  • 0 kudos
2 More Replies
Omri
by New Contributor
  • 587 Views
  • 1 replies
  • 0 kudos

Is it possible to create a scratchpad ui?

https://jupyter-contrib-nbextensions.readthedocs.io/en/latest/nbextensions/scratchpad/README.htmlIs something like this available on Databrick's notebooks ui?

  • 587 Views
  • 1 replies
  • 0 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 0 kudos

Hi, it would be great if you can submit the idea on this and the progress can be tracked by the ideas portal. https://docs.databricks.com/en/resources/ideas.html 

  • 0 kudos
SivaPK
by New Contributor II
  • 1424 Views
  • 1 replies
  • 1 kudos

Generated Access Token is deleted/Expired after lifetime 90 days? How to use old token now?

Hello Team,I have generated a new token via Admin Settings --> Developer --> Access Token -- > Manage.Now my token is deleted/Expired after 90 days. I know what is my token and generated alphanumeric one.Now how can i set or reuse the same token in d...

Community Platform Discussions
access_token
generate_token
restore_token
settings
  • 1424 Views
  • 1 replies
  • 1 kudos
Latest Reply
Debayan
Esteemed Contributor III
  • 1 kudos

Hi, To change the default lifetime of 90 days ,you can leave the Lifetime (days) box empty (blank). Refer: https://docs.databricks.com/en/dev-tools/auth/pat.html#databricks-personal-access-tokens-for-workspace-users

  • 1 kudos
Sujitha
by Community Manager
  • 6253 Views
  • 1 replies
  • 0 kudos

Exciting Announcement: Launch of New Course - Data Analysis with Databricks!

Welcome to the world of Data Analysis with Databricks! We are thrilled to introduce our latest course, providing a comprehensive journey into data analysis on the Databricks platform. Whether you're a beginner or looking to enhance your skills, this...

Screenshot 2024-01-16 at 1.57.40 PM.png
  • 6253 Views
  • 1 replies
  • 0 kudos
Latest Reply
hthiru
New Contributor II
  • 0 kudos

is this course made available in Home - Databricks Learning?

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors