cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

himanmon
by New Contributor III
  • 337 Views
  • 2 replies
  • 1 kudos

Can I move a single file larger than 100GB using dbtuils fs?

Hello. I have a file over 100GB. Sometimes this is on the cluster's local path, and sometimes it's on the volume.And I want to send this to another path on the volume, or to the s3 bucket. dbutils.fs.cp('file:///tmp/test.txt', '/Volumes/catalog/schem...

himanmon_0-1721181332995.png himanmon_1-1721185042042.png
  • 337 Views
  • 2 replies
  • 1 kudos
Latest Reply
Slash
Contributor
  • 1 kudos

Hi @himanmon ,This is caused because of S3 limit on segment count. The part files can be numbered only from 1 to 10000After Setting spark.hadoop.fs.s3a.multipart.size to 104857600. , did you RESTART the cluster? Because it'll only work when the clust...

  • 1 kudos
1 More Replies
yurib
by New Contributor III
  • 563 Views
  • 2 replies
  • 0 kudos

Resolved! error creating token when creating databricks_mws_workspace resource on GCP

 resource "databricks_mws_workspaces" "this" { depends_on = [ databricks_mws_networks.this ] provider = databricks.account account_id = var.databricks_account_id workspace_name = "${local.prefix}-dbx-ws" location = var.google_region clou...

  • 563 Views
  • 2 replies
  • 0 kudos
Latest Reply
yurib
New Contributor III
  • 0 kudos

my issue was caused be credentials in `~/.databrickscfg` (generated by databricks cli) taking precedence over the creds set by `gcloud auth application-default login`. google's application default creds should be used when using the databricks google...

  • 0 kudos
1 More Replies
FaizH
by New Contributor III
  • 558 Views
  • 2 replies
  • 1 kudos

Resolved! Error - Data Masking

Hi,I was testing masking functionality of databricks and got the below error:java.util.concurrent.ExecutionException: com.databricks.sql.managedcatalog.acl.UnauthorizedAccessException:PERMISSION_DENIED: Query on table dev_retransform.uc_lineage.test_...

FaizH_0-1721187898699.png
  • 558 Views
  • 2 replies
  • 1 kudos
Latest Reply
Slash
Contributor
  • 1 kudos

Hi @FaizH , Are you using single user compute by any chance? Because of you do there is following limitation:Single-user compute limitationDo not add row filters or column masks to any table that you are accessing from a single-user cluster. During t...

  • 1 kudos
1 More Replies
thilanka02
by New Contributor II
  • 1199 Views
  • 3 replies
  • 1 kudos

Resolved! Spark read CSV does not throw Exception if the file path is not available in Databricks 14.3

We were using this method and this was working as expected in Databricks 13.3.  def read_file(): try: df_temp_dlr_kpi = spark.read.load(raw_path,format="csv", schema=kpi_schema) return df_temp_dlr_kpi except Exce...

Screenshot 2024-04-19 at 13.29.19.png
  • 1199 Views
  • 3 replies
  • 1 kudos
Latest Reply
databricks100
New Contributor II
  • 1 kudos

Hi, has this been resolved? I am still seeing this issue with Runtime 14.3 LTSThanks in advance.

  • 1 kudos
2 More Replies
Rash
by New Contributor
  • 227 Views
  • 1 replies
  • 0 kudos

Databricks feature

would like to know if databricks supports write back feature via altreyx to databricks.

  • 227 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @Rash, Alteryx does support write-back to Databricks. To achieve this, you can use the Data Stream In tool in Alteryx, which allows you to write data to Databricks. The write support is facilitated via the Databricks Bulk Loader.

  • 0 kudos
Mrinal16
by New Contributor
  • 284 Views
  • 1 replies
  • 0 kudos

Connection with virtual machine

I have to upload files from Azure container to Virtual machine using Databricks. I have mounted my files to Databricks.Please help me do it. If any idea about this.

  • 284 Views
  • 1 replies
  • 0 kudos
Latest Reply
imsabarinath
New Contributor III
  • 0 kudos

I am not sure if you need to further curate the data before you upload it to Virtual machine, if you not you can just mount storage on VMCreate an SMB Azure file share and connect it to a Windows VM | Microsoft LearnAzure Storage - Create File Storag...

  • 0 kudos
johnp
by New Contributor III
  • 447 Views
  • 2 replies
  • 1 kudos

Resolved! Access to "Admin Console" and "System Tables"

I am the contributor and owner of my databricks workspace. After a recent spike of expense, I want to check the billing details of my Azure databricks usage. (i.e per cluster, per VM, etc).  Databricks provides these information thorough "Admin Conso...

  • 447 Views
  • 2 replies
  • 1 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 1 kudos

Hi @johnp , Thank you for reaching out to our community! We're here to help you.  To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback...

  • 1 kudos
1 More Replies
lbdatauser
by New Contributor II
  • 232 Views
  • 1 replies
  • 0 kudos

Liquid clustering with incremental ingestion

We ingest data incrementally from a database into delta tables using a column updatedUtc. This column is a datetime and is updated when the row in the database table changes. What about using this non-mutable column in "cluster by"? Would it require ...

  • 232 Views
  • 1 replies
  • 0 kudos
Latest Reply
greyamber
New Contributor II
  • 0 kudos

It recommended to run optimize query in scheduled manner https://docs.databricks.com/en/delta/clustering.html#how-to-trigger-clustering 

  • 0 kudos
haritashva31
by New Contributor
  • 892 Views
  • 2 replies
  • 0 kudos

50%-off Databricks certification voucher

Hello Databricks Community Team, I am reaching out to inquire about the Databricks certification voucher promotion for completing the Databricks Learning Festival (Virtual) courses.I completed one of the Databricks Learning Festival courses July 2024...

  • 892 Views
  • 2 replies
  • 0 kudos
Latest Reply
J_Anderson
New Contributor III
  • 0 kudos

Thanks for participating in the learning festival! We will be distributing the coupons via email after the event has concluded. You can expect to receive the email by early August

  • 0 kudos
1 More Replies
pjv
by New Contributor III
  • 356 Views
  • 2 replies
  • 0 kudos

Setting up a proxy for python notebook

Hi all,I am running a python notebook with a web scraper. I want to setup a proxy server that I can use to avoid any IP bans when scraping. Can someone recommend a way to setup a proxy server that can be used from HTTP requests send from a Databricks...

  • 356 Views
  • 2 replies
  • 0 kudos
Latest Reply
pjv
New Contributor III
  • 0 kudos

Hi Kaniz,Thanks for the reply. I know how to include HTTP proxies in my python code and redirect the requests. However, I wondered if Databricks has any functionality to setup the proxies?Thank you

  • 0 kudos
1 More Replies
CharlesDLW
by New Contributor
  • 230 Views
  • 1 replies
  • 0 kudos

Unity Catalog cannot display(), but can show() table

Hello all,I'm facing the following issue in a newly setup Azure Databricks - Unity Catalog environment:Failed to store the result. Try rerunning the command.Failed to upload command result to DBFS. Error message: PUT request to create file error Http...

CharlesDLW_0-1721137183201.png
  • 230 Views
  • 1 replies
  • 0 kudos
Latest Reply
Slash
Contributor
  • 0 kudos

Hi @CharlesDLW , You have similar use case to the one below. Follow my reply in that thread: https://community.databricks.com/t5/community-discussions/file-found-with-fs-ls-but-not-with-spark-read/m-p/78618/highlight/true#M5972 

  • 0 kudos
JVesely
by New Contributor III
  • 623 Views
  • 3 replies
  • 2 kudos

Resolved! jdbc errors when parameter is a boolean

I'm trying to query a table from Java code. The query works when I use a databricks notebook / query editor directly in Databricks. However, when using Jdbc with Spring, I get following stacktrace.  org.springframework.jdbc.UncategorizedSQLException:...

  • 623 Views
  • 3 replies
  • 2 kudos
Latest Reply
JVesely
New Contributor III
  • 2 kudos

As I see it, there's two things:jdbcTemplate converts boolean to bit. This is according to JDBC specs (this is a "spring-jdbc" thing and according to documentation; the jdbcTemplate.queryForList makes the best possible guess of the desired type).Data...

  • 2 kudos
2 More Replies
valefar
by New Contributor
  • 299 Views
  • 1 replies
  • 0 kudos

Unexpected response from server during a HTTP connection: authorize: cannot authorize peer.

Hi all,When attempting to connect to Databricks with Spark ODBC using the regular host ip and port, everything is successful. However, we need to send the connection through an internal proxy service that re-maps the server's endpoint to a local port...

  • 299 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz_Fatma
Community Manager
  • 0 kudos

Hi @valefar, Firstly, ensure your connection settings correctly map the server's endpoint to 'localhost' and the appropriate port number through the proxy service. Double-check your connection string or configuration to align with Databricks workspac...

  • 0 kudos
Shrinivas
by New Contributor
  • 288 Views
  • 1 replies
  • 0 kudos

Databricks/Terraform - Error while creating workspace

Hi - I have below code to create the credentials, storage and workspace through terraform script but only credentials and storage is created but failed to create the workspace with error.  Can someone please guide/suggest what's wrong with the code/l...

  • 288 Views
  • 1 replies
  • 0 kudos
Latest Reply
Slash
Contributor
  • 0 kudos

Hi @Shrinivas , Could you share with us how you configured datbricks provider?

  • 0 kudos

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors