- 620 Views
- 2 replies
- 0 kudos
Resolved! error creating token when creating databricks_mws_workspace resource on GCP
resource "databricks_mws_workspaces" "this" { depends_on = [ databricks_mws_networks.this ] provider = databricks.account account_id = var.databricks_account_id workspace_name = "${local.prefix}-dbx-ws" location = var.google_region clou...
- 620 Views
- 2 replies
- 0 kudos
- 0 kudos
my issue was caused be credentials in `~/.databrickscfg` (generated by databricks cli) taking precedence over the creds set by `gcloud auth application-default login`. google's application default creds should be used when using the databricks google...
- 0 kudos
- 604 Views
- 2 replies
- 1 kudos
Resolved! Error - Data Masking
Hi,I was testing masking functionality of databricks and got the below error:java.util.concurrent.ExecutionException: com.databricks.sql.managedcatalog.acl.UnauthorizedAccessException:PERMISSION_DENIED: Query on table dev_retransform.uc_lineage.test_...
- 604 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @FaizH , Are you using single user compute by any chance? Because of you do there is following limitation:Single-user compute limitationDo not add row filters or column masks to any table that you are accessing from a single-user cluster. During t...
- 1 kudos
- 1235 Views
- 3 replies
- 1 kudos
Resolved! Spark read CSV does not throw Exception if the file path is not available in Databricks 14.3
We were using this method and this was working as expected in Databricks 13.3. def read_file(): try: df_temp_dlr_kpi = spark.read.load(raw_path,format="csv", schema=kpi_schema) return df_temp_dlr_kpi except Exce...
- 1235 Views
- 3 replies
- 1 kudos
- 1 kudos
Hi, has this been resolved? I am still seeing this issue with Runtime 14.3 LTSThanks in advance.
- 1 kudos
- 238 Views
- 1 replies
- 0 kudos
Databricks feature
would like to know if databricks supports write back feature via altreyx to databricks.
- 238 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Rash, Alteryx does support write-back to Databricks. To achieve this, you can use the Data Stream In tool in Alteryx, which allows you to write data to Databricks. The write support is facilitated via the Databricks Bulk Loader.
- 0 kudos
- 324 Views
- 1 replies
- 0 kudos
Connection with virtual machine
I have to upload files from Azure container to Virtual machine using Databricks. I have mounted my files to Databricks.Please help me do it. If any idea about this.
- 324 Views
- 1 replies
- 0 kudos
- 0 kudos
I am not sure if you need to further curate the data before you upload it to Virtual machine, if you not you can just mount storage on VMCreate an SMB Azure file share and connect it to a Windows VM | Microsoft LearnAzure Storage - Create File Storag...
- 0 kudos
- 468 Views
- 2 replies
- 1 kudos
Resolved! Access to "Admin Console" and "System Tables"
I am the contributor and owner of my databricks workspace. After a recent spike of expense, I want to check the billing details of my Azure databricks usage. (i.e per cluster, per VM, etc). Databricks provides these information thorough "Admin Conso...
- 468 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @johnp , Thank you for reaching out to our community! We're here to help you. To ensure we provide you with the best support, could you please take a moment to review the response and choose the one that best answers your question? Your feedback...
- 1 kudos
- 247 Views
- 1 replies
- 0 kudos
Liquid clustering with incremental ingestion
We ingest data incrementally from a database into delta tables using a column updatedUtc. This column is a datetime and is updated when the row in the database table changes. What about using this non-mutable column in "cluster by"? Would it require ...
- 247 Views
- 1 replies
- 0 kudos
- 0 kudos
It recommended to run optimize query in scheduled manner https://docs.databricks.com/en/delta/clustering.html#how-to-trigger-clustering
- 0 kudos
- 1004 Views
- 2 replies
- 0 kudos
50%-off Databricks certification voucher
Hello Databricks Community Team, I am reaching out to inquire about the Databricks certification voucher promotion for completing the Databricks Learning Festival (Virtual) courses.I completed one of the Databricks Learning Festival courses July 2024...
- 1004 Views
- 2 replies
- 0 kudos
- 0 kudos
Thanks for participating in the learning festival! We will be distributing the coupons via email after the event has concluded. You can expect to receive the email by early August
- 0 kudos
- 182 Views
- 0 replies
- 1 kudos
- 182 Views
- 0 replies
- 1 kudos
- 397 Views
- 2 replies
- 0 kudos
Setting up a proxy for python notebook
Hi all,I am running a python notebook with a web scraper. I want to setup a proxy server that I can use to avoid any IP bans when scraping. Can someone recommend a way to setup a proxy server that can be used from HTTP requests send from a Databricks...
- 397 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Kaniz,Thanks for the reply. I know how to include HTTP proxies in my python code and redirect the requests. However, I wondered if Databricks has any functionality to setup the proxies?Thank you
- 0 kudos
- 245 Views
- 1 replies
- 0 kudos
Unity Catalog cannot display(), but can show() table
Hello all,I'm facing the following issue in a newly setup Azure Databricks - Unity Catalog environment:Failed to store the result. Try rerunning the command.Failed to upload command result to DBFS. Error message: PUT request to create file error Http...
- 245 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @CharlesDLW , You have similar use case to the one below. Follow my reply in that thread: https://community.databricks.com/t5/community-discussions/file-found-with-fs-ls-but-not-with-spark-read/m-p/78618/highlight/true#M5972
- 0 kudos
- 650 Views
- 3 replies
- 2 kudos
Resolved! jdbc errors when parameter is a boolean
I'm trying to query a table from Java code. The query works when I use a databricks notebook / query editor directly in Databricks. However, when using Jdbc with Spring, I get following stacktrace. org.springframework.jdbc.UncategorizedSQLException:...
- 650 Views
- 3 replies
- 2 kudos
- 2 kudos
As I see it, there's two things:jdbcTemplate converts boolean to bit. This is according to JDBC specs (this is a "spring-jdbc" thing and according to documentation; the jdbcTemplate.queryForList makes the best possible guess of the desired type).Data...
- 2 kudos
- 328 Views
- 1 replies
- 0 kudos
Unexpected response from server during a HTTP connection: authorize: cannot authorize peer.
Hi all,When attempting to connect to Databricks with Spark ODBC using the regular host ip and port, everything is successful. However, we need to send the connection through an internal proxy service that re-maps the server's endpoint to a local port...
- 328 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @valefar, Firstly, ensure your connection settings correctly map the server's endpoint to 'localhost' and the appropriate port number through the proxy service. Double-check your connection string or configuration to align with Databricks workspac...
- 0 kudos
- 317 Views
- 1 replies
- 0 kudos
Databricks/Terraform - Error while creating workspace
Hi - I have below code to create the credentials, storage and workspace through terraform script but only credentials and storage is created but failed to create the workspace with error. Can someone please guide/suggest what's wrong with the code/l...
- 317 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Shrinivas , Could you share with us how you configured datbricks provider?
- 0 kudos
- 458 Views
- 2 replies
- 0 kudos
Variables in databricks.yml "include:" - Asset Bundles
HI,We've got an app that we deploy to multiple customers workspaces. We're looking to transition to asset bundles. We would like to structure our resources like: -src/ -resources/ |-- customer_1/ |-- job_1 |-- job_2 |-- customer_2/ |-- job_...
- 458 Views
- 2 replies
- 0 kudos
- 0 kudos
Interesting use case!! Ideally having seperate bundle for each customer seems like a clean solution. But if you dont want that then - You can just include all the yaml files in databricks.yml with include: - resources/*/*.yml Inside the yaml files...
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
3 -
Azure databricks
3 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Chatgpt
1 -
Community
7 -
Community Edition
3 -
Community Members
2 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Data Processing
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
11 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
3 -
Delta
10 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
11 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
4 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »