cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

amit119
by New Contributor II
  • 363 Views
  • 0 replies
  • 0 kudos

Not able to access partner-academy

Hi,I have used my company email to register an account for customer-academy.databricks.com a while back.Now what I need to do is create an account with partner-academy.databricks.com using my company email too.However when I register at partner-acade...

  • 363 Views
  • 0 replies
  • 0 kudos
Ravikumashi
by Contributor
  • 1975 Views
  • 3 replies
  • 0 kudos

Extract cluster usage tags from databricks cluster init script

Is it possible we extract cluster usage tags from databricks cluster init script, I am specifically interested in spark.databricks.clusterUsageTags.clusterAllTags.I tried to extract from /databricks/spark/conf/spark.conf and /databricks/spark/conf/sp...

Data Engineering
Azure Databricks
  • 1975 Views
  • 3 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, For reference: https://community.databricks.com/t5/data-engineering/pull-cluster-tags/td-p/19216 , could you please confirm the key expectation here? Extracting as such? 

  • 0 kudos
2 More Replies
hemanthtirumala
by New Contributor II
  • 358 Views
  • 0 replies
  • 0 kudos

Free Voucher Worth 200$ in the upcoming events are there Please send me note on it

I need info about any upcoming events that databricks will provide me a free voucher for the Azure platform architect exam , anyone know the time or a hunch about it please ping me the details. i will be stay tuned at that point of time.................

  • 358 Views
  • 0 replies
  • 0 kudos
mo_moattar
by New Contributor III
  • 3293 Views
  • 2 replies
  • 1 kudos

Is anyone knows how to use python logger in Databricks python job on serverless

I'm trying to use the standard Python logging framework in the Databricks jobs instead of print. I'm doing this by using spark._jvm.org.apache.log4j.LogManager.getLogger(__name__)however as I'm running this on serverless, I get the following error [J...

  • 3293 Views
  • 2 replies
  • 1 kudos
Latest Reply
mo_moattar
New Contributor III
  • 1 kudos

I did the same thing however the logs don't show up on the execution of the tasks so I took aimport logging class LoggerBuilder: def __init__(self, log_level: int = logging.INFO) -> None: self.logger = logging.getLogger() self.l...

  • 1 kudos
1 More Replies
ksenija
by Contributor
  • 1763 Views
  • 5 replies
  • 5 kudos

DLT pipeline - SCD type 2

I created my table using SCD type 2 in SQL. I need to do full refresh to load all of the data. Whenever I update data in my source table, in my new table scd_target I see only the latest record, history is not being saved. CREATE OR REFRESH STREAMING...

  • 1763 Views
  • 5 replies
  • 5 kudos
Latest Reply
Rishabh-Pandey
Esteemed Contributor
  • 5 kudos

hi @ksenija  i got your use case but could you please tell me , what do you mean by "sources_test.sources_test.source" 

  • 5 kudos
4 More Replies
Jimolofsson
by New Contributor II
  • 677 Views
  • 2 replies
  • 0 kudos

API key deletions

On Thursday the 15th August I noticed that several API keys over all of the accounts I run on Databricks had been removed. They had no expire date and some accounts where on a Azure hosted workspace and some on AWS. I would like to hear if you experi...

  • 677 Views
  • 2 replies
  • 0 kudos
Latest Reply
Jimolofsson
New Contributor II
  • 0 kudos

@Retired_mod Yes this is one of the accounts that have had the token removed. I sent an e-mail about the other account.@ChendaZhangDB Thank you for the idea, this could be the case for 2 tokens that were removed. However I also had 2 tokens removed t...

  • 0 kudos
1 More Replies
hrushi512
by New Contributor II
  • 1129 Views
  • 1 replies
  • 1 kudos

Resolved! External Table on Databricks using DBT(Data Build Tool) Models

How can we create external tables in Databricks using DBT Models?

  • 1129 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 1 kudos

Hi @hrushi512 ,You can try to use location_root config parameter, as they did below:https://discourse.getdbt.com/t/add-location-to-create-database-schema-statement-in-databricks-to-enable-creation-of-managed-tables-on-external-storage-accounts/6894

  • 1 kudos
Balazs
by New Contributor III
  • 9052 Views
  • 1 replies
  • 2 kudos

Unity Catalog Volume as spark checkpoint location

Hi,I tried to set the spark checkpoint location in a notebook to a folder in a Unity Catalog Volume, with the following command: sc.setCheckpointDir("/Volumes/catalog_name/schema_name/volume_name/folder_name")Unfortunately I receive the following err...

  • 9052 Views
  • 1 replies
  • 2 kudos
Latest Reply
Erp12
New Contributor II
  • 2 kudos

I am facing the same issue on DBR 14.3 and the beta of 15.4.My cluster is using the "Unrestricted" policy and "Single user" access mode set a user which has permission to read and write to the volume. I tested the permissions by writing a small dataf...

  • 2 kudos
ash_pal
by New Contributor II
  • 561 Views
  • 2 replies
  • 0 kudos

Issue with DLT Pipelines

Hi Team,We are trying to create DLT pipeline. The scenario is this:- We already have catalog in unity catalog and under that we have schema called test in which we have 17 tables. Now we are trying to create DLT pipeline and copy the data from those ...

ash_pal_0-1723681443979.png
  • 561 Views
  • 2 replies
  • 0 kudos
Latest Reply
ash_pal
New Contributor II
  • 0 kudos

Hi Jessy,Thanks for reply. Please find the error message below 

  • 0 kudos
1 More Replies
ThisNoName
by New Contributor III
  • 761 Views
  • 2 replies
  • 5 kudos

Resolved! How to query existing storage and network configuration (AWS)?

Trying to provision a simple workspace.  All the code I can find look something like the following, where credential/storage/network resources are created, then referenced. In my case, it's a separate repo and try to reuse existing configurations.  I...

  • 761 Views
  • 2 replies
  • 5 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 5 kudos

Hi @ThisNoName,  Based on your description it looks like you're trying to get Databricks account level information for networks and storage configurations. You can easily achieve that through the Databricks account API.  Here are the docs: - Get all ...

  • 5 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels