cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 
Data + AI Summit 2024 - Data Engineering & Streaming

Forum Posts

hemanthtirumala
by New Contributor II
  • 306 Views
  • 0 replies
  • 0 kudos

Free Voucher Worth 200$ in the upcoming events are there Please send me note on it

I need info about any upcoming events that databricks will provide me a free voucher for the Azure platform architect exam , anyone know the time or a hunch about it please ping me the details. i will be stay tuned at that point of time.................

  • 306 Views
  • 0 replies
  • 0 kudos
mo_moattar
by New Contributor III
  • 1805 Views
  • 2 replies
  • 1 kudos

Is anyone knows how to use python logger in Databricks python job on serverless

I'm trying to use the standard Python logging framework in the Databricks jobs instead of print. I'm doing this by using spark._jvm.org.apache.log4j.LogManager.getLogger(__name__)however as I'm running this on serverless, I get the following error [J...

  • 1805 Views
  • 2 replies
  • 1 kudos
Latest Reply
mo_moattar
New Contributor III
  • 1 kudos

I did the same thing however the logs don't show up on the execution of the tasks so I took aimport logging class LoggerBuilder: def __init__(self, log_level: int = logging.INFO) -> None: self.logger = logging.getLogger() self.l...

  • 1 kudos
1 More Replies
ksenija
by Contributor
  • 1260 Views
  • 5 replies
  • 5 kudos

DLT pipeline - SCD type 2

I created my table using SCD type 2 in SQL. I need to do full refresh to load all of the data. Whenever I update data in my source table, in my new table scd_target I see only the latest record, history is not being saved. CREATE OR REFRESH STREAMING...

  • 1260 Views
  • 5 replies
  • 5 kudos
Latest Reply
Rishabh-Pandey
Esteemed Contributor
  • 5 kudos

hi @ksenija  i got your use case but could you please tell me , what do you mean by "sources_test.sources_test.source" 

  • 5 kudos
4 More Replies
ksenija
by Contributor
  • 208 Views
  • 0 replies
  • 0 kudos

Log data from reports in PowerBI

Where to find log data from PowerBI? I need to find what tables are being used in my PowerBI reports that are pointing to Databricks. I tried system.access.audit but I'm not finding new data when I refresh my report

  • 208 Views
  • 0 replies
  • 0 kudos
Jimolofsson
by New Contributor II
  • 585 Views
  • 2 replies
  • 0 kudos

API key deletions

On Thursday the 15th August I noticed that several API keys over all of the accounts I run on Databricks had been removed. They had no expire date and some accounts where on a Azure hosted workspace and some on AWS. I would like to hear if you experi...

  • 585 Views
  • 2 replies
  • 0 kudos
Latest Reply
Jimolofsson
New Contributor II
  • 0 kudos

@Retired_mod Yes this is one of the accounts that have had the token removed. I sent an e-mail about the other account.@ChendaZhangDB Thank you for the idea, this could be the case for 2 tokens that were removed. However I also had 2 tokens removed t...

  • 0 kudos
1 More Replies
hrushi512
by New Contributor II
  • 719 Views
  • 1 replies
  • 1 kudos

Resolved! External Table on Databricks using DBT(Data Build Tool) Models

How can we create external tables in Databricks using DBT Models?

  • 719 Views
  • 1 replies
  • 1 kudos
Latest Reply
szymon_dybczak
Contributor III
  • 1 kudos

Hi @hrushi512 ,You can try to use location_root config parameter, as they did below:https://discourse.getdbt.com/t/add-location-to-create-database-schema-statement-in-databricks-to-enable-creation-of-managed-tables-on-external-storage-accounts/6894

  • 1 kudos
Balazs
by New Contributor III
  • 8582 Views
  • 1 replies
  • 2 kudos

Unity Catalog Volume as spark checkpoint location

Hi,I tried to set the spark checkpoint location in a notebook to a folder in a Unity Catalog Volume, with the following command: sc.setCheckpointDir("/Volumes/catalog_name/schema_name/volume_name/folder_name")Unfortunately I receive the following err...

  • 8582 Views
  • 1 replies
  • 2 kudos
Latest Reply
Erp12
New Contributor II
  • 2 kudos

I am facing the same issue on DBR 14.3 and the beta of 15.4.My cluster is using the "Unrestricted" policy and "Single user" access mode set a user which has permission to read and write to the volume. I tested the permissions by writing a small dataf...

  • 2 kudos
ash_pal
by New Contributor II
  • 468 Views
  • 2 replies
  • 0 kudos

Issue with DLT Pipelines

Hi Team,We are trying to create DLT pipeline. The scenario is this:- We already have catalog in unity catalog and under that we have schema called test in which we have 17 tables. Now we are trying to create DLT pipeline and copy the data from those ...

ash_pal_0-1723681443979.png
  • 468 Views
  • 2 replies
  • 0 kudos
Latest Reply
ash_pal
New Contributor II
  • 0 kudos

Hi Jessy,Thanks for reply. Please find the error message below 

  • 0 kudos
1 More Replies
ThisNoName
by New Contributor III
  • 623 Views
  • 2 replies
  • 5 kudos

Resolved! How to query existing storage and network configuration (AWS)?

Trying to provision a simple workspace.  All the code I can find look something like the following, where credential/storage/network resources are created, then referenced. In my case, it's a separate repo and try to reuse existing configurations.  I...

  • 623 Views
  • 2 replies
  • 5 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 5 kudos

Hi @ThisNoName,  Based on your description it looks like you're trying to get Databricks account level information for networks and storage configurations. You can easily achieve that through the Databricks account API.  Here are the docs: - Get all ...

  • 5 kudos
1 More Replies
data-enthu
by New Contributor II
  • 340 Views
  • 0 replies
  • 0 kudos

Accessing DBT Articafts, runs, tests from Databricks workflow using automated script.

I am running dbt on a databricks job. It saves all documentation: manifest.json, run_results.json, etc in "Download Artifacts" in a job. I am not able to find out a way to read those in codes, transform and save on databricks. Tried job API. The arti...

  • 340 Views
  • 0 replies
  • 0 kudos
Jay_Kay
by New Contributor
  • 336 Views
  • 1 replies
  • 0 kudos

Databricks Workflow Error

Hi Community,My Workflows has been running smoothly since it was created but for the past week, I have been getting this errorI have tried different method and Documentations but nothing seems to work. All the different jobs in my workflow gets this ...

Jay_Kay_0-1723645181805.png
  • 336 Views
  • 1 replies
  • 0 kudos
Latest Reply
jessysantos
Databricks Employee
  • 0 kudos

Hello @Jay_Kay  Could you please attempt to create a table from this view and re-run your job to verify if it works? Additionally, please ensure that you persist the view dependencies as tables as well. Best Regards, Jéssica Santos

  • 0 kudos
seeker
by New Contributor II
  • 697 Views
  • 2 replies
  • 1 kudos

Get metadata of files present in a zip

I have a .zip file present on an ADLS path which contains multiple files of different formats. I want to get metadata of the files like file name, modification time present in it without unzipping it. I have a code which works for smaller zip but run...

  • 697 Views
  • 2 replies
  • 1 kudos
Latest Reply
seeker
New Contributor II
  • 1 kudos

 Here is the code which i am using def register_udf(): def extract_file_metadata_from_zip(binary_content): metadata_list = [] with io.BytesIO(binary_content) as bio: with zipfile.ZipFile(bio, "r") as zip_ref: ...

  • 1 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels