cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

mikep
by New Contributor II
  • 6446 Views
  • 4 replies
  • 0 kudos

Resolved! Kubernetes or ZooKeeper for HA?

Hello. I am trying to understand High Availability in DataBricks. I understand that DB uses Kubernetes for the cluster manager and to manage Docker Containers. And while DB runs on top of AWS or Azure or GCP, is HA automatically provisioned when I st...

  • 6446 Views
  • 4 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

  • 0 kudos
3 More Replies
george2020
by New Contributor II
  • 1422 Views
  • 0 replies
  • 2 kudos

Using the Databricks Repos API to bring Repo in top-level production folder to latest version

I am having an issue with Github Actions workflow using the Databricks Repos API. We want the API call in the Git Action to bring the Repo in our Databricks Repos Top-level folder to the latest version on a merge into the main branch.The Github Actio...

  • 1422 Views
  • 0 replies
  • 2 kudos
RicksDB
by Contributor III
  • 5627 Views
  • 3 replies
  • 6 kudos

Resolved! Restricting file upload to DBFS

Hi,Is it possible to restrict upload files to dfbs root (Since everyone has access) ? The idea is to force users to use an ADLS2 mnt with credential passthrough for security reasons.Also, right now users use azure blob explorer to interact with ADLS2...

  • 5627 Views
  • 3 replies
  • 6 kudos
Latest Reply
User16764241763
Honored Contributor
  • 6 kudos

Hello @E H​ You can disable DBFS file browser in the workspace, if users directly upload from there. This will prevent uploads to DBFS.https://docs.databricks.com/administration-guide/workspace/dbfs-browser.html Please let us know if this solution wo...

  • 6 kudos
2 More Replies
wyzer
by Contributor II
  • 4598 Views
  • 2 replies
  • 2 kudos

Resolved! Insert data into an on-premise SQL Server

Hello,Is it possible to insert data from Databricks into an on-premise SQL Server ?Thanks.

  • 4598 Views
  • 2 replies
  • 2 kudos
Latest Reply
wyzer
Contributor II
  • 2 kudos

Hello,Yes we find out how to do it by installing a JDBC connector.It works fine.Thanks.

  • 2 kudos
1 More Replies
Soma
by Valued Contributor
  • 4435 Views
  • 3 replies
  • 5 kudos

Resolved! Enable custom Ipython Extension

How to enable custom Ipython Extension on Databricks Notebook Start

  • 4435 Views
  • 3 replies
  • 5 kudos
Latest Reply
Soma
Valued Contributor
  • 5 kudos

I want to load custom extensions which I create like custom call back events on cell runhttps://ipython.readthedocs.io/en/stable/config/callbacks.html

  • 5 kudos
2 More Replies
emanuele_maffeo
by New Contributor III
  • 4880 Views
  • 5 replies
  • 8 kudos

Resolved! Trigger.AvailableNow on scala - compile issue

Hi everybody,Trigger.AvailableNow is released within the databricks 10.1 runtime and we would like to use this new feature with autoloader.We write all our data pipeline in scala and our projects import spark as a provided dependency. If we try to sw...

  • 4880 Views
  • 5 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

You can switch to python. Depending on what you're doing and if you're using UDFs, there shouldn't be any difference at all in terms of performance.

  • 8 kudos
4 More Replies
alonisser
by Contributor II
  • 3547 Views
  • 3 replies
  • 4 kudos

Resolved! How to migrate an existing workspace for an external metastore

Currently we're on an azure databricks workspace, we've setup during the POC, a long time ago. In the meanwhile we have built quite a production workload above databricks.Now we want to split workspaces - one for analysts and one for data engineeri...

  • 3547 Views
  • 3 replies
  • 4 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 4 kudos

From databricks notebook just run mysqldump. Server address and details you can take from logs or configuration.I am including also link to example notebook https://docs.microsoft.com/en-us/azure/databricks/kb/_static/notebooks/2016-election-tweets.h...

  • 4 kudos
2 More Replies
USHAK
by New Contributor II
  • 1329 Views
  • 1 replies
  • 0 kudos

Hi , I am trying to schedule - Exam: Databricks Certified Associate Developer for Apache Spark 3.0 - Python.In the cart --> I couldn't proceed ...

Hi , I am trying to schedule - Exam: Databricks Certified Associate Developer for Apache Spark 3.0 - Python.In the cart --> I couldn't proceed without entering voucher. I do not have voucher.Please help

  • 1329 Views
  • 1 replies
  • 0 kudos
Latest Reply
USHAK
New Contributor II
  • 0 kudos

Can someone Please respond to my above question ? Can i write certification test without Voucher ?

  • 0 kudos
Jeff1
by Contributor II
  • 15020 Views
  • 3 replies
  • 4 kudos

Resolved! How to convert lat/long to geohash in databricks using geohashTools R library

I continues to receive a parsing error when attempting to convert lat/long data to a geohash in data bricks . I've tried two coding methods in R and get the same error.library(geohashTools)Method #1my_tbl$geo_hash <- gh_encode(my_tbl$Latitude, my_tbl...

  • 15020 Views
  • 3 replies
  • 4 kudos
Latest Reply
Jeff1
Contributor II
  • 4 kudos

The problem was I was trying to run the gh_encode function on a Spark dataframe. I needed to collect the date into a R dataframe then run the function.

  • 4 kudos
2 More Replies
manasa
by Contributor
  • 18655 Views
  • 3 replies
  • 7 kudos

Resolved! How to set retention period for a delta table lower than the default period? Is it even possible?

I am trying to set retention period for a delta by using following commands.deltaTable = DeltaTable.forPath(spark,delta_path)spark.conf.set("spark.databricks.delta.retentionDurationCheck.enabled", "false")deltaTable.logRetentionDuration = "interval 1...

  • 18655 Views
  • 3 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 7 kudos

There are two ways:1) Please set in cluster (Clusters -> edit -> Spark -> Spark config):spark.databricks.delta.retentionDurationCheck.enabled false 2) or just before DeltaTable.forPath set (I think you need to change order in your code):spark.conf.se...

  • 7 kudos
2 More Replies
AmanSehgal
by Honored Contributor III
  • 5934 Views
  • 5 replies
  • 12 kudos

Resolved! Query delta tables using databricks cluster in near real time.

I'm trying to query delta tables using JDBC connector in a Ruby app. I've noticed that it takes around 8 seconds just to connect with databricks cluster and then additional time to run the query.The app is connected to a web portal where users genera...

  • 5934 Views
  • 5 replies
  • 12 kudos
Latest Reply
User16763506477
Contributor III
  • 12 kudos

Hi @Aman Sehgal​ Could you please check SQL endpoints? SQL endpoint uses a photon engine. It can reduce the query processing time. And Serverless SQL endpoint can accelerate the launch timemore info: https://docs.databricks.com/sql/admin/sql-endpoin...

  • 12 kudos
4 More Replies
zayeem
by New Contributor
  • 3473 Views
  • 1 replies
  • 3 kudos

Resolved! Databricks - Jobs Last run date

Is there a way to get the last run date of job(s) ? I am trying to compile a report and trying to see if this output exists either in databricks jobs cli output or via api?

  • 3473 Views
  • 1 replies
  • 3 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 3 kudos

Sure. Using Databricks jobs API you can get this information.Use the following API endpoint to get list of all the jobs and their executions till date in descending order.You can pass job_id as parameter to get runs of a specific job.https://<databri...

  • 3 kudos
Anonymous
by Not applicable
  • 1177 Views
  • 0 replies
  • 3 kudos

March Madness + Data  Here at Databricks we like to use (you guessed it) data in our daily lives. Today kicks off a series called Databrags &#xd83c;&#xdf89; ...

March Madness + Data Here at Databricks we like to use (you guessed it) data in our daily lives. Today kicks off a series called Databrags Databrags are glimpses into how Bricksters and community folks like you use data to solve everyday problems, e...

  • 1177 Views
  • 0 replies
  • 3 kudos
Abel_Martinez
by Contributor
  • 2474 Views
  • 1 replies
  • 1 kudos

Resolved! Create data bricks service account

Hi all, I need to create service account users who can only query some delta tables. I guess I do that by creating the user and granting select right to the desired tables. But Data bricks requests a mail account for these users. Is there a way to cr...

  • 2474 Views
  • 1 replies
  • 1 kudos
Latest Reply
Abel_Martinez
Contributor
  • 1 kudos

HI @Kaniz Fatma​ , I've checked the link but the standard method requires a mailbox and the user creation using SCIM API looks too complicated. I solved the issue, I created a mailbox for the service account and I created the user using that mailbox....

  • 1 kudos
lecardozo
by New Contributor II
  • 6561 Views
  • 5 replies
  • 1 kudos

Resolved! Problems with HiveMetastoreClient and internal Databricks Metastore.

I've been trying to use ​the HiveMetastoreClient class in Scala to extract some metadata from Databricks internal Metastore, without success. I'm currently using the 7.3 LTS runtime.​The error seems to be related to some kind of inconsistency between...

  • 6561 Views
  • 5 replies
  • 1 kudos
Latest Reply
lecardozo
New Contributor II
  • 1 kudos

Thanks for the reference, @Atanu Sarkar​ .​Seems a little odd to me that I'd need to change the internal Databricks Metastore table to add a column expected by the client default Scala client. I'm afraid this could cause issues with other users/jobs ...

  • 1 kudos
4 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels