cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

tomsyouruncle
by New Contributor III
  • 28420 Views
  • 14 replies
  • 3 kudos

How do I enable support for arbitrary files in Databricks Repos? Public Preview feature doesn't appear in admin console.

"Arbitrary files in Databricks Repos", allowing not just notebooks to be added to repos, is in Public Preview. I've tried to activate it following the instructions in the above link but the option doesn't appear in Admin Console. Minimum requirements...

image repos
  • 28420 Views
  • 14 replies
  • 3 kudos
Latest Reply
kahing_cheung
Databricks Employee
  • 3 kudos

What environment is your deployment in?

  • 3 kudos
13 More Replies
Sudeshna
by New Contributor III
  • 16081 Views
  • 6 replies
  • 7 kudos

Resolved! I am new to Databricks SQL and want to create a variable which can hold calculations either from static values or from select queries similar to SQL Server. Is there a way to do so?

I was trying to create a variable and i got the following error -command - SET a = 5;Error -Error running queryConfiguration a is not available.

  • 16081 Views
  • 6 replies
  • 7 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 7 kudos

@Sudeshna Bhakat​ what @Joseph Kambourakis​ described works on clusters but is restricted on Databricks SQL endpoints i.e. only a limited number of SET commands are allowed. I suggest you explore the curly-braces (e.g. {{ my_variable }}) in Databrick...

  • 7 kudos
5 More Replies
shelms
by New Contributor II
  • 35219 Views
  • 2 replies
  • 7 kudos

Resolved! SQL CONCAT returning null

Has anyone else experienced this problem? I'm attempting to SQL concat two fields and if the second field is null, the entire string appears as null. The documentation is unclear on the expected outcome, and contrary to how concat_ws operates.SELECT ...

Screen Shot 2022-03-14 at 4.00.53 PM
  • 35219 Views
  • 2 replies
  • 7 kudos
Latest Reply
BilalAslamDbrx
Databricks Employee
  • 7 kudos

CONCAT is a function defined in the SQL standard and available across a wide variety of DBMS. With the exception of Oracle which uses VARCHAR2 semantic across the board, the function returns NULL on NULL input.CONCAT_WS() is not standard and is mostl...

  • 7 kudos
1 More Replies
cmotla
by New Contributor III
  • 3103 Views
  • 1 replies
  • 7 kudos

Issue with complex json based data frame select

We are getting the below error when trying to select the nested columns (string type in a struct) even though we don't have more than a 1000 records in the data frame. The schema is very complex and has few columns as struct type and few as array typ...

  • 3103 Views
  • 1 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 7 kudos

Please share your code and some example of data.

  • 7 kudos
mikep
by New Contributor II
  • 7283 Views
  • 4 replies
  • 0 kudos

Resolved! Kubernetes or ZooKeeper for HA?

Hello. I am trying to understand High Availability in DataBricks. I understand that DB uses Kubernetes for the cluster manager and to manage Docker Containers. And while DB runs on top of AWS or Azure or GCP, is HA automatically provisioned when I st...

  • 7283 Views
  • 4 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

  • 0 kudos
3 More Replies
george2020
by New Contributor II
  • 1655 Views
  • 0 replies
  • 2 kudos

Using the Databricks Repos API to bring Repo in top-level production folder to latest version

I am having an issue with Github Actions workflow using the Databricks Repos API. We want the API call in the Git Action to bring the Repo in our Databricks Repos Top-level folder to the latest version on a merge into the main branch.The Github Actio...

  • 1655 Views
  • 0 replies
  • 2 kudos
RicksDB
by Contributor III
  • 6559 Views
  • 3 replies
  • 6 kudos

Resolved! Restricting file upload to DBFS

Hi,Is it possible to restrict upload files to dfbs root (Since everyone has access) ? The idea is to force users to use an ADLS2 mnt with credential passthrough for security reasons.Also, right now users use azure blob explorer to interact with ADLS2...

  • 6559 Views
  • 3 replies
  • 6 kudos
Latest Reply
User16764241763
Databricks Employee
  • 6 kudos

Hello @E H​ You can disable DBFS file browser in the workspace, if users directly upload from there. This will prevent uploads to DBFS.https://docs.databricks.com/administration-guide/workspace/dbfs-browser.html Please let us know if this solution wo...

  • 6 kudos
2 More Replies
wyzer
by Contributor II
  • 5437 Views
  • 2 replies
  • 3 kudos

Resolved! Insert data into an on-premise SQL Server

Hello,Is it possible to insert data from Databricks into an on-premise SQL Server ?Thanks.

  • 5437 Views
  • 2 replies
  • 3 kudos
Latest Reply
wyzer
Contributor II
  • 3 kudos

Hello,Yes we find out how to do it by installing a JDBC connector.It works fine.Thanks.

  • 3 kudos
1 More Replies
Soma
by Valued Contributor
  • 4963 Views
  • 3 replies
  • 5 kudos

Resolved! Enable custom Ipython Extension

How to enable custom Ipython Extension on Databricks Notebook Start

  • 4963 Views
  • 3 replies
  • 5 kudos
Latest Reply
Soma
Valued Contributor
  • 5 kudos

I want to load custom extensions which I create like custom call back events on cell runhttps://ipython.readthedocs.io/en/stable/config/callbacks.html

  • 5 kudos
2 More Replies
emanuele_maffeo
by New Contributor III
  • 6271 Views
  • 5 replies
  • 8 kudos

Resolved! Trigger.AvailableNow on scala - compile issue

Hi everybody,Trigger.AvailableNow is released within the databricks 10.1 runtime and we would like to use this new feature with autoloader.We write all our data pipeline in scala and our projects import spark as a provided dependency. If we try to sw...

  • 6271 Views
  • 5 replies
  • 8 kudos
Latest Reply
Anonymous
Not applicable
  • 8 kudos

You can switch to python. Depending on what you're doing and if you're using UDFs, there shouldn't be any difference at all in terms of performance.

  • 8 kudos
4 More Replies
alonisser
by Contributor II
  • 4003 Views
  • 3 replies
  • 4 kudos

Resolved! How to migrate an existing workspace for an external metastore

Currently we're on an azure databricks workspace, we've setup during the POC, a long time ago. In the meanwhile we have built quite a production workload above databricks.Now we want to split workspaces - one for analysts and one for data engineeri...

  • 4003 Views
  • 3 replies
  • 4 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 4 kudos

From databricks notebook just run mysqldump. Server address and details you can take from logs or configuration.I am including also link to example notebook https://docs.microsoft.com/en-us/azure/databricks/kb/_static/notebooks/2016-election-tweets.h...

  • 4 kudos
2 More Replies
USHAK
by New Contributor II
  • 1546 Views
  • 1 replies
  • 0 kudos

Hi , I am trying to schedule - Exam: Databricks Certified Associate Developer for Apache Spark 3.0 - Python.In the cart --> I couldn't proceed ...

Hi , I am trying to schedule - Exam: Databricks Certified Associate Developer for Apache Spark 3.0 - Python.In the cart --> I couldn't proceed without entering voucher. I do not have voucher.Please help

  • 1546 Views
  • 1 replies
  • 0 kudos
Latest Reply
USHAK
New Contributor II
  • 0 kudos

Can someone Please respond to my above question ? Can i write certification test without Voucher ?

  • 0 kudos
Jeff1
by Contributor II
  • 15860 Views
  • 3 replies
  • 4 kudos

Resolved! How to convert lat/long to geohash in databricks using geohashTools R library

I continues to receive a parsing error when attempting to convert lat/long data to a geohash in data bricks . I've tried two coding methods in R and get the same error.library(geohashTools)Method #1my_tbl$geo_hash <- gh_encode(my_tbl$Latitude, my_tbl...

  • 15860 Views
  • 3 replies
  • 4 kudos
Latest Reply
Jeff1
Contributor II
  • 4 kudos

The problem was I was trying to run the gh_encode function on a Spark dataframe. I needed to collect the date into a R dataframe then run the function.

  • 4 kudos
2 More Replies
manasa
by Contributor
  • 21089 Views
  • 3 replies
  • 7 kudos

Resolved! How to set retention period for a delta table lower than the default period? Is it even possible?

I am trying to set retention period for a delta by using following commands.deltaTable = DeltaTable.forPath(spark,delta_path)spark.conf.set("spark.databricks.delta.retentionDurationCheck.enabled", "false")deltaTable.logRetentionDuration = "interval 1...

  • 21089 Views
  • 3 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Databricks MVP
  • 7 kudos

There are two ways:1) Please set in cluster (Clusters -> edit -> Spark -> Spark config):spark.databricks.delta.retentionDurationCheck.enabled false 2) or just before DeltaTable.forPath set (I think you need to change order in your code):spark.conf.se...

  • 7 kudos
2 More Replies
AmanSehgal
by Honored Contributor III
  • 6599 Views
  • 5 replies
  • 12 kudos

Resolved! Query delta tables using databricks cluster in near real time.

I'm trying to query delta tables using JDBC connector in a Ruby app. I've noticed that it takes around 8 seconds just to connect with databricks cluster and then additional time to run the query.The app is connected to a web portal where users genera...

  • 6599 Views
  • 5 replies
  • 12 kudos
Latest Reply
User16763506477
Databricks Employee
  • 12 kudos

Hi @Aman Sehgal​ Could you please check SQL endpoints? SQL endpoint uses a photon engine. It can reduce the query processing time. And Serverless SQL endpoint can accelerate the launch timemore info: https://docs.databricks.com/sql/admin/sql-endpoin...

  • 12 kudos
4 More Replies
Labels