cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

NC
by New Contributor III
  • 4485 Views
  • 4 replies
  • 0 kudos

GDAL on Databricks Cluster Runtime 12.2 LTS

I need gdal in my course work.After reading this post, I used init script as follows to install gdal into runtime 12.2 LTS  dbutils.fs.put("/databricks/scripts/gdal_install.sh",""" #!/bin/bash sudo add-apt-repository ppa:ubuntugis/ppa sudo apt-get up...

  • 4485 Views
  • 4 replies
  • 0 kudos
Latest Reply
Matt_C
New Contributor II
  • 0 kudos

Hi, in case anyone is still struggling here. I found I could not get the init script approach to work, but if I just run a shell command to install gdal at the start of my notebook it works fine. You might note, however, that this installs gdal versi...

  • 0 kudos
3 More Replies
juliemoore
by New Contributor
  • 1768 Views
  • 1 replies
  • 0 kudos

Problems with Big Data Solutions and Databricks- Any advice?

Hello everyone,I am currently facing several challenges related to big data solutions, particularly with the Databricks. As many of you may know, Databricks is a powerful platform for data engineering and analytics, but I have encountered some signif...

  • 1768 Views
  • 1 replies
  • 0 kudos
Latest Reply
gchandra
Databricks Employee
  • 0 kudos

Your problem statement is too generic. If your company is facing this, you can reach out to your SA; they will help you. If it's a personal project, then mention what you are trying in detail with cluster size, what you are trying to integrate with, ...

  • 0 kudos
bvraravind
by New Contributor II
  • 2032 Views
  • 1 replies
  • 0 kudos

Resolved! Unable to access Azure blob storage with SAS token

I am following Microsoft documentation to connect from Databricks workspace to Azure blob storage. but it is not working. Any help is greatly appreciated. Below is the codespark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows....

  • 2032 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @bvraravind, The error you are encountering is due to an incorrect configuration setting in your code. The error message indicates that the configuration fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net is not recognized Verify th...

  • 0 kudos
Tejsharma
by New Contributor
  • 3733 Views
  • 1 replies
  • 0 kudos

Troubleshooting the Error "Credential was not sent or was of an unsupported type for this API"

I previously worked on Databricks Asset Bundle (DAB) using a Service Principal token, and it was successful. However, when I attempted it again now, I encountered an error.Error: failed to compute file content for {{.project_name}}/databricks.yml.tmp...

  • 3733 Views
  • 1 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

Which type of token are you currently using, is it an Oauth token or Obo token? Have you generated a new token for testing?

  • 0 kudos
MC97
by New Contributor
  • 995 Views
  • 1 replies
  • 0 kudos

Update on CTE

So I am reflecting a business logic from on prem to azure databricks . what on prem did is created the table and after that updated . I have to construct that as a single query . Example Create or replace table table1with CTE 1 as () ,CTE 2 as (selec...

  • 995 Views
  • 1 replies
  • 0 kudos
Latest Reply
VZLA
Databricks Employee
  • 0 kudos

An actual "Update", it may not be possible, but have you consider and will something like this work for you? This is simulating updates within the query without actual UPDATE statements: CREATE OR REPLACE TABLE table1 AS WITH CTE1 AS ( -- Your in...

  • 0 kudos
dbx_687_3__1b3Q
by New Contributor III
  • 6030 Views
  • 5 replies
  • 5 kudos

Impersonating a user

How do I impersonate a user? I can't find any documentation that explains how to do this or even hint that it's possible.Use case: I perform administrative tasks like assign grants and roles to catalogs, schemas, and tables for the benefit of busines...

  • 6030 Views
  • 5 replies
  • 5 kudos
Latest Reply
NandiniN
Databricks Employee
  • 5 kudos

DB-I-8117 this one is mentioned to be considered for future so adding votes for sure will help.

  • 5 kudos
4 More Replies
JolM
by New Contributor II
  • 1278 Views
  • 1 replies
  • 1 kudos

Resolved! is there a way for us to see billing usage per catalog?

is there a way for us to see billing usage per catalog? I'm using 14days trial period for now.. would it be available in Premium?

  • 1278 Views
  • 1 replies
  • 1 kudos
Latest Reply
gchandra
Databricks Employee
  • 1 kudos

Billing system table provides cost by notebook, jobs, and clusters. If catalog-to-job/cluster/notebook relation is maintained, then catalog-based usage can be determined. https://docs.databricks.com/en/admin/system-tables/billing.html

  • 1 kudos
abueno
by Contributor
  • 9207 Views
  • 3 replies
  • 8 kudos

Resolved! Find and replace

Hi,Is there a "Find and replace" option to edit SQL code?  I am not referring to the "replace" function but something similar to Control  + shift + F in Snowflake or Control + F in MS Excel.

  • 9207 Views
  • 3 replies
  • 8 kudos
Latest Reply
DBKENGR
New Contributor III
  • 8 kudos

is there an option to find-replace just within a cell instead of entire notebook?

  • 8 kudos
2 More Replies
hetrasol
by New Contributor III
  • 4555 Views
  • 7 replies
  • 0 kudos

Resolved! Unable to start browser for databricks certification

Hello, I  have registered for databricks certified data engineering associate exam. One of the requirements to give the exam is The exam is set for Sunday 6th October, 2024 but the browser installation (psi secure bridge browser) does not work. .Reac...

hetrasol_0-1728079492073.png
  • 4555 Views
  • 7 replies
  • 0 kudos
Latest Reply
TaiNguyen
New Contributor II
  • 0 kudos

Hi @hetrasol ,I'm a Windows user. After installation, I just got the Lockdown Browser OEM instead of the PSI browser, as you mentioned above. Can you help to instruct again on how to install these browsers

  • 0 kudos
6 More Replies
ff-paulo-barbos
by New Contributor
  • 2668 Views
  • 2 replies
  • 0 kudos

Spark Remote error when connecting to cluster

Hi, I am using the latest version of pyspark and I am trying to connect to a remote cluster with runtime 13.3.My doubts are:- Do i need databricks unity catalog enabled?- My cluster is already in a Shared policy in Access Mode, so what other configur...

  • 2668 Views
  • 2 replies
  • 0 kudos
Latest Reply
Debayan
Databricks Employee
  • 0 kudos

Hi, Is your workspace is already unity catalog enabled? Also, did you go through the considerations for enabling workspace for unity catalog? https://docs.databricks.com/en/data-governance/unity-catalog/enable-workspaces.html#considerations-before-yo...

  • 0 kudos
1 More Replies
Thms317
by New Contributor III
  • 4153 Views
  • 2 replies
  • 2 kudos

Resolved! Cannot install wheel from Workspace in DLT

Hi all. I am no longer able to install my custom wheel in my DLT pipeline. No matter what configuration I try I cannot get it to work: parameterized or just hard-coding the path to the wheel. If I run the hard-coded cell with an all-purpose cluster t...

  • 4153 Views
  • 2 replies
  • 2 kudos
Latest Reply
Thms317
New Contributor III
  • 2 kudos

I managed to fix the issue. The problem was that my wheel was built for Databricks Runtime 14.3 LTS and I was using the PREVIEW channel rather than the CURRENT channel. At time of writing:CURRENT(default): Databricks Runtime 14.1 --> Python: 3.10.12P...

  • 2 kudos
1 More Replies
Arch_dbxlearner
by New Contributor III
  • 6804 Views
  • 5 replies
  • 1 kudos

How to get data from Splunk on daily basis?

I am finding the ways to get the data to Databricks from Splunk (similar to other data sources like S3, Kafka, etc.,). I have received a suggestion to use the Databricks add-on to get/put the data from/to Splunk. To pull the data from Databricks to S...

Get Started Discussions
Databricks add-on
Splunk
  • 6804 Views
  • 5 replies
  • 1 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 1 kudos

@Arch_dbxlearner  - could you please follow the post for more details.  https://community.databricks.com/t5/data-engineering/does-databricks-integrate-with-splunk-what-are-some-ways-to-send/td-p/22048  

  • 1 kudos
4 More Replies
Phani1
by Valued Contributor II
  • 1089 Views
  • 1 replies
  • 0 kudos

Late file arrivals - Autoloader

 Hi All,I have a situation where I'm receiving various CSV files in a storage location.The issue I'm facing is that I'm using Databricks Autoloader, but some files might arrive later than expected. In this case, we need to notify the relevant team ab...

  • 1089 Views
  • 1 replies
  • 0 kudos
Latest Reply
HaggMan
New Contributor III
  • 0 kudos

Well, Autoloader could work nicely with the notification event for arriving files. You could probably specify a window duration for your "on-time" arrivels and that could be your base check for on time. As files arrive they go to their window and whe...

  • 0 kudos
dipali_globant
by New Contributor II
  • 880 Views
  • 1 replies
  • 0 kudos

duplicate data published in kafka offset

we have 25k data which are publishing by batch of 5k.we are numbering the records based on row_number window function and creating batch using this.we have observed that some records like 10-20 records are getting published duplicated in 2 offset. ca...

  • 880 Views
  • 1 replies
  • 0 kudos
Latest Reply
agallard
Contributor
  • 0 kudos

Hi @dipali_globant,duplicate data in Kafka can arise in a batch processing scenario for a few reasons here’s an example of ensuring unique and consistent row numbering: from pyspark.sql import Window from pyspark.sql.functions import row_number wind...

  • 0 kudos
prabbalagilead
by New Contributor II
  • 3744 Views
  • 1 replies
  • 0 kudos

How do i find total number of input tokens to genie ?

I am calculating usage analytics for my work, where they use genie.I have given the following for my genie as definition:(1) instructions (2) example SQL queries (3) Within catalog, i went to those relevant table schema and added comments, descriptio...

  • 3744 Views
  • 1 replies
  • 0 kudos
Latest Reply
prabbalagilead
New Contributor II
  • 0 kudos

Or is there any set of tables and functions to determine the number of input and output tokens per query?

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels