cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

abueno
by Contributor
  • 4823 Views
  • 2 replies
  • 0 kudos

Resolved! exclude (not like) filter using pyspark

I am trying to exclude rows with a specific variable when querying using pyspark but the filter is not working.  Similar to the "Not like" function in SQL.  e.g. not like '%var4%'.  The part of the code that is not working is: (col('col4').rlike('var...

Get Started Discussions
Databricks
python
  • 4823 Views
  • 2 replies
  • 0 kudos
Latest Reply
abueno
Contributor
  • 0 kudos

Worked perfectly Thank you. 

  • 0 kudos
1 More Replies
MaximeGendre
by New Contributor III
  • 2025 Views
  • 3 replies
  • 2 kudos

Resolved! Curl command working in 12.2 but not in 13.3

Hello, one my teammate is trying to put some obversability on our Databricks flows.When he tries to contact our Open Telemetry server, he gets a timeout.I had a look, and the same command (on the same Databricks workspace) works well with runtime 12....

  • 2025 Views
  • 3 replies
  • 2 kudos
Latest Reply
MaximeGendre
New Contributor III
  • 2 kudos

Hello,thank you for your answer.I tried to update library version and it was almost the solution.It realized it neither works in 12.2 but the output was pretty different and it misled me.Probably a network config to set up in the target server. 

  • 2 kudos
2 More Replies
lux13
by New Contributor
  • 698 Views
  • 0 replies
  • 0 kudos

show document name in AI agent

Hi everyone!I have successfully deployed the Review App of my AI agent following those instructions: Create and log AI agents | Databricks on AWSHowever, one question came up regarding the sources (names). To be precise, is there a possibility to sho...

  • 698 Views
  • 0 replies
  • 0 kudos
teaholic
by New Contributor III
  • 9517 Views
  • 10 replies
  • 10 kudos

Resolved! how to include checkboxes in markdown cells in databricks notebook

Hi has anyone else tried to include checkboxes in markdown cells in databricks notebook?I believe I followed the correct way for checkbox: - [ ] and - [x]but the result I got is still no checkboxes.Please help! Thanks!  %md #to do: - [x] static vari...

  • 9517 Views
  • 10 replies
  • 10 kudos
Latest Reply
renancy
New Contributor III
  • 10 kudos

Hi @teaholicI faced the same problem and found the ✓ and ✗ notation that did work for me. Hope that helps. 

  • 10 kudos
9 More Replies
prad18
by New Contributor III
  • 7676 Views
  • 11 replies
  • 6 kudos

Resolved! Unity catalog implementation

Hello Databricks Community,We are in the process of planning a Unity Catalog implementation for our organization, and I'd like to seek input on some architectural decisions. We're considering various approaches to workspace separation, storage accoun...

  • 7676 Views
  • 11 replies
  • 6 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 6 kudos

Hi @prad18 ,I'm glad the previous response was helpful! Let's address your remaining questions:Cost Differences Between Single vs. Multiple Azure Storage Accounts: The cost difference between using a single Azure storage account for both Unity Catalo...

  • 6 kudos
10 More Replies
rafal_walisko
by New Contributor II
  • 603 Views
  • 0 replies
  • 0 kudos

API Results filename have tomorrow's date

Hi, I'm downloading data in chunks using API api/2.0/preview/sql/queries Today I've realized hat, the chunks have different date.. For example  results_2024-09-10T10_30_32Z_acf798b5-2a5c-474f-a3b0-b83a2a8eb35a.csv I'm in the UTC timezone, I know that...

  • 603 Views
  • 0 replies
  • 0 kudos
tomos_phillips1
by New Contributor II
  • 9502 Views
  • 9 replies
  • 1 kudos

GeoPandas Insall

hi,I cannot install geopandas in my notebook, ive tried all different forms of generic fix, pip installs etc but always get this error:CalledProcessError: Command 'pip --disable-pip-version-check install geopandas' returned non-zero exit status 1.---...

  • 9502 Views
  • 9 replies
  • 1 kudos
Latest Reply
shan_chandra
Databricks Employee
  • 1 kudos

@brian999 - Conda is subjected to commercial licensing. Referenced here:  https://docs.databricks.com/en/archive/legacy/conda.html

  • 1 kudos
8 More Replies
s100rab
by New Contributor
  • 519 Views
  • 0 replies
  • 0 kudos

Free voucher awaited

Hi team , I cleared datalake fundamentals exam attended webinar and completed survey ( my id s100rab@gmail.com )  when can I expect voucher

  • 519 Views
  • 0 replies
  • 0 kudos
philHarasz
by New Contributor III
  • 516 Views
  • 0 replies
  • 0 kudos

Googles finds databricks community topics but databricks community search does not

Every time I ask Google a Databricks question, and a link comes up to this community, and I click the link, it goes right to the topic and I get to read the question and the answer. But then a pop-up appears asking me to login to the community. Since...

  • 516 Views
  • 0 replies
  • 0 kudos
jlagos
by New Contributor II
  • 1322 Views
  • 0 replies
  • 2 kudos

Survey about Databricks

Hi everyone, I hope you are very well.In the research for my university we are conducting a survey to Databricks users, aimed to gather information about how cluster configuration and optimization is faced in the industry.If you like you can answer i...

  • 1322 Views
  • 0 replies
  • 2 kudos
Artman23
by New Contributor
  • 1713 Views
  • 1 replies
  • 0 kudos

DLT Pipeline Issue with New Version of Advanced Data Engineering with Databricks training

I am currently taking the new Advanced Data Engineering with Databricks training. I am stuck in the Demo: Auto Load to Bronze section of Streaming ETL Patterns with DLT. Looks to be an access issue to AWS as the pipeline errors out with the following...

  • 1713 Views
  • 1 replies
  • 0 kudos
PKD28
by New Contributor II
  • 4194 Views
  • 3 replies
  • 0 kudos

Resolved! Databricks Cluster job failure issue

Jobs within the all purpose DB Cluster are failing with "the spark driver has stopped unexpectedly and is restarting. Your notebook will be automatically reattached"In the event log it says "Event_type=DRIVER_NOT_RESPONDING & MESSAGE= "Driver is up b...

  • 4194 Views
  • 3 replies
  • 0 kudos
Latest Reply
PKD28
New Contributor II
  • 0 kudos

 just now there is one cluster issuecluster error: Driver is unresponsive likely due to GCcluster conf:worker: Standard_D8ads_v5Driver: standard_E16d_v4What do you suggest here ??

  • 0 kudos
2 More Replies
Zahid-CSA
by New Contributor
  • 1459 Views
  • 1 replies
  • 0 kudos

PowerBi to Databricks SQL warehouse Inactivity error on refreshing

Hello Team, we are trying to refreshing a dataset which has near about 1 billion rows and we have partiotioned it to run periodicaly and in parralel-distrubted mechansism but the refrshing is failing after hours stating inactivity time out error are ...

  • 1459 Views
  • 1 replies
  • 0 kudos
Latest Reply
szymon_dybczak
Esteemed Contributor III
  • 0 kudos

Hi @Zahid-CSA ,You can take a look at below thread, kind of similar problem. @SethParker suggested one possible solution, maybe worth a try:Re: Power BI Import Model Refresh from Databricks ... - Databricks Community - 51661

  • 0 kudos
AK031
by New Contributor II
  • 647 Views
  • 0 replies
  • 1 kudos

How to display RunName as the source in job compute grid?

I’m using Databricks on GCP and currently facing an issue where I want the RunName of a job to be displayed as the source in the Job Compute Grid.Here’s what I’ve already done:I’ve included the RunName in the custom_tags section of my cluster JSON:Wh...

  • 647 Views
  • 0 replies
  • 1 kudos
phanisaisrir
by New Contributor
  • 1234 Views
  • 1 replies
  • 0 kudos

Accessing table in Unity Catalog

What is the preferred way of accessing a UC enabled SqlWarehouse table from Databricks Spark Cluster . My requirement is to fetch the data from a sqlwarehouse table using complex queries, transform it using Pyspark notebook and save the results.But t...

  • 1234 Views
  • 1 replies
  • 0 kudos
Latest Reply
filipniziol
Esteemed Contributor
  • 0 kudos

Hi @phanisaisrir ,Use spark sql. This is the native and most integrated way to interact with data within Databricks.

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels