cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

TechMG
by New Contributor II
  • 8084 Views
  • 0 replies
  • 0 kudos

power Bi paginate

Hello,I am facing similar kind of issue. I am working on Power BI paginated report and databricks is my source for the report. I was trying to pass the parameter by passing the query in expression builder as mentioned below. https://community.databri...

  • 8084 Views
  • 0 replies
  • 0 kudos
Lazloo
by New Contributor III
  • 1197 Views
  • 1 replies
  • 0 kudos

Using nested dataframes with databricks-connect>13.x

 We needed to move to databricks-connect>13.x. Now I facing the issue that when I work with a nested dataframe of the structure```root|-- a: string (nullable = true)|-- b: array (nullable = true)| |-- element: struct (containsNull = true)| | |-- c: s...

  • 1197 Views
  • 1 replies
  • 0 kudos
Latest Reply
Lazloo
New Contributor III
  • 0 kudos

In addition here is the full stack trace23/12/07 14:51:56 ERROR SerializingExecutor: Exception while executing runnable grpc_shaded.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable@33dfd6ecgrpc_shaded.io.grpc...

  • 0 kudos
mvmiller
by New Contributor III
  • 2226 Views
  • 1 replies
  • 0 kudos

How to facilitate incremental updates to an SCD Type 1 table that uses SCD Type 2 source tables

I have an SCD Type 1 delta table (target) for which I am trying to figure out how to facilitate insert, updates, and deletes.  This table is sourced by multiple delta tables, with an SCD Type 2 structure, which are joined together to create the targe...

  • 2226 Views
  • 1 replies
  • 0 kudos
Latest Reply
mvmiller
New Contributor III
  • 0 kudos

Correction (I can't seem to edit or remove original post):- "... trying to think through an process" --> *a* process- "Thoughts and advice or much appreciated" --> Thoughts and/or advice are much appreciated.

  • 0 kudos
Gembo
by New Contributor III
  • 3303 Views
  • 4 replies
  • 0 kudos

Cluster Access mode set to Shared on Databricks, results in connection refused on Exasol

I am trying to run a TRUNCATE command on my exasol DWH from Databricks using the pyexasol. This works perfectly fine when I have the cluster access mode as "No Isolation Shared" which does not have access to our Unity Catalog. When I change the clust...

  • 3303 Views
  • 4 replies
  • 0 kudos
Latest Reply
SSundaram
Contributor
  • 0 kudos

Interesting. Did you try with "Single User" mode, which also has UC support? 

  • 0 kudos
3 More Replies
Sujitha
by Databricks Employee
  • 30292 Views
  • 3 replies
  • 7 kudos

Introducing the Data Intelligence Platforms

Introducing the Data Intelligence Platform, our latest AI-driven data platform constructed on a lakehouse architecture. It’s not just an incremental improvement over current data platforms, but a fundamental shift in product strategy and roadmap.   E...

Screenshot 2023-11-15 at 7.52.14 PM.png
  • 30292 Views
  • 3 replies
  • 7 kudos
Latest Reply
Hubert-Dudek
Esteemed Contributor III
  • 7 kudos

Hmm I preferred naming related to water like data lake, delta lake and lakehouse

  • 7 kudos
2 More Replies
VJ3
by Contributor
  • 3361 Views
  • 1 replies
  • 0 kudos

Azure Databricks Notebook Sharing, Notebook Exporting and Notebook Clipboard copy download

Hello,I would like to know in which scenario Azure Databricks User would be able to download Notebook Command Output if Notebook Result Download is disabled. Do we know if Privilege user would be able to share sensitive information with non-privilege...

  • 3361 Views
  • 1 replies
  • 0 kudos
Latest Reply
VJ3
Contributor
  • 0 kudos

Thank you Kaniz. Can we disable the exporting of notebook except Source File? If yes, then how do we achieve is?Also, we do not want to share the notebook which has any kind of notebook results, can we use spark.databricks.query.displayMaxRows and se...

  • 0 kudos
alexiswl
by Contributor
  • 8431 Views
  • 1 replies
  • 0 kudos

Resolved! 'Unity Catalog Volumes is not enabled on this instance' error

Hi all,tl;dr I ran the following on a docker-backed personal compute instance (running 13.3-LTS)```%sqlUSE CATALOG hail;USE SCHEMA volumes_testing;CREATE VOLUME 1kg    COMMENT 'Testing 1000 Genomes volume';```But this gives```ParseException: [UC_VOLU...

  • 8431 Views
  • 1 replies
  • 0 kudos
Latest Reply
alexiswl
Contributor
  • 0 kudos

Resolved with the setting "spark.databricks.unityCatalog.volumes.enabled" = "true"

  • 0 kudos
nyck33
by New Contributor II
  • 1822 Views
  • 1 replies
  • 0 kudos

Databricks learning festival, but my trial is over

I just emailed the onboarding-help email account to ask for an extension for 2 weeks as I want to complete the Data Engineer course to prepare for my new position. I have 2 accounts where the trial expired, one community account which cannot be used ...

  • 1822 Views
  • 1 replies
  • 0 kudos
Latest Reply
nyck33
New Contributor II
  • 0 kudos

is what happened when trying to sign up with another email.

  • 0 kudos
DBEnthusiast
by New Contributor III
  • 1806 Views
  • 1 replies
  • 1 kudos

More than expected number of Jobs created in Databricks

Hi Databricks Gurus !I am trying to run a very simple snippet :data_emp=[["1","sarvan","1"],["2","John","2"],["3","Jose","1"]]emp_columns=["EmpId","Name","Dept"]df=spark.createDataFrame(data=data_emp, schema=emp_columns)df.show() --------Based on a g...

  • 1806 Views
  • 1 replies
  • 1 kudos
hukel
by Contributor
  • 3137 Views
  • 1 replies
  • 0 kudos

Resolved! Convert multiple string fields to int or long during streaming

Source data looks like: { "IntegrityLevel": "16384", "ParentProcessId": "10972929104936", "SourceProcessId": "10972929104936", "SHA256Hash": "a26a1ffb81a61281ffa55cb7778cc3fb0ff981704de49f75f51f18b283fba7a2", "ImageFileName": "\\Device\\Harddisk...

  • 3137 Views
  • 1 replies
  • 0 kudos
Latest Reply
hukel
Contributor
  • 0 kudos

Thanks for confirming that the readStream.withColumn() approach is the best available option.  Unfortunately, this will force me to maintain a separate notebook for each of the event types,  but it does work.   I was hoping to create just one paramet...

  • 0 kudos
rpl
by Contributor
  • 2679 Views
  • 2 replies
  • 1 kudos

Bug report: the delimiter option does not work when run on DLT

I have a semicolon separated file in an ADLS container that's been added to Unity Catalog as an External location.When I run the following code on an all-purpose cluster, it runs ok and displays the schema.import dlt @dlt.table def test_data_csv(): ...

  • 2679 Views
  • 2 replies
  • 1 kudos
Latest Reply
rpl
Contributor
  • 1 kudos

@Retired_mod can you confirm that .option("delimiter", ";") is ignored when run in a DLT pipeline? (please see the post above) My colleage confirmed the behavior. 

  • 1 kudos
1 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors