cancel
Showing results for 
Search instead for 
Did you mean: 
Data Engineering
Join discussions on data engineering best practices, architectures, and optimization strategies within the Databricks Community. Exchange insights and solutions with fellow data engineers.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Rita
by New Contributor III
  • 8091 Views
  • 7 replies
  • 6 kudos

How to connect Cognos 11.1.7 to Azure Databricks

We are trying to connect Cognos 11.1.7 to Azure Databricks, but no success.Can you please help or guide us how to connect Cognos 11.1.7 to Azure Databricks.This is very critical to our user community. Can you please help or guide us how to connect Co...

  • 8091 Views
  • 7 replies
  • 6 kudos
Latest Reply
Hans2
New Contributor
  • 6 kudos

Have anyone got the Simba JDBC driver going with CA 11.1.7? The ODBC driver works fine but i  can't get the JDBC running.Regd's

  • 6 kudos
6 More Replies
prathameshJoshi
by New Contributor III
  • 3065 Views
  • 8 replies
  • 6 kudos

Resolved! How to obtain the server url for using spark's REST API

Hi,I want to access the stage and job information (usually available through Spark UI) through the REST API provided by Spark: http://<server-url>:18080/api/v1/applications/[app-id]/stages. More information can be found at following link: https://spa...

  • 3065 Views
  • 8 replies
  • 6 kudos
Latest Reply
prathameshJoshi
New Contributor III
  • 6 kudos

Hi @Retired_mod  and @menotron ,Thanks a lot; your solutions are working. I apologise for the delay, as I had some issue logging in.

  • 6 kudos
7 More Replies
jeremy98
by Contributor
  • 103 Views
  • 1 replies
  • 0 kudos

Resolved! how read through jdbc from postgres to databricks a particular data type

Hi Community,I need to load data from PostgreSQL into Databricks through JDBC without changing the data type of a VARCHAR[]column in PostgreSQL, which should remain as an array of strings in Databricks.Previously, I used psycopg2, and it worked, but ...

  • 103 Views
  • 1 replies
  • 0 kudos
Latest Reply
jeremy98
Contributor
  • 0 kudos

Hi community,Yesterday, I found a solution. This is to query through jdbc from postgres creating two columns that are manageable in databricks. Here the code: query = f"""(SELECT *, array_to_string(columns_to_export, ',') AS columns_to_export_strin...

  • 0 kudos
asurendran
by New Contributor III
  • 165 Views
  • 7 replies
  • 2 kudos

Some records are missing after window function

While loading data from one layer to another layer using pyspark window function, I noticed that some data is missing. This is happening if the data is huge. It's not happening for small quantity. Does anyone come across this issue before?

  • 165 Views
  • 7 replies
  • 2 kudos
Latest Reply
asurendran
New Contributor III
  • 2 kudos

Is there a way caching the dataframe helps to fix this issue?

  • 2 kudos
6 More Replies
busuu
by New Contributor II
  • 265 Views
  • 3 replies
  • 1 kudos

Failed to checkout Git repository: RESOURCE_DOES_NOT_EXIST: Attempted to move non-existing node

I'm having issues with checking out Git repo in Workflows. Databricks can access files from commit `a` but fails to checkout the branch when attempting to access commit `b`. The error occurs specifically when trying to checkout commit `b`, and Databr...

busuu_0-1738776211583.png
  • 265 Views
  • 3 replies
  • 1 kudos
Latest Reply
Augustus
New Contributor II
  • 1 kudos

I didn't do anything to fix it. Databricks support did something to my workspace to fix the issue. 

  • 1 kudos
2 More Replies
ohnomydata
by New Contributor
  • 95 Views
  • 1 replies
  • 0 kudos

Accidentally deleted files via API

Hello,I’m hoping you might be able to help me.I have accidentally deleted some Workspace files via API (an Azure DevOps code deployment pipeline). I can’t see the files in my Trash folder – are they gone forever, or is it possible to recover them on ...

  • 95 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hello @ohnomydata, Unfortunately files deleted via APIs or the Databricks CLI are permanently deleted and do not move to the Trash folder. The Trash folder is a UI-only feature, and items deleted through the UI can be recovered from the Trash within ...

  • 0 kudos
Somia
by New Contributor II
  • 212 Views
  • 6 replies
  • 2 kudos

Resolved! sql query is not returning _sqldf.

Notebooks in my workspace are not returning _sqldf when a sql query is run. If I run this code, it would give an error in second cell that _sqldf is not defined.First Cell:%sqlselect * from some_table limit 10Second Cell:%sqlselect * from _sqldfHowev...

  • 212 Views
  • 6 replies
  • 2 kudos
Latest Reply
Somia
New Contributor II
  • 2 kudos

Changing the notebook to default python and all purpose compute have fixed the issue. I am able to access _sqldf in subsequent sql or python cell.

  • 2 kudos
5 More Replies
pradeepvatsvk
by New Contributor II
  • 211 Views
  • 2 replies
  • 0 kudos

polars to natively read and write through adls

HI Everyone,Is there a way polars can directly read files from ADLS  through abfss protocol.

  • 211 Views
  • 2 replies
  • 0 kudos
Latest Reply
jennifer986bloc
New Contributor II
  • 0 kudos

@pradeepvatsvk wrotae:HI Everyone,Is there a way polars can directly read files from ADLS  through abfss protocol.Hello @pradeepvatsvk,Yes, Polars can directly read files from Azure Data Lake Storage (ADLS) using the ABFS (Azure Blob Filesystem) prot...

  • 0 kudos
1 More Replies
Rafael-Sousa
by Contributor II
  • 125 Views
  • 3 replies
  • 0 kudos

Managed Delta Table corrupted

Hey guys,Recently, we have add some properties to our delta table and after that, the table shows error and we cannot do anything. The error is that: (java.util.NoSuchElementException) key not found: spark.sql.statistics.totalSizeI think maybe this i...

  • 125 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Rafael-Sousa, Could you please raise a support case for this, to investigate this further? help@databricks.com

  • 0 kudos
2 More Replies
samtech
by New Contributor
  • 83 Views
  • 1 replies
  • 1 kudos

DAB multiple workspaces

Hi,We have 3 regional workspaces. Assume that we keep seperate folder for notebook say amer/xx , apac/xx, emea/xx and sepeate job/pipeline configrations for each region in git how to make sure during deploy appropriate job/pipleines are deployed in r...

  • 83 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hi @samtech, Define separate bundle configuration files for each region. These configuration files will specify the resources (notebooks, jobs, pipelines) and their respective paths. For example, you can have amer_bundle.yml, apac_bundle.yml, and eme...

  • 1 kudos
BriGuy
by New Contributor II
  • 98 Views
  • 2 replies
  • 0 kudos

create a one off job run using databricks SDK.

I'm trying to build the job spec using objects.  When I try to call execute the job I get the following error.I'm somewhat new to python and not sure what I'm doing wrong here.  Is anyone able to help?Traceback (most recent call last): File "y:\My ...

  • 98 Views
  • 2 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @BriGuy, Can you try importing this module first? from databricks.sdk.service.jobs import PermissionLevel

  • 0 kudos
1 More Replies
melikaabedi
by New Contributor
  • 71 Views
  • 1 replies
  • 0 kudos

databricks apps

Imagine I develop an app in Databricks with #databricks-apps. Is it possible for someone outside the organization to use it just by accessing a URL, without having a Databricks account? thank you in advance for your hel

  • 71 Views
  • 1 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @melikaabedi, No, only users in the account can access a Databricks app, same way you would with AI/BI dashboards.

  • 0 kudos
Dnirmania
by Contributor
  • 95 Views
  • 2 replies
  • 0 kudos

Foreign Catalog refresh

Hi EveryoneI have recently created one Foreign catalog from AWS Redshift in databricks and I could see some tables too but when I ran REFRESH FOREIGN SCHEMA command, it failed with following error. I tried to search about it online but didn't get any...

  • 95 Views
  • 2 replies
  • 0 kudos
Latest Reply
Dnirmania
Contributor
  • 0 kudos

REFRESH FOREIGN SCHEMA is databricks command to refresh foreign catalog and I don't have visibility about the queries which its runs internally.

  • 0 kudos
1 More Replies
allinux
by New Contributor II
  • 155 Views
  • 2 replies
  • 0 kudos

When Try Returns Success for Invalid S3 Path in Spark: Is This a Bug?

Try(spark.read.format("parquet").load("s3://abcd/abcd/")) should result in Failure, but when executed in the notebook, it returns Success as shown below. Isn't this a bug?Try[DataFrame] = Success(...)

  • 155 Views
  • 2 replies
  • 0 kudos
Latest Reply
MuthuLakshmi
Databricks Employee
  • 0 kudos

@allinux The read is a valid way to load data. Why are you expecting a failure? can you please explain? 

  • 0 kudos
1 More Replies
Dominos
by New Contributor II
  • 111 Views
  • 3 replies
  • 0 kudos

Does DBR 14.3 not support Describe history command?

Hello, We have recently updated DBR version from 9.1 LTS to 14.3 LTS and observed that DESCRIBE HISTORY is not supported in 14.3 LTS. Could you please suggest any alternative to be used for table history? 

  • 111 Views
  • 3 replies
  • 0 kudos
Latest Reply
Dominos
New Contributor II
  • 0 kudos

I think it is a problem with worker type. It is working fine in Standard_DS4_v2 but not in Standard_DS3_v2. 

  • 0 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Labels