cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Arindam19
by New Contributor II
  • 676 Views
  • 3 replies
  • 0 kudos

Are row filters and column masks supported on foreign catalogs in Azure Databricks Unity Catalog?

In my solution I am planning to bring in an Azure SQL Database to Azure Databricks Unity Catalog as Foreign Catalog. Are table row filters and column masks supported in my scenario ?

  • 676 Views
  • 3 replies
  • 0 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 0 kudos

Hi @Arindam19, Yes. Certain operations, including filtering, can be pushed down from Databricks to SQL Server. This is managed by querying the SQL Server directly via a federated connection, allowing SQL Server to handle the filter criteria and retur...

  • 0 kudos
2 More Replies
KaustubhShah
by New Contributor
  • 503 Views
  • 1 replies
  • 0 kudos

GCP Databricks Spark Connector for Cassandra - Error: com.typesafe.config.impl.ConfigImpl.newSimple

Hello,I am using Databricks runtime 12.2 with the spark connector - com.datastax.spark:spark-cassandra-connector_2.12:3.3.0as runtime 12.2 comes with spark 3.3.2 and scala 2.12. I encounter an issue with conneciting to cassandra DB using the below co...

  • 503 Views
  • 1 replies
  • 0 kudos
Latest Reply
cgrant
Databricks Employee
  • 0 kudos

Try using the assembly version of the jar with 12.2.  https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector-assembly  If this doesn't work, please paste the full, original stacktrace

  • 0 kudos
mrstevegross
by Contributor III
  • 2045 Views
  • 6 replies
  • 0 kudos

Resolved! Is it possible to obtain a job's event log via the REST API?

Currently, to investigate job performance, I can look at a job's information (via the UI) to see the "Event Log" (pictured below):I'd like to obtain this information programmatically, so I can analyze it across jobs. However, the docs for the `get` c...

mrstevegross_0-1736967992555.png
  • 2045 Views
  • 6 replies
  • 0 kudos
Latest Reply
mrstevegross
Contributor III
  • 0 kudos

I also see there is a "list cluster events" API (https://docs.databricks.com/api/workspace/clusters/events); can I get the event log this way?

  • 0 kudos
5 More Replies
crowley
by New Contributor III
  • 4371 Views
  • 2 replies
  • 1 kudos

Resolved! How are Struct type columns stored/accessed (interested in efficiency)?

Hello, I've searched around for awhile and didn't find a similar question here or elsewhere, so thought I'd ask...I'm assessing the storage/access efficiency of Struct type columns in delta tables.  I want to know more about how Databricks is storing...

  • 4371 Views
  • 2 replies
  • 1 kudos
Latest Reply
crowley
New Contributor III
  • 1 kudos

Thank you very much for the thoughful response.  Please excuse my belated feedback and thanks!

  • 1 kudos
1 More Replies
pardeep7
by New Contributor II
  • 842 Views
  • 3 replies
  • 0 kudos

Databricks Clean Rooms with 3 or more collaborators

Let's say I create a clean room with 2 other collaborators, call them collaborator A and collaborator B (so 3 in total, including me) and then shared some tables to the clean room. If collaborator A writes code that does a "SELECT * FROM creator.<tab...

  • 842 Views
  • 3 replies
  • 0 kudos
Latest Reply
KaranamS
Contributor III
  • 0 kudos

Hi @pardeep7 , As per my understanding, all participants of clean room can only see metadata. The raw data in your tables is not directly accessed by other collaborators.Any output tables created by Collaborators based on the queries/notebooks will b...

  • 0 kudos
2 More Replies
harsh_Dev
by New Contributor III
  • 998 Views
  • 2 replies
  • 1 kudos

Resolved! Connect databricks community edition to datalake s3/adls2

Can anybody know how can i connect with aws s3 object storage with databricks community edition or can i connect with community databricks account or not ? 

  • 998 Views
  • 2 replies
  • 1 kudos
Latest Reply
KaranamS
Contributor III
  • 1 kudos

Hi @harsh_Dev ,You can read from/write to AWS S3 with Databricks Community edition. As you will not be able to use instance profiles, you will need to configure the AWS credentials manually and access S3 using S3 URI. Try below code spark._jsc.hadoop...

  • 1 kudos
1 More Replies
AGnewbie
by New Contributor
  • 533 Views
  • 1 replies
  • 1 kudos

Required versus current compute setup

To run demo and lab notebooks, I am required to have the following Databricks runtime(s): 15.4.x-cpu-ml-scala2.12 but the compute in my setup is of the following runtime version, will that be an issue? 11.3 LTS (includes Apache Spark 3.3.0, Scala 2.1...

  • 533 Views
  • 1 replies
  • 1 kudos
Latest Reply
Alberto_Umana
Databricks Employee
  • 1 kudos

Hello @AGnewbie, Firstly, regarding the Databricks runtime: your compute setup is currently running version 11.3 LTS, which will indeed be an issue as the specified version is not present in your current runtime. Hence, you need to update your runtim...

  • 1 kudos
Boyeenas
by New Contributor
  • 2792 Views
  • 1 replies
  • 0 kudos

Decimal(32,6) datatype in Databricks - precision roundoff

Hello All,I need your assistance. I recently started a migration project from Synapse Analytics to Databricks. While dealing with the datatypes, I came across a situation where in Dedicated Sql Pool the value is 0.033882, but in DataBricks the value ...

  • 2792 Views
  • 1 replies
  • 0 kudos
Latest Reply
KaranamS
Contributor III
  • 0 kudos

Hi @Boyeenas ,I believe your assumption is correct. Databricks is built on Apache Spark and the system applies rounding automatically based on the value of the subsequent digit. In your case, if the original value had a 7th decimal digit of 5 or high...

  • 0 kudos
mishrarit
by New Contributor
  • 539 Views
  • 1 replies
  • 0 kudos

job "run name" in "system" "lake flow" "job run timeline" table

For few jobs in unity catalog the "run name" is coming out to be "null" whereas for few we the complete name with system generated batch id. I am not sure how this field is populated and why for some job's the "run name" is present whereas for some i...

  • 539 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika_
Databricks Employee
  • 0 kudos

Hello @mishrarit! Run name in Unity Catalog job runs is determined by how the job is triggered. For manual runs, Databricks automatically generates a name, and for scheduled or API-triggered runs, the run name remains null unless explicitly defined.

  • 0 kudos
arne_c
by New Contributor II
  • 1462 Views
  • 2 replies
  • 0 kudos

Set up compute policy to allow installing python libraries from a private package index

In our organization, we maintain a bunch of libraries we share code with. They're hosted on a private python package index, which requires a token to allow downloads. My idea was to store the token as a secret which would then be loaded into a cluste...

  • 1462 Views
  • 2 replies
  • 0 kudos
Latest Reply
arne_c
New Contributor II
  • 0 kudos

I figured it out, seems like secrets can only be loaded into environment variables if the content is the secret and nothing else:"value": "{{secrets/global/arneCorpPyPI_token}}" # this will work"value": "foo {{secrets/global/arneCorpPyPI_toke...

  • 0 kudos
1 More Replies
GerardAlexander
by New Contributor III
  • 706 Views
  • 1 replies
  • 0 kudos

Creating Unity Catalog in Personal AZURE Portal Account

Seeking advice on the following:1. Given that I have a Personal - and not an Organization-based - AZURE Portal Account,    2. that I can see I am Global Admin and have Admin Role in Databricks,         3. then why can I not get "Manage Account" for a...

  • 706 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@GerardAlexander Try signing in to the Account Console (https://accounts.azuredatabricks.net/login) using a user account with the appropriate permissions, rather than accessing it from the workspace.If you are unable to sign in, the following resourc...

  • 0 kudos
jay-cunningham
by New Contributor
  • 2577 Views
  • 0 replies
  • 0 kudos

Is there a way to prevent databricks-connect from installing a global IPython Spark startup script?

I'm currently using databricks-connect through VS Code on MacOS. However, this seems to install (and re-install upon deletion) an IPython startup script which initializes a SparkSession. This is fine as far as it goes, except that this script is *glo...

  • 2577 Views
  • 0 replies
  • 0 kudos
laeforceable
by New Contributor II
  • 3147 Views
  • 3 replies
  • 1 kudos

Power BI - Azure Databricks Connector shows Error AAD is not setup for domain

Hi Team,What I would like to do is understand what is required for PowerBI gateway to use single sign-on (AAD) to Databricks. Is that something you could have encountered before and know the fix? I currently get message from Power BI that AAD is not ...

image.png
  • 3147 Views
  • 3 replies
  • 1 kudos
Latest Reply
kkitsara
New Contributor II
  • 1 kudos

Hello, did you have any solution for this? I am facing the same issue.

  • 1 kudos
2 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels