cancel
Showing results for 
Search instead for 
Did you mean: 
Community Platform Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

chari
by Contributor
  • 1660 Views
  • 0 replies
  • 0 kudos

how to run a group of cells in databricks ?

Hello,I was experimenting with a ML model with different parameters and check the results. However, the important part of this code is contained in a couple of cells (say cell # 12, 13 & 14). I like to proceed to the next cell only when the results a...

  • 1660 Views
  • 0 replies
  • 0 kudos
AravindNani
by New Contributor
  • 2682 Views
  • 1 replies
  • 0 kudos

Unable to read data from API due to Private IP Restriction

I have data in my API Endpoint but am unable to read it using Databricks. My data is limited to my private IP address and can only be accessed over a VPN connection. I can't read data into Databricks as a result of this. I can obtain the data in VS C...

  • 2682 Views
  • 1 replies
  • 0 kudos
Latest Reply
Wojciech_BUK
Valued Contributor III
  • 0 kudos

Hi AravindNaniThis is more of infrastructure questions, you have to make sure that:1) Your databricks Workspace is provisioned in VNET Injection mode2) Your VNET is either peered to "HUB" network where you have S2S VPN Connection to API or you have t...

  • 0 kudos
arkiboys
by Contributor
  • 4174 Views
  • 1 replies
  • 0 kudos

databricks email notification

In databricks, if a job fails, then an email is sent off as notification.The recipient, receives the email with the link to the databricks workspace.Question:How is it possible the email is sent without any link, just the plain text in the email is w...

  • 4174 Views
  • 1 replies
  • 0 kudos
RozaZaharieva
by New Contributor
  • 1662 Views
  • 0 replies
  • 0 kudos

set up Azure Databricks workspace and Unity catalog - how to automate not using Terraform

Hi everyone, I am looking for a way to automate initial setup of Azure Databricks workspace and Unity Catalog but can't find anything on this topic other than Terraform. Can you share if this is possible with powershell, for example. Thank you un adv...

  • 1662 Views
  • 0 replies
  • 0 kudos
Shravanshibu
by New Contributor III
  • 1037 Views
  • 0 replies
  • 0 kudos

Public preview API not working - artifact-allowlists

 I am trying to hit /api/2.1/unity-catalog/artifact-allowlists/as a part of INIT migration script. Its is in public preview, do we need to enable anything else to use a API which is in Public preview. I am getting 404 error. But using same token for ...

  • 1037 Views
  • 0 replies
  • 0 kudos
SaiNeelakantam
by New Contributor
  • 2187 Views
  • 1 replies
  • 0 kudos

How to enable "Create Vector Search Index" button in DB workspace?

How to enable "Create Vector Search Index" button in DB workspace?Following is the screenshot from the Microsoft Ignite 2023 Databricks presentation:

  • 2187 Views
  • 1 replies
  • 0 kudos
Latest Reply
PL_db
Databricks Employee
  • 0 kudos

The feature is in public preview only in some regions, you can check the available regions in the documentation here.  In addition there are certain requirements, such as a UC enabled workspace and Serverless Compute enabled, you can check all requir...

  • 0 kudos
SamGreene
by Contributor II
  • 3450 Views
  • 5 replies
  • 0 kudos

CONVERT_TIMEZONE issue in DLT

I can run a query that uses the CONVERT_TIMEZONE function in a SQL notebook.  When I move the code to my DLT notebook the pipeline produces this error:Cannot resolve function `CONVERT_TIMEZONE`Here is the line:  CONVERT_TIMEZONE('UTC', 'America/Phoen...

  • 3450 Views
  • 5 replies
  • 0 kudos
Latest Reply
annn
New Contributor II
  • 0 kudos

Yes, the notebook is set to SQL and the convert_timezone function is within a select statement.

  • 0 kudos
4 More Replies
Ak_0926
by New Contributor
  • 4479 Views
  • 2 replies
  • 1 kudos

Can we get the actual query execution plan programmatically after a query is executed? Apart from UI

Let's say i have run a query and it showed me results. we can find the respective query execution plan on the UI. Is there any way we can get that execution plan through programmatically or through API?

  • 4479 Views
  • 2 replies
  • 1 kudos
Latest Reply
Walter_C
Databricks Employee
  • 1 kudos

You can obtain the query execution plan programmatically using the EXPLAIN statement in SQL. The EXPLAIN statement displays the execution plan that the database planner generates for the supplied statement. The execution plan shows how the table(s) r...

  • 1 kudos
1 More Replies
Danny_Lee
by Valued Contributor
  • 2562 Views
  • 2 replies
  • 4 kudos

Top Kudoed Author 🌟🤩🧑‍🎤

I recently saw a link to the Kudos Leaderboard for the Community Discussions.  It has always been my hope and fantasy , ever since I was a little child that I would someday be the #1 Kudoed Author on Community Discusions on community.Databricks.com....

KudosOprahGIF.gif
  • 2562 Views
  • 2 replies
  • 4 kudos
Latest Reply
Danny_Lee
Valued Contributor
  • 4 kudos

Thanks @DB_Paul - I'm on my way!   

  • 4 kudos
1 More Replies
Anku_
by New Contributor II
  • 1768 Views
  • 2 replies
  • 0 kudos

New to PySpark

Hi all,I am trying to get the domain from an email field using below expression; but getting an error.Kindly help. df.select(df.email, substring(df.email,instr(df.email,'@'),length(df.email).alias('domain')))

  • 1768 Views
  • 2 replies
  • 0 kudos
Latest Reply
Walter_C
Databricks Employee
  • 0 kudos

In your case, you want to extract the domain from the email, which starts from the position just after '@'. So, you should add 1 to the position of '@'. Also, the length of the substring should be the difference between the total length of the email ...

  • 0 kudos
1 More Replies
kickbuttowski
by New Contributor II
  • 1217 Views
  • 1 replies
  • 0 kudos

Issue in inferring schema for streaming dataframe using json files

Below is the pileine design in databricks and it's not working out , kindly look on this and let me know whether it will work or not , I'm getting json files of different schemas from directory under the root directory and it read all the files using...

  • 1217 Views
  • 1 replies
  • 0 kudos
Latest Reply
AmanSehgal
Honored Contributor III
  • 0 kudos

Could you please share some sample of your dataset and code snippet of what you're trying to implement?

  • 0 kudos
pernilak
by New Contributor III
  • 3419 Views
  • 2 replies
  • 3 kudos

Resolved! Pros and cons of physically separating data in different storage accounts and containers

When setting up Unity Catalog, it is recommended by Databricks to figure out your data isolation model when it comes to physically separating your data into different storage accounts and/or contaners. There are so many options, it can be hard to be ...

  • 3419 Views
  • 2 replies
  • 3 kudos
Latest Reply
raphaelblg
Databricks Employee
  • 3 kudos

Hello @pernilak , Thanks for reaching out to Databricks Community! My name is Raphael, and I'll be helping out. Should all catalogs and the metastore reside in the same storage account (but different containers)   Yes, Databricks recommends having o...

  • 3 kudos
1 More Replies
swapnilmd
by New Contributor
  • 986 Views
  • 1 replies
  • 0 kudos

Databricks Web Editor's Cell like UI in local IDE

I want to have databricks related developement locally.There is extension that allows to run local python file on remote databricks cluster.But I want to have cell like structure that is present in databricks UI for python files in local IDE as well....

  • 986 Views
  • 1 replies
  • 0 kudos
Latest Reply
daniel_sahal
Esteemed Contributor
  • 0 kudos

@swapnilmd You can use VSCode extension for Databricks.https://docs.databricks.com/en/dev-tools/vscode-ext/index.html

  • 0 kudos
NhanNguyen
by Contributor III
  • 1591 Views
  • 3 replies
  • 1 kudos

[Memory utilization in Metrics Tab still display after terminate a cluster]

Hi All,Could you guys help me to check this?I run a cluster and then terminate that cluster but when i navigate to the Metrics tab of Cluster still see the Memory utilization show metrics.Thanks

jensen22_0-1710993062168.png
  • 1591 Views
  • 3 replies
  • 1 kudos
Latest Reply
NhanNguyen
Contributor III
  • 1 kudos

here are my cluster display and my simple notebook:

  • 1 kudos
2 More Replies

Connect with Databricks Users in Your Area

Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.

If there isn’t a group near you, start one and help create a community that brings people together.

Request a New Group
Top Kudoed Authors