cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

saqib_rasool
by New Contributor II
  • 3198 Views
  • 2 replies
  • 3 kudos

Databricks Lakehouse Fundamentals exam

Hi I completed Databricks Lakehouse Fundamentals exam but didn't got my badge

  • 3198 Views
  • 2 replies
  • 3 kudos
Latest Reply
yogu
Honored Contributor III
  • 3 kudos

@saqib_rasool plz submit tikcit to support team https://help.databricks.com/s/contact-us?ReqType=training

  • 3 kudos
1 More Replies
Ajay-Pandey
by Esteemed Contributor III
  • 3469 Views
  • 3 replies
  • 0 kudos

Databricks Pub-Sub Data Recon

I am trying to setup a recon activity between GCP Pub-Sub and databricks, Is there any way to fetch the last 24hrs record count from Pub-Sub?I tried but not got any direct solution for it, It will be great if any one can suggest me the way t#pubsub, ...

  • 3469 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @Ajay-Pandey  Hope you are well. Just wanted to see if you were able to find an answer to your question and would you like to mark an answer as best? It would be really helpful for the other members too. Cheers!

  • 0 kudos
2 More Replies
kll
by New Contributor III
  • 3310 Views
  • 4 replies
  • 1 kudos

SparkException: Job aborted due to stage failure when attempting to run grid_pointascellid

I am attempting to apply Mosaic's `grid_pointascellid` method on a spark dataframe with `lat`, `lon` columns.```import pyspark.sql.functions as F# Create a Spark DataFrame with a lat and lon columndf = spark.createDataFrame([("point1", 10.0, 20.0),("...

  • 3310 Views
  • 4 replies
  • 1 kudos
Latest Reply
Tharun-Kumar
Databricks Employee
  • 1 kudos

@kll This error appears because the function grid_pointascellid expects a Double type column and a Decimal column type was provided as input.To overcome this, before you apply the grid_pointascellid, I would recommend casting the columns lat and lon....

  • 1 kudos
3 More Replies
whleeman
by New Contributor III
  • 2033 Views
  • 1 replies
  • 3 kudos

Resolved! No "create catalog" option in workspace with metastore linked

I created a workspace and a metastore(following the long tedious instructions). I assigned the workspace to the metastore. workspace -> Data, I can see the metastore link on the top left of the page. Through it I configured the permissions (giving me...

  • 2033 Views
  • 1 replies
  • 3 kudos
Latest Reply
whleeman
New Contributor III
  • 3 kudos

Answering my own question - all that is needed is to refresh the Data web page! 

  • 3 kudos
vkuznetsov
by New Contributor III
  • 1439 Views
  • 1 replies
  • 0 kudos

Problem sharing a streaming table created in Delta Live Table via Delta Sharing

Hi all,I hope you could help me to figure out what I am missing.I'm trying to do a simple thing. To read the data from the data ingestion zone (csv files saved to Azure Storage Account) using the Delta Live Tables pipeline and share the resulting tab...

vkuznetsov_0-1689259588838.png 2023_07_13_16_48_52_Data_Explorer.png
  • 1439 Views
  • 1 replies
  • 0 kudos
Latest Reply
vkuznetsov
New Contributor III
  • 0 kudos

Sorry, I think I've created the post in the wrong thread. Created the same post in the Community Cove.

  • 0 kudos
nikhil018
by New Contributor II
  • 3331 Views
  • 2 replies
  • 0 kudos

Databricks Exam got suspended

Hi,I attended Databricks certified associate developer for apache spark 3.0- scala on 09 July 2023(today). At 7.35pm suddenly got a notice stating that due to eye movement away from exam your exam got suspended.I had completed the exam and was at the...

  • 3331 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @nikhil018  Thank you for reaching out!  Please submit a ticket to our Training Team here: https://help.databricks.com/s/contact-us?ReqType=training  and our team will get back to you shortly. 

  • 0 kudos
1 More Replies
Jvonmolt
by New Contributor II
  • 3958 Views
  • 3 replies
  • 2 kudos

Can't use Partner Connect to FiveTran

Brand new Databricks account, Workspace (Premium tier)New SQL Warehouse (Small, Pro)Brand new FiveTran trialI can't seem to use Partner Connect to connect to FiveTran in my trial. It appeared to work at first, and after attempting to sync with Stripe...

  • 3958 Views
  • 3 replies
  • 2 kudos
Latest Reply
Jvonmolt
New Contributor II
  • 2 kudos

Hi @Anonymous ,So far, I haven't been able to get it to work. However, I really appreciate @Prabakar's answer and your follow-up. I'm still interpreting @Prabakar's wise advice; however, I think it mostly doesn't apply to this situation, as I'm using...

  • 2 kudos
2 More Replies
giladba
by New Contributor III
  • 9913 Views
  • 4 replies
  • 1 kudos

Service Principal to run Jobs that contain notebooks in Repos (GitHub)

Hi, Would appreciate your help in understanding how to set up Git credentials for a Service Principal running jobs that contain notebooks in Repos (GitHub), so that it will have access to these notebooks. These credentials should not have any depende...

Get Started Discussions
GitHub
job
Repos
Service Principal
  • 9913 Views
  • 4 replies
  • 1 kudos
Latest Reply
Anonymous
Not applicable
  • 1 kudos

Hi @giladba  Thank you for posting your question in our community! We are happy to assist you. To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your...

  • 1 kudos
3 More Replies
shivank25
by New Contributor II
  • 7336 Views
  • 3 replies
  • 0 kudos

Snowflake connection to databricks error

I am trying to connect to snowflake using data bricks but getting the below errornet.snowflake.client.jdbc.SnowflakeSQLException: JDBC driver encountered a communication error. Message: Exception encountered for HTTP request: Connect to xxx.region.sn...

  • 7336 Views
  • 3 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @shivank25  Thank you for posting your question in our community! We are happy to assist you. To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers yo...

  • 0 kudos
2 More Replies
gilo12
by New Contributor III
  • 3075 Views
  • 2 replies
  • 2 kudos

Bug - Cannot create table The associated location is not empty and also not a Delta table.

I am getting the error:Cannot create table ('`hive_metastore`.`MY_SCHEMA`.`MY_TABLE`'). The associated location ('dbfs:/user/hive/warehouse/my_schema.db/my_table') is not empty and also not a Delta table.When running drop table 'hive_metastore`.`MY_S...

  • 3075 Views
  • 2 replies
  • 2 kudos
Latest Reply
Anonymous
Not applicable
  • 2 kudos

Hi @gilo12  Thank you for posting your question in our community! We are happy to assist you. To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best answers your ...

  • 2 kudos
1 More Replies
joao_vnb
by New Contributor III
  • 10434 Views
  • 3 replies
  • 0 kudos

Give permissions to add comments on data explorer

Hi everyone!I'm facing this issue on databricks workspace.I do have permissions to add comments on data explorer, but some members of my team don't have those permissions.Where can I configure this and give them permission? 

joao_vnb_1-1689087557966.png
  • 10434 Views
  • 3 replies
  • 0 kudos
Latest Reply
Prabakar
Databricks Employee
  • 0 kudos

To grant the permissions on the same page that you have shared, click the permissions tab, and select grant.then from the dropdown list select the user or group and give the respective permissions. 

  • 0 kudos
2 More Replies
pcbzmani
by New Contributor II
  • 11941 Views
  • 7 replies
  • 4 kudos

How to schedule job in workflow for every 30 days

Dear All - I want to schedule a job in Workflow for every 30 days. So, when i try the below CRON expression it gives invalid cron expression. Anyone who already implemented this?0 0 */30 * * 

  • 11941 Views
  • 7 replies
  • 4 kudos
Latest Reply
Tharun-Kumar
Databricks Employee
  • 4 kudos

@pcbzmani Could you try 0 0 0 1/30 * ? *Databricks uses Quartz cron syntax and we have to provide 1/30 for day to achieve a schedule for 30 days.

  • 4 kudos
6 More Replies
qasimhassan
by Contributor
  • 3097 Views
  • 2 replies
  • 0 kudos

ADF & Databricks Pipeline Fails "Library installation failed for library due to user error"

I'm working on small POC to create a data pipeline which get triggered from ADF while having some parameters from ADF but my pipeline fails showing the attaching error:Operation on target Compute Daily Product Revenue failed: Databricks execution fai...

Get Started Discussions
Azure Data Facotry
Data Egineering
Databricks Pipelines
  • 3097 Views
  • 2 replies
  • 0 kudos
Latest Reply
Anonymous
Not applicable
  • 0 kudos

Hi @qasimhassan  Hope all is well! Just wanted to check in if you were able to resolve your issue and would you be happy to share the solution or mark an answer as best? Else please let us know if you need more help.  We'd love to hear from you. Than...

  • 0 kudos
1 More Replies
databricks2k6
by New Contributor II
  • 2028 Views
  • 2 replies
  • 0 kudos

community edition

community edition is available?I tried to login but it is redirecting me differnt pages.

  • 2028 Views
  • 2 replies
  • 0 kudos
Latest Reply
databricks2k6
New Contributor II
  • 0 kudos

Hi ,Please note that community and  is completely fiffernt objectives but you are refering community for disscussion foroum.Unlike the Databricks Free Trial, Community Edition doesn’t require that you have your own cloud account or supply cloud compu...

  • 0 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels