cancel
Showing results for 
Search instead for 
Did you mean: 
Databricks Platform Discussions
cancel
Showing results for 
Search instead for 
Did you mean: 

Browse the Community

Activity in Databricks Platform Discussions

Iblouse
by Visitor
  • 18 Views
  • 0 replies
  • 0 kudos

Machine Learning Practitioner learning Plan Notebook demos

I am enrolled on the Machine Learning Practitioner learning Plan free version, I can't get the notebook demos to run on databricks community edition. How can I do the demo practices of these courses? Is there another alternative? 

  • 18 Views
  • 0 replies
  • 0 kudos
AmnBrt
by Visitor
  • 27 Views
  • 0 replies
  • 0 kudos

"Databricks Accredited Lakehouse Fundamentals" Badge not received.

Hello, so today I watched the tutorial videos and passed the knowledge test as requested to earn the "Databricks Accredited Lakehouse Fundamentals" Badge. Instead I received the "Certificate of Completion of Fundamentals of the Databricks Lakehouse P...

  • 27 Views
  • 0 replies
  • 0 kudos
shadowinc
by Visitor
  • 46 Views
  • 0 replies
  • 0 kudos

spark/databricks temporary views and uuid

Hi All,We have a table which has an id column generated by uuid(). For ETL we use databricks/spark sql temporary views. we observed strange behavior between databricks sql temp view (create or replace temporary view) and spark sql temp view (df.creat...

Data Engineering
Databricks SQL
spark sql
temporary views
uuid
  • 46 Views
  • 0 replies
  • 0 kudos
as999
by New Contributor III
  • 7392 Views
  • 8 replies
  • 4 kudos

Databrick hive metastore location?

In databrick, where is hive metastore location is it control plane or data plane? for prod systems In terms of security what preventions should be taken to secure hive metastore?

  • 7392 Views
  • 8 replies
  • 4 kudos
Latest Reply
Prabakar
Esteemed Contributor III
  • 4 kudos

@as999​ The default metastore is managed by Databricks. If you are concerned about security and would like to have your own metastore you can go for the external metastore setup. You have the details steps in the below doc for setting up the external...

  • 4 kudos
7 More Replies
MarkusFra
by New Contributor II
  • 998 Views
  • 3 replies
  • 0 kudos

Re-establish SparkSession using Databricks connect after cluster restart

Hello,when developing locally using Databricks connect how do I re-establish the SparkSession when the Cluster restarted? getOrCreate() seems to get the old invalid SparkSession even after Cluster restart instead of creating a new one or am I missing...

Data Engineering
databricks-connect
  • 998 Views
  • 3 replies
  • 0 kudos
Latest Reply
Michael_Chein
  • 0 kudos

If anyone encounters this problem, the solution that worked for me was to restart the Jupyter kernel. 

  • 0 kudos
2 More Replies
dbengineer516
by New Contributor
  • 115 Views
  • 1 replies
  • 0 kudos

/api/2.0/preview/sql/queries API only returning certain queries

Hello,When using /api/2.0/preview/sql/queries to list out all available queries, I noticed that certain queries were being shown while others were not. I did a small test on my home workspace, and it was able to recognize certain queries when I defin...

  • 115 Views
  • 1 replies
  • 0 kudos
Latest Reply
brockb
New Contributor III
  • 0 kudos

Hi,How many queries were returned in the API call in question? The List Queries documentation describes this endpoint as supporting pagination with a default page size of 25, is that how many you saw returned? Query parameters page_size integer <= 10...

  • 0 kudos
prabhu26
by New Contributor
  • 89 Views
  • 1 replies
  • 0 kudos

Unable to enforce schema on data read from jsonl file in Azure Databricks using pyspark

I'm tring to build a ETL pipeline in which I'm reading the jsonl files from the azure blob storage, then trying to transform and load it to delta tables in databricks. I have created the below schema for loading my data :  schema = StructType([ S...

  • 89 Views
  • 1 replies
  • 0 kudos
Latest Reply
DataEngineer
New Contributor II
  • 0 kudos

Try this.Add option("multiline","true")

  • 0 kudos
mh_db
by New Contributor II
  • 74 Views
  • 0 replies
  • 0 kudos

Unable to connect to oracle server from databricks notebook in AWS

I'm trying to connect to oracle server hosted in azure from AWS databricks notebook but seems the connection keeps timing out. I tested the connection IP using telnet <hostIP> 1521 command from another EC2 instance and that seems to reach the oracle ...

Data Engineering
AWS
oracle
TCP
  • 74 Views
  • 0 replies
  • 0 kudos
DataEngineer
by New Contributor II
  • 59 Views
  • 0 replies
  • 0 kudos

AWS Email sending challenge from Databricks with UNITY CATALOG and Multinode cluster

Hi,I have implemented the UNITY CATALOG with multinode cluster in databricks. The workspace instance profile with EC2 access is also created in IAM. but still having a challenge in sending emails from databricks using SES service.The same is working ...

  • 59 Views
  • 0 replies
  • 0 kudos
jv_v
by New Contributor
  • 80 Views
  • 0 replies
  • 0 kudos

Issue with "databricks metastores list" Command - Only One Metastore Listed

I have encountered with the Databricks CLI command databricks metastores list. As per our account setup, we have three metastores configured. However, when I run the command, it only returns information for one metastore instead of listing all three....

  • 80 Views
  • 0 replies
  • 0 kudos
MarkD
by New Contributor II
  • 430 Views
  • 8 replies
  • 0 kudos

SET configuration in SQL DLT pipeline does not work

Hi,I'm trying to set a dynamic value to use in a DLT query, and the code from the example documentation does not work.SET startDate='2020-01-01'; CREATE OR REFRESH LIVE TABLE filtered AS SELECT * FROM my_table WHERE created_at > ${startDate};It is g...

Data Engineering
Delta Live Tables
dlt
sql
  • 430 Views
  • 8 replies
  • 0 kudos
Latest Reply
Hkesharwani
Contributor
  • 0 kudos

Hi @MarkD ,You may use  set variable_name.var= '1900-01-01'to set the value of variable and in order to use the value of variable use ${automated_date.var} Example: set automated_date.var= '1800-01-01' select * from my table where date = CAST(${autom...

  • 0 kudos
7 More Replies
pshuk
by New Contributor III
  • 150 Views
  • 2 replies
  • 1 kudos

upload file/table to delta table using CLI

Hi,I am using CLI to transfer local files to Databricks Volume. At the end of my upload, I want to create a meta table (storing file name, location, and some other information) and have it as a table on databricks Volume. I am not sure how to create ...

  • 150 Views
  • 2 replies
  • 1 kudos
Latest Reply
Ayushi_Suthar
Honored Contributor
  • 1 kudos

Hi @pshuk , Greetings!  We understand that you are looking for a CLI command to create a Table but at this moment Databricks doesn't support CLI command to create the table but you can use SQL Execution API -https://docs.databricks.com/api/workspace/...

  • 1 kudos
1 More Replies
jv_v
by New Contributor
  • 355 Views
  • 1 replies
  • 0 kudos

ERROR-cannot create Metastore-has reached the limit for Metastores in region-need Assistance

Message: Hi Databricks Community,I encountered an error while attempting to create a Metastore in Databricks. The error message is as follows:Error: cannot create metastore: This account with id ******************** has reached the limit for Metastor...

  • 355 Views
  • 1 replies
  • 0 kudos
Latest Reply
Hkesharwani
Contributor
  • 0 kudos

Hi,As per the documentation, we can create only one meta store per region per account .https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/create-metastoreAlthough databricks can make an exception and allow you to create ...

  • 0 kudos
marvin1
by New Contributor III
  • 76 Views
  • 2 replies
  • 0 kudos

Hostname redaction in delta table

I am ingesting job-cluster failure notifications that we send to OpsGenie into a delta table to automate the creation and tracking of Jira tickets.  The alert notification includes the job run url, which we use to quickly respond to job failures.  Ho...

  • 76 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @marvin1,The redaction of the hostname in the job run URL during ingestion into a Delta table might be due to some security or privacy settings in your data ingestion pipeline. You might refer to the Delta Lake Documentation for best practices and...

  • 0 kudos
1 More Replies
JOFinancial
by New Contributor
  • 57 Views
  • 1 replies
  • 0 kudos

No Data for External Table from Blob Storage

Hi All,I am trying to create an external table from a Azure Blob storage container.  I recieve no errors, but there is no data in the table.  The Blob Storage contains 4 csv files with the same columns and about 10k rows of data.  Am I missing someth...

  • 57 Views
  • 1 replies
  • 0 kudos
Latest Reply
Hkesharwani
Contributor
  • 0 kudos

Hi, The code looks completely fine. please check if you have any other delimiter other than , .If your CSV files use a different delimiter, you can specify it in the table definition using the OPTIONS clause.Just to confirm I created a sample table a...

  • 0 kudos
Top Kudoed Authors