cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Kaviana
by New Contributor III
  • 2590 Views
  • 1 replies
  • 0 kudos

Connection VPC AWS in Databricks for extract data Oracle Onpremise

 Hi, help meHow can I consume a VPC when it is already anchored in the "network" of Databricks to extract information from a server that really made ping and works

  • 2590 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaviana
New Contributor III
  • 0 kudos

Hello @Retired_mod I have already configured in "Cloud resources"/ "Network" the Private endpoint links and the linked VPC in ibabricks. How can I connect to Oracle using the EC2 virtual machine?Thank u

  • 0 kudos
simran_27
by New Contributor
  • 1211 Views
  • 1 replies
  • 0 kudos

Error while attempt to give Lakehouse fundamentals exam

I would like to know why I am getting this error when I tried to earn badges for lake house fundamentals. I can't access the quiz page. Can you please help on this?I am getting 403: Forbidden error.

  • 1211 Views
  • 1 replies
  • 0 kudos
Latest Reply
APadmanabhan
Databricks Employee
  • 0 kudos

Hi @simran_27 could you please try using the link 

  • 0 kudos
ushnish_98
by New Contributor III
  • 2046 Views
  • 2 replies
  • 0 kudos

Resolved! Databricks Certificate

I cleared Databricks Associate level certification on 25th September. But, I am yet to receive my certificate from Databricks. I raised a ticket for the same but got no response from the suppport team.

  • 2046 Views
  • 2 replies
  • 0 kudos
Latest Reply
APadmanabhan
Databricks Employee
  • 0 kudos

Hello There, Can you share the ticket number, note that we have a high influx of cases and the team is actively working on clearing the backlogs and you should be able to receive the reply in the next 48 hours. I appreciate the patience.

  • 0 kudos
1 More Replies
rajib_bahar_ptg
by New Contributor III
  • 5114 Views
  • 5 replies
  • 5 kudos

Databricks workspace unstable

Our company's databricks workspace is unstable lately. It can't launch any compute cluster. I have never seen this issue. In addition to this issues, I have seen storage credential error on main unity catalog. Why would this happen AWS Databricks ins...

  • 5114 Views
  • 5 replies
  • 5 kudos
Latest Reply
rajib_bahar_ptg
New Contributor III
  • 5 kudos

Hello @Retired_mod and @jose_gonzalez ,I couldn't locate the support ticket we opened. How can we track that ticket down? It came from the peopletech.com domain. If it is more efficient to create another ticket, please let me know. Let us know the UR...

  • 5 kudos
4 More Replies
Sujitha
by Databricks Employee
  • 2562 Views
  • 1 replies
  • 6 kudos

Webinar: Accelerate Data and AI Projects With Databricks Notebooks

Register now: October 24, 2023 | 8:00 AM PT Use new capabilities in Databricks Notebooks to speed up innovation. This webinar will walk you through the features that are designed to take the manual effort and delays out of building and deploying dat...

Screenshot 2023-09-22 at 3.47.19 PM.png
  • 2562 Views
  • 1 replies
  • 6 kudos
Latest Reply
Ajay-Pandey
Esteemed Contributor III
  • 6 kudos

This will help databricks user to speed up development.

  • 6 kudos
gideont
by New Contributor III
  • 5509 Views
  • 3 replies
  • 3 kudos

Application extracting Data from Unity Catalogue

Dear Databricks community,I'm seeking advice on the best method for applications to extract data from the Unity catalogue. One suggested approach is to use JDBC, but there seems to be a dilemma. Although using a job cluster has been recommended due t...

  • 5509 Views
  • 3 replies
  • 3 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 3 kudos

What exacltly do you mean by 'extracting'?  If you want to load tables defined in Unity into a database, I would indeed do this using job clusters and a notebook.If you want to extract some data once in a while into a csv f.e., you could perfectly do...

  • 3 kudos
2 More Replies
JohnSmith2
by New Contributor II
  • 4776 Views
  • 4 replies
  • 2 kudos

Resolved! Error on Workflow

Hi , I have some mysteries situation here My workflow (job) ran and got an error -> [INVALID_IDENTIFIER] The identifier transactions-catalog is invalid. Please, consider quoting it with back-quotes as `transactions-catalog`.(line 1, pos 12) == SQL ==...

  • 4776 Views
  • 4 replies
  • 2 kudos
Latest Reply
-werners-
Esteemed Contributor III
  • 2 kudos

Jobs are just notebooks executed in background, so if the notebook is the same between interactive (manual) and job run, there should be no difference.So I don't see what is wrong.  Is the job using DLT perhaps?

  • 2 kudos
3 More Replies
DBEnthusiast
by New Contributor III
  • 5650 Views
  • 1 replies
  • 1 kudos

DataBricks Cluster

Hi All,I am curious to know the difference between a spark cluster and a DataBricks one.As per the info I have read Spark Cluster creates driver and Workers when the Application is submitted whereas in Databricks we can create cluster in advance in c...

  • 5650 Views
  • 1 replies
  • 1 kudos
Mohan2
by New Contributor
  • 4554 Views
  • 0 replies
  • 0 kudos

SQL Warehouse - several issues

Hi there,I am facing several issues while trying to run SQL warehouse-starter on Azure databricks.Please note I am new to this data world, Azure & Databricks .  while starting SQL starter warehouse in Databricks Trail version and  I am getting these ...

Mohan2_0-1695443100723.png Mohan2_0-1695441032770.png
  • 4554 Views
  • 0 replies
  • 0 kudos
eimis_pacheco
by Contributor
  • 5145 Views
  • 2 replies
  • 2 kudos

Resolved! Is it not needed to preserve the data in its original format anymore with the usage of medallion?

Hi Community I have a doubt. The bronze layer always causes confusion for me. Someone mentioned, "File Format: Store data in Delta Lake format to leverage its performance, ACID transactions, and schema evolution capabilities" for bronze layers.Then, ...

  • 5145 Views
  • 2 replies
  • 2 kudos
vivek2612
by New Contributor II
  • 10331 Views
  • 5 replies
  • 0 kudos

Exam got suspended Databricks Certified Data Engineer Associate exam

Hi team,My Databricks Certified Data Engineer Associate exam got suspended within 20 minutes.  My exam got suspended due to eye movement without any warning. I was not moving my eyes away from laptop screen. Some questions are so big in the exam so I...

  • 10331 Views
  • 5 replies
  • 0 kudos
Latest Reply
Cert-Team
Databricks Employee
  • 0 kudos

@rajib_bahar_ptg funny, not funny, right?! I just posted that tip today in this post: https://community.databricks.com/t5/certifications/minimize-the-chances-of-your-exam-getting-suspended-tip-3/td-p/45712

  • 0 kudos
4 More Replies
Mbinyala
by New Contributor II
  • 7848 Views
  • 6 replies
  • 3 kudos

Cluster policy not showing while creating delta live table pipeline

Hi all!!!I have created a cluster policy but when i want to use that while creating dlt pipeline, It is showing none. I have checked I have all the necessary permissions to create cluster policies. Still, in dlt ui it is showing none.

  • 7848 Views
  • 6 replies
  • 3 kudos
Latest Reply
Rishitha
New Contributor III
  • 3 kudos

@btafur  Can we also set the auto_terminate minutes with the policy?  (for the dlt cluster type)

  • 3 kudos
5 More Replies
RithwikMR
by New Contributor
  • 1688 Views
  • 1 replies
  • 0 kudos

How to automate creating notebooks when i have multiple .html or .py files

Hi all,I have 50+ .html and .py files, for which I have to create separate notebooks for each and every single one of them. For this, manually creating a notebook using the UI and importing the .html/.py file is a bit tedious and time consuming. Is t...

  • 1688 Views
  • 1 replies
  • 0 kudos
Latest Reply
btafur
Databricks Employee
  • 0 kudos

Depending on your use case and requirements, one alternative would be to create a script that loops through your files and uploads them using the API. You can find more information about the API here: https://docs.databricks.com/api/workspace/workspa...

  • 0 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels