cancel
Showing results for 
Search instead for 
Did you mean: 
Community Discussions
Connect with fellow community members to discuss general topics related to the Databricks platform, industry trends, and best practices. Share experiences, ask questions, and foster collaboration within the community.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

elgeo
by Valued Contributor II
  • 2879 Views
  • 0 replies
  • 0 kudos

Databricks job runs - metadata tables

Hello experts,Do you know if there are available metadata/log tables with information on databricks job runs instead of using the Workflows UI?Thank you!

  • 2879 Views
  • 0 replies
  • 0 kudos
Kaviana
by New Contributor III
  • 1416 Views
  • 2 replies
  • 0 kudos

How connect connection ORACLE onpremise in Databricks for extract data

Hello, How Connect in Databricks Enterprise connection Oracle Onpremise and that permissions is necessary.Thank you  

  • 1416 Views
  • 2 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Kaviana , You can connect Databricks Enterprise to an Oracle On-premise database using the cx_Oracle Python module. - To Install Oracle Client libraries, follow these steps:  - Download the Oracle Instant Client Basic Light Package. - Unzip the c...

  • 0 kudos
1 More Replies
rendorHaevyn
by New Contributor III
  • 250 Views
  • 0 replies
  • 0 kudos

Using map_filter / filter to return only map elements for a given predicate

I want to apply a filter to a map structure (on a column called "ActivityMap") for elements only where a given predicate holds. Representative data is below. Applying both "map_filter" and "array_contains" will return rows where a predicate holds, ho...

Community Discussions
filter
map_filter
MapType
pyspark
  • 250 Views
  • 0 replies
  • 0 kudos
Algograp_admin
by New Contributor
  • 841 Views
  • 1 replies
  • 0 kudos

How can I resolve a problem between AWS and Databricks platform?

This is Algograp Co., Ltd., a partner of Databricks.There was a problem in the process of subscribing to Databricks and linking AWS account.I had a problem using my existing platform and canceled my subscription. Account activation is not possible du...

image1.png image2.png image3.png
  • 841 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Algograp_admin, it seems like there are a few potential issues that could be causing the problem with your Databricks subscription and AWS account linkage. Here are some steps you can take to troubleshoot and potentially resolve the issue: 1. **...

  • 0 kudos
Brant_Seibert_V
by New Contributor II
  • 5704 Views
  • 2 replies
  • 1 kudos

Resolved! DLT use case

Is Delta Live Tables (DLT) appropriate for data that is in the millions of rows and GB sized?  Or is DLT only optimal for larger data with billions of rows and TB sized?Please consider the Total Cost of Ownership.Development costs (engineering time)O...

  • 5704 Views
  • 2 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Brant_Seibert_V , Thank you for posting your question in our community! We are happy to assist you. To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best ans...

  • 1 kudos
1 More Replies
rajib_bahar_ptg
by New Contributor III
  • 771 Views
  • 1 replies
  • 1 kudos

Questions about the new Serverless SQL

I have a list of questions about the Serverless SQL option:As per the docs, it says it's hosted by Databricks. Is there any exception to that? Will it ever create ec2 instances on AWS for the serverless option?Are any serverless asset stored on custo...

  • 771 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @rajib_bahar_ptg ,    • Serverless SQL is hosted by Databricks and it uses compute clusters in your account. It does not create EC2 instances on AWS for the serverless option. The serverless SQL warehouses are managed by Databricks, not by the cus...

  • 1 kudos
chenqian0562
by New Contributor II
  • 2187 Views
  • 4 replies
  • 5 kudos

Resolved! Can we apply multiple exam discount voucher to one exam?

I have a 75% off certification exam voucher using my working email, and i also noticed that you can redeem a 25% off voucher from the rewards store. I am wondering if you can add these 2 vouchers together at the same time so that you can get 1 exam 1...

  • 2187 Views
  • 4 replies
  • 5 kudos
Latest Reply
xxjordan
New Contributor II
  • 5 kudos

Hi chenqian0562,Could you please share the solution with the community? I have the same issue.

  • 5 kudos
3 More Replies
adrianhernandez
by New Contributor III
  • 3246 Views
  • 2 replies
  • 2 kudos

Set default database thru Cluster Spark Configuration

Set the default catalog (AKA default SQL Database) in a Cluster's Spark configuration. I've tried the following :spark.catalog.setCurrentDatabase("cbp_reporting_gold_preprod") - this works in a Notebook but doesn't do anything in the Cluster.spark.sq...

  • 3246 Views
  • 2 replies
  • 2 kudos
Latest Reply
adrianhernandez
New Contributor III
  • 2 kudos

I've tried different commands in the Cluster's Spark Config, none work, they execute at Cluster startup w/o any errors shown in the logs, but once you run a notebook attached to the cluster Default catalog is still set to 'default'.

  • 2 kudos
1 More Replies
Wayne
by New Contributor III
  • 2044 Views
  • 4 replies
  • 1 kudos

Resolved! My exam has suspended by an unprofessional proctor- need help to reschedule

case #00378268 I passed  Spark Developer associate exam about 6 month ago  with great experience. However, this time the proctor did not even bother to show up to start the exam - checking ID, the room and the surroundings.  Somehow, I was able to st...

Community Discussions
Certification
Databricks Certified Data Engineer Associate
  • 2044 Views
  • 4 replies
  • 1 kudos
Latest Reply
Wayne
New Contributor III
  • 1 kudos

That will be great.

  • 1 kudos
3 More Replies
jl1
by New Contributor
  • 876 Views
  • 1 replies
  • 0 kudos

"command complete" but not executed

I have a notebook that calls other notebooks with `dbutils.notebook.run` and execute them as a 'Notebook job'. But sometimes when a notebook is taking a long time and the cluster is just waiting for, for instance, an api response, the subsequent comm...

  • 876 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @jl1 , Based on the provided information, it seems like your issue might be related to the timeout settings of the dbutils.notebook.run command or the long waiting time for API responses. The dbutils.notebook.run command has a timeoutSeconds param...

  • 0 kudos
Grace121
by New Contributor
  • 452 Views
  • 1 replies
  • 0 kudos

Deployment of Databricks Artifact Allowlist in Terraform

I would like to inquire about the deployment schedule for the Databricks artifact allowlist within Terraform.

  • 452 Views
  • 1 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @Grace121 , Please raise a support ticket with us for this query.

  • 0 kudos
shihs
by New Contributor
  • 1085 Views
  • 1 replies
  • 1 kudos

Can not read data from GCS

I am trying to use Databricks to read data on Google Cloud Storage (GCS) with Databricks on Google Cloud. I followed the steps from https://docs.gcp.databricks.com/storage/gcs.html.I have tried Access GCS buckets using Google Cloud service accounts o...

  • 1085 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @shihs ,  - Check your Service Account Permissions: Ensure the service account has "storage.objects.get" permission for the GCS bucket. Add the "Storage Object Viewer" role to your service account via the GCS console.  - Use External Locations: Da...

  • 1 kudos
Ashlaiyna
by New Contributor
  • 486 Views
  • 1 replies
  • 1 kudos

Init script upload issue as file format in workspace using databricks cli

Trying to import init script from local to workspace location using databricks cli via YAML pipeline but it is getting uploaded as notebook.Need to upload it as file format using cli command as workspace init script should be in file format.Does anyo...

  • 486 Views
  • 1 replies
  • 1 kudos
Latest Reply
Kaniz
Community Manager
  • 1 kudos

Hi @Ashlaiyna , To upload a file to Databricks workspace using Databricks CLI, you can use the databricks workspace import command. However, since you want to upload it as a file and not as a notebook, you need to specify the --format option as SOUR...

  • 1 kudos
eva_mcmf
by New Contributor II
  • 1989 Views
  • 1 replies
  • 2 kudos

Resolved! Very large binary files ingestion error when using binaryFile reader

Hello, I am facing an error while trying to read a large binary file (rosbag format) using binaryFile reader. The file I am trying to read is approx 7GB large. Here's the error message I am getting:FileReadException: Error while reading file dbfs:/mn...

  • 1989 Views
  • 1 replies
  • 2 kudos
Latest Reply
Kaniz
Community Manager
  • 2 kudos

Hi @eva_mcmf, The error you're encountering is because the size of your binary file exceeds the maximum allowable length in Spark, which is 2147483647 bytes or approximately 2GB. The file you're trying to read is about 7GB, well beyond this limit.  U...

  • 2 kudos
pranav2
by New Contributor II
  • 1800 Views
  • 4 replies
  • 0 kudos

How to Create a DataBricks Notebook using API

import requestsimport json# Databricks workspace API URLdatabricks_url = "https://dbc-ab846cbe-f48b.cloud.databricks.com/api/2.0/workspace/import"# Databricks API token (generate one from your Databricks account)databricks_token = "xxxxxxxxxxxxxxxxxx...

  • 1800 Views
  • 4 replies
  • 0 kudos
Latest Reply
Kaniz
Community Manager
  • 0 kudos

Hi @pranav2, The error message Failed to create notebook: 401 Unauthorized Indicates that the server is not recognizing your credentials. This could be due to various reasons, including an incorrect or expired token or a lack of necessary permissions...

  • 0 kudos
3 More Replies
Join 100K+ Data Experts: Register Now & Grow with Us!

Excited to expand your horizons with us? Click here to Register and begin your journey to success!

Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!