Databricks job runs - metadata tables
Hello experts,Do you know if there are available metadata/log tables with information on databricks job runs instead of using the Workflows UI?Thank you!
- 2879 Views
- 0 replies
- 0 kudos
Hello experts,Do you know if there are available metadata/log tables with information on databricks job runs instead of using the Workflows UI?Thank you!
Hello, How Connect in Databricks Enterprise connection Oracle Onpremise and that permissions is necessary.Thank you
Hi @Kaviana , You can connect Databricks Enterprise to an Oracle On-premise database using the cx_Oracle Python module. - To Install Oracle Client libraries, follow these steps: - Download the Oracle Instant Client Basic Light Package. - Unzip the c...
I want to apply a filter to a map structure (on a column called "ActivityMap") for elements only where a given predicate holds. Representative data is below. Applying both "map_filter" and "array_contains" will return rows where a predicate holds, ho...
This is Algograp Co., Ltd., a partner of Databricks.There was a problem in the process of subscribing to Databricks and linking AWS account.I had a problem using my existing platform and canceled my subscription. Account activation is not possible du...
Hi @Algograp_admin, it seems like there are a few potential issues that could be causing the problem with your Databricks subscription and AWS account linkage. Here are some steps you can take to troubleshoot and potentially resolve the issue: 1. **...
Is Delta Live Tables (DLT) appropriate for data that is in the millions of rows and GB sized? Or is DLT only optimal for larger data with billions of rows and TB sized?Please consider the Total Cost of Ownership.Development costs (engineering time)O...
Hi @Brant_Seibert_V , Thank you for posting your question in our community! We are happy to assist you. To help us provide you with the most accurate information, could you please take a moment to review the responses and select the one that best ans...
I have a list of questions about the Serverless SQL option:As per the docs, it says it's hosted by Databricks. Is there any exception to that? Will it ever create ec2 instances on AWS for the serverless option?Are any serverless asset stored on custo...
Hi @rajib_bahar_ptg , • Serverless SQL is hosted by Databricks and it uses compute clusters in your account. It does not create EC2 instances on AWS for the serverless option. The serverless SQL warehouses are managed by Databricks, not by the cus...
I have a 75% off certification exam voucher using my working email, and i also noticed that you can redeem a 25% off voucher from the rewards store. I am wondering if you can add these 2 vouchers together at the same time so that you can get 1 exam 1...
Hi chenqian0562,Could you please share the solution with the community? I have the same issue.
Set the default catalog (AKA default SQL Database) in a Cluster's Spark configuration. I've tried the following :spark.catalog.setCurrentDatabase("cbp_reporting_gold_preprod") - this works in a Notebook but doesn't do anything in the Cluster.spark.sq...
I've tried different commands in the Cluster's Spark Config, none work, they execute at Cluster startup w/o any errors shown in the logs, but once you run a notebook attached to the cluster Default catalog is still set to 'default'.
case #00378268 I passed Spark Developer associate exam about 6 month ago with great experience. However, this time the proctor did not even bother to show up to start the exam - checking ID, the room and the surroundings. Somehow, I was able to st...
I have a notebook that calls other notebooks with `dbutils.notebook.run` and execute them as a 'Notebook job'. But sometimes when a notebook is taking a long time and the cluster is just waiting for, for instance, an api response, the subsequent comm...
Hi @jl1 , Based on the provided information, it seems like your issue might be related to the timeout settings of the dbutils.notebook.run command or the long waiting time for API responses. The dbutils.notebook.run command has a timeoutSeconds param...
I would like to inquire about the deployment schedule for the Databricks artifact allowlist within Terraform.
Hi @Grace121 , Please raise a support ticket with us for this query.
I am trying to use Databricks to read data on Google Cloud Storage (GCS) with Databricks on Google Cloud. I followed the steps from https://docs.gcp.databricks.com/storage/gcs.html.I have tried Access GCS buckets using Google Cloud service accounts o...
Hi @shihs , - Check your Service Account Permissions: Ensure the service account has "storage.objects.get" permission for the GCS bucket. Add the "Storage Object Viewer" role to your service account via the GCS console. - Use External Locations: Da...
Trying to import init script from local to workspace location using databricks cli via YAML pipeline but it is getting uploaded as notebook.Need to upload it as file format using cli command as workspace init script should be in file format.Does anyo...
Hi @Ashlaiyna , To upload a file to Databricks workspace using Databricks CLI, you can use the databricks workspace import command. However, since you want to upload it as a file and not as a notebook, you need to specify the --format option as SOUR...
Hello, I am facing an error while trying to read a large binary file (rosbag format) using binaryFile reader. The file I am trying to read is approx 7GB large. Here's the error message I am getting:FileReadException: Error while reading file dbfs:/mn...
Hi @eva_mcmf, The error you're encountering is because the size of your binary file exceeds the maximum allowable length in Spark, which is 2147483647 bytes or approximately 2GB. The file you're trying to read is about 7GB, well beyond this limit. U...
import requestsimport json# Databricks workspace API URLdatabricks_url = "https://dbc-ab846cbe-f48b.cloud.databricks.com/api/2.0/workspace/import"# Databricks API token (generate one from your Databricks account)databricks_token = "xxxxxxxxxxxxxxxxxx...
Hi @pranav2, The error message Failed to create notebook: 401 Unauthorized Indicates that the server is not recognizing your credentials. This could be due to various reasons, including an incorrect or expired token or a lack of necessary permissions...
Excited to expand your horizons with us? Click here to Register and begin your journey to success!
Already a member? Login and join your local regional user group! If there isn’t one near you, fill out this form and we’ll create one for you to join!