- 5649 Views
- 1 replies
- 1 kudos
Set default database thru Cluster Spark Configuration
Set the default catalog (AKA default SQL Database) in a Cluster's Spark configuration. I've tried the following :spark.catalog.setCurrentDatabase("cbp_reporting_gold_preprod") - this works in a Notebook but doesn't do anything in the Cluster.spark.sq...
- 5649 Views
- 1 replies
- 1 kudos
- 1 kudos
I've tried different commands in the Cluster's Spark Config, none work, they execute at Cluster startup w/o any errors shown in the logs, but once you run a notebook attached to the cluster Default catalog is still set to 'default'.
- 1 kudos
- 2948 Views
- 4 replies
- 1 kudos
Resolved! My exam has suspended by an unprofessional proctor- need help to reschedule
case #00378268 I passed Spark Developer associate exam about 6 month ago with great experience. However, this time the proctor did not even bother to show up to start the exam - checking ID, the room and the surroundings. Somehow, I was able to st...
- 2948 Views
- 4 replies
- 1 kudos
- 3213 Views
- 2 replies
- 0 kudos
How to Create a DataBricks Notebook using API
import requestsimport json# Databricks workspace API URLdatabricks_url = "https://dbc-ab846cbe-f48b.cloud.databricks.com/api/2.0/workspace/import"# Databricks API token (generate one from your Databricks account)databricks_token = "xxxxxxxxxxxxxxxxxx...
- 3213 Views
- 2 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 2987 Views
- 5 replies
- 2 kudos
Resolved! URGENT - Data Bricks certification Exam Suspended
Hi Team, I scheduled my exam today and I showed the room in proctor. They said dull light. But I turned on with a better place. They again wanted to share the room and suspended the exam. Please help me asapWebassessor Id: npt.senthil@gmail.com
- 2987 Views
- 5 replies
- 2 kudos
- 2 kudos
Thanks I got the rescheduled invite. Thanks much!
- 2 kudos
- 1240 Views
- 0 replies
- 1 kudos
Connect to delta table from mlflow pyfunc serving endpoint
Hi, I'm creating an mlflow pyfunc serving endpoint and I would like to connect to a delta table to retrieve some information within the pyfunc. Is this possible?I ask because I don't think that serving endpoint environment has access to spark, and we...
- 1240 Views
- 0 replies
- 1 kudos
- 2023 Views
- 0 replies
- 0 kudos
Databricks cluster restarted inconsistantly causing job failure ( all purpose cluster)
"run failed with error message Driver of the cluster (0307-***-gpbwt) was restarted during the run.", "effectiveIntegrationRuntime": "vnet-ir-*-**** (East US) while performing merge operation.This error is not consistant. DB runtime: 11.3 LTS (inclu...
- 2023 Views
- 0 replies
- 0 kudos
- 2520 Views
- 0 replies
- 0 kudos
Databricks on Virtualization
Hi Team,Can you please direct me to any content on Databricks on Virtualization?Regards,Phanindra
- 2520 Views
- 0 replies
- 0 kudos
- 4649 Views
- 0 replies
- 3 kudos
What's new in Databricks for September 2023
Platform You can now use Structured Streaming to Stream Data from Apache Pulsar on Databricks. For more information : https://docs.databricks.com/en/structured-streaming/pulsar.html (DBR 14.1 required)Databricks Runtime 14.1 and 14.1 ML are now avail...
- 4649 Views
- 0 replies
- 3 kudos
- 1044 Views
- 0 replies
- 0 kudos
Bootstrap Timeout during cluster start on AWS cloud
Hi!We had bunch of strange failures for our jobs during 28-29 of September.Some jobs` runs could not start for some time (30-50 mins) and then were failed with an error:Unexpected failure while waiting for the cluster (0929-002141-2zkekhdj) to be rea...
- 1044 Views
- 0 replies
- 0 kudos
- 2367 Views
- 2 replies
- 0 kudos
om.microsoft.azure.storage.StorageException: The specifed resource name contains invalid characters.
Hi guys I'm relatively new to Databricks and struggling to implement an autoloader ( with trigger once = true ) in file notifications mode. I have CSV files in one container (landing zone). I would like the autoloader to pick up new and existing file...
- 2367 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Kaniz, thank you for your reply. I initially made the mistake of using a capital letter in the queue as part of config files. I can now write, there is no error as a batch process. However, when I try to run the write stream, it says"Running Comma...
- 0 kudos
- 9385 Views
- 8 replies
- 1 kudos
My exam Datbricks Data Engineer Associate got suspended_need immediate help please (10/09/2023)
Hello Team,I encountered Pathetic experience while attempting my DataBricks Data engineer certification. Abruptly, Proctor asked me to show my desk, after showing he/she asked multiple times.. wasted my time and then suspended my exam.I want to file ...
- 9385 Views
- 8 replies
- 1 kudos
- 3275 Views
- 0 replies
- 0 kudos
Very large binary files ingestion error when using binaryFile reader
Hello, I am facing an error while trying to read a large binary file (rosbag format) using binaryFile reader. The file I am trying to read is approx 7GB large. Here's the error message I am getting:FileReadException: Error while reading file dbfs:/mn...
- 3275 Views
- 0 replies
- 0 kudos
- 817 Views
- 0 replies
- 0 kudos
Init script upload issue as file format in workspace using databricks cli
Trying to import init script from local to workspace location using databricks cli via YAML pipeline but it is getting uploaded as notebook.Need to upload it as file format using cli command as workspace init script should be in file format.Does anyo...
- 817 Views
- 0 replies
- 0 kudos
- 1835 Views
- 1 replies
- 1 kudos
Is Datbricks-Salesforce already available?
Reference: Salesforce and Databricks Announce Strategic Partnership to Bring Lakehouse Data Sharing and Shared AI Models to Businesses - Salesforce NewsI was going through this article and wanted to know if anyone in community is planning to use this...
- 1835 Views
- 1 replies
- 1 kudos
- 1 kudos
- 1 kudos
- 3121 Views
- 2 replies
- 2 kudos
Terraform provider, problemi in creating dependant task!
Hola all.I have a serious problem, perhaps I missed something, but can't find the solution. I need to push a job description to Databricks using TERRAFORM. I wrote the code, but there is no way to get a task dependant from two different tasks.Conside...
- 3121 Views
- 2 replies
- 2 kudos
- 2 kudos
@6502 You need to make multiple depends_on blocks for each dependency, ex.depends_on { task_key = "ichi" } depends_on { task_key = "ni" }
- 2 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
AI Summit
4 -
Azure
2 -
Azure databricks
2 -
Bi
1 -
Certification
1 -
Certification Voucher
2 -
Community
7 -
Community Edition
3 -
Community Members
1 -
Community Social
1 -
Contest
1 -
Data + AI Summit
1 -
Data Engineering
1 -
Databricks Certification
1 -
Databricks Cluster
1 -
Databricks Community
8 -
Databricks community edition
3 -
Databricks Community Rewards Store
3 -
Databricks Lakehouse Platform
5 -
Databricks notebook
1 -
Databricks Office Hours
1 -
Databricks Runtime
1 -
Databricks SQL
4 -
Databricks-connect
1 -
DBFS
1 -
Dear Community
1 -
Delta
9 -
Delta Live Tables
1 -
Documentation
1 -
Exam
1 -
Featured Member Interview
1 -
HIPAA
1 -
Integration
1 -
LLM
1 -
Machine Learning
1 -
Notebook
1 -
Onboarding Trainings
1 -
Python
2 -
Rest API
10 -
Rewards Store
2 -
Serverless
1 -
Social Group
1 -
Spark
1 -
SQL
8 -
Summit22
1 -
Summit23
5 -
Training
1 -
Unity Catalog
3 -
Version
1 -
VOUCHER
1 -
WAVICLE
1 -
Weekly Release Notes
2 -
weeklyreleasenotesrecap
2 -
Workspace
1
- « Previous
- Next »