- 1341 Views
- 1 replies
- 0 kudos
Databricks workflow jobs run is taking Double time in EU Region
We have a scheduled job in Databricks workflow, This Job run is taking aroud 5 hours Previously before 1 month it was tasking on 2.5 hours. Can any one tell what may be the reason behind this. Note: There is no change has been made in this period of ...
- 1341 Views
- 1 replies
- 0 kudos
- 0 kudos
You can check if you are using spot instances on your Job Cluster.btw. if you are using Azure West Europe is on very high demand and sometimes it takes time to provision compute.But it should be matter of minutes, not hours.Check maybe if your data v...
- 0 kudos
- 12838 Views
- 3 replies
- 0 kudos
Resolved! How to pass variables to a python file job
Hi everyone,It's relatively straight forward to pass a value to a key-value pair in notebook job. But for the python file job however, I couldn't figure out how to do it. Does anyone have any idea?Have been tried out different variations for a job wi...
- 12838 Views
- 3 replies
- 0 kudos
- 0 kudos
Thanks so much for this! By the way, is there a way to do it with the JSON interface? I am struggling to get the parameters if entered in this way
- 0 kudos
- 1662 Views
- 1 replies
- 0 kudos
Chat Bot with Azure blob and databricks
Hi Team, I am thinking to start a chat bot application for teams to query data from Azure blob and data bricks tables in python programming language.Please help me out on how i can start and which tools i can use for this requirement.Thanks in advanc...
- 1662 Views
- 1 replies
- 0 kudos
- 0 kudos
@Nagrjuna , that's a great idea! Although we do not know about your use case completely, I am sure you would definitely fall in love with our AI/ML Products. To create a Python chat bot application that can pull data from Azure Blob Storage and Datab...
- 0 kudos
- 2582 Views
- 1 replies
- 1 kudos
How to configure github credentials for a service principal NOT using Azure
I want to have a service principal run a job that uses a notebook in our github. We are AWS not Azure. How do I configure git credentials for the service principal? Does this use deploy keys?
- 2582 Views
- 1 replies
- 1 kudos
- 2279 Views
- 1 replies
- 1 kudos
Resolved! Workspace FileNotFoundExecption
I have a model created with catboost and exported in onnx format in workspace and I want to download that model to my local machine.I tried to use the Export that is in the three points to the right of the model, but the model weighs more than 10 Mb ...
- 2279 Views
- 1 replies
- 1 kudos
- 1 kudos
you need put file to FileStorehttps://docs.databricks.com/en/dbfs/filestore.html#save-a-file-to-filestore
- 1 kudos
- 2083 Views
- 1 replies
- 0 kudos
VS Code integration with Python Notebook and Remote Cluster
Hi, I'm trying to work on VS code remotely on my machine instead of using the Databricks environment on my browser. I have went through documentation to set up the Databricks. extension and also setup Databricks Connect but don't feel like they work ...
- 2083 Views
- 1 replies
- 0 kudos
- 3243 Views
- 1 replies
- 0 kudos
What happened to the ephemeral notebook links????? and the job ids????
Hey Databricks, Why did you remove the ephemeral notebook links and job Ids from the parallel runs? This has created a huge gap for us. We can no longer view the ephemeral notebooks, and also the Jobids are missing from the output. Waccha doing?...
- 3243 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi Kaniz, It's funny you mention these things - we are doing some of those - the problem now is that the JobId is obscured from the output meaning we can't tell which ephemeral notebook goes with which JobId. It looks like the ephemeral notebook ...
- 0 kudos
- 5379 Views
- 0 replies
- 0 kudos
Updating Databricks SQL Warehouse using Terraform
We can Update SQL Warehouse manually in Databricks.Click SQL Warehouses in the sidebarIn Advanced optionsWe can find Unity Catalog toggle button there! While Updating Existing SQL Warehouse in Azure to enable unity catalog using terraform, I couldn'...
- 5379 Views
- 0 replies
- 0 kudos
- 5675 Views
- 5 replies
- 3 kudos
OSError: [Errno 78] Remote address changed
Hello:)as part of deploying an app that previously ran directly on emr to databricks, we are running experiments using LTS 9.1, and getting the following error: PythonException: An exception was thrown from a UDF: 'pyspark.serializers.SerializationEr...
- 5675 Views
- 5 replies
- 3 kudos
- 3 kudos
Hi @liormayn , I can understand. I see the fix went on 20 March 2024, you would have to restart the clusters. Thanks!
- 3 kudos
- 2768 Views
- 1 replies
- 0 kudos
Py4JError: An error occurred while calling o992.resourceProfileManager
Hello I am trying to run the SparkXGBoostRegressor and I am getting the following error:SpoilerPy4JError: An error occurred while calling o992.resourceProfileManager. Trace: py4j.security.Py4JSecurityException: Method public org.apache.spark.resource...
- 2768 Views
- 1 replies
- 0 kudos
- 3195 Views
- 0 replies
- 0 kudos
Stream to stream join NullPointerException
I have a DLT pipeline running in continous mode. I have a stream to stream join which runs for the first 5hrs but then fails with a Null Pointer Exception. I need assistance to know what I need to do to handle this. my code is structured as below:@dl...
- 3195 Views
- 0 replies
- 0 kudos
- 5817 Views
- 4 replies
- 2 kudos
Resolved! How to choose a compute, and how to find alternatives for the current compute being used?
We are using a compute for an Interactive Cluster in Production which incurs X amount of cost. We want to know what are the options available to use with near about the same processing power as the current compute but incur a cost of Y, which is less...
- 5817 Views
- 4 replies
- 2 kudos
- 2 kudos
Hello @Ikanip , You can utilize the Databricks Pricing Calculator to estimate costs. For detailed information on compute capacity, please refer to your cloud provider's documentation regarding Virtual Machine instance types.
- 2 kudos
- 1605 Views
- 0 replies
- 0 kudos
Databricks Running Jobs and Terraform
What happens to a currently running job when a workspace is deployed again using Terraform? Are the jobs paused/resumed, or are they left unaffected without any down time? Searching for this specific scenario doesn't seem to come up with anything and...
- 1605 Views
- 0 replies
- 0 kudos
- 11807 Views
- 5 replies
- 1 kudos
Job parameters to get date and time
I'm trying to set up a workflow in databricks and I need my job parameter to get the date and time. I see in the documentation there's some options for dynamic values.I'm trying to use this one: {{job.start_time.[argument]}}For the "argument" there, ...
- 11807 Views
- 5 replies
- 1 kudos
- 1 kudos
Then please change the code to:```iso_datetime = dbutils.widgets.get("LoadID")```
- 1 kudos
- 950 Views
- 0 replies
- 0 kudos
Archive file support in Jar Type application
In my spark application, I am using set of python libraries. I am submitting spark application as Jar Task. But I am not able to find any option provide Archive Files.So, in order to handle python dependencies, I am using approach:Create archive file...
- 950 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
3 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
Api Calls
1 -
API Documentation
3 -
App
1 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
6 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
2 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineer Associate
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Data Processing
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks App
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
4 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
Hubert Dudek
2 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
3 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
meetup
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Monitoring
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
OpenAI
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Software Development
1 -
Spark Connect
1 -
Spark scala
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
2 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
| User | Count |
|---|---|
| 133 | |
| 119 | |
| 57 | |
| 42 | |
| 34 |