- 1084 Views
- 1 replies
- 0 kudos
init script doesn't finish its run when it is located in workspace. it runs fun when I have in bfs
the first attached image runs fine when cluster startsthis next image of the script does not finish when I include the -y parameter. if I include the -y I get failed: Script exit status is non-zero can someone please help
- 1084 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, if I have understand your issue correctly, when you run directly in notebook it ends with an error but when you run it inside .sh with -y it runs okay?
- 0 kudos
- 277 Views
- 0 replies
- 0 kudos
Import notebooks in Databricks
I am using databricks-connect and VS Code to develop some python code for Databricks.I would like to code and run/test everything directly from VS Code using databricks-connect to avoid dealing with Databricks web IDE. For basic notebooks, it works j...
- 277 Views
- 0 replies
- 0 kudos
- 11357 Views
- 5 replies
- 2 kudos
Is there a way to use wild card expressions inside dbutils functions
I am working on a migration project to migrate the HDFS command executions I do inside my Python code via os.system() function by replacing it with dbutils functions.dbutils functions are working as expected if I pass the fully qualified path of a fi...
- 11357 Views
- 5 replies
- 2 kudos
- 2 kudos
As I mentioned in my problem statement, I have already implemented the required functionality with alternative approaches (AWS S3 API and BOTO3 API).Still, it is an outstanding issue with DBUTILS.
- 2 kudos
- 12432 Views
- 3 replies
- 1 kudos
Resolved! Databricks Monitoring
Hi Everyone,can someone suggest me to select best native job monitoring tool available in Databricks for fulfill my need;we need to monitor the following:Number of failed jobs and its name : for last 24 hoursTable that are not getting dataLatest inge...
- 12432 Views
- 3 replies
- 1 kudos
- 1 kudos
let me go through these one by one: Number of failed jobs and its name : for last 24 hours [BA] Your best bet will be to use the upcoming system tables integration. This is in preview, I believe. The general idea is that you will get a Delta tab...
- 1 kudos
- 4709 Views
- 0 replies
- 1 kudos
Model Serving Endpoint keeps failing with SIGKILL error
I am trying to deploy a model in the serving endpoints section, but it keeps failing after attempting to create for an hour. Here are the service logs:Container failed with: 9 +0000] [115] [INFO] Booting worker with pid: 115[2023-09-15 19:15:35 +0000...
- 4709 Views
- 0 replies
- 1 kudos
- 3002 Views
- 5 replies
- 1 kudos
Access RDS Postgres DB via SSH Tunnel
Hello, How can I configure a foreign catalog connection to use SSH tunneling? I want to be able to use unity catalog
- 3002 Views
- 5 replies
- 1 kudos
- 1 kudos
Hi, in addition to our previous message, you can try https://docs.databricks.com/en/query-federation/foreign-catalogs.html and https://grant-6562.medium.com/connecting-to-sql-server-through-an-ssh-tunnel-with-python-17de859caca5. Also please tag @Deb...
- 1 kudos
- 2080 Views
- 1 replies
- 0 kudos
Unity Catalog - Invalid configuration value detected for fs.azure.account.key
Hi there,I am having issue with writing a df to a table or display it. I have three dataframes that I have unioned and after I have done the union I cannot display the dataframe.df_table1 = spark.sql(f'SELECT * FROM {sql_full_name}')df_table2 = ...df...
- 2080 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 988 Views
- 0 replies
- 0 kudos
Use a single cluster policy for multiple teams and capture team name via custom tags
I am planning to add team names in custom tags and was hoping can do it with allowList and then have the user choose from the list. I am trying to avoid having multiple policy files per team.Has anybody found a good way to do this? May be using globa...
- 988 Views
- 0 replies
- 0 kudos
- 1521 Views
- 2 replies
- 0 kudos
Cannot see 'Databricks Lakehouse Platform" on prod AWS Cost Explorer
Hi, Currently, we have two different AWS accounts: dev and prod. We also have two different workspaces: one for dev and another for prod. The strange thing is that prod costs are being added to the dev account on AWS Cost Explorer ("Databricks Lakeho...
- 1521 Views
- 2 replies
- 0 kudos
- 0 kudos
Databricks uses tags and AWS CloudTrail logs to connect and report costs to AWS.Tags can be used to monitor costs and attribute Databricks usage to different business units and teams.AWS CloudTrail logs can be used to calculate the exact cost of API ...
- 0 kudos
- 1165 Views
- 1 replies
- 0 kudos
Snowflake filter Query giving empty results in Databricks, same query working in Snowflake.
I am trying to fetch filter data based on date format on a date column. Below is the query forming in Databricks. In Databricks giving empty results.SELECT * FROM Test.TestSchema.Address where TO_DATE(TO_VARCHAR(MODIFIEDDATE,'YYYY-MM-DD')) = '2023-09...
- 1165 Views
- 1 replies
- 0 kudos
- 0 kudos
- 0 kudos
- 697 Views
- 1 replies
- 0 kudos
Sql warehouse setting permission
How do you set permission to just execute a query but cannot modify settings on the sql warehouse
- 697 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Have you tried this: https://docs.databricks.com/en/security/auth-authz/access-control/sql-endpoint-acl.html Please let us know if this helps. Thanks!
- 0 kudos
- 1534 Views
- 1 replies
- 0 kudos
Resolved! Structured streaming from Azure Event Hub, authenticating without SAS keys
Using SAS keys is a security issue that we would like to avoid, how do we utilize structured streaming from Event Hub while authenticating to Azure AD (client_id and secret).We know that we can use pythons Event Hub library, but that will make have t...
- 1534 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, Could you please try structured streaming event hubs integration? https://docs.databricks.com/en/_extras/notebooks/source/structured-streaming-event-hubs-integration.html
- 0 kudos
- 1006 Views
- 1 replies
- 1 kudos
How to connect to database in azure databricks from vs2022 console application to access the data
Hi Team,Am new to databricks and please find my below question and do the favor.I have created a cluster then database, tables and data also inserted, Now I want to access this table data from my dotnet console application which is in 6.0 framework(v...
- 1006 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi, Databricks extension will help you to do so. https://learn.microsoft.com/en-us/azure/databricks/dev-tools/vscode-ext Please let us know if this helps. Thanks!
- 1 kudos
- 3442 Views
- 3 replies
- 3 kudos
I created the workspace successfully using QuickStart(recommended) method, now I want to create one more workspace and it is showing an error.
Hi,I created the workspace using QuickStart(recommended) method and at the time of creating workspace, it asked following parameters - AccountId -AWSRegionBucketNameDataS3BucketIAMRole.PasswordUsernameWorkspaceNameThe workspace was created successf...
- 3442 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi, could you please elaborate on the error code here? There is some misconfiguration which is causing the error. Thanks!
- 3 kudos
- 1592 Views
- 2 replies
- 0 kudos
What's going wrong in my attempt to start with DataBricks?
I'm trying to get going with DataBricks for the first time. It's told me to create a workspace, which takes me to AWS (I'm also new to AWS). Following the instructions through there gets it to start creating something, but then it just gets stuck on ...
- 1592 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi, this looks like a workspace creation failure. Would like to know about the error details more. Thanks!
- 0 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
2 -
API Documentation
2 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
5 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
1 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
2 -
conditional tasks
1 -
Cost
2 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
DABs
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
1 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
3 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
DataMasking
2 -
dbdemos
1 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
12 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Devops
1 -
DLT
9 -
DLT Pipeline
6 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
1 -
Feature Store
1 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
informatica
1 -
Jar
1 -
Java
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
2 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
RBAC
1 -
Repos Support
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
suggestion
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
Unity Catalog Metastore
1 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »