- 1157 Views
- 1 replies
- 0 kudos
2023 AI+Data Summit
Hello Everyone,I'm here to learn about technology advancements and how they can help my company. Keynotes were great and learned lot of things through breakout sessions and looking forward to giving back to the community
- 1157 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @shahebaj, We're thrilled to hear about your positive experience at DAIS 2023 and your eagerness to give back to the community! Your participation and engagement are what make events like DAIS so valuable. We appreciate your support and look forwa...
- 0 kudos
- 723 Views
- 1 replies
- 0 kudos
multi-cloud data streaming talk today at Data + AI Summit 2023
The talk highlighted the benefits of using an open data lake for unified batch and streaming workloads and showcased features like Autoloader for data discovery, streaming triggers for seamless switching, and streaming aggregation for incremental com...
- 723 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Hyperparam42, Thank you for sharing your experience and insights from the Data + AI Summit 2023! We're thrilled to hear that you found the talk on multicloud data streaming to be enlightening. We wanted to let you know that the Databricks Commun...
- 0 kudos
- 945 Views
- 1 replies
- 0 kudos
Great Conference!!
Great Conference, it was cool learning about creating my own LLM, and the risks associated with LLM.sMy favorite activity was the Spin at Dark on Tuesday night!
- 945 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @dy, Great to hear you had a valuable experience at DAIS 2023! We appreciate your attendance and participation. We wanted to share that the Databricks Community Team will be returning to San Francisco to host the Databricks Community booth at DAIS...
- 0 kudos
- 618 Views
- 1 replies
- 0 kudos
Training at Summit
Enrolled in the “Machine Learning in Production” and the “LLMs in Production” Classes and completed. Great Training.. looking forward to implementing soon!!!
- 618 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @DatabricksBofA, That's fantastic to hear! It's always rewarding to see attendees like yourself finding value in the training and looking forward to implementing what you've learned. Your enthusiasm is truly appreciated! I wanted to share some exc...
- 0 kudos
- 740 Views
- 1 replies
- 1 kudos
First timer at Summit and super impressed!
What a great community of people and practitioners! Loved to learn more about the path forward and all the ways this community will certainly have an impact on the future of AI.
- 740 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @ValMir, That's a wonderful sentiment! It's fantastic to hear that you found the DAIS event so enriching. Your enthusiasm for the community and its impact on the future of AI is truly inspiring. We greatly appreciate your attendance and participat...
- 1 kudos
- 1320 Views
- 3 replies
- 0 kudos
VS Code integration with Python Notebook and Remote Cluster
Hi, I'm trying to work on VS code remotely on my machine instead of using the Databricks environment on my browser. I have went through documentation to set up the Databricks. extension and also setup Databricks Connect but don't feel like they work ...
- 1320 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @mohaimen_syed, It sounds like you’re trying to use Databricks Connect to run a Python notebook on a remote Azure Databricks cluster from your local machine. Let’s break down the steps to achieve this: Configure Azure Databricks Authentication...
- 0 kudos
- 1557 Views
- 2 replies
- 0 kudos
Py4JError: An error occurred while calling o992.resourceProfileManager
Hello I am trying to run the SparkXGBoostRegressor and I am getting the following error:SpoilerPy4JError: An error occurred while calling o992.resourceProfileManager. Trace: py4j.security.Py4JSecurityException: Method public org.apache.spark.resource...
- 1557 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi @rahuja, The error you’re encountering might be related to the interaction between PySpark and XGBoost. Let’s explore some potential solutions: PySpark Version Compatibility: Ensure that your PySpark version is compatible with the XGBoost vers...
- 0 kudos
- 606 Views
- 1 replies
- 0 kudos
Getting databricks-connect com.fasterxml.jackson.databind.exc.MismatchedInputException parse warning
Hi community, I am getting below warning when I try using pyspark code for some of my use-cases using databricks-connect. Is this a critical warning, and any idea what does it mean?Logs: WARN DatabricksConnectConf: Could not parse /root/.databricks-c...
- 606 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi, @Surajv, The warning you’re encountering is related to using Databricks Connect with PySpark. Databricks Connect: Databricks Connect is a Python library that allows you to connect your local development environment to a Databricks cluster. I...
- 0 kudos
- 955 Views
- 1 replies
- 0 kudos
Optimal Strategies for downloading large query results with Databricks API
Hi everyone,I'm currently facing an issue with handling a large amount of data using the Databricks API. Specifically, I have a query that returns a significant volume of data, sometimes resulting in over 200 chunks.My initial approach was to retriev...
- 955 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @rafal_walisko, Handling large volumes of data using the Databricks API can indeed be challenging, especially when dealing with numerous chunks. Let’s explore some strategies that might help you optimize your approach: Rate Limits and Paral...
- 0 kudos
- 1738 Views
- 1 replies
- 1 kudos
connection from databricks to snowflake using OKTA
Hi team,This is how I connect to Snowflake from Jupyter Notebook:import snowflake.connector snowflake_connection = snowflake.connector.connect( authenticator='externalbrowser', user='U1', account='company1.us-east-1', database='db1',...
- 1738 Views
- 1 replies
- 1 kudos
- 1 kudos
Hi @ymt, It seems you’ve encountered an issue while connecting to Snowflake from your Databricks Notebook. The error message you received is: ImportError: cannot import name 'NamedTuple' from 'typing_extensions' (/databricks/python/lib/python3.9/s...
- 1 kudos
- 4020 Views
- 5 replies
- 1 kudos
Job parameters to get date and time
I'm trying to set up a workflow in databricks and I need my job parameter to get the date and time. I see in the documentation there's some options for dynamic values.I'm trying to use this one: {{job.start_time.[argument]}}For the "argument" there, ...
- 4020 Views
- 5 replies
- 1 kudos
- 1 kudos
Then please change the code to:```iso_datetime = dbutils.widgets.get("LoadID")```
- 1 kudos
- 3842 Views
- 2 replies
- 0 kudos
Getting python version errors when using pyspark rdd using databricks connect
Hi community, When I use pyspark rdd related functions in my environment using databricks connect, I get below error: Databricks cluster version: 12.2. `RuntimeError: Python in worker has different version 3.9 than that in driver 3.10, PySpark cannot...
- 3842 Views
- 2 replies
- 0 kudos
- 0 kudos
Got it. As a side note, I tried above methods, but the error persisted, hence upon reading docs again, there was this statement: You must install Python 3 on your development machine, and the minor version of your client Python installation must be t...
- 0 kudos
- 2610 Views
- 3 replies
- 0 kudos
Workflows: Running dependent task despite earlier task fail
I have a scheduled task running in workflow.Task 1 computes some parameters then these are picked up by a dependent reporting task: Task 2.I want Task 2 to report "Failure" if Task 1 fails. Yet creating a dependency in workflows means that Task 2 wil...
- 2610 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi @sharpbetty , Any suggestions how I can keep the parameter sharing and dependency from Task 1 to Task 2, yet also allow Task 2 to fire even on failure of Task 1?Setup:Task 2 dependent on Task1 Challenge: To Fire Task 2 even on Task 1 FailureSoluti...
- 0 kudos
- 2668 Views
- 2 replies
- 1 kudos
Databricks Users Access Control via Azure AAD?
Hi All,Looking for suggestions to see if it is possible to control users via Azure AD (outside of Azure Databricks). As i want to create a new users in Azure and then I want to give RBAC to individual users and rather than control their permissions f...
- 2668 Views
- 2 replies
- 1 kudos
- 1 kudos
Thank you Kaniz, let me try some of the options as my Databricks is integrated with AAD. Let me try Option 1 as thats my primary requirement.
- 1 kudos
- 3843 Views
- 4 replies
- 1 kudos
Resolved! Cant create cluster: "Aws Authorization Failure:" .. not authorized to perform: sts:AssumeRole
Full error here:Aws Authorization Failure:Failure happened when talking to AWS, AWS API error code: AccessDenied AWS error message: User: arn:aws:iam::414351767826:user/ConsolidatedManagerIAMUser-ConsolidatedManagerUser-VX02FYW0SSCY is not authorized...
- 3843 Views
- 4 replies
- 1 kudos
Connect with Databricks Users in Your Area
Join a Regional User Group to connect with local Databricks users. Events will be happening in your city, and you won’t want to miss the chance to attend and share knowledge.
If there isn’t a group near you, start one and help create a community that brings people together.
Request a New Group-
12.2 LST
1 -
Access Data
2 -
Access Delta Tables
1 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Analytics
1 -
Apache spark
1 -
API
1 -
API Documentation
1 -
Architecture
1 -
Auto-loader
1 -
Autoloader
2 -
AWS
2 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
2 -
Azure data disk
1 -
Azure databricks
10 -
Azure Databricks SQL
4 -
Azure databricks workspace
1 -
Azure Unity Catalog
4 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Best Practices
1 -
Billing
1 -
Billing and Cost Management
1 -
Bronze Layer
1 -
Bug
1 -
Catalog
1 -
Certification
1 -
Certification Exam
1 -
Certification Voucher
1 -
CICD
2 -
Cli
1 -
Cloud_files_state
1 -
cloudera sql
1 -
CloudFiles
1 -
Cluster
3 -
clusterpolicy
1 -
Code
1 -
Community Group
1 -
Community Social
1 -
Compute
2 -
conditional tasks
1 -
Cost
1 -
Credentials
1 -
CustomLibrary
1 -
CustomPythonPackage
1 -
Data Engineering
2 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
DataAISummit2023
1 -
DatabrickHive
1 -
databricks
2 -
Databricks Academy
1 -
Databricks Alerts
1 -
Databricks Audit Logs
1 -
Databricks Certified Associate Developer for Apache Spark
1 -
Databricks Cluster
1 -
Databricks Clusters
1 -
Databricks Community
1 -
Databricks connect
1 -
Databricks Dashboard
1 -
Databricks delta
1 -
Databricks Delta Table
2 -
Databricks Documentation
1 -
Databricks JDBC
1 -
Databricks Job
1 -
Databricks jobs
2 -
Databricks Lakehouse Platform
1 -
Databricks notebook
1 -
Databricks Notebooks
2 -
Databricks Platform
1 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks SQL
1 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks UI
1 -
Databricks Unity Catalog
3 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
1 -
DatabricksJobCluster
1 -
DataDays
1 -
DataMasking
2 -
dbdemos
1 -
DBRuntime
1 -
DDL
1 -
deduplication
1 -
Delt Lake
1 -
Delta
7 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
6 -
Delta Sharing
2 -
deltaSharing
1 -
denodo
1 -
Deny assignment
1 -
Devops
1 -
DLT
8 -
DLT Pipeline
6 -
DLT Pipelines
5 -
DLTCluster
1 -
Documentation
2 -
Dolly
1 -
Download files
1 -
dropduplicatewithwatermark
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
1 -
Feature Store
1 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
Getting started
1 -
glob
1 -
Good Documentation
1 -
Google Bigquery
1 -
hdfs
1 -
Help
1 -
How to study Databricks
1 -
informatica
1 -
Jar
1 -
Java
1 -
JDBC Connector
1 -
Job Cluster
1 -
Job Task
1 -
Kubernetes
1 -
Lineage
1 -
LLMs
1 -
Login
1 -
Login Account
1 -
Machine Learning
1 -
MachineLearning
1 -
masking
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Metastore
1 -
MlFlow
2 -
Mlops
1 -
Model Serving
1 -
Model Training
1 -
Mount
1 -
Networking
1 -
nic
1 -
Okta
1 -
ooze
1 -
os
1 -
Password
1 -
Permission
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
1 -
policies
1 -
PostgresSQL
1 -
Pricing
1 -
pubsub
1 -
Pyspark
1 -
Python
1 -
Quickstart
1 -
RBAC
1 -
Repos Support
1 -
Reserved VM's
1 -
Reset
1 -
run a job
1 -
runif
1 -
S3
1 -
SAP SUCCESS FACTOR
1 -
Schedule
1 -
SCIM
1 -
Serverless
1 -
Service principal
1 -
Session
1 -
Sign Up Issues
2 -
Significant Performance Difference
1 -
Spark
2 -
sparkui
2 -
Splunk
1 -
sqoop
1 -
Start
1 -
Stateful Stream Processing
1 -
Storage Optimization
1 -
Structured Streaming ForeachBatch
1 -
Summit23
2 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
tabrikck
1 -
Tags
1 -
Troubleshooting
1 -
ucx
2 -
Unity Catalog
1 -
Unity Catalog Error
2 -
UntiyCatalog
1 -
Update
1 -
user groups
1 -
Venicold
3 -
volumes
2 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
with open
1 -
Women
1 -
Workflow
2 -
Workspace
2
- « Previous
- Next »
User | Count |
---|---|
14 | |
12 | |
11 | |
11 | |
8 |