- 964 Views
- 2 replies
- 1 kudos
Enroll, Learn, Earn Databricks !!
Hello Team,I had attended the session in CTS Manyata on 22nd April. I am interested in pursuing for the certifications but while enrolling it shows you are not a member of any group.Link for the available certifications and courses: https://community...
- 964 Views
- 2 replies
- 1 kudos
- 1 kudos
Hi @samgupta88 you can find it on the partner academy. Everything is listed in the partner portal.
- 1 kudos
- 831 Views
- 4 replies
- 0 kudos
UCX Installation error
Error Message: databricks.sdk.errors.platform.ResourceDoesNotExist: Can't find a cluster policy with id: 00127F76E005AE12.
- 831 Views
- 4 replies
- 0 kudos
- 0 kudos
Click into each policy in the Compute UI of the Workspace to see if the policy ID exists. If it does, then the account that invoked the SDK method didn't have workspace admin permissions.
- 0 kudos
- 1313 Views
- 3 replies
- 0 kudos
Resolved! .py file running stuck on waiting
Hello, hope you are doing well.We are facing an issue when running .py files. This is fairly recent and we were not experiencing this issue last week.As shown in the screenshots below, the .py file hangs on "waiting" after we press "run all". No matt...
- 1313 Views
- 3 replies
- 0 kudos
- 0 kudos
Hello, thanks a lot for your answer.We were getting the required permissions to use Firefox in our org, but in the meantime it seemed it worked again in Edge when it updated to version 135.0.3179.85 (Official build) (64-bit).
- 0 kudos
- 7048 Views
- 2 replies
- 2 kudos
Resolved! requirements.txt with cluster libraries
Cluster libraries are supported from version 15.0 - Databricks Runtime 15.0 | Databricks on AWS.How can I specify requirements.txt file path in the libraries in a job cluster in my workflow? Can I use relative path? Is it relative from the root of th...
- 7048 Views
- 2 replies
- 2 kudos
- 2 kudos
how to install requirement.txt using github action.- name: Install workspace requirements.txt on clusterenv:CLUSTER_ID: ${{ secrets.DATABRICKS_CLUSTER_ID }}run: |databricks libraries install \--cluster-id "$CLUSTER_ID" \--whl "dbfs:/FileStore/enginee...
- 2 kudos
- 1549 Views
- 5 replies
- 0 kudos
Debugging notebook access to external REST API
I'm using a Python Notebook with a REST API to access a system outside Databricks, in this case it's to call a SAS program. Identical python code works fine if I call it from jupyter on my laptop, but fails with a timeout when I run it from my Databr...
- 1549 Views
- 5 replies
- 0 kudos
- 0 kudos
What happens if you run command in notebook:nc -vz hostname 443If it fails to connect this will mean that the firewall or security groups associated with the VPC or VNet are not allowing this connection, you will need to check with your networking te...
- 0 kudos
- 664 Views
- 1 replies
- 0 kudos
Trusted assets vs query examples
¡Hi community! In recent days I explored trusted assets in my genie space and this working very well! but I feel a little confused :sIn my genie space I have many queries examples when I create a new function with the same query example for verify th...
- 664 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Dulce42! It depends on your use case. If your function covers the scenario well, you don’t need a separate query example. Having both for the same purpose can create redundancy and make things more complex. Choose the option that best fits you...
- 0 kudos
- 1041 Views
- 2 replies
- 0 kudos
Resolved! Need help to add personal email to databricks partner account
I have been actively using the Databricks Partner Academy for the past three years through my current organization. As I am planning to transition to a new company, I would like to ensure continued access to my training records and certifications.Cur...
- 1041 Views
- 2 replies
- 0 kudos
- 903 Views
- 1 replies
- 0 kudos
Python versions - Notebooks and DBR
Hi,I have a problem with conflicting python versions in a notebook running with the Databricks 14 day free trial. One example:spark.conf.get("spark.databricks.clusterUsageTags.clusterName") # Returns: "Python versions in the Spark Connect client and...
- 903 Views
- 1 replies
- 0 kudos
- 0 kudos
Hi @Terje, were you able to fix it? From what I know, during the free trial period we’re limited to the default setup, so version mismatches can’t be resolved unless we upgrade to a paid workspace.
- 0 kudos
- 550 Views
- 1 replies
- 1 kudos
Completed Machine learning course
I have completed my course for Machine learning as part of Learning festival.
- 550 Views
- 1 replies
- 1 kudos
- 1624 Views
- 2 replies
- 0 kudos
Python coding in notebook with a (long) token
I have written a python program (called by a trigger) that uses a token issued by a third party app (it's circa 400 bytes long including '.' and '-'). When I copy/paste this token into a Databricks notebook - curious formatting takes place and a coup...
- 1624 Views
- 2 replies
- 0 kudos
- 0 kudos
Hey Paul, You can use databricks secrets for preserving the integrity of the token.Here's the databricks doc for refernece : https://docs.databricks.com/aws/en/security/secrets
- 0 kudos
- 4435 Views
- 3 replies
- 0 kudos
Save output of show table extended to table?
I want to save the output of show table extended in catalogName like 'mysearchtext*';to a table.How do I do that?
- 4435 Views
- 3 replies
- 0 kudos
- 0 kudos
Use DESCRIBE EXTENDED customer AS JSON this returns as a json data . This you can load Applicable to databricks 16.2 and abovehttps://docs.databricks.com/aws/en/sql/language-manual/sql-ref-syntax-aux-describe-table
- 0 kudos
- 5250 Views
- 2 replies
- 1 kudos
Missing Genie - Upload File Feature in Preview Section
Despite having admin privileges for both the workspace and Genie Workspace, we are unable to see the "Genie - Upload File" feature under the Preview section, even though the documentation indicates it should be available.We also attempted switching r...
- 5250 Views
- 2 replies
- 1 kudos
- 1 kudos
For more information around upload a file option please refer https://docs.databricks.com/aws/en/genie/file-uploadit supports csv and excel datasets as of now with condition that files must be smaller than 200 MB and contain fewer than 100 columns du...
- 1 kudos
- 1958 Views
- 4 replies
- 4 kudos
Resolved! using Azure Databricks vs using Databricks directly
Hi friends,A quick question regarding how data, workspace controls works while using "Azure Databricks". I am planning to use Azure Databricks that comes as part of my employer's Azure Subscriptions. I work for a Public sector organization, which is ...
- 1958 Views
- 4 replies
- 4 kudos
- 4650 Views
- 0 replies
- 0 kudos
Support for managed identity based authentication in python kafka client
We followed this document https://docs.databricks.com/aws/en/connect/streaming/kafka?language=Python#msk-aad to use Kafka client to read events from our event hub for a feature.As part of the SFI, the guidance is to move away from client secret and u...
- 4650 Views
- 0 replies
- 0 kudos
- 555 Views
- 1 replies
- 0 kudos
Right course for ML engineer
Hi I would like to learn databricks so that I could look for job opportunities as a ML engineer. I have background with python programming, computer vision (OpenCV) .not having much of experience with azure , aws so on.which course here is good with ...
- 555 Views
- 1 replies
- 0 kudos
- 0 kudos
Given your background in Python programming and computer vision but limited experience with cloud platforms, the best pathway to enter the job market as MLE using Databricks is to pursue the Databricks Certified Machine Learning Associate certificati...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
114 | |
56 | |
42 | |
30 |