- 2738 Views
- 8 replies
- 0 kudos
External Api not returning any response
import requestsurl = "https://example.com/api"headers = {"Authorization": "Bearer YOUR_TOKEN","Content-Type": "application/json"}Payload = json.dumps({json_data})response = requests.post(url, headers=headers, data=Payload)print(response.status_code)p...
- 2738 Views
- 8 replies
- 0 kudos
- 0 kudos
how to reduce the data size, like API will going to give the data in onetime. can you give with some example.res = request.get("api")this above code is taking is lot of time
- 0 kudos
- 802 Views
- 1 replies
- 0 kudos
AccessDenied error on s3a:// bucket due to Serverless Network Policy in Databricks SQL Endpoint
I wrote this code in Notebookfiles = dbutils.fs.ls("s3a://testbuket114/")for f in files:print(f.name) it caused errs3a://testbuket114/: getFileStatus on s3a://testbuket114/: com.amazonaws.services.s3.model.AmazonS3Exception: Access to storage destina...
- 802 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @jeremylllin ,From the error message:Access to storage destination is denied because of serverless network policyDatabricks serverless environments require explicit network access policies to reach AWS resources like S3. Even if you’ve already ...
- 0 kudos
- 404 Views
- 1 replies
- 0 kudos
query
I was unable to login to databricks community edition i was shown 'User is not a member of this workspace'. even after entering the otp
- 404 Views
- 1 replies
- 0 kudos
- 0 kudos
@manoj991 Did you choose “Login to Free Edition” first?If so, please start from “Sign up.”
- 0 kudos
- 3021 Views
- 3 replies
- 0 kudos
SQL wharehouse do not work with power bi online service
Whenever i try to use a SQL Wharehouse serverless cluster on a power bi dataset it does not refresh on the power bi online service. It does work normally for other types of databricks clusters. The catalog is being defined on the power query import.I...
- 3021 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi,We have the exact same issue, even if we specify the catalog in the connection parameters.However, Oauth authentication through a dataflow (instead of from Power Query Desktop) works fine. In Desktop we are in version 2.122.746.0, but the issue is...
- 0 kudos
- 606 Views
- 1 replies
- 1 kudos
Free Edition and Databricks Asset Bundles
Hi,I would like to learn more about DAB's and gain practical knowledge. For this I want to use the Free Edition but the authentication fails. I have tried both the Databricks extension in VSCode and the Databricks CLI. In the extension, it returns: C...
- 606 Views
- 1 replies
- 1 kudos
- 1 kudos
Hello Lebrown:Here are the steps I managed to deploy job and pipelines to Databricks using DABs with example (Using free edition):1. Install the Databricks CLI (latest version). On Windows, run:winget search databrickswinget install Databricks.Databr...
- 1 kudos
- 1310 Views
- 2 replies
- 4 kudos
Resolved! How To Remove Extra DataBricks Free Edition Account
Accidently made an extra DataBricks Free edition account/workspace on the same email while messing with Legacy edition login, is there a way to delete one of these? The old community edition had a "Delete Account" button but can't seem to find that f...
- 1310 Views
- 2 replies
- 4 kudos
- 4 kudos
Hello Billyboy,I can’t seem to find the option either, but one of the limitations of the Free Edition is:“Databricks may delete Free Edition accounts that are inactive for a prolonged period.”So, you could simply avoid logging into that account for a...
- 4 kudos
- 1155 Views
- 1 replies
- 0 kudos
I can't create a compute resource beyond "SQL Warehouse", "Vector Search" and "Apps"?
None of the LLMs even understand why I can't create a compute resource. I was using community (now free edition) until yesterday, when I became apparent that I needed the paid version, so I upgraded. I've even got my AWS account connected, which was ...
- 1155 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello Jeremyy,The free edition has some limitation in terms of computing. As you noticed that there is no such option to create a custom compute. the custom compute configurations and GPUs are not supported. Free Edition users only have access to ser...
- 0 kudos
- 725 Views
- 1 replies
- 0 kudos
Delete workspace in Free account
I created a free edition account and I used my google account for logging in. I see 2 works spaces got created. I want to delete one of them. How can I delete one of the workspace. If it is not possible, how can I delete my account as a whole?
- 725 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @upskill! Did you possibly sign in twice during setup? That can sometimes lead to separate accounts, each with its own workspace. Currently, there’s no self-serve option to remove a workspace or delete an account. You can reach out to help@data...
- 0 kudos
- 2840 Views
- 3 replies
- 1 kudos
DQ Expectations Best Practice
Hi there, I hope this is a fairly simple and straightforward question. I'm wondering if there's a "general" consensus on where along the DLT data ingestion + transformation process should data quality expectations be applied? For example, two very si...
- 2840 Views
- 3 replies
- 1 kudos
- 1 kudos
in my opinion, you can keep the bronze/raw layer as it is, and the quality check should be applied to silver.
- 1 kudos
- 1276 Views
- 2 replies
- 1 kudos
Resolved! Struggle to parallelize UDF
Hi all I have 2 clusters, that look identical but one runs my UDF in parallel another one does not.The ones that do is personal, the bad one is shared.import pandas as pd from datetime import datetime from time import sleep import threading # test f...
- 1276 Views
- 2 replies
- 1 kudos
- 1 kudos
As a side note "no isolation shared" cluster has no access to unity catalog, so no table queries.I resorted to using personal compute assigned to a group.
- 1 kudos
- 1589 Views
- 1 replies
- 0 kudos
How to override a in-built function in databricks
I am trying to override is_member() in-built function in such a way that, it always return true. How to do it in databricks using sql or python?
- 1589 Views
- 1 replies
- 0 kudos
- 0 kudos
To re-active this question. I have a similar requirement. I want to override shouldRetain(log: T, currentTime: Long) in class org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog, it also always return true
- 0 kudos
- 639 Views
- 1 replies
- 0 kudos
Requirements for Managed Iceberg tables with Unity Catalog
Does Databricks support creating native Apache iceberg tables(managed) in unity catalog or is it possible only with private preview, so what are the requirements?
- 639 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @zent! Databricks now fully supports creating Apache Iceberg managed tables in Unity Catalog, and this capability is available in Public Preview (not just private preview). These managed Iceberg tables can be read and written by Databricks and ...
- 0 kudos
- 2514 Views
- 2 replies
- 1 kudos
Resolved! New Regional Group Request
Hello!How may I request and/or create a new Regional Group for the DMV Area (DC, Maryland, Virginia).Thank you,—Anton@DB_Paul @Sujitha
- 2514 Views
- 2 replies
- 1 kudos
- 1 kudos
Is there a group you already created??
- 1 kudos
- 1311 Views
- 3 replies
- 3 kudos
Resolved! How be a part of Databricks Groups
Hello, I am part of a Community Databricks Crew LATAM, where we have achieved 300 people connected and we have executed 3 events, one by month, we want to be part of Databricks Groups but we dont know how to do that, if somebody can help me I will a...
- 1311 Views
- 3 replies
- 3 kudos
- 3 kudos
Hi Ana, Thanks for reaching out! I won’t be attending DAIS this time, but we do have a Databricks Community booth set up near the Expo Hall. My colleague @Sujitha will be there. Do stop by to say hi and learn about all the exciting things we have go...
- 3 kudos
- 157 Views
- 0 replies
- 0 kudos
How is ur experience with dbx 2025
How is ur experience with dbx 2025
- 157 Views
- 0 replies
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
3 -
Auto-loader
1 -
Autoloader
4 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Cluster Init Script
1 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
Custom Python
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
3 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
3 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks Model
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
databricks_cluster_policy
1 -
DatabricksJobCluster
1 -
DataCleanroom
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
serving endpoint
1 -
Session
1 -
Sign Up Issues
2 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
terraform
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
114 | |
56 | |
42 | |
31 |