- 3582 Views
- 9 replies
- 2 kudos
Spark configuration to access Azure Blob or ADLS
I am new to Databricks, I am using the free edition of Databricks. I have tried [spark.conf.set("fs.azure.account.auth.type.<storage-account>.dfs.core.windows.net", "SAS")spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.dfs.core.win...
- 3582 Views
- 9 replies
- 2 kudos
- 2 kudos
This is an old way of accessing data lake. With the free edition and serverless it is not supported. Try creating an external credential and external location to the data lake.https://learn.microsoft.com/en-us/azure/databricks/sql/language-manual/sql...
- 2 kudos
- 678 Views
- 1 replies
- 0 kudos
ask to how to use databricks community version
Hi, guys, as I am new to Databricks, I use Databricks free tier. But I want to explore it more and practice more, so. How do we get access to the community version? Because when I go to the community version, it redirects me to the free edition, but ...
- 678 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @Amit110409 ,Databricks has announced the free edition version of Databricks. It contains much more capabilities than the community edition.Regarding DBFS, there is no official update on enabling full access in the Free Edition. Also, DBFS is n...
- 0 kudos
- 2780 Views
- 2 replies
- 0 kudos
Converting managed table to external tables
I have some managed tables in catalog which i plan to convert to external tables. But i want preserve the version history of the tables as well. I have tried deep cloning but it builds the external table as version 0. Is there any way i can achieve t...
- 2780 Views
- 2 replies
- 0 kudos
- 0 kudos
I'm curious to know, what is the reason why you are looking to convert from managed to external? Most customers are looking to convert from external to managed, since UC managed tables help them reduce storage costs and increase query speeds. If ther...
- 0 kudos
- 1159 Views
- 1 replies
- 0 kudos
Azure File Share Connect with Databricks
Hi All,I am working on a task where i need to access Azure File Share from Databricks and move files from there to storage account blob container.I found one solution which is to use azure-file-share python package and it needs SAS token. But i don't...
- 1159 Views
- 1 replies
- 0 kudos
- 0 kudos
I think you are on the right track but getting a bit more granular. Once the Azure File Share is mounted,use Spark to move the data from the source path and write it to a blob container.
- 0 kudos
- 3336 Views
- 3 replies
- 0 kudos
Error with using Vector assembler in Unity Catalog
Hello,I am getting the below error while trying to convert my features using vector assembler in unity catalog cluster I tried setting up the config like mentioned in a different post, but it did not work still. Could use some help here.Thank you..
- 3336 Views
- 3 replies
- 0 kudos
- 0 kudos
I am stuck on the same issue, it does not make any sense to keep these blocked by UC. Let me know if someone got any solution of this issue.
- 0 kudos
- 676 Views
- 1 replies
- 0 kudos
Does Databricks run it's own compute clusters?
Rather new to Databricks so I understand this might be a silly question, but from what I understand so far Databricks leverages Spark for parallelized computation-but when we create a compute is it using the compute power from whatever cloud provider...
- 676 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello billyboy,You can start it off by looking in their official architecture documentation: https://learn.microsoft.com/en-us/azure/databricks/getting-started/overviewAnd next this is the article I like, that goes in more details: https://www.accent...
- 0 kudos
- 3524 Views
- 2 replies
- 0 kudos
How to create a mount point to File share in Azure Storage account
Hello All,I have a requirement to create a mount point to file share in Azure Storage account, I did follow the official documentation. However, I could not create the mount point to fileshare.. and the documentation discribed the mount point creatio...
- 3524 Views
- 2 replies
- 0 kudos
- 0 kudos
Hi Raja,You're correct that the wasbs:// method is for Azure Blob Storage, not File Shares! I believe File Share mounting is different and would require you to use SMB protocol mounted outside of Databricks since File Shares isn't natively supported!...
- 0 kudos
- 1734 Views
- 4 replies
- 3 kudos
Databricks Apps Deployment with React codebase
Hi,Need help to understand how we can deploy frontend(React) codebase via Databricks Apps as I am tried all templates and Custom App creation. It seems only Python based codebase can be deployed.Let me know if anyone can help me with approach or fesi...
- 1734 Views
- 4 replies
- 3 kudos
- 3 kudos
Here is some new documentation for using node.js with Databricks apps.
- 3 kudos
- 1649 Views
- 1 replies
- 0 kudos
Integrate Genie to teams
Hey, I'm trying to integrate Genie into Teams. I am Admin and have all rights. created a Genie to test. We are encountering an PermissionDeniederror while interacting with Genie API via SDK and workspace token.Details:Workspace URL: https://dbc-125a3...
- 1649 Views
- 1 replies
- 0 kudos
- 2051 Views
- 4 replies
- 0 kudos
Resolved! UDF fails with "No module named 'dbruntime'" when using dbutils
I've got an UDF which I call using applyInPandasThat UDF is to distribute API calls.It uses my custom .py library files that make these calls.Everything worked until I use `dbutils.widgets.get` and `dbutils.secrets.get` inside these libraries.It thro...
- 2051 Views
- 4 replies
- 0 kudos
- 0 kudos
Answering my own question. Similar to the original response, the answer was to pass in the secret as a function argument:CREATE OR REPLACE FUNCTION geocode_address(address STRING, api_key STRING) RETURNS STRUCT<latitude: DOUBLE, longitude: DOUBLE> ...
- 0 kudos
- 2568 Views
- 8 replies
- 0 kudos
External Api not returning any response
import requestsurl = "https://example.com/api"headers = {"Authorization": "Bearer YOUR_TOKEN","Content-Type": "application/json"}Payload = json.dumps({json_data})response = requests.post(url, headers=headers, data=Payload)print(response.status_code)p...
- 2568 Views
- 8 replies
- 0 kudos
- 0 kudos
how to reduce the data size, like API will going to give the data in onetime. can you give with some example.res = request.get("api")this above code is taking is lot of time
- 0 kudos
- 355 Views
- 0 replies
- 0 kudos
Cannot get tracing to work on genai app deployed on databricks
Hi, I have a gradio app that is deployed on databricks. The app is coming from this example provided by databricks. The app works fine, but when I want to add tracing I cannot get it to work. I keep getting the errormlflow.exceptions.MlflowException...
- 355 Views
- 0 replies
- 0 kudos
- 700 Views
- 1 replies
- 0 kudos
AccessDenied error on s3a:// bucket due to Serverless Network Policy in Databricks SQL Endpoint
I wrote this code in Notebookfiles = dbutils.fs.ls("s3a://testbuket114/")for f in files:print(f.name) it caused errs3a://testbuket114/: getFileStatus on s3a://testbuket114/: com.amazonaws.services.s3.model.AmazonS3Exception: Access to storage destina...
- 700 Views
- 1 replies
- 0 kudos
- 0 kudos
Hello @jeremylllin ,From the error message:Access to storage destination is denied because of serverless network policyDatabricks serverless environments require explicit network access policies to reach AWS resources like S3. Even if you’ve already ...
- 0 kudos
- 345 Views
- 1 replies
- 0 kudos
query
I was unable to login to databricks community edition i was shown 'User is not a member of this workspace'. even after entering the otp
- 345 Views
- 1 replies
- 0 kudos
- 0 kudos
@manoj991 Did you choose “Login to Free Edition” first?If so, please start from “Sign up.”
- 0 kudos
- 2862 Views
- 3 replies
- 0 kudos
SQL wharehouse do not work with power bi online service
Whenever i try to use a SQL Wharehouse serverless cluster on a power bi dataset it does not refresh on the power bi online service. It does work normally for other types of databricks clusters. The catalog is being defined on the power query import.I...
- 2862 Views
- 3 replies
- 0 kudos
- 0 kudos
Hi,We have the exact same issue, even if we specify the catalog in the connection parameters.However, Oauth authentication through a dataflow (instead of from Power Query Desktop) works fine. In Desktop we are in version 2.122.746.0, but the issue is...
- 0 kudos
Join Us as a Local Community Builder!
Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!
Sign Up Now-
.CSV
1 -
Access Data
2 -
Access Databricks
1 -
Access Delta Tables
2 -
Account reset
1 -
ADF Pipeline
1 -
ADLS Gen2 With ABFSS
1 -
Advanced Data Engineering
1 -
AI
1 -
Analytics
1 -
Apache spark
1 -
Apache Spark 3.0
1 -
API Documentation
3 -
Architecture
1 -
asset bundle
1 -
Asset Bundles
2 -
Auto-loader
1 -
Autoloader
4 -
AWS
3 -
AWS security token
1 -
AWSDatabricksCluster
1 -
Azure
5 -
Azure data disk
1 -
Azure databricks
14 -
Azure Databricks SQL
6 -
Azure databricks workspace
1 -
Azure Delta Lake
1 -
Azure Unity Catalog
5 -
Azure-databricks
1 -
AzureDatabricks
1 -
AzureDevopsRepo
1 -
Big Data Solutions
1 -
Billing
1 -
Billing and Cost Management
1 -
Blackduck
1 -
Bronze Layer
1 -
Certification
3 -
Certification Exam
1 -
Certification Voucher
3 -
CICDForDatabricksWorkflows
1 -
Cloud_files_state
1 -
CloudFiles
1 -
Cluster
3 -
Community Edition
3 -
Community Event
1 -
Community Group
2 -
Community Members
1 -
Compute
3 -
Compute Instances
1 -
conditional tasks
1 -
Connection
1 -
Contest
1 -
Credentials
1 -
CustomLibrary
1 -
Data
1 -
Data + AI Summit
1 -
Data Engineering
3 -
Data Explorer
1 -
Data Ingestion & connectivity
1 -
Databrick add-on for Splunk
1 -
databricks
2 -
Databricks Academy
1 -
Databricks AI + Data Summit
1 -
Databricks Alerts
1 -
Databricks Assistant
1 -
Databricks Certification
1 -
Databricks Cluster
2 -
Databricks Clusters
1 -
Databricks Community
10 -
Databricks community edition
3 -
Databricks Community Edition Account
1 -
Databricks Community Rewards Store
3 -
Databricks connect
1 -
Databricks Dashboard
2 -
Databricks delta
2 -
Databricks Delta Table
2 -
Databricks Demo Center
1 -
Databricks Documentation
2 -
Databricks genAI associate
1 -
Databricks JDBC Driver
1 -
Databricks Job
1 -
Databricks Lakehouse Platform
6 -
Databricks Migration
1 -
Databricks notebook
2 -
Databricks Notebooks
3 -
Databricks Platform
2 -
Databricks Pyspark
1 -
Databricks Python Notebook
1 -
Databricks Repo
1 -
Databricks Runtime
1 -
Databricks SQL
5 -
Databricks SQL Alerts
1 -
Databricks SQL Warehouse
1 -
Databricks Terraform
1 -
Databricks UI
1 -
Databricks Unity Catalog
4 -
Databricks Workflow
2 -
Databricks Workflows
2 -
Databricks workspace
3 -
Databricks-connect
1 -
DatabricksJobCluster
1 -
DataDays
1 -
Datagrip
1 -
DataMasking
2 -
DataVersioning
1 -
dbdemos
2 -
DBFS
1 -
DBRuntime
1 -
DBSQL
1 -
DDL
1 -
Dear Community
1 -
deduplication
1 -
Delt Lake
1 -
Delta
22 -
Delta Live Pipeline
3 -
Delta Live Table
5 -
Delta Live Table Pipeline
5 -
Delta Live Table Pipelines
4 -
Delta Live Tables
7 -
Delta Sharing
2 -
deltaSharing
1 -
Deny assignment
1 -
Development
1 -
Devops
1 -
DLT
10 -
DLT Pipeline
7 -
DLT Pipelines
5 -
Dolly
1 -
Download files
1 -
Dynamic Variables
1 -
Engineering With Databricks
1 -
env
1 -
ETL Pipelines
1 -
External Sources
1 -
External Storage
2 -
FAQ for Databricks Learning Festival
2 -
Feature Store
2 -
Filenotfoundexception
1 -
Free trial
1 -
GCP Databricks
1 -
GenAI
1 -
Getting started
2 -
Google Bigquery
1 -
HIPAA
1 -
import
1 -
Integration
1 -
JDBC Connections
1 -
JDBC Connector
1 -
Job Task
1 -
Lineage
1 -
LLM
1 -
Login
1 -
Login Account
1 -
Machine Learning
2 -
MachineLearning
1 -
Materialized Tables
2 -
Medallion Architecture
1 -
Migration
1 -
ML Model
2 -
MlFlow
2 -
Model Training
1 -
Module
1 -
Networking
1 -
Notebook
1 -
Onboarding Trainings
1 -
Pandas udf
1 -
Permissions
1 -
personalcompute
1 -
Pipeline
2 -
Plotly
1 -
PostgresSQL
1 -
Pricing
1 -
Pyspark
1 -
Python
5 -
Python Code
1 -
Python Wheel
1 -
Quickstart
1 -
Read data
1 -
Repos Support
1 -
Reset
1 -
Rewards Store
2 -
Schedule
1 -
Serverless
3 -
Session
1 -
Sign Up Issues
2 -
Spark
3 -
Spark Connect
1 -
sparkui
2 -
Splunk
2 -
SQL
8 -
Summit23
7 -
Support Tickets
1 -
Sydney
2 -
Table Download
1 -
Tags
1 -
Training
2 -
Troubleshooting
1 -
Unity Catalog
4 -
Unity Catalog Metastore
2 -
Update
1 -
user groups
1 -
Venicold
3 -
Voucher Not Recieved
1 -
Watermark
1 -
Weekly Documentation Update
1 -
Weekly Release Notes
2 -
Women
1 -
Workflow
2 -
Workspace
3
- « Previous
- Next »
User | Count |
---|---|
133 | |
97 | |
52 | |
42 | |
30 |