cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Upendra_Dwivedi
by New Contributor III
  • 1109 Views
  • 1 replies
  • 0 kudos

Azure File Share Connect with Databricks

Hi All,I am working on a task where i need to access Azure File Share from Databricks and move files from there to storage account blob container.I found one solution which is to use azure-file-share python package and it needs SAS token. But i don't...

  • 1109 Views
  • 1 replies
  • 0 kudos
Latest Reply
Omerabbasi
New Contributor II
  • 0 kudos

I think you are on the right track but getting a bit more granular. Once the Azure File Share is mounted,use Spark to move the data from the source path and write it to a blob container. 

  • 0 kudos
sue01
by New Contributor II
  • 3271 Views
  • 3 replies
  • 0 kudos

Error with using Vector assembler in Unity Catalog

Hello,I am getting the below error while trying to convert my features using vector assembler in unity catalog cluster I tried setting up the config like mentioned in a different post, but it did not work still. Could use some help here.Thank you..

sue01_0-1708093249414.png
Get Started Discussions
unitycatalog mlflowerror
  • 3271 Views
  • 3 replies
  • 0 kudos
Latest Reply
ankur2917
New Contributor II
  • 0 kudos

I am stuck on the same issue, it does not make any sense to keep these blocked by UC. Let me know if someone got any solution of this issue.

  • 0 kudos
2 More Replies
billyboy
by New Contributor II
  • 640 Views
  • 1 replies
  • 0 kudos

Does Databricks run it's own compute clusters?

Rather new to Databricks so I understand this might be a silly question, but from what I understand so far Databricks leverages Spark for parallelized computation-but when we create a compute is it using the compute power from whatever cloud provider...

  • 640 Views
  • 1 replies
  • 0 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 0 kudos

Hello billyboy,You can start it off by looking in their official architecture documentation: https://learn.microsoft.com/en-us/azure/databricks/getting-started/overviewAnd next this is the article I like, that goes in more details: https://www.accent...

  • 0 kudos
RajaDOP
by New Contributor
  • 3215 Views
  • 2 replies
  • 0 kudos

How to create a mount point to File share in Azure Storage account

Hello All,I have a requirement to create a mount point to file share in Azure Storage account, I did follow the official documentation. However, I could not create the mount point to fileshare.. and the documentation discribed the mount point creatio...

  • 3215 Views
  • 2 replies
  • 0 kudos
Latest Reply
Aaaddison
New Contributor II
  • 0 kudos

Hi Raja,You're correct that the wasbs:// method is for Azure Blob Storage, not File Shares! I believe File Share mounting is different and would require you to use SMB protocol mounted outside of Databricks since File Shares isn't natively supported!...

  • 0 kudos
1 More Replies
amitkumarvish
by New Contributor II
  • 1553 Views
  • 4 replies
  • 3 kudos

Databricks Apps Deployment with React codebase

Hi,Need help to understand how we can deploy frontend(React) codebase via Databricks Apps as I am tried all templates and Custom App creation. It seems only Python based codebase can be deployed.Let me know if anyone can help me with approach or fesi...

Get Started Discussions
Databricks Apps
  • 1553 Views
  • 4 replies
  • 3 kudos
Latest Reply
cgrant
Databricks Employee
  • 3 kudos

Here is some new documentation for using node.js with Databricks apps.

  • 3 kudos
3 More Replies
Tarun-1100
by New Contributor
  • 1575 Views
  • 1 replies
  • 0 kudos

Integrate Genie to teams

Hey, I'm trying to integrate Genie into Teams. I am Admin and have all rights. created a Genie to test. We are encountering an PermissionDeniederror while interacting with Genie API via SDK and workspace token.Details:Workspace URL: https://dbc-125a3...

Tarun1100_0-1745945967915.png
Get Started Discussions
API
Genie
integration
  • 1575 Views
  • 1 replies
  • 0 kudos
Latest Reply
chanukya-pekala
Contributor II
  • 0 kudos

Check this repo -  TeamsGenieIntegration

  • 0 kudos
Dimitry
by Contributor
  • 1885 Views
  • 4 replies
  • 0 kudos

Resolved! UDF fails with "No module named 'dbruntime'" when using dbutils

I've got an UDF which I call using applyInPandasThat UDF is to distribute API calls.It uses my custom .py library files that make these calls.Everything worked until I use `dbutils.widgets.get` and `dbutils.secrets.get` inside these libraries.It thro...

  • 1885 Views
  • 4 replies
  • 0 kudos
Latest Reply
df_dbx
New Contributor II
  • 0 kudos

Answering my own question. Similar to the original response, the answer was to pass in the secret as a function argument:CREATE OR REPLACE FUNCTION geocode_address(address STRING, api_key STRING) RETURNS STRUCT<latitude: DOUBLE, longitude: DOUBLE> ...

  • 0 kudos
3 More Replies
sshukla
by New Contributor III
  • 2426 Views
  • 8 replies
  • 0 kudos

External Api not returning any response

import requestsurl = "https://example.com/api"headers = {"Authorization": "Bearer YOUR_TOKEN","Content-Type": "application/json"}Payload = json.dumps({json_data})response = requests.post(url, headers=headers, data=Payload)print(response.status_code)p...

  • 2426 Views
  • 8 replies
  • 0 kudos
Latest Reply
guptaharsh
New Contributor III
  • 0 kudos

how to reduce the data size, like API will going to give the data in onetime. can you give with some example.res = request.get("api")this above code is taking is lot of time

  • 0 kudos
7 More Replies
jeremylllin
by New Contributor
  • 640 Views
  • 1 replies
  • 0 kudos

AccessDenied error on s3a:// bucket due to Serverless Network Policy in Databricks SQL Endpoint

I wrote this code in Notebookfiles = dbutils.fs.ls("s3a://testbuket114/")for f in files:print(f.name) it caused errs3a://testbuket114/: getFileStatus on s3a://testbuket114/: com.amazonaws.services.s3.model.AmazonS3Exception: Access to storage destina...

jeremylllin_0-1750413768379.png jeremylllin_1-1750413858121.png
  • 640 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor II
  • 0 kudos

Hello @jeremylllin ,From the error message:Access to storage destination is denied because of serverless network policyDatabricks serverless environments require explicit network access policies to reach AWS resources like S3. Even if you’ve already ...

  • 0 kudos
manoj991
by New Contributor
  • 313 Views
  • 1 replies
  • 0 kudos

query

I was unable to login to databricks community edition i was shown  'User is not a member of this workspace'. even after entering the otp

  • 313 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@manoj991 Did you choose “Login to Free Edition” first?If so, please start from “Sign up.”

  • 0 kudos
AshwinGujarathi
by New Contributor
  • 470 Views
  • 0 replies
  • 0 kudos

Issue with the SQL Server ingestion in Databricks Lakflow- Gateway Stuck at "Starting"

Context: I am working on a POC to explore Databricks Lakeflow ingestion feature. I have tried creating Ingestion pipeline with salesforce connector successfully and able to load the data into Delta table. However when I am trying to create ingestion ...

Sql server.png
Get Started Discussions
Databricks Lakeflow
  • 470 Views
  • 0 replies
  • 0 kudos
andre_gonzalez
by New Contributor
  • 2773 Views
  • 3 replies
  • 0 kudos

SQL wharehouse do not work with power bi online service

Whenever i try to use a SQL Wharehouse serverless cluster on a power bi dataset it does not refresh on the power bi online service. It does work normally for other types of databricks clusters. The catalog is being defined on the power query import.I...

2023-10-17_14-36.png
  • 2773 Views
  • 3 replies
  • 0 kudos
Latest Reply
ChuckyDee25
New Contributor II
  • 0 kudos

Hi,We have the exact same issue, even if we specify the catalog in the connection parameters.However, Oauth authentication through a dataflow (instead of from Power Query Desktop) works fine. In Desktop we are in version 2.122.746.0, but the issue is...

  • 0 kudos
2 More Replies
Lebrown
by New Contributor
  • 467 Views
  • 1 replies
  • 1 kudos

Free Edition and Databricks Asset Bundles

Hi,I would like to learn more about DAB's and gain practical knowledge. For this I want to use the Free Edition but the authentication fails. I have tried both the Databricks extension in VSCode and the Databricks CLI. In the extension, it returns: C...

  • 467 Views
  • 1 replies
  • 1 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 1 kudos

Hello Lebrown:Here are the steps I managed to deploy job and pipelines to Databricks using DABs with example (Using free edition):1. Install the Databricks CLI (latest version). On Windows, run:winget search databrickswinget install Databricks.Databr...

  • 1 kudos
billyboy
by New Contributor II
  • 1040 Views
  • 2 replies
  • 4 kudos

Resolved! How To Remove Extra DataBricks Free Edition Account

Accidently made an extra DataBricks Free edition account/workspace on the same email while messing with Legacy edition login, is there a way to delete one of these? The old community edition had a "Delete Account" button but can't seem to find that f...

  • 1040 Views
  • 2 replies
  • 4 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 4 kudos

Hello Billyboy,I can’t seem to find the option either, but one of the limitations of the Free Edition is:“Databricks may delete Free Edition accounts that are inactive for a prolonged period.”So, you could simply avoid logging into that account for a...

  • 4 kudos
1 More Replies

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels