cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

Dimitry
by Valued Contributor
  • 3544 Views
  • 4 replies
  • 0 kudos

Resolved! UDF fails with "No module named 'dbruntime'" when using dbutils

I've got an UDF which I call using applyInPandasThat UDF is to distribute API calls.It uses my custom .py library files that make these calls.Everything worked until I use `dbutils.widgets.get` and `dbutils.secrets.get` inside these libraries.It thro...

  • 3544 Views
  • 4 replies
  • 0 kudos
Latest Reply
df_dbx
New Contributor II
  • 0 kudos

Answering my own question. Similar to the original response, the answer was to pass in the secret as a function argument:CREATE OR REPLACE FUNCTION geocode_address(address STRING, api_key STRING) RETURNS STRUCT<latitude: DOUBLE, longitude: DOUBLE> ...

  • 0 kudos
3 More Replies
sshukla
by New Contributor III
  • 3313 Views
  • 8 replies
  • 0 kudos

External Api not returning any response

import requestsurl = "https://example.com/api"headers = {"Authorization": "Bearer YOUR_TOKEN","Content-Type": "application/json"}Payload = json.dumps({json_data})response = requests.post(url, headers=headers, data=Payload)print(response.status_code)p...

  • 3313 Views
  • 8 replies
  • 0 kudos
Latest Reply
guptaharsh
New Contributor III
  • 0 kudos

how to reduce the data size, like API will going to give the data in onetime. can you give with some example.res = request.get("api")this above code is taking is lot of time

  • 0 kudos
7 More Replies
jeremylllin
by New Contributor
  • 1218 Views
  • 1 replies
  • 0 kudos

AccessDenied error on s3a:// bucket due to Serverless Network Policy in Databricks SQL Endpoint

I wrote this code in Notebookfiles = dbutils.fs.ls("s3a://testbuket114/")for f in files:print(f.name) it caused errs3a://testbuket114/: getFileStatus on s3a://testbuket114/: com.amazonaws.services.s3.model.AmazonS3Exception: Access to storage destina...

jeremylllin_0-1750413768379.png jeremylllin_1-1750413858121.png
  • 1218 Views
  • 1 replies
  • 0 kudos
Latest Reply
Isi
Honored Contributor III
  • 0 kudos

Hello @jeremylllin ,From the error message:Access to storage destination is denied because of serverless network policyDatabricks serverless environments require explicit network access policies to reach AWS resources like S3. Even if you’ve already ...

  • 0 kudos
manoj991
by New Contributor
  • 565 Views
  • 1 replies
  • 0 kudos

query

I was unable to login to databricks community edition i was shown  'User is not a member of this workspace'. even after entering the otp

  • 565 Views
  • 1 replies
  • 0 kudos
Latest Reply
Takuya-Omi
Valued Contributor III
  • 0 kudos

@manoj991 Did you choose “Login to Free Edition” first?If so, please start from “Sign up.”

  • 0 kudos
andre_gonzalez
by New Contributor
  • 3565 Views
  • 3 replies
  • 0 kudos

SQL wharehouse do not work with power bi online service

Whenever i try to use a SQL Wharehouse serverless cluster on a power bi dataset it does not refresh on the power bi online service. It does work normally for other types of databricks clusters. The catalog is being defined on the power query import.I...

2023-10-17_14-36.png
  • 3565 Views
  • 3 replies
  • 0 kudos
Latest Reply
ChuckyDee25
New Contributor II
  • 0 kudos

Hi,We have the exact same issue, even if we specify the catalog in the connection parameters.However, Oauth authentication through a dataflow (instead of from Power Query Desktop) works fine. In Desktop we are in version 2.122.746.0, but the issue is...

  • 0 kudos
2 More Replies
Lebrown
by New Contributor
  • 1039 Views
  • 1 replies
  • 1 kudos

Free Edition and Databricks Asset Bundles

Hi,I would like to learn more about DAB's and gain practical knowledge. For this I want to use the Free Edition but the authentication fails. I have tried both the Databricks extension in VSCode and the Databricks CLI. In the extension, it returns: C...

  • 1039 Views
  • 1 replies
  • 1 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 1 kudos

Hello Lebrown:Here are the steps I managed to deploy job and pipelines to Databricks using DABs with example (Using free edition):1. Install the Databricks CLI (latest version). On Windows, run:winget search databrickswinget install Databricks.Databr...

  • 1 kudos
billyboy
by New Contributor II
  • 1890 Views
  • 2 replies
  • 4 kudos

Resolved! How To Remove Extra DataBricks Free Edition Account

Accidently made an extra DataBricks Free edition account/workspace on the same email while messing with Legacy edition login, is there a way to delete one of these? The old community edition had a "Delete Account" button but can't seem to find that f...

  • 1890 Views
  • 2 replies
  • 4 kudos
Latest Reply
ilir_nuredini
Honored Contributor
  • 4 kudos

Hello Billyboy,I can’t seem to find the option either, but one of the limitations of the Free Edition is:“Databricks may delete Free Edition accounts that are inactive for a prolonged period.”So, you could simply avoid logging into that account for a...

  • 4 kudos
1 More Replies
upskill
by New Contributor
  • 952 Views
  • 1 replies
  • 0 kudos

Resolved! Delete workspace in Free account

I created a free edition account and I used my google account for logging in. I see 2 works spaces got created. I want to delete one of them. How can I delete one of the workspace. If it is not possible, how can I delete my account as a whole?

  • 952 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @upskill! Did you possibly sign in twice during setup? That can sometimes lead to separate accounts, each with its own workspace. Currently, there’s no self-serve option to remove a workspace or delete an account. You can reach out to help@data...

  • 0 kudos
ChristianRRL
by Honored Contributor
  • 3211 Views
  • 3 replies
  • 1 kudos

DQ Expectations Best Practice

Hi there, I hope this is a fairly simple and straightforward question. I'm wondering if there's a "general" consensus on where along the DLT data ingestion + transformation process should data quality expectations be applied? For example, two very si...

  • 3211 Views
  • 3 replies
  • 1 kudos
Latest Reply
dataoculus_app
New Contributor III
  • 1 kudos

in my opinion, you can keep the bronze/raw layer as it is, and the quality check should be applied to silver.

  • 1 kudos
2 More Replies
Dimitry
by Valued Contributor
  • 1576 Views
  • 2 replies
  • 1 kudos

Resolved! Struggle to parallelize UDF

Hi all I have 2 clusters, that look identical but one runs my UDF in parallel another one does not.The ones that do is personal, the bad one is shared.import pandas as pd from datetime import datetime from time import sleep import threading # test f...

Dimitry_0-1750216264118.png Dimitry_1-1750216332766.png Dimitry_3-1750216642622.png
  • 1576 Views
  • 2 replies
  • 1 kudos
Latest Reply
Dimitry
Valued Contributor
  • 1 kudos

As a side note "no isolation shared" cluster has no access to unity catalog, so no table queries.I resorted to using personal compute assigned to a group.

  • 1 kudos
1 More Replies
Jerry01
by New Contributor III
  • 1788 Views
  • 1 replies
  • 0 kudos

How to override a in-built function in databricks

I am trying to override is_member() in-built function in such a way that, it always return true. How to do it in databricks using sql or python?

  • 1788 Views
  • 1 replies
  • 0 kudos
Latest Reply
xbgydx12
New Contributor II
  • 0 kudos

To re-active this question. I have a similar requirement. I want to override shouldRetain(log: T, currentTime: Long) in class org.apache.spark.sql.execution.streaming.CompactibleFileStreamLog, it also always return true

  • 0 kudos
zent
by New Contributor
  • 1275 Views
  • 1 replies
  • 0 kudos

Requirements for Managed Iceberg tables with Unity Catalog

Does Databricks support creating native Apache iceberg tables(managed) in unity catalog or is it possible only with private preview, so what are the requirements?

  • 1275 Views
  • 1 replies
  • 0 kudos
Latest Reply
Advika
Community Manager
  • 0 kudos

Hello @zent! Databricks now fully supports creating Apache Iceberg managed tables in Unity Catalog, and this capability is available in Public Preview (not just private preview). These managed Iceberg tables can be read and written by Databricks and ...

  • 0 kudos
Anton_Lagergren
by Contributor II
  • 2760 Views
  • 2 replies
  • 1 kudos

Resolved! New Regional Group Request

Hello!How may I request and/or create a new Regional Group for the DMV Area (DC, Maryland, Virginia).Thank you,—Anton@DB_Paul   @Sujitha 

  • 2760 Views
  • 2 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

Is there a group you already created??

  • 1 kudos
1 More Replies
darkanita81
by New Contributor III
  • 1574 Views
  • 3 replies
  • 3 kudos

Resolved! How be a part of Databricks Groups

Hello, I am part of a Community Databricks Crew LATAM, where we have achieved 300  people connected and we have executed 3 events, one by month, we want to be part of Databricks Groups but we dont know how to do that, if somebody can help me I will a...

  • 1574 Views
  • 3 replies
  • 3 kudos
Latest Reply
Rishabh_Tiwari
Community Manager
  • 3 kudos

Hi Ana, Thanks for reaching out! I won’t be attending DAIS this time, but we do have a Databricks Community booth set up near the Expo Hall. My colleague @Sujitha  will be there. Do stop by to say hi and learn about all the exciting things we have go...

  • 3 kudos
2 More Replies
Labels