cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

LLLMMM
by New Contributor III
  • 2587 Views
  • 4 replies
  • 2 kudos

Resolved! Try Databricks sign up failed

Hi, I am trying to use Databricks with the community edition. However, when I tried to create an account, the sign-up failed after I completed the puzzle. 

Screenshot 2024-07-05 at 20.45.53.png
  • 2587 Views
  • 4 replies
  • 2 kudos
Latest Reply
sreedevi
New Contributor II
  • 2 kudos

unable to signup try databricks

  • 2 kudos
3 More Replies
tarun_singh
by New Contributor
  • 220 Views
  • 1 replies
  • 1 kudos
Get Started Discussions
data-skipping
Databricks
delta-lake
z-order
  • 220 Views
  • 1 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

We are going to need a little more information to better help you. What is the scenario?  Louis

  • 1 kudos
jact
by New Contributor II
  • 296 Views
  • 1 replies
  • 1 kudos

Why keep both Azure OpenAI and Databricks?

Hi everyone,I’m curious to hear your thoughts on the benefits of having both Azure OpenAI and Azure Databricks within the same ecosystem.From what I can see, Databricks provides a strong foundation for data engineering, governance, and model lifecycl...

  • 296 Views
  • 1 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

Two use case I can think of is RAG:Use Databricks for vector indexing (e.g., via Delta Lake or FAISS) and Azure OpenAI for inference.Example: A chatbot that queries Databricks-hosted documents and uses GPT-4 for response generation.Agentic Workflows:...

  • 1 kudos
int32lama
by New Contributor II
  • 290 Views
  • 2 replies
  • 1 kudos

Resolved! Ingesting data from APIs Like Shopify (for orders), Meta Ads, Google Ads etc

Hi,I am and trying to create some table by calling APIs of Shopify/Meta Ads/Google Ads and so on. Where will I make the API call ? Is making API calls in Notebooks considered standard way to ingest in these cases.  I intend to make a daily call to ge...

  • 290 Views
  • 2 replies
  • 1 kudos
Latest Reply
dejivincent
New Contributor II
  • 1 kudos

hello @int32lama i can help you with that if you are interested 

  • 1 kudos
1 More Replies
nageswara
by New Contributor III
  • 147 Views
  • 0 replies
  • 4 kudos

Databricks One

Databricks One is a user interface designed for business users, giving them a single, intuitive entry point to interact with data and AI in Azure Databricks, without needing to navigate technical concepts such as clusters, queries, models, or noteboo...

  • 147 Views
  • 0 replies
  • 4 kudos
Hubert-Dudek
by Databricks MVP
  • 174 Views
  • 0 replies
  • 0 kudos

SQL warehouse: A materialized view is the simplest and cost-efficient way to transform your data

Materialized views running on SQL warehouse are super cost-efficient, and additionally, it is a really simple and powerful data engineering tool - just be sure that Enzyme updates it incrementally. Read more: - https://databrickster.medium.com/sql-wa...

mv.png
  • 174 Views
  • 0 replies
  • 0 kudos
Hubert-Dudek
by Databricks MVP
  • 205 Views
  • 0 replies
  • 2 kudos

The purpose of your All-Purpose Cluster

Small, hidden but useful cluster setting.You can set that no jobs are allowed on the all-purpose cluster.Or vice versa, you can set an all-purpose cluster that can be used only by jobs. read more: - https://databrickster.medium.com/purpose-for-your-...

no_jobs_cluster.png
  • 205 Views
  • 0 replies
  • 2 kudos
tarunnagar
by Contributor
  • 266 Views
  • 1 replies
  • 1 kudos

How to Integrate Machine Learning Model Development with Databricks Workflows?

Hey everyone I’m currently exploring machine learning model development and I’m interested in understanding how to effectively integrate ML workflows within Databricks.Specifically, I’d like to hear from the community about:How do you structure ML pi...

  • 266 Views
  • 1 replies
  • 1 kudos
Latest Reply
jameswood32
Contributor
  • 1 kudos

You can integrate machine learning model development into Databricks Workflows pretty smoothly using the platform’s native tools. The main idea is to treat your ML lifecycle (data prep → training → evaluation → deployment) as a series of tasks within...

  • 1 kudos
Kruthika
by New Contributor
  • 5495 Views
  • 1 replies
  • 0 kudos

Support for managed identity based authentication in python kafka client

We followed this document https://docs.databricks.com/aws/en/connect/streaming/kafka?language=Python#msk-aad to use Kafka client to read events from our event hub for a feature.As part of the SFI, the guidance is to move away from client secret and u...

  • 5495 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

Currently, Databricks does not support using Managed Identities directly for Kafka client authentication (e.g., MSK IAM or Event Hubs Kafka endpoint) in Python Structured Streaming connections. However, there is a supported and secure alternative tha...

  • 0 kudos
Sarathk
by New Contributor
  • 3686 Views
  • 2 replies
  • 1 kudos

Resolved! Data bricks is not mounting with storage account giving java lang exception error 480

Hi Everyone,I am currently facing an issue with in our Test Environment where Data bricks is not able to mount with the storage account and we are using the same mount in other environments those are Dev,Preprod and Prod and it works fine there witho...

  • 3686 Views
  • 2 replies
  • 1 kudos
Latest Reply
mark_ott
Databricks Employee
  • 1 kudos

This issue in your Test environment, where Databricks fails to mount an Azure Storage account with the error java.lang.Exception: 480, is most likely related to expired credentials or cached authentication tokens, even though the same configuration w...

  • 1 kudos
1 More Replies
newenglander
by New Contributor II
  • 3031 Views
  • 2 replies
  • 1 kudos

Cannot import editable installed module in notebook

Hi,I have the following directory structure:- mypkg/ - setup.py - mypkg/ - __init__.py - module.py - scripts/ - main # notebook From the `main` notebok I have a cell that runs:%pip install -e /path/to/mypkgThis command appears to succ...

  • 3031 Views
  • 2 replies
  • 1 kudos
Latest Reply
Louis_Frolio
Databricks Employee
  • 1 kudos

Hey @newenglander — always great to meet a fellow New Englander Could you share a bit more detail about your setup? For example, are you running on classic compute or serverless? And are you working in a customer workspace, or using Databricks Free ...

  • 1 kudos
1 More Replies
GMB
by New Contributor II
  • 8216 Views
  • 5 replies
  • 1 kudos

Spatial Queries

Hi,I'm trying to execute the following code:%sqlSELECT LSOA21CD,       ST_X(ST_GeomFromWKB(Geom_Varbinary)) AS STX,       ST_Y(ST_GeomFromWKB(Geom_Varbinary)) AS STYFROM ordnance_survey_lsoas_december_2021_population_weighted_centroidsWHERE LSOA21CD ...

  • 8216 Views
  • 5 replies
  • 1 kudos
Latest Reply
ivan-kurchenko
New Contributor II
  • 1 kudos

@Corar You might want to enable that explicitly by setting 'spark.databricks.geo.st.enabled' configuration to value 'true'. 

  • 1 kudos
4 More Replies
Saubhik
by New Contributor III
  • 1240 Views
  • 6 replies
  • 0 kudos

Getting [08S01/500593] Can't connect to database - [Databricks][JDBCDriver](500593) Communication

I am getting below error connecting a databricks instance using JDBC driver .ERROR: [08S01/500593] Can't connect to database - [Databricks][JDBCDriver](500593) Communication link failure. Failed to connect to server. Reason: HTTP Response code: 401, ...

  • 1240 Views
  • 6 replies
  • 0 kudos
Latest Reply
Saubhik
New Contributor III
  • 0 kudos

I am trying to connect Databricks from Mainframe z/OS using JDBC driver and using below IBM Java version java version "11.0.26" 2025-01-21IBM Semeru Runtime Certified Edition for z/OS 11.0.26.0 (build 11.0.26+4)IBM J9 VM 11.0.26.0 (build z/OS-Release...

  • 0 kudos
5 More Replies
maikel
by New Contributor III
  • 533 Views
  • 5 replies
  • 1 kudos

Resolved! External MCP representing user data permissions

Hello Community!I am writing to you with a question and hope that you will help me to find the right approach.I am building AI Enterprise System and the organization store the data on Data Bricks. To access the given data, you have to raise a request...

  • 533 Views
  • 5 replies
  • 1 kudos
Latest Reply
smithsonian
New Contributor III
  • 1 kudos

Ignore for now you have MCP Server.The problem you are trying to solve1) An AI Agent needs to access data inside Databricks 2) The agent need to operate at the user's permissionsThere are muliple paths1) Directly using OAuth/HTTPhttps://docs.databric...

  • 1 kudos
4 More Replies
__angel__
by New Contributor III
  • 1835 Views
  • 1 replies
  • 1 kudos

CREATE Community_User_Group [IF NOT EXISTS] IN MADRID(SPAIN)

Hi,I would like to get some support in creating a Community User Group in Madrid, Spain. It would be nice to host events/meetings/discussions ...Regards,Ángel

  • 1835 Views
  • 1 replies
  • 1 kudos
Latest Reply
anastasia_lc
New Contributor II
  • 1 kudos

Hi Ángel,I see your post is from quite some time ago, but I wanted to say that I’d also love to see a Databricks User Group here in Madrid.Although I’m not new to Databricks, I haven’t really taken much advantage of the community so far due to lack o...

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels