cancel
Showing results for 
Search instead for 
Did you mean: 
Get Started Discussions
Start your journey with Databricks by joining discussions on getting started guides, tutorials, and introductory topics. Connect with beginners and experts alike to kickstart your Databricks experience.
cancel
Showing results for 
Search instead for 
Did you mean: 

Forum Posts

edice
by New Contributor
  • 261 Views
  • 1 replies
  • 0 kudos

Creating links in embedded dashboards

I am curious what Databricks AI/BI's capabilities are in terms of linking out other dashboards in your dashboards. Can I create a text box that links out another dashboard? I noticed you can also create links in a pivot table. What will happen if I e...

  • 261 Views
  • 1 replies
  • 0 kudos
Latest Reply
yogeshsingh
Databricks Employee
  • 0 kudos

You can add a text widget to the canvas to insert hyperlinks to any URL, including other dashboards.In pivot tables, you can define URL paths for cell values from the Actions tab to make them clickable.For navigation within a dashboard, use “drill th...

  • 0 kudos
ckoeber
by New Contributor
  • 34 Views
  • 1 replies
  • 0 kudos

Attempting to Use VSCode Extension: Error: No sync destination found

Hello,I have the Databricks Extension + Databricks Connect Python module installed.I am connected to the target environment with a running cluster and I have all green checkmarks under the "Configuration" section.However, when testing a python file a...

  • 34 Views
  • 1 replies
  • 0 kudos
Latest Reply
mark_ott
Databricks Employee
  • 0 kudos

The error message "No sync destination found" when running your Python file in Databricks with the Extension and Databricks Connect module, even with all green configuration checkmarks, typically indicates a configuration issue with the sync destinat...

  • 0 kudos
Suheb
by New Contributor
  • 30 Views
  • 2 replies
  • 0 kudos

What’s the easiest way to clean and transform data using PySpark in Databricks?

You have some raw data (like messy Excel files, CSVs, or logs) and you want to prepare it for analysis — by removing errors, fixing missing values, changing formats, or combining columns — using PySpark (Python for Apache Spark) inside Databricks.

  • 30 Views
  • 2 replies
  • 0 kudos
Latest Reply
ShaneCorn
New Contributor III
  • 0 kudos

The easiest way to clean and transform data using PySpark in Databricks is by leveraging the DataFrame API. Start by loading data into a Spark DataFrame with spark.read. Use built-in functions like dropna, fillna, and withColumn to handle missing val...

  • 0 kudos
1 More Replies
tarunnagar
by New Contributor II
  • 43 Views
  • 1 replies
  • 0 kudos

How to Leverage Databricks for End-to-End AI Model Development

Hi everyone,I’m exploring how to use Databricks as a platform for end-to-end AI and machine learning model development, and I’d love to get insights from professionals and practitioners who have hands-on experience.Specifically, I’m curious about:Set...

  • 43 Views
  • 1 replies
  • 0 kudos
Latest Reply
jameswood32
New Contributor III
  • 0 kudos

You can leverage Databricks for end-to-end AI model development by using its Lakehouse Platform, which unifies data engineering, analytics, and machine learning in one workspace. Start by ingesting and transforming data using Apache Spark and Delta L...

  • 0 kudos
LLLMMM
by New Contributor III
  • 2340 Views
  • 4 replies
  • 2 kudos

Resolved! Try Databricks sign up failed

Hi, I am trying to use Databricks with the community edition. However, when I tried to create an account, the sign-up failed after I completed the puzzle. 

Screenshot 2024-07-05 at 20.45.53.png
  • 2340 Views
  • 4 replies
  • 2 kudos
Latest Reply
sreedevi
New Contributor
  • 2 kudos

unable to signup try databricks

  • 2 kudos
3 More Replies
jact
by New Contributor II
  • 101 Views
  • 1 replies
  • 1 kudos

Why keep both Azure OpenAI and Databricks?

Hi everyone,I’m curious to hear your thoughts on the benefits of having both Azure OpenAI and Azure Databricks within the same ecosystem.From what I can see, Databricks provides a strong foundation for data engineering, governance, and model lifecycl...

  • 101 Views
  • 1 replies
  • 1 kudos
Latest Reply
nayan_wylde
Esteemed Contributor
  • 1 kudos

Two use case I can think of is RAG:Use Databricks for vector indexing (e.g., via Delta Lake or FAISS) and Azure OpenAI for inference.Example: A chatbot that queries Databricks-hosted documents and uses GPT-4 for response generation.Agentic Workflows:...

  • 1 kudos
int32lama
by New Contributor
  • 107 Views
  • 2 replies
  • 1 kudos

Resolved! Ingesting data from APIs Like Shopify (for orders), Meta Ads, Google Ads etc

Hi,I am and trying to create some table by calling APIs of Shopify/Meta Ads/Google Ads and so on. Where will I make the API call ? Is making API calls in Notebooks considered standard way to ingest in these cases.  I intend to make a daily call to ge...

  • 107 Views
  • 2 replies
  • 1 kudos
Latest Reply
dejivincent
New Contributor
  • 1 kudos

hello @int32lama i can help you with that if you are interested 

  • 1 kudos
1 More Replies
nageswara
by New Contributor
  • 59 Views
  • 0 replies
  • 2 kudos

Databricks One

Databricks One is a user interface designed for business users, giving them a single, intuitive entry point to interact with data and AI in Azure Databricks, without needing to navigate technical concepts such as clusters, queries, models, or noteboo...

  • 59 Views
  • 0 replies
  • 2 kudos
Hubert-Dudek
by Esteemed Contributor III
  • 80 Views
  • 0 replies
  • 1 kudos

The purpose of your All-Purpose Cluster

Small, hidden but useful cluster setting.You can set that no jobs are allowed on the all-purpose cluster.Or vice versa, you can set an all-purpose cluster that can be used only by jobs. read more: - https://databrickster.medium.com/purpose-for-your-...

no_jobs_cluster.png
  • 80 Views
  • 0 replies
  • 1 kudos
tarunnagar
by New Contributor II
  • 106 Views
  • 1 replies
  • 1 kudos

How to Integrate Machine Learning Model Development with Databricks Workflows?

Hey everyone I’m currently exploring machine learning model development and I’m interested in understanding how to effectively integrate ML workflows within Databricks.Specifically, I’d like to hear from the community about:How do you structure ML pi...

  • 106 Views
  • 1 replies
  • 1 kudos
Latest Reply
jameswood32
New Contributor III
  • 1 kudos

You can integrate machine learning model development into Databricks Workflows pretty smoothly using the platform’s native tools. The main idea is to treat your ML lifecycle (data prep → training → evaluation → deployment) as a series of tasks within...

  • 1 kudos

Join Us as a Local Community Builder!

Passionate about hosting events and connecting people? Help us grow a vibrant local community—sign up today to get started!

Sign Up Now
Labels